Issue 
Wuhan Univ. J. Nat. Sci.
Volume 28, Number 4, August 2023



Page(s)  309  316  
DOI  https://doi.org/10.1051/wujns/2023284309  
Published online  06 September 2023 
Computer Science
CLC number: TP391.9
Improved Design of Architecture of the Battlefield Situation Comprehensive Display System
Teaching and Research Department of Unit 91976, PLA, Guangzhou 510430, Guangdong, China
Received:
15
November
2022
The traditional chimney software architecture design, which is based on the application object, scale and equipment, is difficult to meet the urgent need of multiresolution, multiviewpoint, twodimensional and threedimensional consistent battlefield situation comprehensive display. Referring to the idea of U.S. Army's general operational diagram, the architecture of the current battlefield situation integrated display system is upgraded and improved. Aiming at the problems of information splitting and realtime undifferentiated access of multidomain heterogeneous data, technical means such as spatial analysis and multisource data integration are put forward, and the problems of fast loading and visualization of 3D models of the large battlefield space are solved by designing force clustering and motion trajectory smoothing algorithms. This method improves the service ability and the level of the battlefield situation comprehensive display system, and has a broad arrange of application.
Key words: common operational picture (COP) / architecture of battlefield situation / visualization / computer model
Biography: LIU Yu, male, Ph.D., research direction: military modeling and simulation, operational training system. Email: 2942509486@qq.com
Fundation item: Support map operations such as zooming in, zooming out, and roaming, and support the display and control of moving targets.
© Wuhan University 2023
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
0 Introduction
Under the condition of informatization, the focus of competition among the warring parties in joint operations is information superiority. The key to obtaining the advantage of information is to obtain the battlefield situation promptly and accurately. Common Operational Picture (COP) is the most effective platform for commanders to perceive and share the battlefield situation and the coordinated planning, and then the commanders can make decisions quickly^{[1,2]}. At present, the research and application of COP in the U.S. military is relatively mature, and COP has played an important role in the operational command of the U.S. military in the Afghan War and the Iraq War.
Some domestic institutions and research institutes have developed some information systems with COP characteristics, such as the Army's "United 99" and "Military Situation Integrated DecisionMaking System". Some systems have initially achieved the integration of training campaign and strategy. Some systems even have achieved the integration of the main battle weapon and the command system^{[35]}. However, most systems have problems such as slow data update speed and imperfect release/reading mechanism, resulting in poor interaction. It is difficult to achieve the consistency and seamless switching between the global situation and the local situation under different perspectives and resolutions.
The battlefield situation comprehensive display system (simplified as "the System") overlays the force distribution and situation changes of both sides on the electronic map, and presents it from the perspective of equipment or individual soldiers, which can give users an immersive feeling. The System has two forms of expression: twodimensional and threedimensional, the former presents a multiresolution battlefield situation, and the latter presents a realistic threedimensional battlefield environment based on virtual reality technology^{[6,7]}, both of which meet the personalized view needs of different users.
Based on the typical COP architecture, the System is reconstructed, that is, the System is divided into data layer, technical layer and application layer according to the logical function, and the technical and tactical indicators are listed^{[8]}. The multivariate data integration technology is used to solve the problem of multivariate heterogeneous data fusion in the data layer, and the spatial analysis technology is used to solve the problem of information splitting. Five application subsystems are designed and the interface relations between the subsystems are given. Force clustering algorithm and motion trajectory smoothing algorithm are designed to optimize the display effect and efficiency of largescale target data in the application layer. Through the aggregation and decoupling of situation information, the goal of realtime selfadjustment of granularity can be achieved at all levels of strategy, campaign, tactics, individual soldiers, equipment and so on. Through the sharing of situation information between different levels and combat units, users at different levels of the System can achieve consistent perception and understanding of the battlefield situation.
1 Requirement Analysis
Modern war has a large scale and a wide range of battlefield space, including sea, land, air and other physical space and electromagnetic, network, information, psychological, social and other abstract space. According to the characteristics, it can be divided into social domain, cognitive domain, information domain and physical domain^{[9]}(see Fig. 1(a)).The battlefield space is analyzed and formally described from two aspects: battlefield situation factors and combat tasks. The general composition of the battlefield data sample space is shown in the Fig. 1(b).The elements of battlefield situation are closely related to combat tasks. Battlefield situation factors restrict combat tasks, and different combat tasks correspond to different battlefield situation elements.
Fig. 1 Typical operational domain and battlefield data composition of modern war 
A battlefield situation display system that can meet the needs is shown as follows:
1) A platform that integrates all the information of battlefield space, such as battlefield environment, combat situation, etc.
2) A visual threedimensional virtual battlefield environment that not only presents the battlefield situation realistically, but also supports spatial analysis, virtual manipulation, immersive scene roaming and other functions.
3) A platform that can provide personalized views for users at different levels.
4) An interface that can be connected to the command automation system and integrated into the combat command system.
The information transmission process from data to the battlefield situation is shown in Fig.2.
Fig. 2 The information transmission process from data to battlefield situation 
2 Architecture Design
The serviceoriented architecture of COP provides a variety of information services through networked information service, and it is also convenient for nonprofessional users to make military thematic maps, which can solve many problems of the existing battlefield situation comprehensive display system with information support. Based on the typical COP architecture, the battlefield situation display system is divided into data layer, technology layer and application layer according to its logic function. The lower layer provides service support for the upper layer invocation, and the upper layer aggregates the lower layer functions^{[10]} (Fig.3).
Fig. 3 Architecture diagram of the System based on COP 
According to the relationship between information, two different visualization methods are adopted to present it (Fig.4). The data describing the topography, terrain, elevation, electromagnetism and other environmental elements of the task space are integrated into the basic layer and can be presented directly through the system. Data describing the combat forces of the mission space and other elements are integrated into the model layer. Static combat entities are used to describe personnel, weapons and equipment, and dynamic combat entities are used to describe the motion trajectory and trend of static combat entities. The basic layer and model layer not only support each other, but also influence each other. The basic layer provides the scene data for loading and display control for the model layer, and the behavior results of the combat forces in the model layer will affect the basic layer data, such as the explosion will destroy the topography.
Fig. 4 Information relationship diagram 
The application layer of the System is composed of twodimensional situation generation subsystem, twodimensional situation realtime rendering engine, threedimensional situation generation subsystem, threedimensional situation realtime rendering engine and multisource data access and situation comprehensive display control subsystem. The main timing of the system is as follows:
1) The System starts, and the two/threedimensional situation realtime rendering engine loads the basic data of the local two/threedimensional environment data subsystem and multisource data access and situation comprehensive display control subsystem to generate the two/threedimensional basic situation.
2) At the same time, the simulation dynamic data interface receives, collates and parses the external realtime data, and then distributes it to the two/threedimensional situation generation subsystem.
3) The two/threedimensional situation generation subsystem generates the two/threedimensional dynamic situation. The dynamic situation and the basic situation are synthesized to generate a comprehensive situation that is finally displayed to the user.
4) The multisource data access and situation comprehensive display control subsystem detects whether there is a user operation, receives and processes humancomputer interaction instructions, carries out environmental control, calculation and operation, and finally outputs the situation and rendering signals.
2.1 Interface Relationship
The information interaction and internal interface between the five subsystems in the application layer of the System are shown in Fig.5.
Fig. 5 Internal interface diagram of the application layer 
2.2 Information Splitting
Information splitting in the system refers to the process of responding to the user's operations, determining the information flow according to the needs of users, the resources available in the combat area, the availability of infrastructure and security policies, and showing targeted and consistent information to users in different roles. The spatial analysis method which is a process of selecting areas for analysis is used to split the information in the System.
Intercept a square with a side length of 2 in the area of the entity and make the following assumptions about the relevant parameters of the entity.
1) Treat the entity as a geometric sphere with a radius of R.
2) The average speed is S.
3) The average length of stay in the square is T.
Regardless of any prior knowledge, the relationship between the square region and the entities within the region can be expressed as
$M=K\times f(R,\text{}S,\text{}T)$(1)
where K is a proportional constant.
Assuming that the entity moves only in the plane. In the worst case, there is a 50% chance that the entity will remain in the square after moving for a certain period of time. By calculating the area ratio of two square regions, the constant K can be obtained (see Fig.6).
$\frac{{(Kdd)}^{\mathrm{2}}}{{(Kd)}^{\mathrm{2}}}=\mathrm{0.5}$(2)
Fig.6 Method of calculating constants 
Similarly, it is still assumed that there is a 50% probability that the entity will still be in the cube area after moving in 3D space for a certain period of time. By calculating the volume ratio of two cube regions, the constant K can also be obtained.
$\frac{{(Kdd)}^{\mathrm{3}}}{{(Kd)}^{\mathrm{3}}}=\mathrm{0.5}$(3)
2.3 Data Acquisition and Processing
The data comes from a variety of sources and must be preprocessed before it can be used. Data acquisition and processing are carried out according to the following steps.
1) Receive data. The data mainly come from the following sources.
a. Remote sensing satellite, navigation satellite, early warning satellite, meteorological satellite, etc.
b. Ground measurement and control station, data receiving station, etc.
c. Report of electromagnetic, nuclear, biochemical, surveying and mapping departments.
d. Information bulletin from higher departments.
e. Data from radar, infrared, photoelectric, sonar and other sensors.
f. Data obtained by our combat units through their own equipment.
g. Processed information about the enemy's actions, plans, intentions, etc.
h. Computergenerated force data that act as enemy or support forces.
2) Integrate data. Clean the original data at different times and different sources, deal with the consistency of timespace benchmark, data semantics, data format and so on, and then build a multisource spatial situation data assimilation model.
3) Fuse data. Simulate the way the human brain deals with complex problems to analyze data, fuse and reason out deeper information.
4) Maintain the consistency of data. Data is stored in a distributed database. The data synchronization mechanism of the database management system is used to maintain the consistency among the nodes of the database.
The integration technology of multisource data ensures the uniqueness and interoperability of data. By integrating data, the target situation information with high accuracy and reliability can be obtained^{[11]}. Through multilevel and largescale data fusion, deep level information can be mined to improve the recognition rate of target situation information and expand the spacetime range of situation information.
2.4 Main Index
Function items are not only the concrete implementation of system requirements, but also the direct result of architecture design, which can be refined according to the actual situation. It mainly includes the following aspects.
1) Support to add and delete 3D models, adjust 3D effects, and control position, posture, size, speed and other states.
2) Support to call physical engine interfaces such as collision detection, motion computing, physical damage, etc.
4) Support distance measurement, area coverage calculation, plotting and layer control.
5) Support local loading of commonly used raster files (such as bmp, png, jpg, tiff, geotiff) and vector files(such as shp, mif, tab, gdb) and manipulate layers.
6) Support to display map layers on different scales.
7) Support for saving and printing maps as picture files in pdf, png, jpeg, bmp and tiff formats.
8) Support multidimensional, perceptible and measurable rendering of virtual battlefield environment, including basic geographical environment such as topography and landform, and thematic environment such as electromagnetism, meteorology, hydrology and network.
9) Support access to battlefield situational awareness information resources from sea, land, air, space, network and electromagnetic space. Support a comprehensive display of the fullspace battlefield situation. Support the visualization and analysis of working range of the sensors and communication link. Support the display of routes, orbits, coordinate systems and grids.
10) Support the dynamic presentation of the position, trajectory and air posture of weapons and equipment.
11) Support operations such as compression, denoising, smoothing and playback of target tracks. Support the functions of outlier monitoring, trajectory feature point extraction and target track fusion.
12) Support the display, control and linkage of 2D and 3D views.
13) Support highresolution, multichannel, parallel rendering output function.
Performance items are the intuitive reflection of the architecture to the environment and interaction constraints, including the following aspects.
1) Realtime and timely delay requirements of simulation synchronization information.
2) The time requirement for refreshing the display of a 2D map after loading 2D military symbols.
3) The time requirement for refreshing the display of a 2D map after loading the 2D situation layer file. Requirements for the number of symbols plotted on each layer.
4) Frame rate requirements for refreshing the 2D dynamic targets.
5) Frame rate requirements for the number of 3D dynamic targets displayed on the same screen.
6) Requirements for data types for 3D visualization.
3 Key Technology
3.1 Force Clustering Algorithm
On the battlefield, discrete combat forces generally move according to formation, therefore, distance can be used to cluster military situation information. The efficiency of scene rendering is improved by aggregating targets, which is helpful to respond to the operation of users more quickly.
The definition of force clustering algorithm based on distance is given below. A maximum distance R is given for n points representing targets in Ndimensional space $\mathbb{R}$^{N}, and the n discrete points are clustered according to the following conditions.
$\forall {x}_{i}\in {S}_{l},\text{}\exists {x}_{j}\in {S}_{l},$ ${x}_{i}{x}_{j}\le d$,$\text{}{S}_{l}\text{}\mathrm{m}\mathrm{e}\mathrm{a}\mathrm{n}\mathrm{s}$ the lth cluster.
$\forall {x}_{i}\in {S}_{l},\exists {x}_{j}\in {S}_{m},l\ne m$,$\left{x}_{i}{x}_{j}\right>d,{S}_{l}\text{}\mathrm{m}\mathrm{e}\mathrm{a}\mathrm{n}\mathrm{s}$ the $l$th cluster,$\text{}{S}_{m}\text{}\mathrm{m}\mathrm{e}\mathrm{a}\mathrm{n}\mathrm{s}$ the $m$th cluster.
There are two advantages of the force clustering algorithm:
1) It is not a NPHard problem^{[12]}.
2) It is effective to calculate the distance between 2 points by using spatial geometric model.
The implementation process of the force clustering algorithm is shown in Fig.7.
Fig.7 Force clustering algorithm based on distance 
Input: n points representing targets
Output: clusters representing battlefield situations
Algorithm description:
1) Capture combat forces in battlefield space, and mark them with points.
2) Connect all points to make it a complete graph, and calculate the distance between 2 points by using spatial geometric model.
3) Cut edges larger than the maximum distance R.
4) Form a cluster by a connected point set. Output all clusters.
3.2 Motion Trajectory Smoothing Technology
The trajectory data generated by the moving target is a series of points which are constantly changing and cannot be predicted. When visualizing the trajectory of a moving target, if you directly connect the adjacent points in order, the trajectory curve will jump, which looks as unnatural as shown in Fig.8(a). Therefore, the trajectory curve needs to be smoothed to make it more natural. The process of smoothing the trajectory curve can be divided into two stages: calculating and updating, as shown in Fig.8(b).
Fig.8 Examples of motion trajectory smoothing technology 
When the moving target does not generate new trajectory point data, its future direction and position are calculated according to the current motion state. Considering the inertia of motion, it is usually assumed that the velocity and acceleration of the moving target remain constant. Because of the large number of moving targets in the battlefield space, in order to make the System respond quickly to users' operation, the calculating functions should be as simple and unified as possible. No matter how fast the moving target is, the lower power functions can achieve good results.
Set the initial coordinate of the target as (${x}_{\mathrm{0}},{y}_{\mathrm{0}},{z}_{\mathrm{0}}$), the initial speed as (${v}_{{x}_{\mathrm{0}}},{v}_{{y}_{\mathrm{0}}},{v}_{{z}_{\mathrm{0}}}$), the accelerated speed as (${a}_{{x}_{\mathrm{0}}},{a}_{{y}_{\mathrm{0}}},{a}_{{z}_{\mathrm{0}}}$), the coordinates of the target (${x}_{k},{y}_{k},{z}_{k}$) moving for $k\u25b3T$ in the 3D scene can be calculated through the following linear function group and quadratic function group.
Linear function group:
$\begin{array}{l}\text{}\\ \{\begin{array}{c}{x}_{k}={x}_{\mathrm{0}}+{v}_{{x}_{\mathrm{0}}}\cdot k\cdot \u25b3T\\ {y}_{k}={y}_{\mathrm{0}}+{v}_{{y}_{\mathrm{0}}}\cdot k\cdot \u25b3T\\ {z}_{k}={z}_{\mathrm{0}}+{v}_{{z}_{\mathrm{0}}}\cdot k\cdot \u25b3T\end{array}\end{array}$
Quadratic function group:
$\{\begin{array}{c}{x}_{k}={x}_{\mathrm{0}}+{v}_{{x}_{\mathrm{0}}}\cdot k\cdot \u25b3T+\mathrm{0.5}\cdot {a}_{{x}_{\mathrm{0}}}\cdot {(k\cdot \u25b3T)}^{\mathrm{2}}\\ {y}_{k}={y}_{\mathrm{0}}+{v}_{{y}_{\mathrm{0}}}\cdot k\cdot \u25b3T+\mathrm{0.5}\cdot {a}_{{y}_{\mathrm{0}}}\cdot {(k\cdot \u25b3T)}^{\mathrm{2}}\\ {z}_{k}={z}_{\mathrm{0}}+{v}_{{z}_{\mathrm{0}}}\cdot k\cdot \u25b3T+\mathrm{0.5}\cdot {a}_{{z}_{\mathrm{0}}}\cdot {(k\cdot \u25b3T)}^{\mathrm{2}}\end{array}$
When the moving target generates new trajectory point data, the System updates the trajectory curve. The data transmission of the System generally has time delay. The longer the time delay, the greater the deviation between the calculated coordinates and the actual coordinates. In order to reduce the deviation, the relevant parameters of the calculation function group will be updated as soon as new data is received , and transit the target to the current actual coordinate in one step according to the time difference, then continue to use the above calculating function to calculate the coordinates according to the preset step. The strategy corrects the deviation between the calculated coordinates and the actual coordinates in one step. If the time delay is too long, the trajectory curve of the target may show an unnatural jump in the System. Use the following smoothing function to solve this problem:
${x}_{i}={x}_{\mathrm{o}\mathrm{l}\mathrm{d}}+i\cdot \frac{{x}_{\mathrm{n}\mathrm{e}\mathrm{w}}{x}_{\mathrm{o}\mathrm{l}\mathrm{d}}}{n}$
where ${x}_{i}$ is the smoothing value of step i, ${x}_{\mathrm{o}\mathrm{l}\mathrm{d}}$ is the calculated coordinate when new data is received, ${x}_{\mathrm{n}\mathrm{e}\mathrm{w}}$ is the revised new coordinate, n represents the number of steps, $i\in [\mathrm{1},n]$.
The value of n is critical. The larger the n is, the better the smoothing effect is, but the longer time it takes correspondingly. On the contrary, the smaller the n is, the shorter time it takes, but the worse the smoothing effect is. After finishing the smoothing process, continue to use the above calculating function to calculate the coordinates.
4 Conclusion
Using the core concept of Common Operational Picture to improve the architecture of the system can effectively solve the existing problems such as information splitting and realtime indifference access of multidomain multivariate heterogeneous data. The system can enable users to accurately and intuitively obtain the required information in the dynamic and interactive 3D virtual battlefield environment, and support users to use visualization technology for battlefield planning, command decisionmaking and command control, and help users understand the current battlefield situation and predict the future development trend. The next step will focus on how to refine the model, improve the algorithm and deal with massive spatiotemporal data more effectively.
References
 Bouzekri E, Canny A, Martinie C, et al. Deep system knowledge required: Revisiting UCD contribution in the design of complex command and control systems[C]// 2019 International Federation for Information Processing. Washington D C: IEEE , 2019: 699720. [Google Scholar]
 Armenis D. An experiment on the utility of blue force tracker: The costs and benefits of having God's eye view[J]. International Journal of Intelligent Defence Support Systems, 2010, 3(3/4): 207224. [CrossRef] [Google Scholar]
 Vacca W A. The social context of technological systems: Dreadnoughts, computers, and flags[J]. Environment Systems and Decisions, 2019, 39(2): 154162. [CrossRef] [Google Scholar]
 Schmidtke H R. TextMap: A general purpose visualization system[J]. Cognitive Systems Research, 2020, 59: 2736. [CrossRef] [Google Scholar]
 Zhang J R, Wang G, Wang S Y. Command and control system construction in big data era[C]// 2018 International Conference on Computer Information Science and Application Technology. New York: IEEE, 2018, 2018: 16. [Google Scholar]
 Dong J, Wu G W, Yang T T, et al. Battlefield situation awareness and networking based on agent distributed computing[J]. Physical Communication, 2019, 33(c): 178186. [CrossRef] [Google Scholar]
 Lin K, Xia F Z, Li C S, et al. Emotionaware system design for the battlefield environment[J]. Information Fusion, 2019, 47: 102110. [CrossRef] [Google Scholar]
 Gao Y, Ma X H, Jiang T, et al. Based on two and three dimensional technology to quickly build a virtual battlefield[J]. Energy Procedia, 2012, 17: 630637. [CrossRef] [Google Scholar]
 Regragui Y, Moussa N. Agentbased system simulation of wireless battlefield networks[J]. Computers and Electrical Engineering, 2016, 56: 313333. [CrossRef] [Google Scholar]
 Zong W, Chow Y W, Susilo W. Interactive threedimensional visualization of network intrusion detection data for machine learning[J]. Future Generation Computer Systems, 2019, 102: 292306. [Google Scholar]
 Fan L J, Ling Y X, Liao L C, et al. An improved evaluation method based on cloud models for situation consistency within the battlefield of joint operations[J]. Procedia Engineering, 2012, 29: 15901595. [CrossRef] [Google Scholar]
 Li J C, Ge B F, Zhao D L, et al. Metapathbased weapontarget recommendation in heterogeneous combat network[J]. IEEE Systems Journal, 2019, 13(4): 44334441. [NASA ADS] [CrossRef] [Google Scholar]
All Figures
Fig. 1 Typical operational domain and battlefield data composition of modern war 

In the text 
Fig. 2 The information transmission process from data to battlefield situation 

In the text 
Fig. 3 Architecture diagram of the System based on COP 

In the text 
Fig. 4 Information relationship diagram 

In the text 
Fig. 5 Internal interface diagram of the application layer 

In the text 
Fig.6 Method of calculating constants 

In the text 
Fig.7 Force clustering algorithm based on distance 

In the text 
Fig.8 Examples of motion trajectory smoothing technology 

In the text 
Current usage metrics show cumulative count of Article Views (fulltext article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 4896 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.