Open Access
Wuhan Univ. J. Nat. Sci.
Volume 28, Number 4, August 2023
Page(s) 309 - 316
Published online 06 September 2023

© Wuhan University 2023

Licence Creative CommonsThis is an Open Access article distributed under the terms of the Creative Commons Attribution License (, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

0 Introduction

Under the condition of informatization, the focus of competition among the warring parties in joint operations is information superiority. The key to obtaining the advantage of information is to obtain the battlefield situation promptly and accurately. Common Operational Picture (COP) is the most effective platform for commanders to perceive and share the battlefield situation and the coordinated planning, and then the commanders can make decisions quickly[1,2]. At present, the research and application of COP in the U.S. military is relatively mature, and COP has played an important role in the operational command of the U.S. military in the Afghan War and the Iraq War.

Some domestic institutions and research institutes have developed some information systems with COP characteristics, such as the Army's "United 99" and "Military Situation Integrated Decision-Making System". Some systems have initially achieved the integration of training campaign and strategy. Some systems even have achieved the integration of the main battle weapon and the command system[3-5]. However, most systems have problems such as slow data update speed and imperfect release/reading mechanism, resulting in poor interaction. It is difficult to achieve the consistency and seamless switching between the global situation and the local situation under different perspectives and resolutions.

The battlefield situation comprehensive display system (simplified as "the System") overlays the force distribution and situation changes of both sides on the electronic map, and presents it from the perspective of equipment or individual soldiers, which can give users an immersive feeling. The System has two forms of expression: two-dimensional and three-dimensional, the former presents a multi-resolution battlefield situation, and the latter presents a realistic three-dimensional battlefield environment based on virtual reality technology[6,7], both of which meet the personalized view needs of different users.

Based on the typical COP architecture, the System is reconstructed, that is, the System is divided into data layer, technical layer and application layer according to the logical function, and the technical and tactical indicators are listed[8]. The multivariate data integration technology is used to solve the problem of multivariate heterogeneous data fusion in the data layer, and the spatial analysis technology is used to solve the problem of information splitting. Five application subsystems are designed and the interface relations between the subsystems are given. Force clustering algorithm and motion trajectory smoothing algorithm are designed to optimize the display effect and efficiency of large-scale target data in the application layer. Through the aggregation and decoupling of situation information, the goal of real-time self-adjustment of granularity can be achieved at all levels of strategy, campaign, tactics, individual soldiers, equipment and so on. Through the sharing of situation information between different levels and combat units, users at different levels of the System can achieve consistent perception and understanding of the battlefield situation.

1 Requirement Analysis

Modern war has a large scale and a wide range of battlefield space, including sea, land, air and other physical space and electromagnetic, network, information, psychological, social and other abstract space. According to the characteristics, it can be divided into social domain, cognitive domain, information domain and physical domain[9](see Fig. 1(a)).The battlefield space is analyzed and formally described from two aspects: battlefield situation factors and combat tasks. The general composition of the battlefield data sample space is shown in the Fig. 1(b).The elements of battlefield situation are closely related to combat tasks. Battlefield situation factors restrict combat tasks, and different combat tasks correspond to different battlefield situation elements.

thumbnail Fig. 1

Typical operational domain and battlefield data composition of modern war

A battlefield situation display system that can meet the needs is shown as follows:

1) A platform that integrates all the information of battlefield space, such as battlefield environment, combat situation, etc.

2) A visual three-dimensional virtual battlefield environment that not only presents the battlefield situation realistically, but also supports spatial analysis, virtual manipulation, immersive scene roaming and other functions.

3) A platform that can provide personalized views for users at different levels.

4) An interface that can be connected to the command automation system and integrated into the combat command system.

The information transmission process from data to the battlefield situation is shown in Fig.2.

thumbnail Fig. 2

The information transmission process from data to battlefield situation

2 Architecture Design

The service-oriented architecture of COP provides a variety of information services through networked information service, and it is also convenient for non-professional users to make military thematic maps, which can solve many problems of the existing battlefield situation comprehensive display system with information support. Based on the typical COP architecture, the battlefield situation display system is divided into data layer, technology layer and application layer according to its logic function. The lower layer provides service support for the upper layer invocation, and the upper layer aggregates the lower layer functions[10] (Fig.3).

thumbnail Fig. 3

Architecture diagram of the System based on COP

According to the relationship between information, two different visualization methods are adopted to present it (Fig.4). The data describing the topography, terrain, elevation, electromagnetism and other environmental elements of the task space are integrated into the basic layer and can be presented directly through the system. Data describing the combat forces of the mission space and other elements are integrated into the model layer. Static combat entities are used to describe personnel, weapons and equipment, and dynamic combat entities are used to describe the motion trajectory and trend of static combat entities. The basic layer and model layer not only support each other, but also influence each other. The basic layer provides the scene data for loading and display control for the model layer, and the behavior results of the combat forces in the model layer will affect the basic layer data, such as the explosion will destroy the topography.

thumbnail Fig. 4

Information relationship diagram

The application layer of the System is composed of two-dimensional situation generation subsystem, two-dimensional situation real-time rendering engine, three-dimensional situation generation subsystem, three-dimensional situation real-time rendering engine and multi-source data access and situation comprehensive display control subsystem. The main timing of the system is as follows:

1) The System starts, and the two/three-dimensional situation real-time rendering engine loads the basic data of the local two/three-dimensional environment data subsystem and multi-source data access and situation comprehensive display control subsystem to generate the two/three-dimensional basic situation.

2) At the same time, the simulation dynamic data interface receives, collates and parses the external real-time data, and then distributes it to the two/three-dimensional situation generation subsystem.

3) The two/three-dimensional situation generation subsystem generates the two/three-dimensional dynamic situation. The dynamic situation and the basic situation are synthesized to generate a comprehensive situation that is finally displayed to the user.

4) The multi-source data access and situation comprehensive display control subsystem detects whether there is a user operation, receives and processes human-computer interaction instructions, carries out environmental control, calculation and operation, and finally outputs the situation and rendering signals.

2.1 Interface Relationship

The information interaction and internal interface between the five subsystems in the application layer of the System are shown in Fig.5.

thumbnail Fig. 5

Internal interface diagram of the application layer

2.2 Information Splitting

Information splitting in the system refers to the process of responding to the user's operations, determining the information flow according to the needs of users, the resources available in the combat area, the availability of infrastructure and security policies, and showing targeted and consistent information to users in different roles. The spatial analysis method which is a process of selecting areas for analysis is used to split the information in the System.

Intercept a square with a side length of 2 in the area of the entity and make the following assumptions about the relevant parameters of the entity.

1) Treat the entity as a geometric sphere with a radius of R.

2) The average speed is S.

3) The average length of stay in the square is T.

Regardless of any prior knowledge, the relationship between the square region and the entities within the region can be expressed as


where K is a proportional constant.

Assuming that the entity moves only in the plane. In the worst case, there is a 50% chance that the entity will remain in the square after moving for a certain period of time. By calculating the area ratio of two square regions, the constant K can be obtained (see Fig.6).


thumbnail Fig.6

Method of calculating constants

Similarly, it is still assumed that there is a 50% probability that the entity will still be in the cube area after moving in 3D space for a certain period of time. By calculating the volume ratio of two cube regions, the constant K can also be obtained.


2.3 Data Acquisition and Processing

The data comes from a variety of sources and must be preprocessed before it can be used. Data acquisition and processing are carried out according to the following steps.

1) Receive data. The data mainly come from the following sources.

a. Remote sensing satellite, navigation satellite, early warning satellite, meteorological satellite, etc.

b. Ground measurement and control station, data receiving station, etc.

c. Report of electromagnetic, nuclear, biochemical, surveying and mapping departments.

d. Information bulletin from higher departments.

e. Data from radar, infrared, photoelectric, sonar and other sensors.

f. Data obtained by our combat units through their own equipment.

g. Processed information about the enemy's actions, plans, intentions, etc.

h. Computer-generated force data that act as enemy or support forces.

2) Integrate data. Clean the original data at different times and different sources, deal with the consistency of time-space benchmark, data semantics, data format and so on, and then build a multi-source spatial situation data assimilation model.

3) Fuse data. Simulate the way the human brain deals with complex problems to analyze data, fuse and reason out deeper information.

4) Maintain the consistency of data. Data is stored in a distributed database. The data synchronization mechanism of the database management system is used to maintain the consistency among the nodes of the database.

The integration technology of multisource data ensures the uniqueness and interoperability of data. By integrating data, the target situation information with high accuracy and reliability can be obtained[11]. Through multi-level and large-scale data fusion, deep level information can be mined to improve the recognition rate of target situation information and expand the space-time range of situation information.

2.4 Main Index

Function items are not only the concrete implementation of system requirements, but also the direct result of architecture design, which can be refined according to the actual situation. It mainly includes the following aspects.

1) Support to add and delete 3D models, adjust 3D effects, and control position, posture, size, speed and other states.

2) Support to call physical engine interfaces such as collision detection, motion computing, physical damage, etc.

4) Support distance measurement, area coverage calculation, plotting and layer control.

5) Support local loading of commonly used raster files (such as bmp, png, jpg, tiff, geotiff) and vector files(such as shp, mif, tab, gdb) and manipulate layers.

6) Support to display map layers on different scales.

7) Support for saving and printing maps as picture files in pdf, png, jpeg, bmp and tiff formats.

8) Support multi-dimensional, perceptible and measurable rendering of virtual battlefield environment, including basic geographical environment such as topography and landform, and thematic environment such as electromagnetism, meteorology, hydrology and network.

9) Support access to battlefield situational awareness information resources from sea, land, air, space, network and electromagnetic space. Support a comprehensive display of the full-space battlefield situation. Support the visualization and analysis of working range of the sensors and communication link. Support the display of routes, orbits, coordinate systems and grids.

10) Support the dynamic presentation of the position, trajectory and air posture of weapons and equipment.

11) Support operations such as compression, denoising, smoothing and playback of target tracks. Support the functions of outlier monitoring, trajectory feature point extraction and target track fusion.

12) Support the display, control and linkage of 2D and 3D views.

13) Support high-resolution, multi-channel, parallel rendering output function.

Performance items are the intuitive reflection of the architecture to the environment and interaction constraints, including the following aspects.

1) Real-time and timely delay requirements of simulation synchronization information.

2) The time requirement for refreshing the display of a 2D map after loading 2D military symbols.

3) The time requirement for refreshing the display of a 2D map after loading the 2D situation layer file. Requirements for the number of symbols plotted on each layer.

4) Frame rate requirements for refreshing the 2D dynamic targets.

5) Frame rate requirements for the number of 3D dynamic targets displayed on the same screen.

6) Requirements for data types for 3D visualization.

3 Key Technology

3.1 Force Clustering Algorithm

On the battlefield, discrete combat forces generally move according to formation, therefore, distance can be used to cluster military situation information. The efficiency of scene rendering is improved by aggregating targets, which is helpful to respond to the operation of users more quickly.

The definition of force clustering algorithm based on distance is given below. A maximum distance R is given for n points representing targets in N-dimensional space N, and the n discrete points are clustered according to the following conditions.

, the l-th cluster.

, the -th cluster, the -th cluster.

There are two advantages of the force clustering algorithm:

1) It is not a NP-Hard problem[12].

2) It is effective to calculate the distance between 2 points by using spatial geometric model.

The implementation process of the force clustering algorithm is shown in Fig.7.

thumbnail Fig.7

Force clustering algorithm based on distance

Input: n points representing targets

Output: clusters representing battlefield situations

Algorithm description:

1) Capture combat forces in battlefield space, and mark them with points.

2) Connect all points to make it a complete graph, and calculate the distance between 2 points by using spatial geometric model.

3) Cut edges larger than the maximum distance R.

4) Form a cluster by a connected point set. Output all clusters.

3.2 Motion Trajectory Smoothing Technology

The trajectory data generated by the moving target is a series of points which are constantly changing and cannot be predicted. When visualizing the trajectory of a moving target, if you directly connect the adjacent points in order, the trajectory curve will jump, which looks as unnatural as shown in Fig.8(a). Therefore, the trajectory curve needs to be smoothed to make it more natural. The process of smoothing the trajectory curve can be divided into two stages: calculating and updating, as shown in Fig.8(b).

thumbnail Fig.8

Examples of motion trajectory smoothing technology

When the moving target does not generate new trajectory point data, its future direction and position are calculated according to the current motion state. Considering the inertia of motion, it is usually assumed that the velocity and acceleration of the moving target remain constant. Because of the large number of moving targets in the battlefield space, in order to make the System respond quickly to users' operation, the calculating functions should be as simple and unified as possible. No matter how fast the moving target is, the lower power functions can achieve good results.

Set the initial coordinate of the target as (), the initial speed as (), the accelerated speed as (), the coordinates of the target () moving for in the 3D scene can be calculated through the following linear function group and quadratic function group.

Linear function group:

Quadratic function group:

When the moving target generates new trajectory point data, the System updates the trajectory curve. The data transmission of the System generally has time delay. The longer the time delay, the greater the deviation between the calculated coordinates and the actual coordinates. In order to reduce the deviation, the relevant parameters of the calculation function group will be updated as soon as new data is received , and transit the target to the current actual coordinate in one step according to the time difference, then continue to use the above calculating function to calculate the coordinates according to the preset step. The strategy corrects the deviation between the calculated coordinates and the actual coordinates in one step. If the time delay is too long, the trajectory curve of the target may show an unnatural jump in the System. Use the following smoothing function to solve this problem:

where is the smoothing value of step i, is the calculated coordinate when new data is received, is the revised new coordinate, n represents the number of steps, .

The value of n is critical. The larger the n is, the better the smoothing effect is, but the longer time it takes correspondingly. On the contrary, the smaller the n is, the shorter time it takes, but the worse the smoothing effect is. After finishing the smoothing process, continue to use the above calculating function to calculate the coordinates.

4 Conclusion

Using the core concept of Common Operational Picture to improve the architecture of the system can effectively solve the existing problems such as information splitting and real-time indifference access of multi-domain multivariate heterogeneous data. The system can enable users to accurately and intuitively obtain the required information in the dynamic and interactive 3D virtual battlefield environment, and support users to use visualization technology for battlefield planning, command decision-making and command control, and help users understand the current battlefield situation and predict the future development trend. The next step will focus on how to refine the model, improve the algorithm and deal with massive spatio-temporal data more effectively.


  1. Bouzekri E, Canny A, Martinie C, et al. Deep system knowledge required: Revisiting UCD contribution in the design of complex command and control systems[C]// 2019 International Federation for Information Processing. Washington D C: IEEE , 2019: 699-720. [Google Scholar]
  2. Armenis D. An experiment on the utility of blue force tracker: The costs and benefits of having God's eye view[J]. International Journal of Intelligent Defence Support Systems, 2010, 3(3/4): 207-224. [CrossRef] [Google Scholar]
  3. Vacca W A. The social context of technological systems: Dreadnoughts, computers, and flags[J]. Environment Systems and Decisions, 2019, 39(2): 154-162. [CrossRef] [Google Scholar]
  4. Schmidtke H R. TextMap: A general purpose visualization system[J]. Cognitive Systems Research, 2020, 59: 27-36. [CrossRef] [Google Scholar]
  5. Zhang J R, Wang G, Wang S Y. Command and control system construction in big data era[C]// 2018 International Conference on Computer Information Science and Application Technology. New York: IEEE, 2018, 2018: 1-6. [Google Scholar]
  6. Dong J, Wu G W, Yang T T, et al. Battlefield situation awareness and networking based on agent distributed computing[J]. Physical Communication, 2019, 33(c): 178-186. [CrossRef] [Google Scholar]
  7. Lin K, Xia F Z, Li C S, et al. Emotion-aware system design for the battlefield environment[J]. Information Fusion, 2019, 47: 102-110. [CrossRef] [Google Scholar]
  8. Gao Y, Ma X H, Jiang T, et al. Based on two and three dimensional technology to quickly build a virtual battlefield[J]. Energy Procedia, 2012, 17: 630-637. [CrossRef] [Google Scholar]
  9. Regragui Y, Moussa N. Agent-based system simulation of wireless battlefield networks[J]. Computers and Electrical Engineering, 2016, 56: 313-333. [CrossRef] [Google Scholar]
  10. Zong W, Chow Y W, Susilo W. Interactive three-dimensional visualization of network intrusion detection data for machine learning[J]. Future Generation Computer Systems, 2019, 102: 292-306. [Google Scholar]
  11. Fan L J, Ling Y X, Liao L C, et al. An improved evaluation method based on cloud models for situation consistency within the battlefield of joint operations[J]. Procedia Engineering, 2012, 29: 1590-1595. [CrossRef] [Google Scholar]
  12. Li J C, Ge B F, Zhao D L, et al. Meta-path-based weapon-target recommendation in heterogeneous combat network[J]. IEEE Systems Journal, 2019, 13(4): 4433-4441. [NASA ADS] [CrossRef] [Google Scholar]

All Figures

thumbnail Fig. 1

Typical operational domain and battlefield data composition of modern war

In the text
thumbnail Fig. 2

The information transmission process from data to battlefield situation

In the text
thumbnail Fig. 3

Architecture diagram of the System based on COP

In the text
thumbnail Fig. 4

Information relationship diagram

In the text
thumbnail Fig. 5

Internal interface diagram of the application layer

In the text
thumbnail Fig.6

Method of calculating constants

In the text
thumbnail Fig.7

Force clustering algorithm based on distance

In the text
thumbnail Fig.8

Examples of motion trajectory smoothing technology

In the text

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.