By Mike Engle, Shahram Sarkani and Thomas Mazzuchi


Abstract

Mulitsensor data fusion (MSDF) has been researched for decades yet programs relying on it to provide a situational, or threat, assessment continue to be less than successful. In order to alleviate the too-much-information, too-few-analysts issue, a better approach must be determined. A survey of recent and current data fusions programs was conducted along with a literature review on how different organizations handle a fusion-based assessment. Key points found in this study were used to develop an adaption of models that can be used to provide an improved assessment while simplifying the process needed to get there.

MSDF programs have long been a goal of the DoD and the warfighter. It promises to combine information from multiple sensors in order to determine what traditionally could not be determined by one sensor alone either because of technological limitations or geographic restrictions. These multisensor systems can be used to increase geolocation accuracy, reduce uncertainty, automatically extract man made features, and quickly identify potential targets. When expanded to include higher-level fusion capabilities, MSDF tools can help anticipate future actions of these potential targets or provide recommendations for anticipated decision points. It is no longer enough to simply provide image registration or to combine sensor level information when higher-level fusion based software promises improved situational awareness and autonomous decision-making aids.

The demand for MSDF systems has only increased in the era of near ubiquitous sensors. With more sensors, especially with the move into persistence, come more data and the need for more analysts to review the data. The problem has long since arrived that there is too much data for too few analysts. The goal is not to replace the analyst but to better enable them to use the information that is already available. How often has the world been surprised by a significant incident only later to find that there were indicators available to prevent it? Events like the 2009 Christmas Day Bombing or the Ft Hood shooting were preceded by sufficient indicators; all that was needed was someone to piece together the parts in a timely manner.

It should already be clear why data fusion has been researched for decades. Still today, there are dozens of contractors and universities dealing with multiple agencies who continue searching for solutions [1, 2]. The National Geospatial-Intelligence Agency (NGA) outreach lists multi-source and multi-INT fusion as priority for research and has asked for help in tackling what they consider a hard problem [3]. In fact, the NGA has increased research into different fusion technologies to such an extent that other agencies have reduced their funding [4].

Unfortunately, many of these fusion programs have been less than successful and the golden age of sensor fusion has not yet arrived [3, 4]. Several factors can be attributed to this issue. At the sensor level, these systems must combine data with varying temporal, spatial, spectral and radiometric characteristics. They, “may be heterogeneous, possibly asynchronous, and not identically georeferenced due to motion, limited fields of view, or constraints on power and/or the GPS signal [4].” At the program level, problems have arrived from too grand a goal to start with, the requirements of a wide range of disciplines not traditionally used in systems or software engineering, and the use of what are traditionally very stove-piped, isolated tradecraft.

Sensor Fusion Defined

There may be as many interpretations about what defines data fusion as there are people who are trying to solve it. The sensor fusion domain not only includes combining the outputs of single-modal, single-phenomenology sensors but also the predictive assessments provided by systems relying on multi-platform (different unmanned aerial vehicles for example), mult-INT (combining multiple intelligence types such as imagery intelligence and electronic signals derived data. An instructive way to define DF while conveying its wide scope is to use a process model. The most referenced model within the DoD appears to be the Joint Director of Labs (JDL) data fusion model shown in Figure 1 [5].

The JDL, is an organization which no longer exists but in the 1980s they were tasked to develop a model for data fusion. This JDL model, revised in 1999, was created to show a general process of data fusion with wide applicability for both government and academia. It standardizes communications between engineers but does not dictate the actual steps of performing fusion nor which levels must be used. The model shows multiple potential data sources on the left that can be directed to any of a number of processes within the fusion domain then the resulting output provided on the right. Table 1 provides a description of the most common fusion levels [6].

As an example, detecting a manmade object at a specific location, classifying it as a tank and even identifying it specifically as a T-72 tank is all covered under Object Assessment (level-1). Situation assessment (level-2) can use priory information to indicate that this Soviet-designed main battle tank is possibly a friendly unit of the Iraqi Army. The number found and location would further indicate unit size, if not the exact unit, and possibly the unit’s disposition such as movement to contact. The impact assessment (level-3) could use this information then indicate that the explosions detected by acoustic sensors may not have been an attack directly on coalition forces; however units should be moved to support the Iraqi Army.

Though this model does not indicate a process where one level must be met before the next, programs traditionally start at level-1 then determine what must be accomplished in order to reach the next level on up. This has led to a large number of programs which have worked through level-1 processing while not too many have successfully developed level-3 [2]. When working through each level in this manner level-3 becomes an increasingly complex goal. This complexity is further increased when the need for increased situational awareness necessitates moving from the fusion of different single-INT sensors to the fusion of different multi-INT sensors.

An Analysis of Recent Projects

An existing initiative already offers a concise summery of current technology-based programs, the National Technology Alliance (NTA). One of the benefits provided by the NTA is simplifying USG access to commercial technology; specifically dual-use technology where cost-sharing can be attained. It also provides an independent assessment and evaluation of government users’ needs and identifies optimum technology solutions to technical challenges [1]. Several of these analyses have covered data fusion research but in 2009 the multi-source and multi-INT Fusion Technology Survey and Analysis report conducted in conjunction with the Pennsylvania State University directly aligns with the type of work needed. Though this report covered several hundred government and COTS sensor fusion solutions, 24 separate projects ranging from basic research to tool development were picked for additional study. These projects represent the work funded by a single R&D office whose goal was the advancement of available sensor fusion based tools.

First, information was collected from their project summaries as a starting point to show the breakdown of what was included in this sample space. Then a more in-depth analysis into each project was made in order to provide an independent look while ensuring each was evaluated by a single person. This was done to remove any bias or at least provide a consistent bias across all 24 projects. Finally, a third look was attempted after approximately one year in order to determine a status update.

The assumption was that the majority of the programs would concentrate on both a single intelligence gathering discipline (INT) and lower-level sensor fusion techniques. Once the information was collected, 11 of these selected projects concentrated on a single INT although most did span across multiple phenomenologies. Seven programs were described as multi-INT, while most of these simply provided a common geospatial reference to a specific type of non-geospatial intelligence data. The remaining six included support items such as database development and were determined to not be directly applicable to this breakdown. Figure 2 shows that of the 18 represented projects, nine were considered level-1 fusion, seven were considered level-2 and the final two were considered level-3 fusion.

The independent audit of the 18 represented projects showed that a total of 15 were likely level-1 fusion technologies. This left only one of the original seven level-2 projects in place to support situational assessment. The two projects originally indicated as level-3 fusion remained level-3 (Figure 3). Of these final two, one turned out to be a study. This study was not rejected as a level-3 project because it potentially laid out important groundwork for follow-on multi-INT work. However, it did not provide for any actual data fusion in itself. This left a single project out of a total of 24 to possibly become a higher-level data fusion based software tool.

During the review several issues were noted. It was found that that a large percentage of the level-1 fusion projects required multiple separate hard problems to be answered in order to be successful. Some of these problems were the same but approached separately between the separate projects and were therefore redundant efforts. In one instance a problem was worked though using a supporting technology that was known to be untested and at a very low technology readiness level. Though there was testing as part of the normal tool development process, none of it was meant to test performance of individual technologies before being integrated into the tool. This shows that projects were initiated without determining existing capability gaps and continued using high-risk methodologies

After all work was initially planned to be complete a third round of review was undertaken. This review was less successful. It was not possible to find the exact status of any individual project the organization was working on. It was only possible to find artifacts of work leaving the organization. This included projects being sent out for independent testing or transitioning to a semi-operational status. From what was found, the initial 24 projects were roughly correlated to only three available MSDF tools. These conclusions also support previously cited literature stating that these types of programs tend to mostly be lower-level data fusion based with few successful higher-level programs.

Evaluating Alternative Approaches to Data Fusion

After reviewing the types of existing sensor fusion programs, the next step was to evaluate the process other organizations used to attain what could be interpreted as level-3 data fusion. Areas covered included legacy military, finance, and weather projects. This investigation converged on one manually intensive procedure that closely parallels the MSDF process discussed earlier. It is described in the Army’s Field Manual on Intelligence, FM 2-0 [7].

The Army defines a procedure through the military decision making process (MDMP) to help identify the most important information to a commander. This is important because it is likely that there will always be too much information available and the commander does not need to track the status and update from each individual information source. FM 2-0 includes this process in the key intelligence task “conduct ISR” summarized in Figure 4. This is an exhaustive and iterative procedure that involves several key personnel with an in-depth understanding of the environment, unit capabilities, and what needs to happen to affect mission success.

This process starts with an understanding of the mission that needs accomplished. Then different courses of action are developed which are analyzed against the threat and environmental factors to produce a set of intelligence requirements and Priority Intelligence Requirements (PIR). Information deemed sufficiently important but not necessarily mission impacting are Intelligence Requirements. Information on hostile forces essential to support key decisions that must be made in order to accomplish a mission is classified as PIR. The process continues with an analysis of all available ISR (intelligence, surveillance and reconnaissance) assets and their capabilities. This, along with the initial MDMP, helps identify collectable indicators of threat intentions and objectives which can then be used to task subordinate units and ISR collection platforms.

Combining New Technology with Proven Process

A method to simplify building a set of fusion algorithms to take into account any number of sensory input and to try to think through possibly infinite scenarios is to start at the traditional end point (level-3 DF) to determine what actually needs to be assessed then move backwards by determining what must be obtained in order to get what is needed. In other words, if the traditional progression of data fusion is reversed and combined with the Army’s Intelligence Synchronization discussed in the previous section, then a skeleton process of simplified multisensor data fusion starts to take shape (Figure 5).

Taken a step further, IRs can be analyzed using knowledge of the organization’s existing intelligence capabilities to determine which could be met using an automated fusion process. These would be labeled as Fusion Information Requirements (FIRs). FIRs are the intelligence requirements that can be autonomously processed by current and potential sensor and used in fusion processing. These FIRs are broken down into indicators that support the FIRs and can be labeled as facts. These facts are the actual observations that can be detected by any of the available intelligence sensors and matched against priory information. In other words, these indicators are used to support any one of a number of situation assessments that have been predetermined as necessary in order to match a threat assessment or answer a PIR.

Next, the most likely methods to observe these indicators are thought through. Each may have multiple methods of detection. Depending on timeliness requirements, available sensors and the environment, each reasonable detection is used to create a Collection Requirement (CR). CRs are the tasking to the specific intelligence collector such as aircraft, soldiers or ground sensors that are most likely to observe what is needed in the time frame that it is needed. Each CR is added to the existing requirements management process. An example is shown in Figure 6 where three separate FIRs are broken down into their applicable facts. FIR-1 needs three Facts meet in order to be satisfied. Each fact can be met through a set of detections using Boolean logic.

Figure 7 shows how the FIRs (previously PIRs), Facts (indicators), and CRs loosely align with but move opposite of the more traditional object, situation, and threat assessment functions of the JDL MSDF model. This process continues in cycles as the threat evolves, new PIRs are determined, or the availability of different ISR platforms change. This creates both a synchronized collection effort and a modular approach to MSDF. It also provides a basis for real-time information collection and processing without creating any redundant processes to an organization. Even if the result is only an alert in an operations center or an email sent to the responsible analyst, pertinent and timely information is sent to the specific person in need, in near real-time, without having to monitor countless hours of data feeds.

Conclusion

Higher level multisensor data fusion programs allow for a solution that is more significant than the sum of the data supplied to them. In this case, they take new and known information and provide a level of data abstraction in order to help understand what is going on and to do this quicker then what would normally be possible. This allows for timely decisions to be made as events occur or statuses change, instead of after analysts have had time to analyze each situation manually.

Care should be taken to limit work that is too similar to work already funded or completed. This includes anything from basic R&D initiatives to acquisition programs placing major end items into combat. Care should also be taken to limit the overall scope of what a MSDF program may cover. If the intent is to develop a new MSDF system for a specific purpose then do not add new and unrelated capabilities. Many very capable systems already exist but varying missions and the effects of rapid fielding initiatives have limited their capabilities and interconnections into other systems. Using principles of modular systems engineering and borrowing from aspects of different levels of sensor fusion (fusion at the sensor, object or decision level) and a simplified method of improving the common operation picture may be possible while leveraging on existing capabilities.

Though many MSDF programs have met with limited success it seems entirely possible that simply reversing the order in which most programs run may affect positive outcomes. By taking what absolutely must be known (facts), finding ways to first characterize then indicate these facts through available sensor detections, then to provide an output largely based on relatively simple Boolean math and a whole new model for future programs is created. When taken in the traditional lower to higher level DF order, advanced processing must be developed in order to account for countless possible combinations of unknown future indications. The reverse model alleviates the need of this advanced methodology, such of cognitive engineering and neural networks, and simply waits for detections that can answer the commander’s priority information requirements.

Tables and Figures:

Figure 1: The revised JDL Data Fusion Model (Hall, Liggins, & Llinas, 2009) ( Click to view image )

Table 1: JDL Fusion Levels 1-4 with descriptions ( Click to view image )

Figure 2: Represented programs broken down by JDL fusion level IAW project description ( Click to view image )

Figure 3: Represented programs broken down by JDL fusion levels determined by independent audit ( Click to view image )

Figure 4: Army ISR Task Development Process from FM 2.0 which takes the mission, threat and environment into account to determine the most significant intelligence requirements ( Click to view image )

Figure 5: The Reverse Data Fusion Model shown over the initial JDL fusion model with traditional workflow indicated ( Click to view image )

Figure 6: PIRs that can be used as fusion information requirements are further broken down into Facts and Detections ( Click to view image )

Figure 7: Reverse higher level data fusion model ( Click to view image )


References and Notes

1. National Technology Alliance. (2009). Rosettex NTA project portfolio. No. TR-001-072709-554). Retrieved 13 March 2011 from http://www.rosettex.com/nta/Rosettex%20Project%20 Portfolio%20Sept%202009%20v1-1a.pdf 2. National Technology Alliance. (2009). Multi-source and multi-INT fusion technology survey and analysis, version 3. No. FR-001-085-052609-538PR). Retrieved 13 March 2011 from http://portal.opengeospatial. org/files/?artifact_id=38981 3. NGA. (2009). Cooperative research and development agreement (CRADA) handbook. No. 10-008). Va: National Geospatial Intelligence Agency. 4. National Research Council. (2006). Priorities for GEOINT research at the national geospatial-intelligence agency. Washington, DC: The National Academies Press. 5. Hall, D. L., Liggins, M. E., & Llinas, J. (2009). Handbook of multisensor data fusion : Theory and practice (2nd ed.). Boca Raton, FL: CRC Press. 6. Hall, D. L., & Llinas, J. (2002). An introduction to multisensor data fusion. Proceedings of the IEEE, 85, 6-23. 7. Department of the Army. (2008). FM 2-0 intelligence (C1 ed.). Washington, DC: Department of the Army.

Mike Engle

Click to view image

Mike Engle has a Bachelor of Science in Mechanical Engineering from the Pennsylvania State University and a Master’s of Science in Systems Engineering from the George Washington University. He has provided systems engineering support to a variety of R&D and software engineering organization over the past 10 years. Prior to graduate school, Mike Engle was a US Army aviation officer.

The George Washington University


1776 G Street, NW Suite 101


Washington, DC 20052


E-mail: tme110@gwu.edu

Shahram Sarkani

Click to view image

Dr. Shahram Sarkani joined the faculty of the School of Engineering and Applied Science (SEAS) at The George Washington University in 1986. He currently serves as the faculty advisor for Off-Campus Programs in the Department of Engineering Management and Systems Engineering. From 1994 to 1997, he served as chair of the Civil, Mechanical, and Environmental Engineering Department. From 1997 to 2001, he was SEAS interim associate dean for Research and Development. Dr. Sarkani holds a BS and MS in Civil Engineering from Louisiana State University and a PhD in Civil Engineering from Rice University.

The George Washington University
1776 G Street, NW Suite 101
Washington, DC 20052
E-mail: sarkani@gwu.edu

Thomas Mazzuchi

Click to view image

Dr. Thomas Mazzuchi is a professor of Operations Research and Engineering Management at The George Washington University. His current research interests include reliability and risk analysis, Bayesian inference, quality control, stochastic models of operations research, and time series analysis. Dr. Mazzuchi earned a BA in Mathematics from Gettysburg College, and an MS and DSC in Operations Research

The George Washington University
1776 G Street, NW Suite 101
Washington, DC 20052
E-mail: mazzu@gwu.edu


« Previous Next »