By C. Warren Axelrod, Ph.D.


Abstract.

Many legacy embedded systems, such as aircraft flight-control systems and weapon fire-control systems, continue in use decades after their introduction. At the same time, we are seeing modern-day tablets and laptops being used to make up for functionality and ease-of-use limitations of legacy systems. As long as modern information systems and legacy embedded systems remain independent of one another, the latter are not subject to conventional cyber attacks. However, if these systems are interconnected and interoperate, previously-avoided cybersecurity risks may be introduced. This article looks at how these risks might be mitigated.

Background

Since the advent of digital computers more than a half-century ago, we have seen IT (information technology) and control software advance much more rapidly than underlying technologies inherent in military aircraft, ships and ground vehicles and weapons. Physical equipment may have to remain in use well beyond their anticipated decommissioning date, especially if replacements have been delayed or not approved. Such equipment will eventually contain obsolescent computer hardware and software components. Such outdated components hamper the effectiveness of their hosts. In response, programs to modernize older equipment by adding or fusing IT systems onto legacy systems are undertaken. This approach, however, introduces cybersecurity issues, which we will examine in this article.

Relative Useful Lives of Systems

Donzelli [1] describes how the operational lifespan of military aircraft has increased from about 15-20 years in the 1940s to about 40-60 years at the turn of the 21st century. The same holds true for some artillery in the author’s experience. On the other hand, computer technologies generally have a much shorter lifespan, with software mostly in the 5-15 year range according to Tamai [2], and successful software lasting about 10-20 years, per Rajlich [3]. Hardware technologies and programming languages have seen new generations every decade or so since the 1940s as in Table 1.

From a categorization perspective, the first three generations, shown in Table 1, took some 30 years in total (averaging a decade for each generation of computer technology), whereas the fourth generation underlying technology (integrated circuits and microprocessors) has lasted for more than 40 years. This is somewhat misleading since there have been major advances in size reduction and lower costs for computer devices. As indicated in the column with other noteworthy events, there were also game-changing advances such as the GPS system, which began in 1973, the World Wide Web and the introduction of Ethernet, which both began in 1980, and the adoption of mobile computing, which took off in 2005.

The underlying thesis is that in today’s military aircraft, ships, vehicles, weapons and munitions can have useful lives of half-a-century or more, whereas computer equipment and software offer new generations within in a 10-20 year cycle. Consequently, one might expect the computer hardware and software to be updated between two and five times over the lifetime of the equipment. While this type of cycle is reasonable for control systems and data processing systems, it does not account for game-changing “noteworthy events” such as the Web, GPS, mobile computing and touch screens. These paradigms are often dealt with by “bolting on” additional capabilities that were not anticipated when the original systems were designed. As we shall discuss later, software engineers usually do not account for the exposure to cybersecurity attacks.

Software components and communications networks have become increasingly critical to the effective operation of mission-critical resources. Hence there is a push to transition to newer computer and communications technologies and infrastructures. However, in many cases, newer technologies have to be bolted onto legacy systems rather than being incorporated during the design, development and manufacture of software and devices.1 The former approach results in a significantly higher cybersecurity risk, as systems, which were previously physically and logically independent, are interconnected into systems of systems [7]. It takes an understanding both of modern computer and communications technologies and the technologies incorporated in older embedded systems to be able to design and develop overall systems that demonstrate acceptable levels of safety and security [10].

Cyber-Physical Systems

A critical issue with IT systems, which did not plague embedded systems until very recently, is the high likelihood of cybersecurity compromise, which not only affects the IT systems themselves but also any other systems with which they interoperate.

To better understand what is taking place, we will examine the general structure of cyber-physical systems. NIST defines cyber-physical systems as “the tight conjoining of and coordination between computational and physical resources.” Figure 1 illustrates such a relationship.

It is important to distinguish between control and administrate applications, which are usually built into embedded systems and accessed (when necessary) by administrators or operators, and data-processing or IT systems, which are separately developed or acquired applications, which are operated by internal or external end users. As long as these systems operate independently, there is little risk of cyber attack. However, it is when these systems are interfaced logically (shown by the double-ended arrow) that cybersecurity problems arise, particularly when the interoperability was not contemplated.

Legacy military real-time embedded systems, such as flight-control systems found in older aircraft and fire-control systems still operating in older weapons, were never designed to be connected to modern information systems, which are often connected to the Internet, let alone fused with them into tightly-bound cyber-physical systems. Yet, as we have discussed, there is an increasing need to extend the useful lives of legacy resources that continue in service well beyond their expected useful lives.

There are a number of ways in which the useful life of legacy software systems can be extended in order to avoid having to replace existing systems or bolt on modern front-end systems.2 However, the use of these latter systems, many running on personal computers and tablets that are in turn connected to the Internet, is inevitable since organizations, such as the DoD, cannot afford to provide required functionally via the traditional approach of building the systems in-house.

Security and Safety of Cyber-Physical Systems

When IT systems (which are connected to private and public networks) and embedded systems (which historically have been standalone with restricted access) are interfaced, one with the other, the overall system of systems is vulnerable to threats typical of both types of system and subject to the consequences of both hacking of front-end IT systems and the malfunctioning and failure of the back-end control systems. This situation is illustrated in Figure 2.

The diagram shows that security-critical information systems, which are connected to public networks (such as the Internet), are affected by both external and internal threats and exploits, whereas safety-critical control systems traditionally were minimally affected by external threats, if at all. Also, designers and developers of security-critical information systems formerly were not concerned about their systems causing physical harm or damage to the environment. However, when information and control systems are interconnected, they inherit both the positive and negative characteristics of both. In particular, information systems might be a conduit for malware into control systems and information systems take on some of the liability for malfunctions or failures of the control systems. It appears that the bulk of the responsibility for protecting the overall software environment lies with the information systems since they present the pathways for malicious activities. Nevertheless, software engineers need to understand the implications of adverse behavior of the control systems since they must focus their attention on protecting against particular events, such as crashes of vehicles or inaccurate aim of weapons.

Brief History of Real-Time Tactical Computer Systems

The evolution of real-time naval tactical digital computer systems began with the transition from analog systems in the early 1960s with the Naval Tactical Data System (NTDS) according to David Boslaugh’s detailed accounts [12], [13]. The original digital computers were standalone minicomputers connected to analog servomechanism-based control systems.

In Tables 2a and 2b we show the characteristics of this and subsequent phases of the evolution of such systems, the threats and vulnerabilities that make up the risks to and from them and what measures can be put in place to mitigate those risks.

Beginning with Phase 5, we see that systems become exposed to increasing outside threats due to connection to public networks such as the Internet, initially through separate systems, but increasingly with interconnected and interoperating architectures.

Certification

Certification is mandatory for software aboard commercial aircraft, for example. The basic certification standard used is DO-178C [16], which superseded DO-178B in January 2012. This standard has been adopted by the DoD as guidance for certifying military avionics [17].

The DO-178C certification standard categorizes types of software as to the severity of the consequences if the system were to fail. This is shown in Table 3.

This clearly shows that the standards are much more stringent for flight control and management systems, as would be expected since the consequences of failure are usually catastrophic. Onboard information systems, on the other hand, are shown to have minimal consequences. As long as the information systems, which include tablets used by pilots for navigation purposes, which have caused the grounding of commercial aircraft [18], are kept separate from the control systems, such classification appears to be reasonable. However, as soon as links between the two are created, then a cyberattack can have catastrophic consequences. The risk from on-board entertainment systems may be less for military aircraft compared to civilian commercial planes, but the risk from CMDs is likely to be the same if not greater for military aircraft.

In Figure 3 we illustrate how cyber-physical systems can be shown as layers with IT systems at the perimeter and control systems at the center. The various levels can be entered by certain authorized groups and by attackers if the systems as fused together since there may not be effective barriers to entry or for exfiltrating sensitive information and control data.

It is important to consider such systems of systems holistically from both the security and safety perspectives which can be achieved only if information security professionals and safety engineers work collaboratively throughout the system development lifecycle, including operation, updating and decommissioning.

Cybersecurity Risk of Safety-Critical Systems

The DoD chief information officer, Teri Takai, announced on March 12, 2014 that DIACAP (DoD Information Assurance Certification and Accreditation Process) was to be replaced as of that date by the NIST (National Institute of Standards and Technology) risk management framework governed by NIST Special Publications SP 800-37, SP 800-39 and SP 800-53 [19].

NIST SP 800-53 [20] provides a three-tiered risk management approach that addresses strategic and tactical risk at the corresponding organization, mission/business process and information system levels.

A Risk Management Framework (RMF) is presented in SP 800-53. The RMF consists of six steps as follows:

• Categorize information systems based on impact assessment

• Select the applicable security control baseline

• Implement the security controls and document their design, development and implementation details

• Assess security controls as to their meeting security requirements

• Authorize information system operation

• Monitor security controls

As described in [10] for software systems generally and in [21] for avionics software, security requirements have to be inserted early in the software development lifecycle and carried through design, development, testing and implementation.

In [21] the author quotes Robert Dewar, president of AdaCore, as saying “We have fortunately not experienced an aircraft accident where a software bug has resulted in loss of human life.” Sadly, the was a recent accident in which an Airbus A400M military cargo and troop transport plane crashed on a test flight on May 9, 2015 and resulted in the deaths of four persons. Several weeks later it was revealed that the crash was caused by faulty software installation [22].3

Kidnapping, Defection, Threats, Bribery, Spying and Industrial Espionage

One might ask why these topics are mentioned in an article about cybersecurity. Surely today’s major concerns relate to cyber rather than physical attacks? Don’t such methods as kidnapping belong to a bygone era when computers operated in isolation in guarded data centers and there was no way for outsiders to access the systems? The reality is that there may often be a physical component even when the most visible aspect of an attack is via the Internet.

As far back as the 1960s and 1970s even civilian computer security experts were concerned about being kidnapped by foreign powers to gain access to their specialized knowledge of “system internals,” the underlying software that controls the operation of computers.4 Since the early 1990s there have been a number of movies, such as Sneakers, (1992), Swordfish (2001), Firewall (2006) and Live Free or Die Hard (2007), that were based on the idea that experts could be kidnapped by criminals and forced to give up information on how computer systems in the government and private sectors could be used for nefarious purposes.

While reporters are quick to publicize major cyber attacks against computer systems, where huge amounts of sensitive personal data and intellectual property are obtained, as being the work of hackers in Russia, North Korea, Iran, China and the like, few mention that many attacks are facilitated by insider knowledge of the systems and analysis of the stolen data requires subject-matter expertise. This expert knowledge can only be obtained from insiders, former employees or contractors, whether voluntarily (by defectors), involuntarily (from kidnapping, threats) or accidentally (via social engineering).

Therefore, when it comes to protecting modern tactical systems, one must not only consider the possibility of hacking into operational systems and taking over control systems, but also consider cyber and physical attacks against defense and intelligence agency and contractor systems and personnel, and against former employees and contractors, to obtain information about the design, programming and operation of tactical systems.

Conclusion

Perhaps the best way to ensure that real-time tactical systems are not subjected to cyber attacks is to keep them separate from IT systems, particularly those IT systems that use commercial and open-source software and hardware. However, this is increasingly less feasible for technological and economic reasons. We just have to face the fact that there is pressure to use relatively inexpensive off-the-shelf technology and free open-source software components and to interface these systems, via loose coupling or tight interoperability, with legacy systems in order to attain desired levels of functionality and usability.

Given this situation, it is important to take a proactive stance by examining the cybersecurity impact of each and every change to cyber-physical system environments, rather than just succumb to pressure and respond to problems as they occur. In addition, if modern IT systems, particularly those that access the Internet or utilize cloud-computing services, are to be integrated with legacy real-time tactical systems so that they interoperate, then it is necessary to go through extensive validation and verification of the combined system to ensure that the tactical control systems cannot be compromised by someone entering the combined system via the front-end IT system.

Further Reading

1. Axelrod, C. Warren. “Trading Security and Safety Risks within Systems of Systems,” 14.2 INCOSE Insight (July 2011): 26-29.

2. Corman, David. “The IULS Approach to Software Wrapper Technology for Upgrading Legacy Systems, 14.12 CrossTalk, (December 2001): 9-13.

3. DoD Inspector General, Improvements Needed with Tracking and Configuring Army Commercial Mobile Devices. Report No. DODIG-2013-060, March 26, 2013.

4. Hecht, Herbert. Systems Reliability and Failure Prevention, Norwood, MA: Artech House, 2004.

5. Joyce, Robert R. History of the AN/UYK-20(V) Data Processing System Acquisition and its Impact on Tactical Systems Development, Master’s Thesis, Naval Postgraduate School, Monterey, CA, September 1976.

6. Kölle, Raine, Garick Markarian, and Alex Tartar, Aviation Security Engineering: A Holistic Approach, Norwood, MA: Artech House, 2011.

7. Kornecki, Andrew J., and Janusz Zalewski. “The Qualification of Software Development Tools from the DO-178B Certification Perspective,” 19.4 CrossTalk, (April 2006): 19-22.

8. Kornecki, Andrew J., and Janusz Zalewski. “Certification of Software for Real-Time Safety-Critical Systems: State of the Art.” 5.2 Innovations in Systems and Software Engineering, (June 2009): 149-161.

9. Leveson, Nancy G. Engineering a Safer World: Systems Thinking Applied to Safety. Cambridge, MA: MIT Press, 2011

10. Littlejohn, Kenneth, Michael V. DelPrincipe, Jonathan D. Preston, and Ben A. Calloni. “Reengineering: An Affordable Approach for Embedded Software Upgrade,” 14.12 CrossTalk, (December 2001): 4-8.

11. Luke, Jahn A., Douglas . Halderman, and William J. Cannon. “A COTS-Based Replacement Strategy for Aging Avionics Computers, 14.12 CrossTalk (December 2001):14-17.

12. Rosengard, Phillip I. Avionic Computer Software Interpreter, Patent No. US6564241, May 13, 2003.

13. Spitzer, Cary R. Ed. Avionics: Elements, Software and Functions, Boca Raton, FL: CRC Press, 2007.

14. Swassing, Margaret M. “A Brief History of Military Avionics.” 2014 SFTE (The Society of Flight Test Engineers)/SETP (The Society of Experimental Test Pilots) Southwest Symposium, Fort Worth, TX, October 24-26, 2014.


References and Notes

Notes:

1. In order to extend the useful lives of various munitions, GPS and inertial guidance systems and adjustable tail fins were added to existing projectiles and bombs. These add-on features provide much greater precision in hitting a target and may also extend the range. Two examples are the Excalibur 155 mm precision-guided artillery shell [8] and the JDAM bomb [9]. In these cases, existing “dumb” munitions are retrofitted with the requisite technologies.

2. A number of these methods, such as using software wrappers and executing legacy code on modern microprocessors, are described in an article in the December 2001 issue of CrossTalk, which has the title “Software Legacy Systems: Bridging the Past to the Future.” This issue is available at

3. In August 2005, the author happened to be on a brand-new cruise ship leaving St. Petersburg. The ship suddenly stopped and did not move again for about five hours. The captain, in an attempt to assuage passenger concerns, reported that we had not run aground but that the engines had failed due to a “software problem.” As more vehicles are built with software-managed and “fly-by-wire” control systems, this type of problem will surely become much more common. While the risk to cruise-ship passengers may be small, the same claim cannot be made for aircraft, trains or road vehicles.

4. The author was aware, in the 1970s, of a group of a dozen experts in a particular computer system who were so concerned about being kidnapped that they organized a formal contact system. In one case, a member of the group was at an airport, about to board a plane, when he called his colleagues. The latter validated his concerns about whom he was supposedly meeting and he cancelled the trip.

References:

1. Donzelli, Paolo, and Roberto Marozza. “Customizing the Software Process to Support Avionics Systems Enhancements,” CrossTalk, 4.9, (September 2001):10-14.

2. Tamai, Tetsuo, and Yohsuki Torimitsu, Software Lifetime and its Evolution Process over Generations. Proc. of the Conf. on Software Maintenance. Durham, UK, 1992: 63-69. < http://tamai-lab.ws.hosei.ac.jp/pub/icsm92.pdf&gt; 7 June 2015.

3. Rajlich, Václav “Software Lifespan Models,” Chapter 2 in Václav Rajlich (ed.) Software Engineering: The Current Practice. Boca Raton, FL: Chapman and Hall/CRC, 2011. 19-30.

4. Beal, Vanie. “The Five Generations of Computers,” Webopedia, 22 April 2015. . 7 June 2015.

5. “The Five Generations of Software,” RBV Web Solutions, 20 March 2000. 7 June 2015.

6. “The Four Generations of Computer Hardware,” RBV Web Solutions , 10 December 2005. 7 June 2015.

7. Baldwin, Kristen, Judith Dahmann, and Jonathan Goodnight, “Systems of Systems and Security: A Defense Perspective,” 14.2 INCOSE Insight (July 2011): 22-25.

8. “M982 Excalibur,” Wikipedia. 7 June 2015

9. Harris, Tom. “How Smart Bombs Work” 20 March 2003. Howstuffworks.com. 7 June 2015

10. Axelrod, C. Warren. Engineering Safe and Secure Software Systems, Norwood, MA: Artech House, 2012.

11. Axelrod, C. Warren. Mitigating the Risks of Cyber-Physical Systems, Proc. of the 2013 IEEE LISAT Conf., Farmingdale, NY, 2013.

12. D,L. Boslaugh, “First-Hand: No Damned Computer is Going to Tell Me What to DO - The Story of the Naval Tactical Data System, NTDS. Engineering and Technology History Wiki 7 June 2015

13. D.L. Boslaugh, When Computers Went to Sea: The Digitization of the United States Navy, Los Alaminos, CA: IEEE Computer Society, 1999.

14. Brosgol, Benjamin M. “Ada 2005: A Language for High-Integrity Applications.” 19.8 CrossTalk, (August 2006}: 8-11.

15. Almeshekah, Mohammed H., and Eugene H. Spafford, “Using Deceptive Information in Computer Security Defenses.” 4.3 Int. J. of Cyber Warfare and Terrorism, (July-September 2014): 63-80.

16. “DO-178C,” Wikipedia, 5 January 2012. 7 June 2015.

17. Romanski, George. “The Challenges of Software Certification,” CrossTalk 14.9, (September 2001): 15-18.

18. BBC News, “American Airlines planes grounded by iPad app error,” 29 April, 2015. 7 June 2015

19. Perera, David. “DoD Abandons DIACAP in Favor of the NIST Risk Management Framework.” FierceGovernmentIT. < http://www.fiercegovernmentit.com/story/dod-abandons-diacap-favor-nist-risk-management-framework/2014-03-18&gt; 7 June 2015

20. National Institute for Science and Technology. Security and Privacy Controls for Federal Information Systems and Organizations, NIST Special Publication 800-53 Revision 4, April 2013 (updated as of 22 January 2015). < http://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-53r4.pdf&gt; 7 June 2015

21. Howard, Courtney E. “Safety- and security-critical avionics software.” Military & Aerospace Electronics, 1 February, 2011. 7 June 2015

22. Wheatley, Mike. “Faulty Software Install Led to Airbus A400M Plane Crash,” SiliconAngle, 1 June 2015. < http://siliconangle.com/blog/2015/06/01/faulty-software-install-led-to-airbus-a400m-plane-crash/&gt; 7 June 2015.


C. Warren Axelrod, Ph.D.

Click to view image

C. Warren Axelrod, Ph.D., is a senior consultant with Delta Risk, a consultancy specializing in cyber defense, resiliency, and risk management. Previously, Axelrod was the chief privacy officer and business information security officer for US Trust.
He was a co-founder of the FS-ISAC (Financial Services Information Sharing and Analysis Center). He represented the financial services sector at the national command center over the Y2K weekend and testified before Congress about cyber security in 2001. He has participated in a number of initiatives at sector and national levels.
Dr. Axelrod was honored with the prestigious ISE (Information Security Executive) Luminary Leadership Award in 2007 and, in 2003, he received the Computerworld Premier 100 IT Leaders Award and Best in Class Award. His article “Accounting for Value and Uncertainty in Security Metrics” won ISACA’s Michael P. Cangemi Best Book/Best Article Award in 2009.
Dr. Axelrod has published five books on various IT risk, outsourcing, cyber security, privacy and safety topics. His most recent book is Engineering Safe and Secure Software Systems, released in 2012 by Artech House. He has published three prior articles in CrossTalk magazine.
He holds a Ph.D. (managerial economics) from Cornell University and MA (economics and statistics) and B.Sc. (electrical engineering) honors degrees from the University of Glasgow. He is certified as a CISSP and CISM.

Phone 917-670-1720
E-mail: waxelrod@delta-risk.net

C. Warren Axelrod, Ph.D.

Click to view image

C. Warren Axelrod, Ph.D., is a senior consultant with Delta Risk, a consultancy specializing in cyber defense, resiliency, and risk management. Previously, Axelrod was the chief privacy officer and business information security officer for US Trust.
He was a co-founder of the FS-ISAC (Financial Services Information Sharing and Analysis Center). He represented the financial services sector at the national command center over the Y2K weekend and testified before Congress about cyber security in 2001. He has participated in a number of initiatives at sector and national levels.
Dr. Axelrod was honored with the prestigious ISE (Information Security Executive) Luminary Leadership Award in 2007 and, in 2003, he received the Computerworld Premier 100 IT Leaders Award and Best in Class Award. His article “Accounting for Value and Uncertainty in Security Metrics” won ISACA’s Michael P. Cangemi Best Book/Best Article Award in 2009.
Dr. Axelrod has published five books on various IT risk, outsourcing, cyber security, privacy and safety topics. His most recent book is Engineering Safe and Secure Software Systems, released in 2012 by Artech House. He has published three prior articles in CrossTalk magazine.
He holds a Ph.D. (managerial economics) from Cornell University and MA (economics and statistics) and B.Sc. (electrical engineering) honors degrees from the University of Glasgow. He is certified as a CISSP and CISM.

Phone 917-670-1720
E-mail: waxelrod@delta-risk.net


« Previous Next »