Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

Safe and Secure Cyber-Physical Systems and Internet-of-Things Systems
Safe and Secure Cyber-Physical Systems and Internet-of-Things Systems
Safe and Secure Cyber-Physical Systems and Internet-of-Things Systems
Ebook202 pages1 hour

Safe and Secure Cyber-Physical Systems and Internet-of-Things Systems

Rating: 0 out of 5 stars

()

Read preview

About this ebook

​This book provides the first comprehensive view of safe and secure CPS and IoT systems.  The authors address in a unified manner both safety (physical safety of operating equipment and devices) and computer security (correct and sound information), which are traditionally separate topics, practiced by very different people.    


  • Offers readers a unified view of safety and security, from basic concepts through research challenges;
  • Provides a detailed comparison of safety and security methodologies;
  • Describes a comprehensive threat model including attacks, design errors, and faults;
  • Identifies important commonalities and differences in safety and security engineering.

LanguageEnglish
PublisherSpringer
Release dateSep 24, 2019
ISBN9783030258085
Safe and Secure Cyber-Physical Systems and Internet-of-Things Systems
Author

Marilyn Wolf

Marilyn Wolf is Elmer E. Koch Professor of Engineering and Chair of the Department of Computer Science and Engineering at the University of Nebraska Lincoln. She received her BS, MS, and PhD in electrical engineering from Stanford University. She was with AT&T Bell Laboratories from 1984 to 1989, was on the faculty of Princeton University from 1989 to 2007 and was Farmer Distinguished Chair in Embedded Computing Systems and GRA Eminent Scholar at the Georgia Institute of Technology from 2007 to 2019. Her research interests include cyber-physical systems, Internet-of-Things, embedded computing, embedded computer vision, and VLSI systems. She has received the IEEE Computer Society Goode Memorial Award, the ASEE Terman Award, and IEEE Circuits and Systems Society Education Award. She is a Fellow of the IEEE and ACM and a Golden Core member of IEEE Computer Society. Professor Wolf is the author of several successful Morgan Kaufmann textbooks: Computers as Components, Fifth Edition (2022); High-Performance Embedded Computing, Second Edition (2014); The Physics of Computing, First Edition (2016); and Embedded System Interfacing, First Edition (2019).

Read more from Marilyn Wolf

Related to Safe and Secure Cyber-Physical Systems and Internet-of-Things Systems

Related ebooks

Electrical Engineering & Electronics For You

View More

Related articles

Reviews for Safe and Secure Cyber-Physical Systems and Internet-of-Things Systems

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Safe and Secure Cyber-Physical Systems and Internet-of-Things Systems - Marilyn Wolf

    © Springer Nature Switzerland AG 2020

    M. Wolf, D. SerpanosSafe and Secure Cyber-Physical Systems and Internet-of-Things Systemshttps://doi.org/10.1007/978-3-030-25808-5_1

    1. The Safety and Security Landscape

    Marilyn Wolf¹  and Dimitrios Serpanos²

    (1)

    School of Computer Science and Engineering, University of Nebraska – Lincoln, Lincoln, NE, USA

    (2)

    Electrical and Computer Engineering, University of Patras, Patras, Greece

    Keywords

    SafetySecurityFolk wisdomInformation technologyOperational technologyPrivacy

    1.1 Introduction

    Safety and security are both important, established, and very distinct engineering disciplines. Each discipline has developed its own methodologies and tools based on a set of goals. However, we can no longer treat these two disciplines as separate. The introduction of real-time embedded computing systems that control physical objects means that physical safety and computer security must be treated as a single discipline; the design of cyber-physical (CPS) and Internet-of-Things (IoT) systems must be based on this unitary goal of safe and secure systems.

    The traditional definitions of these fields can be briefly summarized:

    Physical safety is the result of the absence or minimization of hazards that may harm life or property.

    Howard’s early analysis of Internet security [How97] defines computer security as preventing attackers from achieving objectives through unauthorized access or unauthorized use of computers and networks.

    In the modern world, these two goals cannot be cleanly separated. The impact of computer security on safety is easy to see—attackers gain unauthorized access to a cyber-physical system and command it to do bad things. However, safety engineering also has an important influence on computer security practices that heavily rely on updates to fix newly found threats. Physical systems cannot be stopped arbitrarily—an airplane cannot be stopped mid-flight for a software update. Even a planned shutdown of a physical plant can take hours given the physical constraints on the system’s operation.

    Cyber-physical attacks differ from cyber attacks in that they directly threaten physical systems: infrastructure, civil structures, and people. Cyber-physical attacks can kill people and cause damage to physical plants that can take months to repair. Infrastructure equipment is built for small markets and in some cases is one of a kind. Large-scale damage to civil infrastructure—water heaters, refrigeration equipment, etc.—can overwhelm standard production and result in lengthy delays for replacements and repairs.

    This book elaborates several themes:

    As stated above, safety and security are inseparable in CPS and IoT systems.

    Neither safety nor security disciplines offer all the answers.

    Safety and security vary in their use of short-term vs. long-term approaches and in the use of prevention vs. remediation. The new field of safe and secure systems should operate at all time scales and from the earliest stages of design to updates.

    System designers must accept the fact that there is no end to design process due to evolving Internet threats. Systems must be designed to be adaptable to counter evolving threats.

    Suites of standardized design templates help to reduce design risks.

    Modern systems must combine design time analysis and architected safety+security features along with run-time monitoring.

    Safety and security should be assessed in part by probabilistic assertions of the health of the system.

    The next section reviews several case studies of safety and security problems in cyber-physical and IoT systems and the lessons we can draw from them. Section 1.3 surveys the remaining chapters in this book.

    1.2 Case Studies

    A few notes on terminology are useful. Threats may come from several sources: deliberate attack, design flaws, physical faults, and improper operation of the system.

    A few examples of accidents involving cyber-physical systems indicate a range of specific causes:

    An Airbus A400M crashed after takeoff at Sevilla Airport in Spain. Later analysis determined that the aircraft’s engine control software had been incorrectly installed during its final assembly. That improper installation led to engine failure [Keo15].

    Analysis of the design of Toyota automobiles [Koo14] identified failures to apply well-known engineering techniques in several areas, including protection from cosmic ray-induced data errors and application of software engineering principles.

    Dieselgate [Dav15] was the result of a decision by Volkswagen management to design software in many of their diesel vehicles to provide inaccurate testing data that incorrectly gave the appearance of satisfying emissions regulations in several companies.

    As will be discussed below, several attacks on cyber-physical systems have been reported. The Notpetya attack [Gre18] was strictly a cyber attack and did not attack physical plants. However, the virulence of the attack, which wiped out vast amounts of data and system configurations, shows the level of chaos that can be generated by a determined attacker.

    Cyber attacks and physical attacks can be used in tandem to create a cyber-physical attack. Safety problems demonstrate the physical damage that can be inflicted by cyber-physical systems. And in some cases, they expose flaws that could also have been used by attackers.

    A commonplace of safety is that accidents often have multiple causes—a chain of events that lead to the final accident, resulting in lower accident rates than may otherwise occur. However, an examination of designs and their behavior suggests that many systems have multiple flaws: security vulnerabilities and safety hazards. These multiple sources of problems suggest that failures may be more frequent than we would like them to be.

    The sections below discuss several observations about the safety and security of cyber-physical and IoT systems. Section 1.2.1 discusses ease of attack. Sections 1.2.2 and 1.2.3 discuss the serious implications of safety failures and attacks. Sections 1.2.4, 1.2.5, 1.2.6, 1.2.7, and 1.2.8 all describe various limitations on existing methodologies and architectures. Section 1.2.9 reviews the importance of privacy in cyber-physical and IoT systems.

    1.2.1 Cyber-Physical Systems Are Shockingly Easy to Attack

    Rouf et al. [Rou10] demonstrated vulnerabilities in the tire pressure monitoring system (TPMS) that are legally required for many types of vehicles in several countries. Direct TPMS devices are mounted in wheels and send data on tire pressure to the car’s electronic control units (ECUs) using radio signals. Rouf et al. showed that the packets could be received at a distance of 40 m, that they could be spoofed to the ECU, and that the packets were not encrypted.

    Checkoway et al. [Che11] identified a number of vulnerabilities in an example car, with each attack providing complete control over the car’s systems. Vulnerabilities included the car CD player, the OBD-II (on-board diagnostics) port required in the USA, telematics links such as those used for emergency services, and Bluetooth ports.

    1.2.2 Cyber-Physical Systems Can Kill People

    Leveson and Turner [Lev93] analyzed the causes of a series of accidents related to the Therac-25 medical radiation device. They identified several problems with the Therac-25 design, including mechanical systems, user interface design, and software design. These devices administered several radiation overdoses, some of which were fatal. These multiple accidents appear to have resulted from several distinct root causes. Leveson and Turner identified several contributing factors: lack of procedures for reacting to reported incidents, overconfidence in software, poor software engineering, and unrealistic risk assessments.

    HatMan [NCC18] attacks safety controllers from Triconex. Safety controllers are PLCs used for safety procedures such as disabling equipment or inhibiting operations. HatMan can both read/modify memory and execute arbitrary code on a controller; it is believed to be designed to not only reconnoiter industrial systems but also implement physical attacks.

    1.2.3 Cyber-Physical System Disruptions Require Extensive and Lengthy Repairs

    Stuxnet is widely considered to be the first cyber-physical attack [Fal10]. It reprogrammed a particular programmable logic controller (PLC) used in industrial control systems, members of the Siemens Step 7 family. It was designed to attack PLCs at a specific industrial control system in Iran, located in that country’s nuclear facilities, and to control their behavior, generally to operate them in a way that would damage the related equipment.

    Stuxnet was deployed in at least two phases. Stuxnet 0.5 [McD13] was in the wild by November 2007. It was designed to manipulate valves in uranium enrichment equipment at the Natanz, Iran facility in order to damage centrifuges and other equipment. It used a replay attack to hide its changes to valve settings during a physical attack. This version spread Step 7 project archives.

    W32.Stuxnet [Fal11] conducted more extensive attacks. It propagated using vulnerabilities in the Windows Print Spooler and vulnerabilities in removable drives. It used infected Windows computers to modify PLC code. Its physical attack caused centrifuges to run too fast, causing them damage. Attacks are believed to have caused significant damage to Natanz’s equipment and to reduce its productivity.

    The Dragonfly group [Sym14], also known as Energetic Bear, was identified as surveilling a large number of targets in multiple countries. The campaign’s targets included energy companies, petroleum pipeline operators, and energy industry equipment providers as well as defense and aviation companies. Espionage and reconnaissance were believed to be the primary goals of the campaign. Phishing and watering hole attacks were used to obtain credentials from authorized users. Malware installed on target systems gathered a range of data. Symantec identified a Dragonfly 2.0 campaign active in the USA, Turkey, and Switzerland, starting as early as December 2015 [Sec17].

    The Ukraine power grid was attacked in early 2016 [Goo16]. The physical attack disconnected electrical substations, causing hundreds of thousands of homes to lose power. The National Cybersecurity and Communications Integration Center (NCCIC) identified CrashOverride as being used to attack critical infrastructure in Ukraine in 2016 [NCC17].

    1.2.4 Patch and Pray Considered Harmful

    Information security practices emphasize the importance of applying updates to software at all levels of the stack, from operating system to application.

    Enjoying the preview?
    Page 1 of 1