Cyber Resilience for the UK Industrial Strategy
Amethyst’s Steve Mash on why integrating cyber security risk assessment and management is essential
This week, the Government announced its Industrial Strategy for the UK which presents opportunities for development and large scale adoption of emerging technologies. However, this strategy comes during the period in history where there is an ever-growing global threat of attack on information systems from state sponsored hackers and organised criminals. If robotics and autonomous systems is the future, then cyber resilience needs to be designed in as a fully integrated function from day one rather than being a last-minute bolt-on feature.
Novel systems are designed and developed by engineers and scientists who are highly skilled in their specialist areas of expertise, but they are not security specialists. Cyber security has not traditionally been part of the development life cycle, and so even the most brilliant engineers will rarely consider security in their designs. Security is itself a highly-specialised area, the processes to adequately describe security threats and vulnerabilities so that appropriate countermeasures can be implemented requires skills and experience.
In an ever more interconnected world, external threats can have significant consequences. Where a robotic or autonomous system performs a critical process (i.e. a process whose failure can cause significant damage, loss or harm) and is interconnected to other systems, this may be vulnerable to interference from accidental or malicious commands received through these interconnections. Systems are designed with controls in depth, such that only in the event of a sequence of improbable independent failures or events would there be a resultant effect that could cause injury or loss of life. This strength in depth is the resilience of the system.
The resilience of the system is the intrinsic ability of the system to manage and maintain safe and secure operation before, during, and after any change or disturbance as a result of both expected and unexpected conditions. To be fully resilient, the system needs to have both proactive and reactive behavioural properties, managing unexpected changes to boundary conditions, combinations or external events or challenging of underlying assumptions.
A key aspect of a systems resilience is that its properties are temporal in nature, as components of the system and the environment that the system operates in change over time then the reliance of the system changes. Resilience can decrease if external events that were originally improbable become increasingly probable but conversely resilience can increase if external events become less probable. Similarly, resilience can decrease if the reliability of controls decrease over time, and conversely resilience can increase if controls are replaced with more reliable controls. Where control systems include human intervention as part of the control process, resilience can decrease or increase with changes to the training, experience, practices and culture of the personnel involved in the process.
Cyber resilience assessment methodologies need to consider the consequences of internal technical and operator failure events, as well as malicious actions both within and external to the boundaries of the system. By integrating cyber security risk assessment and management, a set of controls can be integrated into the developed system which are both proportionate and which result in residual risks that are as low as reasonable practicable. Treating cyber security separately at the end of the development life cycle may result in duplication of effort or, more seriously, undetected gaps in the controls.