“Financial markets have evolved to become complex adaptive systems highly reliant on the communication speeds and processing power afforded by digital systems,” explained the report's co-authors, Robin Bloomfield, a researcher at the City University London Centre for Software Reliability, and Anne Wetherilt, Bank of England security expert. And much like a nuclear reactor, “their failure could cause severe disruption to the provision of financial services and possibly the wider economy.”
The increase of computer-based trading has created new challenges for the industry. These relate to the understanding of the interaction between human traders and computer algorithms, the implications for systemic risk and the development of new risk controls for use by both market participants and infrastructure providers.
Thus, issues and practices in the nuclear industry can offer some guidance when considering the evolution of computer-based trading, the authors maintain. The top-line considerations are the approaches to systemic risk definition and evaluation; the definition of protection system parameters, risk controls and architecture; and the need for trust in computer-based systems.
The nuclear and finance industries may seem worlds apart on the surface. A nuclear plant relies on decades of science-based engineering, the plant is static, physically identifiable, remotely located, and each reactor owned and licensed to a single operator with strong incentives to ensure safety and the remaining risks are tolerable, the report noted.
Meanwhile, the finance industry relies on centuries-old risk concepts, yet is fluid, innovative and fast changing. “Risk taking is an intrinsic part of its day-to-day functioning,” the authors said. “Diversity abounds, both in terms of market participants and infrastructure providers. Competition between participants and infrastructure providers drives both innovation and risk taking. Technology allows participants to be present in multiple venues at once.”
But Bloomfield and Wetherilt define a serious nuclear incident as one that has the potential for the release of radioactivity with associated plant damage as a “systemic event,” hence very similar to a financial market crash in terms of anatomy.
Like computer-based finance, “the development of the nuclear industry approach to safety has been driven by the need to engineer systems that provide social and economic benefits with tolerable risks, to evaluate and explain the nature and extent of these risks and to provide a framework that allows for scrutiny at varying levels of independence ranging from technical experts within the industry as well as pressure groups and those who, quite legitimately, hold very different values and worldviews,” explain the authors.
The bottom line? Both industries are concerned with safety and systemic risk mitigation as well as its impact on the broader economy. And both market participants and infrastructure providers have incentives to ensure the system is robust and inspires confidence, the researchers conclude.
Thus, the same questions can be asked when developing an appropriate, systemic approach to security. The authors lay out fourteen questions across the aforementioned three buckets of considerations: systemic risk, protection systems and computer assurance.
For instance, is it possible to have a more precise description of risk categories (e.g., in terms of the type of consequences, who is affected, the initiating events that precipitated them). Is it possible to define numerical targets? If not, how does one define ‘acceptable’ risk? What would the protection and control envelopes look like? What would be the parameters that need to be measured and what would be inferred from them? How are they related to existing controls such as price limits or circuit breakers? What additional understanding (and research) is needed given the complex adaptive systems nature of markets?
By asking the right questions, financial market trading and other functions can be locked down and evolve into a trusted environment.
“Over time, risk frameworks may become obsolete as market participants and computer algorithms adapt their behavior, risk frameworks may become obsolete,” write Bloomfield and Wetherilt. “But adaptive behavior has positive features too: feedback and learning are essential parts of a safety culture. The nuclear industry has its share of accidents and incidents (from Windscale to Fukushima) that cause reflection and reanalysis of its risk frameworks.”