Home » Posts tagged 'systemic risk'
Tag Archives: systemic risk
The Law and “Ultrafast Extreme Events” – Is it Possible to Regulate “Machine Ecology” If it Moves Faster than the Human Mind Can React?
In a fascinating new article in Nature’s Scientific Reports, researchers describe a “machine ecology” humans have built through which we have ceded decisionmaking across a wide array of domains to technologies moving faster than the human mind can react. Consider that the new transatlantic cable underway is being built so we can reduce communication times by another 5 milliseconds, and that a new chip designed for financial trading can execute trades in just 740 nanoseconds (that’s 0.00074 milliseconds!), whereas even in its fastest modes (flight from danger and competition) the human mind makes important decisions in just under 1 second. As the article abstract suggests, the proliferation of this machine ecology could present as many problems as benefits:
Society’s techno-social systems are becoming ever faster and more computer-orientated. However, far from simply generating faster versions of existing behaviour, we show that this speed-up can generate a new behavioural regime as humans lose the ability to intervene in real time. Analyzing millisecond-scale data for the world’s largest and most powerful techno-social system, the global financial market, we uncover an abrupt transition to a new all-machine phase characterized by large numbers of subsecond extreme events. The proliferation of these subsecond events shows an intriguing correlation with the onset of the system-wide financial collapse in 2008. Our findings are consistent with an emerging ecology of competitive machines featuring ‘crowds’ of predatory algorithms, and highlight the need for a new scientific theory of subsecond financial phenomena.
One has to wonder how we can design regulatory mechanisms that will prove effective in controlling “ultrafast extreme events” and how legal doctrine will handle issues of liability, property, and contract when such events are moving at nanosecond speeds beyond human recognition. Indeed, the article’s authors focus on the financial system, and observe that the extent to which the thousands of UEEs their research has detected as occurring during the financial crisis were actually “provoked by regulatory and institutional changes around 2006, is a fascinating question whose answer depends on a deeper understanding of the market microstructure.” I’d love to see how Congress tees up that committee hearing!
If you’ve heard the term “systemic risk” it was most likely in connection with that little financial system hiccup we’re still recovering from. But the concept of systemic risk is not limited to financial systems–it applies to all complex systems. I have argued in a forthcoming article, for example, that complex legal systems experience systemic risk leading to episodes of widespread regulatory failure.
Dirk Helbing of the Swiss Federal Institute of Technology has published an article in Nature, Globally Networked Risks and How to Respond, that does the best job I’ve seen of explaining the concept of systemic risk and relating it to practical contexts. He defines systemic risk as
the risk of having not just statistically independent failures, but interdependent, so-called “cascading” failures in a network of N interconnected system components. That is, systemic risks result from connections between risks (“networked risks”). In such cases, a localized initial failure (“perturbation”) could have disastrous effects and cause, in principle, unbounded damage as N goes to infinity….Even higher risks are multiplied by networks of networks, that is, by the coupling of different kinds of systems. In fact, new vulnerabilities result from the increasing interdependencies between our energy, food and water systems, global supply chains, communication and financial systems, ecosystems and climate.
As Helbing notes, the World Economic Forum has described this global environment as a “hyper-connected” world exposed to massive systemic risks. Helbing’s paper does a wonderful job of working through through the drivers of systemic instability (such as tipping points, positive feedback, and complexity) and explaining how they affect various global systems (such as finance, communications, and social conflict). Along the way he makes some fascinating observations and poses some important questions. For example:
- He suggests that catastrophic damage scenarios are increasingly realistic. Is it possible, he asks, that “our worldwide anthropogenic system will get out of control sooner or later” and make possible the conditions for a “global time bomb”?
- He observes that “some of the worst disasters have happened because of a failure to imagine that they were possible,” yet our political and economic systems simply are not wired with the incentives needed to imagine and guard against these “black swan” events.
- He asks “if a country had all the computer power in the word and all the data, would this allow government to make the best decisions for everybody?” In a world brimming with systemic risk, the answer is no–the world is “too complex to be optimized top-down in real time.”
OK, so what’s this rather scary picture of our hyper-connected world got to do with Law 2050? Quite simply, we need to build systemic risk into our scenarios of the future. I argue in my paper that the legal system must (1) anticipate systemic failures in the systems it is designed to regulate, but also (2) anticipate systemic risk in the legal system as well. I offer some suggestions for how to do that, including greater use of “sensors” style regulation and a more concerted effort to evaluate law’s role in systemic failures. More broadly, Helbing suggests the development of a “Global Systems Science” discipline devoted to studying the interactions and interdependencies in the global techno-socio-economic-environmental system leading to systemic risk.
There is no way to root out systemic risk in a complex system–it comes with the territory–but we don’t have to be stupid about it. Helbing’s article goes a long way toward getting smart about it.