Instead of reflecting on the
unlikelihood of rare catastrophes after the fact, Elisabeth Paté-Cornell, a Stanford professor of management science and
engineering and risk analysis expert
, prescribes an engineering approach to anticipate them
when possible, and to manage them when not.
Kelly Servick is a science-writing intern at the
Stanford University School of Engineering. In a recent article, she reviews the work of Elisabeth Paté-Cornell in this subject area. Click here for Ms. Servick's article. Click here for the link to Elisabeth Paté-Cornell's paper on the subject.
Ms. Paté-Cornell argues that a true 'black swan' - an
event that is impossible to imagine because we've known nothing like it in the
past - is extremely rare. (Reference "The Black Swan" by Nassim Nicholas Taleb.) The terms "black swan" and
"perfect storm" have become part of the public vocabulary for
describing disasters ranging from the 2008 meltdown in the financial sector to
the terrorist attacks of Sept. 11, 2001. But using these terms too
liberally in the aftermath of a disaster is really just an excuse for poor planning.
Her research on risk analysis was published in the November issue of
the journal Risk Analysis. Here she suggests that other fields could borrow risk
analysis strategies from engineering to make better management decisions, even
in the case of once-in-a-blue-moon events where statistics are scant,
unreliable or even non-existent.
A true
"black swan" – an event that is impossible to imagine because we've
known nothing like it in the past – is extremely rare. The AIDS virus is an example. More often, there are important clues and warning signs of
emerging hazards (e.g., a new flu virus) that can be monitored to guide quick
risk management responses.
The 9/11 attack was not a black
swan as the FBI knew that questionable people were taking flying
lessons on large aircraft.
Similarly, she argues that the risk
of a "perfect storm," where multiple forces join to create a disaster
greater than the sum of its parts, can be assessed in a systematic way before
the event because even though their conjunctions are rare, the events that
compose them – and all the myriad events that are dependent on them – have been
observed in the past.
An engineering risk analysis is based upon systems, their functional components and their dependencies. For instance, many plants require cooling,
generators, turbines, water pumps, safety valves and more all contributing to
making the system work. Therefore, the risk analyst must first understand the ways
in which the system works as a whole in order to identify how it could fail.
The same methods can be applied to medical, financial or ecological systems.
Paté-Cornell says that a systematic
approach is also relevant to human aspects of risk analysis.
"Some argue that in engineering
you have hard data about hard systems and hard architectures, but as soon as
you involve human beings, you cannot apply the same methods due to the
uncertainties of human error. I do not believe this is true," she said.
In fact, she and her colleagues
have long been incorporating "soft" elements into their systems
analysis to calculate the probability of human error. They look at all the
people with access to the system and factor in any available information about
past behaviors, training and skills. by doing this, she has found that human
errors, far from being unpredictable, are often rooted in the way an
organization is managed. "We look at how the management has trained,
informed and given incentives to people to do what they do and assign risk based
on those assessments."
Paté-Cornell has successfully
applied this approach to the field of finance, where she has estimated the
probability that an insurance company would fail given its age and its size.
She has found that companies need forward-looking
models that their financial analysts generally did not provide. Traditional
financial analysis is based on evaluating existing statistical data
about past events - like trying to drive by only looking in the rear view mirror.
In her view, analysts can better anticipate market failures
– like the financial crisis that began in 2008 – by recognizing precursors and
warning signs, and factoring them into a systemic probabilistic analysis.
Medical specialists must also make
decisions in the face of limited statistical data, and Paté-Cornell says the
same approach is useful for calculating patient risk. She used systems analysis to assess
data about anesthesia accidents. Based
on her results, she suggested retraining and recertification procedures for
anesthesiologists to make their system safer.
"Lots of people don't like
probability because they don't understand it," she said, "and they
think if they don't have hard statistics, they cannot do a risk analysis."
In fact, we generally do a system-based risk analysis because we do not
have reliable statistics about the performance of the whole system.
No comments:
Post a Comment