Encyclopedia > Risk aversion

  Article Content

Risk

Redirected from Risk aversion

Risk is the potential future harm that may arise from some present action. It is often combined or confused with the probability of an event which is seen as undesirable. Usually the probability and some assessment of expected harms must be combined into a believable scenario combining risk, regret and reward[?] probabilities into expected value. However, there are many informal methods which are used to assess (or to "measure" although it is not usually possible to directly measure) risk.

Table of contents

Risk is not the same as threat

In scenario analysis[?] "risk" is distinct from "threat." "Threat" refers to a very low-probability but high-impact event - which cannot typically be assigned a probability in a risk assessment because it has never occurred, and for which no effective preventive measure[?] is available. The difference is most clearly illustrated by the precautionary principle which seeks to reduce threat by requiring it to be reduced to a set of well-defined risks before an action, project, innovation or experiment is allowed to proceed.

A more specific example is the preparedness of the United States of America prior to the devastating attack on September 11th, 2001. Although the Central Intelligence Agency had often warned of a "clear and present danger" of using planes as weapons, this was considered a threat not a risk. Accordingly, no comprehensive scenarios of probabilities and counter-measures were ever prepared for the type of attack that occurred. In general, a threat cannot be characterized as a risk without at least one specific incident[?] wherein the threat can be said to have "realized". From that point, there it at least some basis to characterize a probability, e.g. "in the entire history of air travel, X flights have led to 1 incident of..."

Professions and Governments manage Risk

Means of measuring and assessing risk vary widely across different professions-- indeed means of doing so may define different professions, e.g. a doctor manages medical risk, a civil engineer manages risk of structural failure, etc.

A professional code of ethics[?] is usually focused on risk assessment and mitigation (by the professional on behalf of client, public, society or life in general).

Some theorists of political science, notably Carol Moore and Jane Jacobs, emphasize that smaller political units and careful separation of the roles of regulator and trader can improve professional ethics[?] and subordinate them to uniform risk limits[?] that would apply to a particular locale, e.g. an entire urban area.

The political ideal of bioregional democracy arose in part in response to these ideals, and problems of professional jargons and associations alienating power from real people living in real places.

"A profession by definition is in a conflict of interest with respect to the risk passed on to its clients." - Steven Rapaport.

Risk as Regret?

Risk has no one definition, but some theorists, notably Ron Dembo, have defined quite general methods to assess risk as an expected after-the-fact level of regret. Such methods have been uniquely successful in limiting interest rate risk[?] in financial markets[?]. Financial markets are considered to be a proving ground for general methods of risk assessment.

However, these methods are also hard to understand. The mathematical difficulties interfere with other social goods such as disclosure[?], valuation[?] and transparency.

In particular, it is often difficult to tell if such financial instruments are "hedging[?]" (decreasing measurable risk by giving up certain windfall gains) or "gambling" (increasing measurable risk and exposing the investor to catastrophic loss in pursuit of very high windfalls that increase expected value).

As regret measures[?] rarely reflect actual human risk-aversion[?], it is difficult to determine if the outcomes of such transactions will be satisfactory.

In financial markets one may needs to measure credit risk[?], information timing and source risk[?], probability model risk[?], and legal risk[?] if there are regulatory or civil actions taken as a result of some "investor's regret[?]".

Tough Choices

Financial markets illustrate a more general problem in defining and assessing risk-- the ways that different types of risk combine.

In can be hard to see how the relative risks from different sources should affect one's decisions. For example, when treating a disease a doctor might have the choice of either using a drug that had a high probablility of causing minor side effects, or carrying out an operation with a low probability of causing very severe damage.

According to the regret theory[?], the only way to resolve such dilemmas might be to find out more about the patient's life and ambitions. If, for instance, the patient's greatest desire centered on raising children, one might prefer the drug even if it limited their mobility or physical capacity somewhat. However, if the patient has already risked their own life several times in extreme sporting events, the decision to do so one more time and recover full capacities may be far preferable.

This highlights a major problem in professional ethics: knowing when the cognitive bias of the professional versus the client (or "patient") must dominate, and what choices each is best able to make.

Framing

Framing is a fundamental problem with all forms of risk assessment. The above examples: body, threat, price of life, professional ethics[?] and regret show that the risk adjustor[?] or assessor often faces serious conflict of interest[?], The assessor also faces cognitive bias and cultural bias, and cannot always be trusted to avoid all moral hazards[?]. This represents a risk in itself, which grows as the assessor is less like the client.

For instance, an extremely disturbing event that all participants wish not to happen again may be ignored in analysis despite the fact it has occurred and has a nonzero probability. Or, an event that everyone agrees is inevitable may be ruled out of analysis due to greed or an unwillingness to admit that it is believed to be inevitable.

These human tendencies to error and wishful thinking often affect even the most rigorous applications of the scientific method and are a major concern of the philosophy of science.

But all decision-making under uncertainty[?] must consider cognitive bias, cultural bias, and notational bias: No group of people assessing risk is immune to "groupthink": acceptance of obviously-wrong answers simply because it is socially painful to disagree.

One effective way to solve framing problems in risk assessment or measurement (although some argue that risk cannot be measured, only assessed) is to ensure that scenarios, as a strict rule, must include unpopular and perhaps unbelievable (to the group) high-impact low-probability "threat" and/or "vision" events.

This permits participants in risk assessment to raise others' fears or personal ideals by way of completeness, without others concluding that they have done so for any reason other than satisfying this formal requirement.

For example, an intelligence analyst with a scenario for an attack by hijacking might have been able to insert mitigation for this threat into the U.S. budget. It would be admitted as a formal risk with a nominal low probability. This would permit coping with threats even though the threats were dismissed by the analyst's superiors.

Even small investments in diligence on this matter might have disrupted or prevented the attack-- or at least "hedged" against the risk that an Administration might be mistaken.

Insurance

Although military decision making tends to dominate risk theory, its most sophisticated daily practice is in the insurance industry[?],

The insurers have well-defined roles of actuary, underwriter[?], agent, auditor[?] and adjustor[?]. Each of these is an assessor[?] in somewhat different circumstances or stages of the insuring, reinsuring, adjustment, recovery and claims payment processes.

Military leads Insurance leads Finance leads Government

In very broad terms, military and insurance decision making is quite a bit more formal and sophisticated than equivalent processes in financial markets - the regret theory[?] has done much to equalize this by incorporating many common military and insurance practices, and putting formal trappings on them.

Generally, the military, insurance, financial, and other professional fields must work through methods before they become prevalent in government policy.

Risk assessments with differing ways of determining public concerns are a major concern of political parties. These parties compete to impose these views on foreign policy, the judicial system[?], law enforcement, and in Legislation.

The techniques flow slowly from one field to the next. To illustrate the long timelines involved, scenario analysis[?] matured during Cold War confrontations between major powers, notably the USA and USSR, but was not widespread in insurance circles until the 1970s when major oil tanker disasters forced a more comprehensive foresight, It entered finance until the 1980s when financial derivatives[?] proliferated. It did not reach most professions in general until the 1990s when personal computers proliferated.

Governments are apparently only now learning to use sophisticated risk methods, most obviously to set standards for environmental regulation[?], e.g. "pathway analysis[?]" as practiced in the US EPA[?].

Civilization as Risk Reduction?

"Civilization advances by extending the number of important operations which we can perform without thinking about them." - Alfred North Whitehead.

If Whitehead is right, then the perfect civilization is the perfect risk reduction algorithm-- capable of warning us long in advance of forseeable problems, and assuring us that surprises were unforseeable in principle.

Unfortunately, this vision of a risk-reducing symbiote or prosthetic for human judgement remains elusive, fragmented, and unlikely to be realized.

Fear as Intuitive Risk Assessment?

For the time being, we must rely on our own fear and hesitation to keep us out of the most profoundly unknown circumstances.

In "The Gift of Fear[?]", Gavin de Becker[?] argues that "True fear is a gift." (from book jacket) "It is a survival signal that sounds only in the presence of danger. Yet unwarranted fear has assumed a power over us that it holds over no other creature on Earth. It need not be this way."

Risk could be said to be the way we collectively measure and share this "true fear" - a fusion of rational doubt, irrational fear, and a set of unquantified biases from our own experience.

The field of behavioral finance focuses on human risk aversion, asymmetric regret, and other ways that human financial behavior varies from what analysts call "rational."

See also Safety engineering, Civil defense

Whitehead quotations (http://www-groups.dcs.st-and.ac.uk/~history/Quotations/Whitehead)


Risk is the name of a popular board game and an album by Megadeth.



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
Kings Park, New York

... Kings Park is located at 40°53'19" North, 73°14'33" West (40.888497, -73.242582)1. According to the United States Census Bureau, the town has a total area of 16.3 ...

 
 
 
This page was created in 26.7 ms