Archive for the ‘Decision Makng’ category

AI Can Help the CRO

March 27, 2024

A Guest Post by ChatGPT

For Chief Risk Officers (CROs) navigating the complex and rapidly evolving landscape of risk in financial institutions, artificial intelligence (AI) presents a suite of powerful tools to enhance decision-making, improve risk assessment, and optimize risk management processes. AI’s capabilities can significantly impact various aspects of a CRO’s job, making it a pivotal ally in addressing strategic, operational, and financial risks.

Enhanced Risk Identification and Assessment

AI can process vast amounts of data from diverse sources, including market trends, operational metrics, and social media, to identify and assess risks more efficiently than traditional methods. This capability allows CROs to detect emerging risks faster and with greater accuracy, facilitating proactive risk management. For instance, machine learning models can predict potential default risks by analyzing patterns in credit history, market conditions, and economic indicators, thereby enhancing the accuracy of credit risk assessments.

Strategic Decision Support

AI supports strategic decision-making by providing CROs with data-driven insights into risk-return trade-offs associated with different strategic choices. By simulating various scenarios and analyzing their potential impacts on the organization’s risk profile, AI helps CROs in making informed decisions that align with the company’s risk appetite and strategic objectives.

Operational Risk Management

AI can automate the monitoring of operational risks by analyzing transaction patterns, employee activities, and compliance with procedures, identifying anomalies that may indicate fraud, errors, or inefficiencies. This real-time monitoring capability enables CROs to swiftly address operational risks, reducing potential losses and improving operational resilience. Furthermore, AI-powered process automation can streamline risk management processes, enhancing efficiency and reducing the likelihood of human error.

Financial Risk Analysis

In the realm of financial risks, AI models excel at analyzing market data, economic indicators, and financial trends to forecast future market movements and assess the potential impact on the organization’s financial health. This analysis can include stress testing, value-at-risk (VaR) calculations, and sensitivity analyses, providing CROs with a comprehensive understanding of financial risks and the effectiveness of hedging strategies.

Risk Reporting and Communication

AI can also revolutionize risk reporting and communication by generating dynamic, real-time risk reports that offer insights into the current risk landscape. These reports can be tailored to different audiences, from the board of directors to operational teams, ensuring that all stakeholders have the information they need to understand and manage risks effectively.

Conclusion

For CROs, the adoption of AI in risk management offers a transformative approach to navigating the complexities of risk in the financial services industry. By enhancing risk assessment, supporting strategic decision-making, improving operational efficiency, and facilitating effective risk communication, AI enables CROs to manage risks more proactively and strategically. As the risk landscape continues to evolve, leveraging AI will be crucial for CROs aiming to foster a strong risk management culture and drive their organizations towards sustainable growth and resilience.

Variety of Decision Making

July 20, 2022

Over the past several years, an anthropologist (Thompson), a control engineer (Beck) and an actuary (Ingram) have formed an unlikely collaboration that has resulted in countless discussions among the three of us along with several published (and posted) documents.

Our work was first planned in 2018. One further part of what was planned is still under development — the application of these ideas to economic thinking. This is previewed in document (2) below, where it is presented as Institutional Evolutionary Economics.

Here are abstracts and links to the existing documents:

  1. Model Governance and Rational Adaptability in Enterprise Risk Management, January 2020, AFIR-ERM section of the International Actuarial Association. The problem context here is what has been called the “Insurance Cycle”. In this cycle we recognize four qualitatively different risk environments, or seasons of risk. We address the use of models for supporting an insurer’s decision making for enterprise risk management (ERM) across all four seasons of the cycle. In particular, the report focuses expressly on: first, the matter of governance for dealing with model risk; and, second, model support for Rational Adaptability (RA) at the transitions among the seasons of risk. This latter examines what may happen around the turning points in the insurance cycle (any cycle, for that matter), when the risk of a model generating flawed foresight will generally be at its highest.
  2. Modeling the Variety of Decision Making, August 2021, Joint Risk Management Section. The four qualitatively different seasons of risk call for four distinctly different risk-coping decision rules. And if exercising those strategies is to be supported and informed by a model, four qualitatively different parameterizations of the model are also required. This is the variety of decision making that is being modeled. Except that we propose and develop in this work a first blueprint for a fifth decision-making strategy, to which we refer as the adaptor. It is a strategy for assisting the process of RA in ERM and navigating adaptively through all the seasons of risk, insurance cycle after insurance cycle. What is more, the variety of everyday risk-coping decision rules and supporting models can be substituted by a single corresponding rule and model whose parameters vary (slowly) with time, as the model tracks the seasonal business and risk transitions.
  3. The Adaptor Emerges, December 2021, The Actuary Magazine, Society of Actuaries. The adaptor strategy focuses on strategic change: on the chops and changes among the seasons of risk over the longer term. The attention of actuaries coping with everyday risk is necessarily focused on the short term. When the facts change qualitatively, as indeed they did during the pandemic, mindsets, models, and customary everyday rules must be changed. Our adaptor indeed emerged during the pandemic, albeit coincidentally, since such was already implied in RA for ERM.
  4. An Adaptor Strategy for Enterprise Risk Management, April 2022, Risk Management Newsletter, Joint Risk Management Section. In our earlier work (2009-13), something called the “Surprise Game” was introduced and experimented with. In it, simulated businesses are obliged to be surprised and shaken into eventually switching their risk-coping decision strategies as the seasons of risk undergo qualitative seasonal shifts and transitions. That “eventually” can be much delayed, with poor business performance accumulating all the while. In control engineering, the logic of the Surprise Game is closely similar to something called cascade control. We show how the adaptor strategy is akin to switching the “autopilot” in the company driving seat of risk-coping, but ideally much more promptly than waiting (and waiting) for any eventual surprise to dawn on the occupant of the driving seat.
  5. An Adaptor Strategy for Enterprise Risk Management (Part 2), July 2022, Risk Management Newsletter, Joint Risk Management Section. Rather than its switching function, the priority of the adaptor strategy should really be that of nurturing the human and financial resources in the makeup of a business — so that the business can perform with resilience, season in, season out, economic cycle after economic cycle. The nurturing function can be informed and supported by an adaptor “dashboard”. For example, the dashboard can be designed to alert the adaptor to the impending loss or surfeit of personnel skilled in implementing any one of the four risk-coping strategies of RA for ERM. We cite evidence of such a dashboard from both the insurance industry and an innovation ecosystem in Linz, Austria.
  6. Adaptor Exceptionalism:Structural Change & Systems Thinking, March 2022, RISKVIEWS, Here we link Parts 1 and 2 of the Risk Management Newsletter article ((4) and (5) above). When we talk of “when the facts change, we change our mindsets”, we are essentially talking about structural change in a system, most familiarly, the economy. One way of grasping the essence of this, hence the essence of the invaluable (but elusive) systemic property of resilience, is through the control engineering device of a much simplified model of the system with a parameterization that changes relatively slowly over time — the adaptor model of document (2) above, in fact. This work begins to show how the nurturing function of the adaptor strategy is so important for the achievement of resilient business performance.
  7. Adaptor Strategy: Foresight, May 2022, RISKVIEWS. This is a postscript to the two-part Newsletter article and, indeed, its linking technical support material of document (6). It identifies a third possible component of an adaptor strategy: that of deliberately probing the uncertainties in business behaviour and its surrounding risk environment. This probing function derives directly from the principle of “dual adaptive control” — something associated with systems such as guided missiles. Heaven forbid: that such should be the outcome of a discussion between the control engineer, the actuary, and the anthropologist!

Still to be completed is the full exposition of Institutional Evolutionary Economics that is previewed in Section 1 of Modeling the Variety of Decision Making (Item 2 above).

First Quarter GDP

April 30, 2022

Do you notice anything unusual in the graph above that occurred in the first quarter of 2022? This graph says that in January about 6% of Americans were sick. That is about 25% of all of the COVID infections over the past 26 months. Other than January 2022, COVID infections averaged 2.4 million per month.

First quarter GDP fell by 1.4% in 2022.

I would bet that some of the GDP drop was due to the absolutely extraordinary level of illness in the first quarter.

I hadn’t noticed any commentary that agrees with this point. But I am guessing that since we are all feeling that we have turned the corner on COVID, we are deliberately putting it out of our minds. Which may cause us to draw erroneous conclusions about what is happening with the economy and take actions to fix something that may have been driven to some extant by the pandemic, not some other type of weakness in the economy.

Risk Intelligence IV

March 20, 2019

Overcoming Biases

In a recent post, RISKVIEWS proposed that Risk Intelligence would overcome biases.  Here are some specifics…

Biases

  • Anchoring – too much reliance on first experience
  • Availability – overestimate likelihood of events that readily come to mind
  • Confirmation Bias – look for information that confirms bias
  • Endowment effect – overvalue what you already have
  • Framing effect – conclusion depends on how the question is phrased
  • Gambler’s Fallacy – Belief that future probabilities are impacted by past experience – reversion to mean
  • Hindsight bias – things seem to be predictable after they happen
  • Illusion of control – overestimate degree of control over events
  • Overconfidence – believe own answers are more correct
  • Status Quo bias – Expect things to stay the same
  • Survivorship bias – only look at the people who finished a process, not all who started
  • Ostrich Effect – Ignore negative information

Each of Education, Experience and Analysis should reduce all of these.

Experience should provide the feedback that most of these ideas are simply wrong.  The original work that started to identify these biases followed the standard psychology approach of excluding anyone with experience and would also prohibit anyone from trying any of the questions a second time.  So learning to identify and avoid these biases through experience has had limited testing.

Education for a risk manager should simply mention all of these biases directly and their adverse consequences.  Many risk managers receiving that education will ever after seek to avoid making those mistakes.

But some will be blinded by the perceptual biases and therefore resist abandoning their gut feel that actually follows the biases.

Analysis may provide the information to convince  some of these remaining holdouts.  Analysis, if done correctly, will follow the logic of economic rationality which is the metric that we used to identify the wrong decisions that were eventually aggregated as biases.

So there may still be some people who even in the face of:

  • Experience of less than optimal outcomes
  • Education that provides discussion and examples of the adverse impact of decision-making based upon the biases.
  • Analysis that provides numerical back-up for unbiased decision making

Will still want to trust their own gut to make decisions regarding risk.

You can probably weed out those folks in hiring.

Management by Onside Kick

June 6, 2016

Many American football fans can recall a game when their team drove the ball 80 or more yards in the waning moments of the game to pull within a touchdown of the team that had been dominating them. Then they call for the on side kick – recover the ball and charge to a win within a few more plays.

But according to NFL stats, that onside kick succeeds only 20% of the time in the waning minutes of the game.

Mid game onside kicks – that are surprises – work 60% of the time.

But mostly it is the successful onside kicks that make the highlights reel. RISKVIEWS guesses that on the highlights those kicks are 80% or more successful.

And if you look back on the games of the teams that make it to the Super Bowl, they probably were successful the few times that they called that play.

What does that mean for risk managers?

Be careful where you get your statistics. Big data is now very popular. Winners use Big Data. So many conclude that it will give better indications. But make sure that your data inputs are not from highlight reels or from the records of the best year for a company.

Many firms use default data collected by rating agencies for example to parameterize their credit models. But the rating agencies would point out that the data is from rated companies only. This makes little difference for rated Bonds. There the bonds are rated from issue to maturity or default. But if you want to build a default model of insurers or reinsurers then you need to know that many insurers and some reinsurers will drop their rating if it falls below a level where it hurts their business. So ratings transition statistics for insurers are more like the highlight reels below a certain level.

Some models of dynamic hedging strategies were in effect taking the mid game success rates and assuming that they would apply in bad times. But like the onside kick, things worked very different.

So realize that a business strategy and especially a risk mitigation strategy may work differently when things have gone all a mess.

And an onside kick is nothing more than putting the ball in play and praying that something good will happen.

Top 10 RISKVIEWS Posts of 2014 – ORSA Heavily Featured

December 29, 2014

RISKVIEWS believes that this may be the best top 10 list of posts in the history of this blog.  Thanks to our readers whose clicks resulted in their selection.

  • Instructions for a 17 Step ORSA Process – Own Risk and Solvency Assessment is here for Canadian insurers, coming in 2015 for US and required in Europe for 2016. At least 10 other countries have also adopted ORSA and are moving towards full implementation. This post leads you to 17 other posts that give a detailed view of the various parts to a full ORSA process and report.
  • Full Limits Stress Test – Where Solvency and ERM Meet – This post suggests a link between your ERM program and your stress tests for ORSA that is highly logical, but not generally practiced.
  • What kind of Stress Test? – Risk managers need to do a better job communicating what they are doing. Much communications about risk models and stress tests is fairly mechanical and technical. This post suggests some plain English terminology to describe the stress tests to non-technical audiences such as boards and top management.
  • How to Build and Use a Risk Register – A first RISKVIEWS post from a new regular contributor, Harry Hall. Watch for more posts along these lines from Harry in the coming months. And catch Harry on his blog, http://www.pmsouth.com
  • ORSA ==> AC – ST > RCS – You will notice a recurring theme in 2014 – ORSA. That topic has taken up much of RISKVIEWS time in 2014 and will likely take up even more in 2015 and after as more and more companies undertake their first ORSA process and report. This post is a simple explanation of the question that ORSA is trying to answer that RISKVIEWS has used when explaining ORSA to a board of directors.
  • The History of Risk Management – Someone asked RISKVIEWS to do a speech on the history of ERM. This post and the associated new permanent page are the notes from writing that speech. Much more here than could fit into a 15 minute talk.
  • Hierarchy Principle of Risk Management – There are thousands of risks faced by an insurer that do not belong in their ERM program. That is because of the Hierarchy Principle. Many insurers who have followed someone’s urging that ALL risk need to be included in ERM belatedly find out that no one in top management wants to hear from them or to let them talk to the board. A good dose of the Hierarchy Principle will fix that, though it will take time. Bad first impressions are difficult to fix.
  • Risk Culture, Neoclassical Economics, and Enterprise Risk Management – A discussion of the different beliefs about how business and risk work. A difference in the beliefs that are taught in MBA and Finance programs from the beliefs about risk that underpin ERM make it difficult to reconcile spending time and money on risk management.
  • What CEO’s Think about Risk – A discussion of three different aspects of decision-making as practiced by top management of companies and the decision making processes that are taught to quants can make quants less effective when trying to explain their work and conclusions.
  • Decision Making Under Deep Uncertainty – Explores the concepts of Deep Uncertainty and Wicked Problems. Of interest if you have any risks that you find yourself unable to clearly understand or if you have any problems where all of the apparent solutions are strongly opposed by one group of stakeholders or another.

Decision Making Under Deep Uncertainty

October 20, 2014

The above is a part of the title of a World Bank report.  The full title of that report is

Investment Decision Making Under Deep Uncertainty – Application to Climate Change

While that report focuses upon that one specific activity – Investing, and one area of deep uncertainty – Climate Change, there are some very interesting suggestions contained there that can be more broadly applied.

First, let’s look at the idea of Deep Uncertainty.  They define it as:

deep uncertainty is a situation in which analysts do not know or cannot agree on (1) models that relate key forces that shape the future, (2) probability distributions of key variables and parameters in these models, and/or (3) the value of alternative outcomes.

In 1973, Horst W.J. Rittel and Melvin M. Webber, two Berkeley professors, published an article in Policy Sciences introducing the notion of “wicked” social problems. The article, “Dilemmas in a General Theory of Planning,” named 10 properties that distinguished wicked problems from hard but ordinary problems.

1. There is no definitive formulation of a wicked problem. It’s not possible to write a well-defined statement of the problem, as can be done with an ordinary problem.

2. Wicked problems have no stopping rule. You can tell when you’ve reached a solution with an ordinary problem. With a wicked problem, the search for solutions never stops.

3. Solutions to wicked problems are not true or false, but good or bad. Ordinary problems have solutions that can be objectively evaluated as right or wrong. Choosing a solution to a wicked problem is largely a matter of judgment.

4. There is no immediate and no ultimate test of a solution to a wicked problem. It’s possible to determine right away if a solution to an ordinary problem is working. But solutions to wicked problems generate unexpected consequences over time, making it difficult to measure their effectiveness.

5. Every solution to a wicked problem is a “one-shot” operation; because there is no opportunity to learn by trial and error, every attempt counts significantly. Solutions to ordinary problems can be easily tried and abandoned. With wicked problems, every implemented solution has consequences that cannot be undone.

6. Wicked problems do not have an exhaustively describable set of potential solutions, nor is there a well-described set of permissible operations that may be incorporated into the plan. Ordinary problems come with a limited set of potential solutions, by contrast.

7. Every wicked problem is essentially unique. An ordinary problem belongs to a class of similar problems that are all solved in the same way. A wicked problem is substantially without precedent; experience does not help you address it.

8. Every wicked problem can be considered to be a symptom of another problem. While an ordinary problem is self-contained, a wicked problem is entwined with other problems. However, those problems don’t have one root cause.

9. The existence of a discrepancy representing a wicked problem can be explained in numerous ways. A wicked problem involves many stakeholders, who all will have different ideas about what the problem really is and what its causes are.

10. The planner has no right to be wrong. Problem solvers dealing with a wicked issue are held liable for the consequences of any actions they take, because those actions will have such a large impact and are hard to justify.

These Wicked Problems sound very similar to Deep Uncertainty.

The World Bank report suggests that “Accepting uncertainty mandates a focus on robustness”.

A robust decision process implies the selection of a project or plan which meets its intended goals – e.g., increase access to safe water, reduce floods, upgrade slums, or many others– across a variety of plausible futures. As such, we first look at the vulnerabilities of a plan (or set of possible plans) to a field of possible variables. We then identify a set of plausible futures, incorporating sets of the variables examined, and evaluate the performance of each plan under each future. Finally, we can identify which plans are robust to the futures deemed likely or otherwise important to consider.

That sounds a lot like a risk management approach.  Taking your plans and looking at how your plans work under a range of scenarios.

This is a different approach from what business managers are trained to take.  And it is a clear example of the fundamental conflict between risk management thinking and the predominant thinking of company management.

What business managers are taught to do is to predict the most likely future scenario and to make plans that will maximize the results under that scenario.

And that approach makes sense when faced with a reliably predictable world.  But in those situations when you are faced with Deep Uncertainty or Wicked Problems, the Robust Approach should be the preferred approach.

Risk managers need to understand that businesses mainly need to apply the Robust/risk management techniques to these Wicked Problems and Deep Uncertainty.  It is a major waste of time to seek to apply the Robust Approach when the situation is not that extreme.  Risk managers need to develop skills and processes to identify these situations.  Risk managers need to “sell” this approach to top management.  Risks need to be divided into two classes – “normal” and “Deep Uncertain/Wicked” and the Robust Approach used for planning what to do regarding the business activities subject to that risk.  The Deep Uncertainty may not exist now, but the risk manager needs to have the credibility with top management when they bring their reasoning for identifying a new situation of Deep Uncertainty.

Communicating with CEOs

September 24, 2014

 The point of communication isn’t to speak. It’s to be heard and understood — to have influence and motivate action. Effective communication requires knowing what information you want to convey and what action you want to motivate, but that’s not enough. You must also know your audience — in this case CEOs—well enough to determine what factors will truly resonate and motivate them to take the desired action based on your information.

CEO’s often are not thinking about their key decisions in the same statistical terms that a risk manager or other quantitative analyst would favor.   Several different studies show that most experienced decision makers do not apply statistical thinking either.  Instead they apply a natural decision making process assisted liberally by heuristics. 

CEO’s and other leaders also commonly have different perspectives on priorities than risk managers and analysts.  Analysts will tend to see the world “realistically” with a balance between risks and rewards, while CEO’s may have reached their position, in part, because they see the world “optimisticslly” as containing plenty of opportunities where rewards are much more likely than overstated risks.  Of course, from the perspective of the CEO, the analysts are “pessimistic” and they themselves are “realistic”. 

To communicate with CEO’s, risk managers and analysts need to learn to frame the results of their work in terms that make sense to CEO’s.  That will often be in terms of Natural Decision Making, Heuristics and Opportunities. 

For more on this topic, see Actuarial Review “How to Talk to a CEO“. 

 

Irrational means you don’t agree with me

February 5, 2014

The term “Rational” is used in economics to mean using the decision process that results in the best economic outcome.

And the best economic outcome is often defined as the one that results in the highest amount of money for the decision maker.  That at least is the theory.  There is an entire body of analysis under the title “Game Theory” that shows how such Rational Decision making would apply to many situations.

Herbert Simon actually showed a fundamental flaw in approaches like Game Theory and other forms of economic rationalism.

The flaw is that to really satisfy the rules of rationality, the decision maker would need to have infinite time to make the decision and would also need access to all knowledge, all of which must be reviewed to see if it pertains to the problem.

So Simon proposed that what was really meant by economists when they used the idea of rational decision making was something that he called “Bounded Rationality”.  The boundaries to rationality were necessary to get to a decision before tea time next spring.

Rational decision makers needed to apply heuristics to determine the actual amount of time and the pertinent information that would be needed for making any decision.  Heuristics are seen as the opposite of rationality.  They are the “gut feel” way to decide something.  So Simon showed that Rational Decision Makers must be using gut feel.

Just think if physicists considered anyone who did not solve physical problems using the best equations that physicists have to offer as “irrational”.  Everyone who drives a car, or catches a ball, would be found to be totally irrational, because those activities always rely upon heuristics, rather than physics equations.  Instead, physicists would readily admit that the person who can run across Center Field and arrive at just the right time to catch the baseball is actually properly applying physics, instead of the opposite.

And RISKVIEWS would extend Simon’s arguments to suggest that the heuristics used are not neutral to the decision.  “Rational” decision makers will all apply their own heuristics to decide what needs to go into a decision.  Some of those heuristics will be based not on a “rational” evaluation of the value of information not included or analysis not performed, but it will be biased to leave out the information and analysis that leads away from their preferred solution.

What Simon deduced is that there is no purely rational decision making process.

And RISKVIEWS is saying that anyone who proposes that their decision is made rationally should be suspect.  Are they using the term rational to persuade?  Or do they not even know about the limitations of their own analysis?

Good analysis should include information about the way that the analyst decided on the boundaries for that analysis.  Someone who simply states the assumptions underlying their analysis is not giving you a solution, they are giving you a puzzle to solve.  The solution to the puzzle is the knowledge of when the analysis may be true and when it may be untrue.  Solving that puzzle involves understanding the bounded rationality of the analyst and the degree to which reality may or may not be outside of those bounds.