The quantification of risk has been rising in popularity in cybersecurity circles over the past couple years, as reported by the Wall Street Journal. However, WSJ’s report leaves out one of the most impactful measures for risk: Factor Analysis of Information Risk (FAIR) analysis, the only internationally recognized standard for quantifying risk. The FAIR Institute has developed a robust system of risk management based entirely on quantifications. So, how is the risk exposure calculated in FAIR?
The short answer: FAIR assigns dollar values to losses you are likely to face, then factors in various “threat” and “vulnerability” values to determine if, or how often, you’ll face them.
The longer answer needs a little more detail.
How is Risk Exposure Calculated in FAIR?
The FAIR Institute doesn’t publicize its specific formulas for risk. Both the actual dollar amounts and the variables used for their risk calculations are secret. However, FAIR’s categorical scheme and the flow of analysis used to calculate risk are accessible (and covered below).
To understand how the assessment of risk operates in its nitty-gritty details, you need to understand FAIR’s overall approach to risk and what constitutes effective management.
In the sections that follow, we’ll cover:
- FAIR’s definition of and approach to risk
- FAIR’s breakdown of cost magnitude
- FAIR’s figures for loss frequency
With this, you’ll be ready to begin the process of risk quantification at your company, either on your own or with the help of qualified cybersecurity experts.
Risk as Loss: Risk Quantification FAIR Methodology
The FAIR Institute prioritizes a proactive approach to assessing and managing risk.
This runs counter to what it categorizes as reactive, compliance-based approaches. Thus, FAIR risk training involves unlearning passive, qualitative definitions of risk and acclimating to a more dynamic, proactive, and quantified approach. Robust, cost-effective risk management requires sound models.
But that brings us to the heart of the matter — how do you create mathematically sound models for risk?
On the FAIR model flowchart, “risk” is defined as: “The probable frequency and probable magnitude of future loss.” In other words, FAIR sees risk as probable frequency multiplied by probable cause. This simple formula bears similarity to the “frequency-severity method,” a classic actuarial science model used to calculate risk in insurance for corporate finance.
But FAIR’s model isn’t just a simple multiplication; it involves a complex flow of categories, branching out from the two main factors:
- Loss magnitude
- Frequency of loss events
So, to better understand the risk quantification FAIR methodology (loss magnitude x frequency of loss events), let’s take a closer look at each side’s respective half of the flowchart.
FAIR’s Forms and Magnitude of Loss
According to FAIR, the shape that loss takes is half of what risk entails. Understanding the magnitude of loss involves not just quantifying it with a dollar amount, but also classifying it according to its particular type. Different types of loss will impact your company differently.
The forms of loss enumerated by FAIR in the flowchart include:
- Productivity – Costs of incapacitation and inability to deliver key products and services.
- Response – The immediate costs of responding to a risk or threat as it happens.
- Replacement – Short- and long-term costs of replacing any compromised assets.
- Reputation – Stock price drops and other costs of fluctuating shareholder perception.
- Competitive advantage – Costs of losing an edge, like intellectual property or a market share.
- Judgements and fines – Court-ordered fines, settlements, and criminal punishments.
Across these forms of loss, companies have the ability to even quantify costs which might be considered “intangible” or immaterial. For instance, a company with an accurate overview of its market and customer base should be able to estimate the cost of a blow to its reputation. While publicly available information might not suffice here, internal metrics are invaluable.
Ultimately, the “magnitude” side of the main flow chart leads to two destinations:
- Primary loss – A quantification (in dollars) of loss across the categories listed above, as well as any other niche costs specific to your organization (e.g., scheduling).
- Secondary risk – A new tree, stemming from any risks opened up by the first. Crucially, this second tree also compounds with the first, multiplying costs accordingly:
- Secondary loss event frequency, taking the primary event into account.
- Secondary loss magnitude, combined with that of the first.
All told, the values from this half of the chart makeup half of the equation. The other half of calculating risk involves looking at how often any and all of these losses can potentially occur.
Frequency of FAIR Loss Events
The frequency half of the flowchart is more complex than the magnitude side. While the form of loss does factor into its calculation, the raw output of that side of the chart has fewer exchanges and terms, as all kinds of loss ultimately result in a dollar amount of real or estimated cost.
On the frequency side, however, the task is to calculate the number of times or the rate at which a given loss event is likely to occur. This breaks down into the following branches:
- Threat event frequency – The amount of times a threat or potential risk may occur, including the frequency with which “threat agents” may initiate a loss event.
- Vulnerability – The likelihood of a threat becoming a risk, expressed as a probability.
These branches introduce the operating terms of “threat” and “vulnerability,” respectively, which work to quantify events that could become risks, as well as how likely it is that they will. This calculation involves another level of subdivision into further sub-branches.
The first sub-branch, threat event frequency, leads into:
- Contact frequency – The amount of times, or rate at which, a threat agent can be expected to come into contact with assets or resources, creating a potential threat.
- Probability of action – How likely it is that the threat agent will take action upon contacting an asset or other resource, represented as a probability.
And the second sub-branch, vulnerability, leads into:
- Threat capability – The level of force a threat agent is likely to be able to apply to a given asset or resource, depending on the agent’s skills and resources.
- Resistance strength – The ability of an asset or resource to resist a threat agent’s attempts to compromise it, including built-in cyberdefense mechanisms.
The values from the frequency side of the flowchart are multiplied by corresponding magnitude figures to produce risk estimates. But ultimately, the estimated probability and cost in dollars of any given threat or risk is less valuable than the sum total of the model that produces it.
Manage Risks FAIR-ly and Professionally
Quantifying risk is far from the only thing you need to do in order to maintain robust vulnerability management across your company. You’ll also need to specify particular plans of action for addressing identified threats and risks, as well as full-blown cybersecurity events. To that effect, managed IT and security services completely protect your company from all risks and threats.
RSI Security is your best option for all your cyberdefense needs. Our talented team of experts has serviced companies of all sizes for over a decade. When it comes FAIR, we know compliance is only the beginning of security. So, we’ll help build your security architecture from the ground up.
To circle back to our question from above: how is the risk exposure calculated in FAIR? It’s run through a version of the flowchart we detailed, breaking down into measures of frequency and magnitude of loss. To help run the calculations for your company, contact RSI Security today!