The average cost of a data breach continues to increase, from $3.85 million in 2020 to nearly $4.9 million by 2024. In total, cybercrime costs global businesses more than $9 trillion a year. How dangerous are cyberattacks and cybersecurity for your business? Many organizations are turning to cyber risk quantification to answer this question in a rapidly changing threat environment. This guide explains five excellent quantitative risk frameworks you can use for CRQ.
1. Monte Carlo Quantitative Risk Framework

Monte Carlo risk models are complex frameworks that use probability distributions and random variables to calculate outcomes. The name is no coincidence; like the roulette tables at Monte Carlo, this statistical method relies on randomness to deliver comprehensive probabilities for situations that are hard to predict mathematically.
Monte Carlo simulations also form the backbone of well-known risk management frameworks such as the proprietary Factor Analysis of Information Risk model. FAIR uses a Monte Carlo engine to assign values for loss magnitude, loss event frequency, vulnerability, and threat event calculations.
Are Monte Carlo Simulations Good for Cybersecurity Risk Calculations?
Monte Carlo is one of the most widely used and trusted quantitative risk assessment methods for infosec — if you have the necessary processing power and expertise. What makes it powerful is its ability to account for multiple variables and unknown obstacles.
In the world of cyber threats, unpredictability is a given:
- Human error
- Insider threats
- Zero-day exploits and other emergent threats
- Business email compromise attacks
- Software flaws and errors
- Supply-chain vulnerabilities
New cyber threats can appear at a moment’s notice. Many events are outside of your control, such as supply chain attacks or user behaviors. Monte Carlo quantitative risk frameworks solve this problem by using a distribution of probabilities instead of concrete figures.
How Does the Monte Carlo Method Work for Quantitative Risk?
This framework involves building detailed computer models of the risks your organization is likely to face:
- Determine the main variables that impact risks.
- Calculate the minimum and maximum probability thresholds.
- Use a risk management software engine to calculate the randomized values for a bell curve.
- Interpret test results, set priorities, and apply the findings to your risk management program.
Monty Carlo frameworks shine in unpredictable situations where you need a quantitative analysis but lack historical data to draw on.
2. Decision Tree Analysis Method
Risk-averse organizations often use a decision tree risk analysis framework to minimize the negative impacts of investments and maximize the returns. This model uses detailed decision trees to show the short-term and long-term costs associated with different paths.
One example of a decision-tree framework in action is deciding which third-party vendors to work with. There are costs and benefits to each option:
- The vendor’s reputation for quality and dependability
- The cost of services
- Vendor certifications
- Time in business and expertise
- Practical considerations, such as project delivery timelines
- Likelihood and severity of risks associated with each vendor
Vendor A may be a well-known global firm, such as PwC. There’s usually a correlation between reputation, price, and risk factors, so Vendor A probably costs more than Vendor B. Before you choose a “less expensive” option, though, you need to calculate the potential costs of regulatory violations, data breaches, downtime, and reputational harm.
Other uses for decision models include:
- Technology decisions, such as continuous network monitoring services
- Personnel decisions, calculating the right size for your IT security or compliance team
- Policy decisions
- Risk management decisions, determining whether to mitigate, transfer, or accept risk
This model calculates the costs and risks associated with each decision node and overall path, helping you make data-driven decisions.
3. Expected Monetary Value Assessment Technique

The expected monetary value quantitative risk framework speaks in a language that shareholders and C-suite executives understand readily. It expresses cyber risks in financial terms, calculating probabilities and financial losses associated with risks.
The EMV formula is simple:
- Event probability x financial impact = risk EMV
Some cyber threats have a low likelihood but extreme costs, such as your CFO falling for a phishing scam. Other possibilities are likely but less impactful, such as an employee accidentally deleting a document. Critical vulnerabilities have a high likelihood and impact, including ransomware attacks that affect 75% of enterprises and result in staggering losses.
4. Sensitivity Analysis
Also known as a “what-if” analysis, sensitivity risk frameworks analyze the relationship between input variables and target variables. This allows your business to forecast the probable impacts and risks associated with specific changes.
This quantitative risk framework has important advantages:
- Shows which factors are relevant or irrelevant
- Allows for predictive analysis
- Offers excellent precision when paired with historical data
- Allows stakeholders to know which warning signs to watch for
- Identifies the most important areas of concern
Imagine you’re calculating the pros and cons of using an automation tool for regulatory compliance. A sensitivity analysis shows which areas of your workflow will be affected, the benefits, and the costs.
5. Fault Tree Quantitative Risk Assessment

The fault tree method of quantitative risk analysis flips the approach. Instead of starting with risks and working out the consequences, you start with the failure state and work backward to identify its causes.
This is helpful for cybersecurity risk assessments and regulatory compliance. Fault trees are exceptionally useful at locating vulnerabilities and attack surfaces.
If your organization manages a SaaS platform, you already know that you need to avoid data breaches at all costs. The question is where to strengthen your defenses.
One step at a time, you use operators like AND, OR, CONDITION, or VOTING to map the gates that can lead to failure. Each gate has a probability based on organizational data, such as network traffic, failed login attempts, or employee compliance failures.
The results of this analysis can help you prioritize specific corrective measures, technology investments, and mitigation strategies, such as zero-trust policies and role-based access controls.
Leverage Accurate Insights for Quantitative Risk Frameworks and Risk Management Decisions
The quality of your data has an enormous impact on the results you get from quantitative risk frameworks. Industry trends aren’t as accurate as historical data from your business. Use a governance, risk, and compliance monitoring solution to perform reliable risk assessments. Contact us to discover Compyl’s state-of-the-art GRC risk analysis tools today.