Skip to main content
Home
magnifying glass

Increasing forecasting confidence with sensitivity analysis

Every business has to cope with uncertainty about the future – but a method known as ‘sensitivity analysis’ can boost confidence in our financial forecasts. Yet what is sensitivity analysis and how can it help us? Carl Seidman explains.
 
As financial planners, we are not fortune-tellers: we do not know, with certainty, what the future holds. Forecasts will almost always be wrong, because our assumptions will be incorrect and will often change.

But when it comes to helping financial leaders plan and make decisions, that uncertainty should not make forecasting less useful – as long as our forecasts offer deep insight into the business drivers and risks involved, and how these may change.

So, as modelers, we should seek to understand not only sources of data, but also the basis and quality of our assumptions, and how changing assumptions may affect the conclusions we make.

In particular, given the uncertainty of forecasting, we need to “sensitize” the variables we use – to determine how far they impact the outputs we identify. To do this, we can run a sensitivity analysis.

What is sensitivity analysis?

A sensitivity analysis asks: “How far are outcomes impacted by key variables and business drivers?” At face value, it is a straightforward technique, which we can use to find out how a range of a certain input impacts certain outputs.

Take this example: a company wants to gauge how likely it is to hit an earnings metric (for example, EBITDA) 12 months into the future. To do this, it must understand the drivers and inputs which influence this metric.

Now, EBITDA is a product of many changing assumptions – including, but not limited to, sales mix, variable margin, SG&A, fixed costs, and other overheads. To run a sensitivity analysis, we adjust each of these variables and examine the extent to which they influence the output.

The four sensitivity quadrants

When I advise financial planning professionals on how to do a sensitivity analysis, I encourage them to understand two key elements of a business driver’s sensitivity: its range of variability, and the extent of its impact.

This means there are four possible combinations of what I call “sensitivity quadrants”:

  • Level 1: High variability, high impact
  • Level 2: Low variability, high impact
  • Level 3: High variability, low impact
  • Level 4: Low variability, low impact

Sensitivity analysis: risk management

To help understand how risks are managed, I also encourage professionals to analyze things qualitatively. So you can ask the following questions:

  1. What can be controlled and to what extent?
  2. What cannot be controlled to any extent?
  3. What may be difficult to control and to what extent?

Scenario: a forecasting analysis in practice

Here’s an example of a forecasting analysis I provided for a former client. Assume, for this example, that we were forecasting cell phone expenses and direct labor costs for a manufacturer of industrial fasteners and bolts.

At the outset, cell phone expenses were classified as Level 4: low variability, low impact. After all, rates should be known in advance (allowing for an advance-purchase discount); and HR should also be able to provide headcount, along with hiring/severance expectations.

Both factors are largely controllable. Cell phone expense is not a direct cost, has little impact on strategic decisions, and is unlikely to change from period to period. So, when forecasting this line item, we can probably project it with a high degree of precision – either based upon historical run rates or as the product of future estimates of headcount and monthly rates.

Because of the low variability and low impact, I considered a sensitivity analysis largely unnecessary.

Direct labor costs, on the other hand, were given a Level 1 classification – high variability, high impact – with a view to improving them to Level 2 (low variability, high impact) through better planning and cost control.

Part of the challenge in this case was that the company was spending 22 cents on direct labor for every dollar of sales. I considered this to be high-impact, compared to the company’s peers and across the industry.

Labor rates are generally known in advance, but the volume of hours – time spent on the manufacturing process – was highly variable, because the number and size of projects for future months was constantly changing.

So it was vital to conduct a sensitivity analysis on these costs. The analysis would illustrate at what levels of project volume, and over what period of time, we were likely to see corresponding labor shortages and overtime.

Conducting a sensitivity analysis is the easy part; interpreting its findings and acting upon it is harder. What this sensitivity analysis showed was that certain, known levels of manufacturing activity made labor inefficiencies worse.

One remedy we identified was to allocate labor hours across projects in similar stages of completion. We soon saw that the wide range of unexpected hours – which had led to increased overtime spend and a need for additional labor –narrowed to more manageable levels.

While we could have come to this conclusion without a sensitivity analysis, the tool gave us the insight which helped put the decision into context.

How to do a sensitivity analysis when things get complex

The most common tools for sensitivity analyses are easy to use and can be widely applied. For forecast models that are simple – or where an analyst seeks to analyze a small number of variables – these tools are excellent. However, financial modelers who are accustomed to modeling in traditional spreadsheet software may oversimplify sensitivity analyses based upon the tools conveniently provided.

Because these tools are rudimentary, they quickly become cumbersome when more than two input values vary at the same time – as is the case in most business scenarios. Forecasts are often complex, considering a wide range of simultaneously changing variables. In fact, in my advisory work, financial leaders often deflate basic sensitivity analyses with one question: “What’s the likelihood of that outcome?”

To answer this question, rather than rely on basic tools for advanced forecasting, companies should consider using “probabilistic” sensitivity analysis. This handles uncertainty across all variables, simultaneously.

With this kind of analysis, the extent of the variability must be defined using either known or projected distributions. Known distributions may be based upon historical data or behavior; unknown distributions may be based upon human best guesses, or computer-generated estimates.

Once base parameters are defined, the analysis will show not only the range of possible outcomes when key drivers change, but also the likelihood of those outcomes – addressing one of the core questions decision-makers are likely to ask.

In the direct labor case, for example, we could determine that we were likely to experience a labor shortage in about 25%–30% of manufacturing runs, which meant delays in project completion and an erosion of gross margin. By knowing the likelihood of issues like this, we can determine the extent of the impact and how we wish to remedy it.

Sensitivity analysis: risk of losing sight of key variables

Despite our human abilities and the capabilities of software, analysts should be careful not to overwhelm their planning models or analysis by sensitizing too many variables. If you do this, it’s possible to lose sight of the most important elements. In the example above, related to EBITDA, because sensitizing may have been performed on so many drivers, we may inadvertently minimize the impact of one element (such as sales mix) while unconsciously maximizing others.

Tools such as Monte Carlo can help an analyst determine which variables have the greatest impact on outcomes, allowing the analyst to pinpoint which variables should be sensitized. Instead of merely reporting what the planning models or analyses say, financial professionals can consider sensitivities, understand them thoroughly, and make better recommendations.

And if you know which variables can be controlled and to what extent, you can make recommendations which help the organization stay on track toward the outcomes it wants.

 

Remember, it’s insight that counts

In summary, you could perform sensitivity analysis on virtually any financial forecast and on any line item. When running such an analysis, be prudent in selecting the inputs that have the greatest amount of variability and/or impact, to determine which deserve the most attention.

Finally, recognize that many analysts myopically view sensitivities merely as a mechanical exercise in displaying how a range of inputs impacts outputs. Yet the greatest value in sensitivity analysis is understanding what the results mean – and how to influence performance.

At Unit4, we recognize that financial planning software should help you act on insights to achieve results. Read more about Unit4’s intelligent financial planning and analysis tools.

Sign up to see more like this