The Excel paradox: a flawed forecasting tool we can’t live without

Forecasting · ~7 minute read

Two things every planner knows

Every workforce planner with more than a year of experience knows two things at the same time. The first is that Excel is a poor place to run a serious forecast — brittle, error-prone, hard to audit, and limited in the statistical methods it can support. The second is that they cannot do their job without it. The result is a quiet, slightly guilty professional dependency that almost nobody talks about openly. This article argues that the contradiction is real but not embarrassing — that Excel is genuinely flawed as a forecasting platform and genuinely irreplaceable as a forecasting tool, and that the difference between an immature planning function and a mature one is not whether they use Excel but how.

Why Excel really is flawed for forecasting

The case against Excel as a serious forecasting environment is well documented. Spreadsheets have no real version control: the same workbook on two analysts’ machines is two different artefacts within five minutes, and reconciling them is a manual process with no audit trail. Formulas can be silently overwritten with hard-coded values, and the only person who notices is the next planner who tries to recalculate the model six months later and cannot work out why it now produces a different number.

Errors are easy to make and hard to catch. The famous cases — the London Whale’s VaR model, the Reinhart-Rogoff economics paper, JP Morgan’s copy-paste errors, the Public Health England test-result truncation — are extreme but not unusual. Studies of operational spreadsheets repeatedly find error rates of one to five percent on non-trivial cells, which means that any large workbook contains arithmetic errors as a matter of statistical certainty. Workforce planning models routinely span hundreds of thousands of cells across volume, AHT, shrinkage, schedule, and headcount calculations; a planner who is confident that no errors are present is, almost always, confidently wrong.

The statistical toolkit is also limited. Excel can run a moving average and a basic exponential smoothing, but proper Holt-Winters, ARIMA, regression with multiple regressors, or anything resembling a modern machine-learning approach requires either painful manual implementation, third-party add-ins, or an escape hatch into Python or R. Multi-channel and multi-skill operations, where the forecast is really a system of interacting forecasts, push the limits of what a spreadsheet can represent before they become genuinely complex.

Reproducibility is the deepest problem. A serious forecast should be defensible: another analyst, given the same inputs, should produce the same output. In a spreadsheet, the same inputs feed through different versions of the same workbook, with different overrides, different assumption sheets, and different undocumented manual adjustments, to produce subtly different numbers. The forecast becomes attached to the analyst rather than to the methodology, and the operation becomes vulnerable the day that analyst leaves.

Why we keep using it anyway

None of this stops Excel being the most important tool a workforce planner owns, and the reason is not laziness. Excel does several things uniquely well, and any serious replacement has to either match them or accept the cost of losing them.

First, it is universal. Every stakeholder in the conversation already has it installed and already knows how to read a sheet, sort a column, and follow a formula. The cost of communicating a number through a spreadsheet is, for most audiences, lower than communicating it through any dedicated platform. When finance asks “why has next month’s headcount gone up by twelve?”, the answer almost always lands as a spreadsheet because that is the medium of the conversation.

Second, it is transparent in a way that proprietary forecasting tools usually are not. A good Excel model lets you see every formula, every assumption, and every derivation. The black-box nature of many WFM platforms — including expensive ones — is a genuine problem when the planner needs to defend a number, debug an unexpected output, or explain an edge case to a sceptical operations manager. Excel’s transparency is one of the reasons even mature operations re-export from their WFM platform into Excel before any serious finance conversation.

Third, it is fast. Modelling a one-off scenario, a what-if, an adjustment for a known event, or a quick sense-check of a vendor’s number is a five-minute job in Excel and a half-day project in most platforms. The flexibility that creates the brittleness is also the flexibility that creates the speed.

Fourth, it is trusted. People believe what they can read, and they read Excel. A platform that produces the same number through a more rigorous methodology will, in many organisations, struggle for credibility against a spreadsheet the operations manager has been using for ten years. Trust is earned slowly and lost quickly, and the trust attached to the spreadsheet is real even when the underlying methodology would not stand serious scrutiny.

The actual problem isn’t Excel

The mistake is not using Excel; it is using Excel as the system of record for forecasting. A system of record is the canonical store of a number, the place that authoritatively says “this is the forecast.” When that role sits inside a spreadsheet, all the brittleness, error-proneness, and reproducibility problems described above become production risks. When it sits inside a dedicated forecasting platform — whether a commercial WFM tool, an in-house Python or R service, or a database-backed model — the spreadsheet is freed up to be what it is genuinely good at: the analyst’s workbench, the what-if scratchpad, and the communication layer.

This is the distinction that separates immature planning functions from mature ones. The immature function has Excel everywhere, doing everything: producing the baseline, holding the assumptions, carrying the overrides, and presenting the result. The mature function has a clearer separation. The baseline lives in a tool that is built for it, version-controlled, audited, and reproducible. Excel is used downstream for analysis, sense-checking, and stakeholder communication. The same number arrives in finance’s inbox, but the journey is different, and the journey is what matters when something goes wrong.

What right use of Excel looks like

In a mature operation, Excel earns its keep in three specific roles. The first is what-if analysis. Once the production forecast exists in a system of record, an analyst can pull the inputs into Excel, model a scenario the platform does not directly support — a different shrinkage assumption, a one-off campaign, a sensitivity to AHT — and bring the answer back into the conversation. The Excel artefact is disposable; the production model is not.

The second is reconciliation. The number a WFM platform produces is rarely the same as the number finance is expecting, because the two are usually built off slightly different assumptions, shrinkage definitions, or contractual scopes. Reconciling them is a spreadsheet job, and the spreadsheet that reconciles them is itself a useful artefact — saved, named, dated, and ideally version-controlled in a shared drive with naming conventions that make the lineage clear.

The third is presentation. The board, the senior management team, and most of finance want to see numbers in a familiar format with a structure they can read. The system of record might produce the data, but the spreadsheet (or a slide derived from one) is what gets discussed. Investing time in a clean, well-laboured presentation Excel is not waste; it is the medium through which the work becomes visible.

A practical migration path

For planning functions that recognise themselves in the “Excel everywhere” description, the path forward is gradual and unglamorous. Identify the single most important forecast — usually the rolling 12-week tactical forecast for the largest queue — and move that one out of Excel into something more disciplined first. The destination does not need to be a six-figure platform: a Python script under version control, a SQL-backed model, or even a dedicated Excel artefact with proper change control and named ranges is a meaningful improvement. The goal is to remove the “analyst-on-a-laptop” risk from the most consequential numbers, then expand from there.

The aim is never to ban Excel. The planners who try usually fail and lose credibility on the way. The aim is to put Excel in its proper place — as a brilliantly capable and irreplaceable analytical and communication tool — rather than letting it carry roles it was never designed for.

Conclusion

The Excel paradox is genuine. It is the wrong tool for production forecasting and the right tool for almost everything else a planner needs to do around a forecast. Holding both ideas in mind at the same time is the start of a more mature relationship with the spreadsheet — and a more defensible forecasting practice. The planners and leaders who acknowledge this openly tend to invest in the right places: a serious system of record where it matters, and a clean, disciplined Excel practice where it adds value. The ones who pretend Excel is the problem, or pretend it is the solution, end up with the worst of both.

If you build a lot of forecasts in Excel, the simple volume forecaster on this site uses the same Holt-Winters method as a sense-check — useful for benchmarking your spreadsheet output against a clean reference implementation.

Comments

Comments are powered by Giscus — sign in with GitHub to join the discussion.