Excel is often the right tool, but for the wrong job
xls file

Excel is often the right tool, but for the wrong job

Excel’s been with us a long time, and let’s face it, we all love Excel! But many of us rely on Excel for far more than was ever intended. Which is understandable in an age where tools come and go.

What is the issue with Excel?

Excel is a fantastic and reliable tool backed up by an almost ubiquitous level of user familiarity. Used well, it’s an extremely powerful and accessible modelling tool.

But Excel is often used as a corporate accounting or data management tool, and that’s when problems start to occur. The growth in companies’ data to incorporate portfolio analysis, profit and loss, and scenario planning components begins to uncover the limitations of Excel. Why? Because fundamentally Excel remains a 2-dimensional personal productivity accounting tool.

Even if your data is accurate (and often it’s not, thanks to multiple entries and sources), the final results may still contain errors due to the process of multiple entries from multiple people combined with how you calculate, import, or export data.

Data Issues – its not a database

One of the biggest misconceptions or misuses of Excel is as a database. It’s not. And there are several ways this causes problems.

Excel is not tailored to your organization or the job you’re using it for, and it cannot comply with current security standards. It’s a generic tool, with inbuilt assumptions about data types. For example text strings like SEPT2 and MARCH1 when loaded into Excel will default to dates in your spreadsheet.

I’m using this specific example because those terms happen to be the scientific codes for a couple of genes. Incredibly, a 2016 study of publicly available spreadsheets on gene research found 20% of the research that used Excel contained errors – scary stuff!

So how can you rely on results from a system that’s making assumptions about your data that you’ve not set? And if not spotted, these assumptions can infect more than the single instance where they appear, as other cells, sheets, and documents rely on them.

The difficulty is you have no common or set standards for process controls and validations. More importantly, you can’t create them. While you can make some changes to each file, Excel is not yours to adapt at the software level. This can result in remarkably complex, expensive, and time-consuming workarounds to develop processes to save time and money. In fact a recent report published in August 2020 states that the scientific community are now renaming human genes to overcome the Excel issue! Surely software should be there to support us, not the other way around.

Inconsistent version control and missing data

As we discussed in our recent eBook, “Excel spreadsheets were not designed for multiple users or for making numerous updates and changes. Doing this here tends to cause confusion, frustration, and, of more concern, distrust in the numbers provided.”

There are other well publicized examples highlighting the limitations of Excel, and if you want to learn more, then a quick internet search will provide you with plenty of reading matter. In particular the ones I refer to most often are the 65k row limits in older versions of Excel (how many companies are still using those older versions – I suspect more than we realize including a number of government departments), and the 2001 Enron collapse where investigators found severe issues with over 9,000 spreadsheets containing over 20.2 million formulas.

The expert conundrum

Even when Excel works well, that comes with its problems, often in the form of expert knowledge. To get the most out of Excel takes specialist knowledge to understand the kinds of formulas many companies use these days. It’s almost become an industry.

In fact, I know someone who studied chemistry at university and now uses the advanced math learned there to build these formulas in the financial services industry. This individual develops financial derivative Excel models using obscure, very advanced mathematical models that can only be understood by other, very clever, appropriately trained, experienced people. And all those people are doing the same in highly paid roles – building more and more complex derivative models in an un-auditable framework surrounded by flawed risk control systems (built in Excel).

This is great for them but can create several problems for you.

  1. It’s costly. These types of expertise are not readily available, and those with them can charge a premium.
  2. It’s time-consuming. Solutions using these methods take time to build and test, impacting your agility and ability to solve problems, and the delays to create them keep you on the back foot.
  3. It’s impractical. This point, and all the others above, mean that your people can’t simply use data as they need. And even more worryingly, they can’t even be confident they can rely on the data they are using or the results it produces.

And ultimately, it is this lack of confidence that leads to a concern when using Excel for the wrong job.

The power of data you can trust and use

Data’s power comes from having trust in its accuracy and its applicability to the task in hand. People need to be able to access and use data instantly and know that the data they use and the findings it delivers are accurate to do their job i.e. fit for purpose.

Unit4 has been helping organizations like yours access and use data to its full potential for over 40 years. And if you’d like to see just how much your people could achieve with your data take a look at our People Experience Suite today.

Bob Elliot

Bob Elliot

ACMA/CGMA, Senior Solution Manager - FP&A, Unit4

Bob is a qualified finance professional (CGMA/ACMA) with extensive solution architect and client engagement expertise with cross sector experience in Enterprise/Corporate Performance Management (EPM/CPM) and BI.