So You Got Some Statistics From Your Data…So What?

When you begin to extract some metrics and statistics from your analytics efforts, it’s easy to attach some arbitrary meaning to them before you have the full picture in front of you. Statistics would tell you that only one out of seven dwarves is happy, but why is that?

Ok, so that’s a humorous example of a statistic that could lead to assumptions being made about the general well being of a particular group of dwarves, but the point is still a valid one. More often than not, you will require more levels of information in order to make full sense of the numbers your data is showing you. 

Using raw figures to identify the scale of the statistics is one way to begin to build a bigger picture. Using our dwarves example from above, we know that when they were asked if they were happy, only one in seven answered “Yes!”, or 1/7. But we know this was a small scale (Pun intended) survey, so what happens when we begin to look at larger data sets.

25% looks very different depending on the scale. It could be 1/4, 25/100, or 150/600, etc. leading us to the conclusion that statistics on their own, mean nothing and can often result in people asking “So what?”.

Leaving the little people to one side for the moment, and focusing on a more real world scenario, let’s say that you discover 25% (125/500) of your sales team hit their targets last month…So what? Why was that? What’s significant about it? You could rightly state that 75% of them missed their targets too, but the stats themselves don’t give you the full picture either way.

To understand what really happened, you’ll need to add layers of information on top, gradually building up layers of awareness until the numbers begin to reveal the true story. It would be easy to assume that 75% of you sales team are just no good at their jobs, and that you should go on an immediate recruitment drive to fix the problem, so what other layers can we add to gain a greater insight?

We could look at the number of salespeople that have received full training, and see if there’s a link between that and performance. Maybe a certain number of the 75% are new to the job, and are still building up their client bases. Perhaps there was a change of management that unsettled the team. Did some people take their foot off the gas following a bonus payout? Was there an external impact? – For example, did a regional warehouse flood? Was a ship’s consignment from abroad delayed? Or did a competitor launch something new that blew your product out of the water?

As you can see, there can be a whole host of reasons why something happened, and without digging deeper into that why, you’re only ever going to be basing your decisions on a very small part of the puzzle. The data at your disposal are pieces of the puzzle, and by having a clear understanding of the questions you want answers to, you can add or subtract these pieces to give you the bigger picture.

The ultimate goal of data analysis, is to learn something. That something is either a positive thing that happened, and something you definitely want to repeat, or it’s a negative thing that you want to avoid at all costs. Becoming curious about what your data can tell you, will naturally lead to you better defining the questions you want answered.

It’s worth mentioning again, that the human elements of your business can often add vital layers of information that simply do not exist in any of your data sets. If you ask people on the ground what’s happening, you may just get some insights that no amount of analysis could ever discover!

The next time somebody throws some arbitrary stats in your direction, just answer them with “So what?”.


<—If you’ve enjoyed this article, please consider sharing it


Nav Mode