When delivering actionable analytics within an organisation I often spend some time removing commentary from management reports. I have found that the regular reporting packs produced for senior management are often drowning in long commentaries that attempt to explain the data in the packs.
This immediately gets me asking ‘Why?’ most of the commentary exists.
Why can’t the data do the talking? What is it about the way the data is organised and presented that makes it necessary to add all of this text?
In my experience there are a number of reasons for this.
The first is poor design of the report. Not enough thought has been given to either the purpose of the report (i.e. what are the objectives? what are the decisions we want taken?) or the audience (i.e. what are the information requirements of the decisions makers? how do they prefer information to be presented?).
A second reason may be limitations in the tools being used to produce the report. Even the best business intelligence tools have their functional limitations and as the first rule of report design states:
“Within any given set of business analytic requirements there will be at least one that is impossible to …
When delivering actionable analytics within an organisation I often spend some time removing commentary from management reports. I have found that the regular reporting packs produced for senior management are often drowning in long commentaries that attempt to explain the data in the packs.
This immediately gets me asking ‘Why?’ most of the commentary exists.
Why can’t the data do the talking? What is it about the way the data is organised and presented that makes it necessary to add all of this text?
In my experience there are a number of reasons for this.
The first is poor design of the report. Not enough thought has been given to either the purpose of the report (i.e. what are the objectives? what are the decisions we want taken?) or the audience (i.e. what are the information requirements of the decisions makers? how do they prefer information to be presented?).
A second reason may be limitations in the tools being used to produce the report. Even the best business intelligence tools have their functional limitations and as the first rule of report design states:
“Within any given set of business analytic requirements there will be at least one that is impossible to deliver using the available toolset.”
If you are unfamiliar with these rules, then you should know that this immediately leads to the second rule of report design:
“The amount of insight delivered by a report is halved by each undelivered business analytic requirement.”
A third reason is less benign than the first two. I use the word ‘spin’ to describe the collection of responses that the business ‘reporter’ makes when they fear that the ‘reportee’ will:
- respond negatively to the data presented; or
- not sufficiently appreciate the ‘brilliance’ that was required to deliver the result.
In both cases the intent of the manager reporting the figures is to ‘spin’ the message given to the report reader. 9 times out of 10 the spin is intended to mislead the reader.
All of these reasons for extensive amounts of report pack commentary result in at best the report data failing to provide the insight required by the business users. At worst it misleads decision makers into making bad decisions.
I like to think that my job is to create an environment where good decisions can be made because they are based on unbiased information that fairly measures performance. In practical terms, this translates into reducing commentary by improving the quality of the automatically produced data. This sounds simple but this is deceptive. It is very challenging to achieve in practice.
Here is a little video that I have made that helps explain how identical words can be spun to justify diametrically opposed management messages. It’s a bit of fun (adapted from a political campaign ad I saw some years ago) and that is why I have found it effective. Enjoy.
Download Spinning_Decisions.m4v