Analytics: Thought Leadership's Magic Wand

The August 2015 Gartner Hype Cycle for Emerging Technologies placed “advanced analytics with self-service delivery” at the very crest of the “Peak of Inflated Expectations.”

Of course, you know what comes next: the dreaded “Trough of Disillusionment.”

It’s been a long slog to the top for analytics, let alone the advanced, self-service kind (which is sort of DYI analytics, meaning you don’t need a data scientist to do it for you; all you need is the right tools, such as dashboards designed for line-of-business executives. Sure.)

One could say that analytics began worming its way into the public ear and consciousness with Tom Davenport’s 2006 HBR article, “Competing on Analytics,” or his 2007 book of the same name. Or, one could argue that analytics – which is really just collecting a lot of data and crunching it with software to reveal hidden patterns and surprising connections within the data that can guide all sorts of smart decisions – has been around a lot longer, maybe since the first computers. But whenever it started, it’s hard to find a presumptive thought leader today who hasn’t recommended its use to figure out how to improve business processes, sell more stuff to consumers, or create new products – filling the hitherto unknown needs that all those crunched numbers representing buying patterns, trends, cycles, wishes, hopes, and dreams have revealed.

Unfortunately, these recommendations usually boil down to “use analytics” and, as easy as A, B, C, ye shall become more successful than Amazon, Buffett, and Croesus.

Use analytics. That’s really not much help, is it?

If it were that simple, wouldn’t everyone do it? And if everyone did it, why would one business ever become more successful than another?

They wouldn’t, because it’s not that easy.

But many thought leaders fail to mention the effort and costs that go into using analytics to improve business – the data scientists that need to be hired; the cultural change required to get people to trust the numbers and not their experience; the computing power and storage space necessary; the expensive software needed to make sense of the data and make it useable. To be fair, sometimes these thought leaders note that by deploying or implementing or leveraging analytics (never "using," because that sounds pedestrian), some business has improved profitability by X percent, cut costs by Y percent, or increased market share by Z percent. But thought leaders almost never describe the how of analytics projects. Who decided what data to go after? How was it collected? What patterns were revealed? How were they interpreted? Who interpreted them, and how was that understanding translated into the actions that produced those sparkling numbers? Describing all that would be helpful. Describing that would be thought leadership.

A typical paragraph I recently ran across (disguised a bit to protect the not-so-innocent thought leader), ran, “They deployed analytics to maximize processes. Revenue went up 20%. Analytics is changing their business model.”

One could just as easily (and usefully) write, “They deployed a magic wand to maximize processes. Revenues went up 20%. The magic wand is changing their business model.”

That’s not thought leadership. That’s Harry Potter.

True, analytics is a complex science, and thought leaders may have to work hard to describe analytics projects in terms that a business person can understand. But if you can’t describe the project approach in terms that a line-of-business executive can understand, you’re cribbing from the Hogwarts syllabus.  

By itself, the word "analytics" means very little, and it doesn’t tell the readers of your putative thought leadership enough about why they should consider the tools or strategy being explored. You can’t expect business people to take your word on faith, and buy analytics software without knowing how to use it.

Because if they do, you know what comes next . . . the trough of disillusionment.

I hope thought leaders put down their magic wands before the businesses that listen to them find themselves in their own troughs, not only disillusioned, but staring bleakly at an expensive dashboard flashing red for loss. 

Comments

The 'How' of Analytics can be described by Research Managers who actually design the approach, the source and scope of data, the methodology to collect and analyze and the interpretations. Data Scientists would be too technical while a Research manager has the functional and to some extent a technical orientation. As such they should be able to fill this gap in Thought Leadership.

Submitted by Tim Parker on

Jitesh, I agree. I think though the problem is often that before publishing their prescriptions, many authors don't spend time with research managers to figure out an approach that a reader might find useful.

Add new comment