How Helpful Is Your Business Information "Dashboard?"

Every executive I know wants easier ways to get information about what is happening in their business. Most big software applications include some kind of “Executive Dashboard” in order to meet this need.

If only the problems were solvable with software!

It’s not.

A discussion of this topic that is well worth reading was posted on April 30 by Avinash Kaushik, a blogger who works for Google: The “Action Dashboard” (An Alternative to Crappy Dashboards)

Avinash’s point is that the analyst responsible for creating the data can make things better by presenting a context that is helpful for executives. He took a human psychology perspective in his recommendations to improve the dashboard’s effectiveness:

  • Only report the three or five (at most!) metrics that define success for the whole business.
  • For each metric, include:
         1) the data in trend format (ie, chart it over time),
         2) interpretation of the trends and the context,
         3) actions/steps to take, based on investigation, and
         4) impact on the company/customer.

These are very good recommendations, because they stretch the analyst, forcing him/her to learn a bit more about the business so they can present information in an appropriate context. They also make the information more useful for executives, because it is tied to issues they care about.

Additional Requirement for Good Executive Dashboard Information
If you’ve been following my writing for a while, you might guess where I will go with this: Dashboards work far better for production of goods than they do for production of sales revenue.

There is a reason for that problem, and it is entirely fixable.

A dashboard that tracks data around the flow of inventory production has a high likelihood of providing meaningful data, because people can trace everything to concrete observable events (inventory transactions). Classify the events correctly and add up the right numbers and you produce important facts.

A dashboard that tracks the flow of leads, opportunities, and deals has a much lower likelihood of providing meaningful data. That’s because people have not developed the means of tracing everything to concrete observable events. Executives need to know the returns they are getting on their marketing spend. They need to know if salespeople have enough potential deals in their pipelines to make the numbers.

This kind of information is very high level. It is the kind of information they usually can’t get, so they are used to relying on gut feeling rather than on facts. But it is information that could be generated from field data …that’s what dashboard technology is supposed to do.

Suppose the dashboard includes some measure of lead generation, since that is obviously important to ensure salespeople’s pipeline is full.

If some of the “leads” are defined as “business cards collected at a trade show,” for example, then the dashboard will present confusing information. That’s because a “package deal-“a false assumption-is getting in the way. It will show that high (or low) production of “leads” has virtually no relationship to sales revenue.

What’s the false assumption?

The idea that “business cards collected at a trade show” has something to do with putting opportunities in salespeople’s pipelines. That assumption might possibly be true (although probably not). However, until you have tested and proved it and the sales team accepts it, you don’t KNOW that it is true. Hence, the false assumption.

Consider: Has anyone in your organization taken the time to trace-to audit-the validity of the meanings of the terms in your company’s dashboard? The IT world is filled with examples of spreadsheets containing huge mistakes, retired old ladies receiving phone bills for $100,000, of so-called “web site traffic” turning out to be undigested logs of meaningless server activity.

Even a dashboard that incorporates Avinash’s suggestions will provide confusing, even misleading, “information” if the data it shows cannot be traced to hard evidence, similar to that of production inventory systems. Everyone must understand what observable characteristics cause a prospect to be classified as leads, qualified opportunities, and customers.

This is one of the key foundations of sales process improvement: Establishing observable definitions (ideally based on customer actions) for the stages of sales production. Get these perfected within your company’s culture, and the rest really is just a software problem.

Michael J. Webb
June 15, 2008



Michael Webb

Michael Webb founded Sales Performance Consultants to create a data-driven alternative to the slogans and shallow impact offered by typical sales training, sales consulting, and CRM companies. Michael helped organize and delivered the keynote speeches for the first conferences ever held on applying Six Sigma to marketing and sales. Connect with me on LinkedIn.

Click Here to Leave a Comment Below

Leave a Comment:

Verified by MonsterInsights