Top BI Dashboard Design and Delivery Worst Practices

It’s child’s play, don’t you know? Business Intelligence (BI) dashboards are the proverbial cherry-on-top of your business analytics project – they’re easy as pie. Sprinkle a few metrics here, drizzle a report there, then simply smoosh it all together – and vuala, we have a dashboard capable of enhancing decision-making capabilities far beyond the fluffiest clouds of your wildest dreams. And inducing a stern hunger. Right?


A far more apt analogy to describe such a blasé, fanciful approach would be to compare dashboard design and delivery with Tom Holland’s nightmarish Child’s Play – a 1988 American horror film based on the homicidal urges of “Good-Guy” doll, Chucky. This type of child’s play can be deadly – for your business that is…

Unfortunately, a careless approach to BI dashboards is far too common, and has the potential to derail and devalue your entire analytics project.

To avoid becoming an unwitting victim of cavalier dashboard cowboys – and ensure that your BI dashboards deliver accurate, timely, relevant and actionable insights at-a-glance – avoid these design and delivery mistakes.

Keen to understand some of the fundamental best practices that should be observed when designing and delivering BI dashboards? Register for the vendor agnostic Webinar series BI Dashboard Best Practices and join in the conversation. Register Here >

1. Taking a ‘she’ll be right’ attitude to data quality

A dashboard should be thought of as the icing on top of a seriously dense layer cake. Remember, a dashboard is the centerpiece of a reporting presentation layer – it’s not an ETL tool.

Firstly, you must ensure that the necessary data exists, within a usable format, to support the reliable reporting of desired metrics linked to defined business goals. Remember: ‘Rubbish in equals rubbish out’. Dodgy data leads to dubious decision-making, which undermines the integrity of your dashboard rollout.

Executive disdain will ensure, and depending on the inadequacy of the data, project abandonment may follow. Even if the initiative retains executive support, a poor initial experience will result in a disgruntled user base. Convincing them of the information’s trustworthiness thereafter will be an unenviable challenge.

So before you embark upon the pretty stuff, ask whether the underlying data can fully support your reporting objectives. Is the right data there? Is the data of sufficient consistency and quality? Is it in a usable format? Do you have a data integration strategy that will allow you to effectively combine data required from disparate sources in order to provide users with a unified view?

Ultimately, poor quality data is misinformation, which will lead to miscommunication, misunderstanding and inaccurate analysis.

Letting your available data define your key dashboard metrics
Sometimes relying on existing resources just won’t give you the output you need to succeed.
Business goals should always define the content of your dashboard; not the data which was most readily available. Observing this rule is the only way to ensure the effectiveness, appropriateness and expediency of your dashboard. If it’s discovered that the data you really need isn’t yet accessible for reporting purposes, then it’s time to put your dashboard project on hold.

2. Failing to consult end-users in an iterative manner

Business users – most often the primary target audience (in general terms) of dashboard deployments – are not BI experts. As such, they’ll find it difficult to articulate which key metrics they most want to track and monitor. The irony is that, as business functions and departments begin to have an increasingly large stake in BI purchase and rollout decisions, many stakeholders won’t be able to recognize their own fallibility. But, it’s a reality that must be recognized.

As such, you can’t expect to nail down all user requirements in one initial sitting. The needs and wants of each user group will continue to shift as they begin to understand which parts of the business they really need to keep a handle on. The business environment within which each user group is operating will also change over time. Regular consultation between developers, executive sponsors and each defined user group – throughout the development and management phases of dashboard implementations – will enable you to satisfy expectations on an ongoing basis.

Neglecting to provide adequate training
Dashboards, well thought-out and designed, should be intuitive and deliver actionable insights at-a-glance. But, keep in mind that someone once showed you how to use a mouse and search for answers on Google, too.

Dashboards do enable a wide variety of not-particularly-technologically-savvy business users (me) to take advantage of the fact-based insights gleaned from business analytics. The whole point is that users don’t have to transform themselves into bespoke-analytics-coding-propeller-hat-wearing data scientists. So give them the guidance and support they need.

Adopting a ‘build it and they will come’ approach: Letting IT tell business users what they want
Yes, IT might know the data, but that doesn’t mean they know which metrics business users need to monitor. Further, the manner in which the information is displayed and ordered may not match the users’ perception about their job function. For example, Rick – Sales Manager for music chain Rock On – may be used to viewing and analyzing sales by instrument (guitar, drums, violin, etc), region, store and than individual sales rep. If IT creates a sales dashboard – without proper consultation – that begins by analyzing sales by store, it will immediately fail to meet Rick’s expectations.

Dashboards created without appropriate user consultation and negotiation often lead to a gulf between dashboard content or functionality and user needs or expectations.

Users will quickly rebel against such I-know-what-you-need condescension, and resort to finding answers the old way – with the omnipresent excel spreadsheet.

Contrary to what you might have learnt from 1989’s Field of Dreams, you can’t just build it, because they won’t just come.

Assuming that requirements gathering can happen ‘on-the-fly’ because an iterative approach is employed
Don’t mistake iterative for careless or haphazard. Dashboard developers and end-users should liaise on a regular and ongoing basis to ensure that the moving target of constantly shifting business requirements is being adequately met. But, this doesn’t mean that initial requirements – carefully mapped to KPIs, core metrics and business goals – shouldn’t be judiciously gathered. You’ve got to ensure that the project is on track from the beginning to avoid user disgruntlement. Consistent consultation simply helps it remain on track, and delivering business value, well beyond the ‘go-live’ date.

3. The MIA executive: Developing directionless dashboards

While many user groups will struggle to delicately and definitively articulate which business goals they hope to better achieve by having access to customized BI dashboards, this is not an invitation to abandon proper processes for gathering user requirements.

However, if key stakeholders haven’t spent the time carefully considering their needs, pull back until they’re able to offer a reasonable level of detail. While any dashboard deployment needs robust backing from senior business figures, don’t yield to ‘rogue’ executives who make vague requests: “Oh, I haven’t worked out the finer points yet, just make me something – you know, just include all the obvious stuff.” Don’t know what “all the obvious stuff” actually entails? Then don’t make a move until they’re willing to come, prepared, to the requirements gathering table. Be especially wary if you receive such a nebulous response from one of your project ‘champions’. Losing their trust and support may negate project development down the track.

Remember, if users don’t know what they want, you can’t deliver what they want. It’s a lose – lose situation.

4. Sidelining IT: Failing to understand IT and business roles in dashboard project management

The business side of many organizations is becoming increasingly influential throughout dashboarding projects. But, IT still has a very definitive and integral role to play.

While business users have growing governance over dashboard purchase decisions, requirements gathering, design as well as the strategic direction of the project, IT must position themselves as the ‘data custodians’ of the development. This includes ensuring:

  • Trustworthy data quality
  • Appropriate data security and segmentation
  • Robust data integration procedures and policies
  • All aspects of application – and application infrastructure – management, governance and maintenance

To bridge this clear division of roles between the business and IT portions of your dashboard initiative, assign a dedicated project manager to enable constant and consistent communication. This person is also should also act as a conduit between business and IT to ensure that requirements are gathered properly, agreed deliverables and timelines are being met, and that all stakeholders are ‘on the same page’.

5. Providing historical, not up-to-date and actionable, data

Hindsight has undoubted worth. It allows us to learn from our mistakes and, hopefully, make better choice as a result.

But historical information has limitations. A properly designed dashboard should help decision-makes track and understand the KPIs that lead to a desired outcome. You need to find a balance between outcome or goal KPIs, which are based on measures of things that have already happened, and KPIs which track metrics that drive those desired outcomes.

For example, the primary goal of your CEO might be to boost company revenue. So of course tracking revenue progression towards predefined benchmarks will be important. But, to help diagnose the underlying cause of revenue fluctuations as they occur, they’ll also want to monitor the metrics which directly affect revenue – such as number of new sales leads, new customers and new consulting engagements per month. Understanding and improving these underlying metrics is the key to achieving the ultimate goal of boosting overall organizational revenue.

6. Wasting time: Building and rebuilding

Even if you think you’ve meticulously mapped out the requirements of each user group, that doesn’t mean everybody left the meeting with the same vision.

To ensure that different perspectives don’t result in avoidable dashboard rebuilds, and needlessly delay project delivery, try drawing a simple dashboard mock-up encompassing requested metrics and KPIs. Sharing this with your users prior to dashboard creation can help secure agreement on critical design issues, leaving plenty of time to effortlessly include any requested alterations.

7. Failing to cater for different user types and groups

It’s fairly obvious that different user groups – the sales department compared to the finance department for example – will want to track and act on different KPIs.

But, each user group will also most likely need to explore the data relevant to them in different ways. Not only that, different data – as well as the presentation of that data – will be relevant to different user types within the same user group. Just because you’ve nailed down the core requirements and KPIs for your HR, sales and marketing dashboard, doesn’t mean that every person within each department will require the same information. For example, one finance dashboard won’t suit everyone – the CFO will probably just want a summary level view of finance KPIs, with the ability to drill to deeper detail as required.

When designing dashboards for user groups, consider:

  • Their level of technical skill – are they likely to explore data or just view it?
  • The level of interactivity required – will users need analysis functionality such as drill down capabilities?
  • The frequency with which they’ll access their dashboards and the impact that has on required data velocity
    • The level of detail required (granularity of data) – think back to our CFO example
      • Catering for roles within roles (user types)
        • After the initial rollout, time permitting, don’t be afraid to spend time tweaking individual dashboards of key stakeholders

8. Not adhering to Ronan Keating’s less is more message

Did you know that singer Ronan Keating espoused a sound dashboard development philosophy when he famously said, “You say it best, when you say nothing at all”? Well, it almost works…

When designing dashboards, it’s crucial that all elements added have a specific informational goal aimed at conveying the relevance of organizational data as it relates to defined business objects.

Selecting ‘noisy’ data visualizations
Purveyor of all things data visualization, the sage Stephen Few, claimed in an article that visualizing your KPIs is integral because 70% of the sense receptors in the human body reside in our eyes. That being the case, it’s safe to assume that the inaccurate or unnecessary visualization of information has the ability to be grossly misleading.

So, it’s vital that you take the blank canvass approach to dashboard data visualization: Every line, dot or color should be added because it conveys a specific and intended message that assists the users to better understand and accurately interpret the information presented.

Ask yourself: Is each visualization highlighting or distracting from the intended message? And, how exactly does the third dimension on that pie chart help communicate the significance of the data it represents?

Designing a ‘noisy’ dashboard
What’s the goal of dashboard design? Are you aiming to deliver technology or insights? Don’t let the technology define the information displayed; this must be driven by a clearly stated business need. Dashboards should be fashioned to best communicate KPIs – not the other way round.

As with the data visualizations selected, every element of dashboard design should help convey a meaningful, useable message.

One of the core dashboard design issues relates to the grouping of information. Few said it best when he wrote: “Dashboard content must be organized in a way that reflects the nature of the information and that supports efficient and meaningful monitoring.”

Carefully consider how individual reports within a dashboard, and the metrics within every report, relate to each other. The Center for Information Based Competition found that 75% of BI success is determined by factors other than data and technology – so understanding the impact of data presentation and grouping is critical.

Information cramming
A customized dashboard is not an overview of all business information relating to a particular business function.

If you can’t interpret a dashboard at-a-glance, the information is actually fighting against its intended purpose – it’s too detailed. Can you see the forest?

Try to just display what’s crucial – not everything is a KPI. If your requirements gathering was sketchy, then poor information focus is a common symptom.

All KPIs displayed should be actionable – able to be improved upon as a result of actions taken by users. This may sound obvious, but all-too-often organizations become over-enthused and insist on designing measures for almost every aspect of the business. KPIs won’t standout if they’re shrouded by a mist of extraneous measures.

9. Trying to solve everyone’s information needs at once

Just as the pre ‘go-live’ design and development of dashboards should be an iterative and consultative process, so should the rollout and management phases.

Dashboard deployment should be staggered amongst previously identified primary users, and user groups, to assist sustained adoption and boost project manageability.

Equally, the metrics and KPIs selected for display within each dashboard should also be introduced in a phased process. Those of highest priority must be delivered upfront, with new metrics added over time through cyclical evaluation and consultation, to ensure that each addition helps users address particular business goals. Not only will a staged rollout help remove the danger of dashboard abandonment and information clutter, periodic evaluation will also help ensure that metrics can be intermittently tweaked to keep up with a changing business environment and constantly evolving user demand.

Adopting this approach will enable realistic development and delivery deadlines to be established and met. It also has a higher likelihood of achieving user satisfaction, as phased delivery allows users to derive value from the project faster, and have evolving requirements addressed.

Wait, there’s more. Stay tuned for the second installment of this two-part blog post. You might even take home a free set of steak knives.*

*Note: Probably won’t take home a free set of steak knives.