Dashboard Design and Delivery Worst Practices

18 Min Read

In the first post of this two-part blog series – Chucky’s top BI Dashboard design and delivery worst practices (Part One) – we explored and analyzed nine common pitfalls encountered by the nonchalant dashboard yahoo who mistakenly believes dashboard design and delivery is Child’s Play.

In the first post of this two-part blog series – Chucky’s top BI Dashboard design and delivery worst practices (Part One) – we explored and analyzed nine common pitfalls encountered by the nonchalant dashboard yahoo who mistakenly believes dashboard design and delivery is Child’s Play.

So, to escape accidental dashboard martyrdom – and ensure that you’re delivering the best insights to the right people – here’s eight more dashboard destroyers to circumvent.

10. Failing to select the most appropriate data visualizations

Certain metrics and measures are best represented by particular types of charts or tables. It’s important to select and create visualizations that effectively communicate the significance of the figures displayed in the most accurate, direct, succinct and digestible way possible.

For example, if comparing two metrics over time, a line graph is the only sensible way to present that information.

Similarly, the national sales manager for Paul’s Audio wants to compare the volume of sales for each of its major product categories – speakers, amplifiers and CD players. Because many customers purchase these items in a bundled package, sales are fairly even across the three product categories. In this instance, a pie chart would offer very little insight. However, when comparing several roughly equal quantitative values, a simple table or horizontal bar graph will offer easier at-a-glance insight.

Further, as a more general rule, select simple 2D chart types (such as bar and line), whenever possible, to assist fast information absorption. Conversely, avoid superfluous or complex charts (think 3D pie charts and scatterplots).

Constructing misleading visualizations
And, always ensure that your visualizations are proportionally accurate. If you work for Better Bakers, and you’ve sold twice as many pies as pasties, ensure that the line, dot or squiggle you use to denote those metrics, is proportionally representative of that fact.

11. Providing too much detail

Sensing a little bit of history repeating?

Close, but not quite – Shirley Bassey’s not in the house.

Part one of this post mentioned the importance of avoiding “information cramming”, as a dashboard is not simply an overview of all information relating to a particular business function.

Displaying unnecessary detail is a similar, but slightly different, problem. Remember; the mandate of the dashboard is to provide at-a-glance understanding of KPIs. While users might need to drill to further detail to understand the underlying trigger(s) behind a pattern, trend or outlier, a dashboard should focus on rapidly drawing attention and indicating the need to act.

For example, displaying a running total of your organization’s total revenue to date, down to three decimal places, is unlikely to be necessary – certainly in the context of a dashboard at least.

12. Abusing color (knowingly or unknowingly)

Stephen Few noted in a recent blog – titled Signal Detection: An Important Skill in a Noisy World and published on his website perceptualedge.com – that “Essentially, data analysis is the process of signal detection”. Few went on to lament the fact that “Signal detection is actually getting harder with the advent of so-called Big Data”, because “Finding a needle in a haystack doesn’t get easier as you’re tossing more and more hay on the pile.”

Part one of Chucky’s bloodthirsty BI dashboard design worst practices rampage emphasized the importance of carefully selecting:

  • Metrics tied to specific business goals
  • Complementary data visualizations that highlight, rather than distract from, the significance of KPIs displayed
  • The design of your dashboard to ensure every element helps convey a meaningful, usable message

But, judiciously following these steps is still insufficient. These elements need to be combined with the appropriate and prudent use of color. As with dashboard design in general, every speck of color added to a chart should be linked to a specific informational purpose.

While matching colors – used to depict metrics within a dashboard – to corporate color schemes may look cute, it’s completely unhelpful for assisting the accurate interpretation of analytical information.

Effective color usage should, in the context of dashboard design, aim to leverage the strengths of human perception, connotations and semiotics. For example, if a metric or measure is representative of a ‘good’ outcome – say, exceeding a predefined benchmark – it would be logical to color it green. Most cultures associate ‘red’ with negative and ‘green’ with something positive. Brightness of color is also usually associated with urgency. So reserve your brightest color choices for the data which requires the users most urgent attention.

Aside from selecting colors that fit logically with how we interpret the world, it’s also important to consider how the items within a dashboard relate to each other. So, it might be best to denote two measures that directly relate to each other by different shades of the same color. Similarly, stark contrasts in color imply that the metrics or measures being displayed are distinct. The same concept of contrasting colors should be applied to your data and report / dashboard background. Ideally, this means adopting a white background to deemphasize its importance and, by direct comparison, help to highlight the data.

Above all, keep in mind that the superfluous variation of color and shading actually inhibits our ability to interpret the significance of information. If you highlight everything, you effectively highlight nothing.

Only assign different colors to metrics or measures to indicate a difference in meaning that is not already communicated via another mechanism – such as a label, annotation or index. Simply wanting to make it different from other metrics or measures, for the sake of variation, isn’t a good enough reason.

Remember to embrace white space
Not to be confused with white noise, remembering to incorporate sufficient white, or blank, space is also important to assist users efficiently and effectively decipher visual information.

Creating breathing space for your eye (who doesn’t like a mixed metaphor, anyway?) makes highlighted information appear more prominent by comparison.

Appropriately spacing content will also assist the human eye to innately pick-up on informational groupings and interrelationships.

13. Making users scroll

While not always achievable, it’s good practice to design dashboards so that they can fit into the bounds of a single screen.

Being able to simultaneously view all information within a dashboard helps contextualize its meaning, enabling you to understand the relevance and interconnectivity of different pieces of information as they relate to each other. Constraining the proportions of your dashboard within a single screen helps you identify relationships that may have otherwise gone unnoticed.

Not only does a single screen view give you the ‘big picture’, it can ultimately provide a mechanism for faster, more accurate, and deeper information interpretation.

Form before function
However, rigidly conforming to the above rule puts the form of your dashboard above its function. This too, is bad practice. The form of your dashboard should be designed in a way that best supports its function – the point of its existence. So ask yourself: What’s the point of any dashboard? Hopefully you said something like ‘delivering personalized, understandable and actionable information to people in a way that allows them to make better, faster fact-based decisions wherever and whenever necessary.’

So, consider some important functional factors that may affect the construction of your dashboard, such as:

Mobility: Will your user group be accessing their dashboard from mobile devices? If so, how will the screen size affect the volume, positioning and type of information displayed?

Frequency: How often will they be looking for updates? And what impact will this have on the frequency with which data needs to be updated, or the granularity of information required for display?

Time: In what context will the dashboard be viewed? Will it be scrutinized at a desk, or glanced at before an important meeting? The amount of time that users can reasonably devote to data analysis and interpretation will directly impact the level of detail, as well as the level of interactivity, required. Will users want to drill into the details behind high-level numbers, or is summary level information enough to satisfy their needs and expectations?

 

14. Visualization variation: Pandering to the vanity of your users

Yes it’s true. We human beings can be a fickle, irrational bunch. But logic, as Star Trek’s Spock would attest, must triumph.

While it may be tempting to showcase a wide variety of chart types within a dashboard simply to keep users ‘entertained’, unnecessary variation is not a virtue. In fact, pointless data visualization variation can inhibit users’ interpretive capabilities. Always select the best and most appropriate visualization for the data and context – even if it results in a superficially repetitive dashboard.

After all, your primary aim is to deliver the right information to the right people in the most easily digestible form possible. You won’t experience user backlash for delivering accurate, understandable insights that enable people to make better, faster decisions. You might, however, experience considerable blowback should you introduce unnecessary visual variation that hinders information interpretation, leading to slower and potentially less accurate decision-making.

Ignoring the need for visual appeal altogether
But, unlike Vulcan’s, we are human.

And, while there are many other more pressing factors to consider and address first, creating a visual experience equal to the dog world’s ‘Pugg’ should be avoided.


Note: Apologies to all Pugg owners, but you can’t make a dashboard ‘so ugly that it’s cute’.

Widely deployed as the key apparatus for communicating BI content to users of all backgrounds, a dashboard must:

  • Have enough visual appeal to avoid developing a reputation as the ‘ugly duckling’ of your BI deployment
  • Encourage users to embrace it
Please be advised: The second point is not an invitation to add meaningless decoration. You must make your dashboards as visually appealing as possible within a ‘function first’ framework.

15. Poor dashboard structure

When it comes to dashboard layout, many simply take the drag-drop-done approach; placing charts and graphs together in no particular order. However, given the fact that dashboard deployments are continuing to be rolled out to increasingly diverse and less technical audiences, helping users digest the information presented as easily as possible is a non-negotiable must.

In this new business-user-oriented environment, careful and deliberate attention to dashboard structure is important – and fairly easy.

Just apply some basic ‘common sense’ rules to the structure of your dashboard content:

  • Apply a natural viewing sequence: Place charts in order of importance (the level of attention and / or action required), arranged from left to right and top to bottom
  • Grouping: Group data which is directly related, and designed to be compared, together to support easy information interpretation and relationship identification
  • Flow-based: Visualizations within a dashboards that monitor time-bound processes – such as tracking a sales process from initial lead, through to evaluation, proof of concept and eventual purchase order – should be arranged to mirror that process (again, from top left to bottom right corner)
  • Dashboard help: Assist casual business users by reducing the number of clicks necessary to explore the dashboard, while carefully positioning informational cues that outline how to utilize its features and functions

16. Selecting metrics

Chucky part one emphasized that, when designing dashboards, it’s crucial that all elements added have a specific informational goal aimed at conveying the relevance of organizational data as it relates to defined business objects. But how do you select, and define, the best metrics – ones which will be commonly accepted, underpin organizational strategy, and effectively track the factors that lead to an established delineation of ‘success’?

As a general principal, your chosen metrics should:

  • Be tied to explicit and agreed business goals
  • Be actionable – users must be able to take action to address a specific issue, which in turn will assist progression towards a defined business goal
  • Not require the undertaking of an onerous process to acquire the necessary data (types and sources)
  • Be easily recognizable – there is a general acceptance of the metrics’ meaning, and the factors the underpin it, amongst your target audience(s)

17. Failing to determine the type of dashboard you’re building

You may have:

  • Carefully defined your metrics
  • Selected appropriate visualizations
  • Structured and designed your dashboard for ease of use and navigability
  • Prudently considered the informational needs of your users, and how those needs impact the type of data and the style in which it’s displayed

You may have even framed the content of your dashboard around a specific business question you’re seeking to address.

And yet, you still haven’t arrived at your destination. Forgotten something?

If you fail to determine the type of dashboard you’re building (from the outset), it’s unlikely to be able to help users better achieve the business outcomes sought.

Most dashboards fall under three main umbrella types:

  • Strategic: Used to monitor progression towards overarching and predefined business goals.
  • Tactical: Used to track trends that relate to specific business objectives – which usually combine with others to support and drive overarching and predefined business goals. Think back to an example we used in part one of this post. A CEO may monitor advancement towards a defined organizational revenue benchmark (overarching business goal). At a more granular level, a marketing manager may track the number of product evaluation requests registered as a result of a recent advertising campaign (specific business objective) – this figure will play a role in determining overall company revenue.
  • Operational: Often used to trace and assess detailed, department specific, activities at a reasonably meticulous level. Operational dashboards are also frequently associated with the need for near-real-time data access.

So, which type of dashboard are you building?

Share This Article
Exit mobile version