The Pretty Chart Dilemma

Whenever a client says, “We just need the charts to be pretty”, I pause and weigh my response. While clearly placing some value on the user experience with their comment, they clearly miss the point of information design and effective data visualization. My dilemma then becomes, what’s the right response to this statement?

To be clear I’m not rehashing the data visualization aesthetics debate, but wrestling with how to win over product leadership on how to implement data applications the right way, i.e. functional vs. pretty.  When I’m working on a project It's all about delivering a functional design supporting an existing workflow that leads users to a set of results they can act on.  Pretty can’t do that.

The tide has definitely turned.  Product leadership recognizes they must deliver well-designed data applications to customers.  However, even though there are dashboard UI user stories in everyone's upcoming sprints, it’s still treated as something that happens to the display (colors, fonts and charts) and not between the display and the user (insights, actions and to-do lists).  My fear is that when mediocre designs gets implemented and results are mixed, design budgets will only be further cut in the future.  Without a design that gets users to take action and drive improved results, subscription attrition, tepid customer survey responses and low adoption will continue to be the norm.

So, how should we engage with people when they disproportionately place value on pretty?

First, it's important to recognize the pretty chart dilemma in all its manifestations.   It can appear as variations of the following:

  • “The charts need to look better so users will use it."

  • “Make sure the dashboard uses approved company branded colors."

  • “They just need to be prettier than today’s version."

  • “The charts should “pop” for the user."

  • “Can we do a Sankey chart?"

Here’s one of my favorite examples of pretty over functional.  Note the icon relaxing in the upper right corner.  Is the call to action for Vacation Days Utilization ever to take a nap?

Necto Dashboard .png

Here’s how I’ve learned to tackle the pretty chart dilemma.

Address Comments Right Away

The best way to deal with the pretty chart dilemma is to address it immediately when it arises.  Whether that’s during the selling process or mid-project, there’s no better time than the present.  One phrase I’ve repeated several times is, “we don’t focus on pretty, we’re implementing effective.”  I will also emphasize that the changes and design were implementing are intended to impact metrics or goals such as utilization, improved feedback or user attrition. 

Change the Vocabulary

It's also important to have everyone on the team continually use “non-pretty” language with the client. Eventually, they will ask questions or just start mimicking you.  Here are the words and phrases I use and those that I try to avoid.

Encouraged Vocabulary          Discouraged Vocabulary

Useful                                         Good-looking

Distinct                                        Pretty

Well-labeled                              Attractive

Clear Call to Action                    Eye-pleasing

Functional                                  Beautiful

Engaging                                    Cool

Yes, I am avoiding actionable these days as it makes me feel the same way when I hear synergy.

Highlight the Process

Be sure to highlight and emphasize the design process and your principles at work throughout the implementation.   Don’t hide the science or methodology at work during your design phases from the client. Be sure to lead with it.  Also, demonstrate how information design works well with Agile Development.  Often times the Pretty Chart Dilemma arises when product owners are squeezed for time and want to reduce the number of points given to a UX user story in an upcoming sprint.  

Focus on Users not Charts

Often times the dilemma is the result of the product team having the wrong perspective.  They're focused on features, sprint completion and their own preferences vs. those of their users.  This is pretty natural. Don’t be hesitant to ask questions like, “Would a user value this chart type or color palette over a clear call to action?” Consider the Juice Dashboard White paper of 10 years ago as a resource to help with language or audience focus.  Note the white paper title.   Tts not about pretty or beautiful, but designing dashboards that people (users) love

Still not sure how to talk to clients about the Pretty Chart Dilemma?  

Visit the Juice Analytics resource page.  Feel free to use and share the content provided with clients and team to deal with the dilemma, while giving us the proper attribution of course.   Need a more hands on approach then schedule a session to talk through the best approach to handling your specific dilemma.

Getting Data Product Requirements Right

Often customer data products or applications go awry because of poor requirements.  While customers can describe a billing workflow or a mobile app feature, explaining how data should be used is less clear. Merely documenting a wish list of reports, fields and filters is a recipe for low adoption and canceled subscriptions.

To ensure that data requirements are relevant and the solution is useful to customers (profitable too) consider the expression Walking a Mile in their Shoes.   The original expression, which has evolved and used in different forms is before you judge a man walk a mile in his shoes.  Collecting good requirements is less about a laundry list of charts and metrics, but an understanding of how information can transform the business from how it exists today.

In 2017 I had the opportunity to work on an insurance industry project for the first time.  The challenge was to deliver the industry’s first insurance agency analytics solution.  The product team showed us their competitor’s dashboards and suggested we replicate them. The support team demanded more ad-hoc reporting functionality on top of the Crystal Reports report writer.   Customers wanted an embedded BI tool to make themselves more data-driven. Needless to say all parties were miffed when we accommodated none of their requests.

What we did was contrary to what everyone expected.  We didn’t talk about the data (at least not in the beginning) or ask them their report wish list, but strived to understand the drivers of success and behavior within an insurance agency.  To walk in their shoes, we scheduled agency office visits, had discovery meetings with executives, observed workflow and documented data usage.  In the discovery meetings we asked questions related to the end user’s data experience, how and when information was being used and what decisions were made using data.

Here’s a sample of our questions.

Data Consumers (Users)

  1. How well does the user understand the data?

  2. How much expertise do they have in the industry?

  3. What were some examples industry best practices?

  4. Are customers looking for data or insights?

  5. Does the end user trust the data?

Data Consumption

  1. What are some examples of business processes being influenced by data insights?

  2. What are the top 3 questions each audience wants to answer? 

  3. When is data being used and how, e.g. daily, weekly, monthly, in account reviews etc.

  4. How is information currently be displayed and disseminated?

Decision-Making

  1. What are the current metrics that measure business success?

  2. What are the key decisions made every day?

  3. What are the decisions not made or delayed because of missing data?

  4. What are the top data conversations had or that need to be improved?

  5. What are the metrics that drive top line revenue?

  6. What business processes will be impacted by this new information?

  7. What are some example actions that might be taken as a result of insights? 

Data

  1. What are the relevant time intervals that information will be updated, distributed and reviewed?

  2. What are the most relevant time comparisons, prior week, prior month, prior year? 

  3. Are these dashboard(s) meant to be exploratory or explanatory in nature?

  4. What offers the most relevant context to the end user?

Getting users to adopt 20 new reports or a single new dashboard can be challenging when habits are already in places. Your best bet for successful data product adoption is to improve existing workflow and/or meetings using the newly uncovered insights.  In the case of the insurance project customers already had access to 300 reports before we created their new analytics solution. 

As it relates to the insurance project our first phase developed three new data applications.

  1. Daily Cash Flow Application (printed) 

  2. Weekly Sales Meeting Dashboard (TV Monitor) 

  3. Monthly Carrier Marketing Meeting Presentation (2 Desktop Dashboards)

These solutions or apps solved specific problems and fit into their existing workflow.  In each case we developed a data application based on industry best practices.  

Just “knowing your audience” isn’t enough to get data requirements right.  Walking in their footsteps means understanding how their business works and how the right insights can impact it.  Some of the other benefits from this approach are:

  • Quantifiable Returns - It was easier to talk about the benefits of a data product when tied to a process where time or effort saved can be measured.

  • Increased Credibility - By taking the time to walk with customers we establish credibility.

  • Improved Stickiness - Tying new applications to existing processes not only aided in adoption, but made them harder to turn off over time with increase usage.

Much of what was discussed above can be found in the Juice design principles, resource page or in Data Fluency; however the quickest way to find our more is to schedule a case study review.  Click the button below, send us a message and indicate your industry.  We’ll walk you through a successful case study in a relevant industry and answer your questions.



The difference between visualization and data storytelling

25senior2-articleLarge-v3.jpg

Preet Bharara, former United States Attorney for the Southern District of New York, shared some thoughts on writing his new book "Doing Justice" (from the New York Times Book Review podcast — starts at 21:30). These lessons mirror the challenges we see when we think about the distinction between making good visualizations and crafting data stories.

Anything of length is difficult in a different way.

Lots of restaurants have good appetizers, but to sustain a great meal through the appetizer then the main course then dessert is more difficult.

Lots of people can write an article, but to sustain a book is difficult.

Lots of movies have great opening scenes, but to sustain it for 2 hours in an arc that is paced properly is a much different thing.

I also struggled with figuring out which stories to tell and which stories not to tell...it was too much.

Also the difficulty for me was wanting to write a book that wasn't for lawyers...that is [a book that is] page turning.

That’s a succinct three messages about why data stories are different from making a good visualizations:

  1. It takes a lot more effort to tell the whole story — with a beginning, end, and narrative flow.

  2. Don’t share all your data, just the most important stuff.

  3. Communicating to another data analyst isn’t the goal. You need to be able to communicate to people who don’t have the same foundation of understanding.

Hilburn's Law of Data Intentionality

Hilburn's Law of Data Intentionality identifies the existence of a positive correlation between the intentionality of data collection and and the intentionality of data communication [citation needed].

When a person or organization makes deliberate and purposeful choices about the data gathered, the person or organization tends to place similar weight and effort into the presentation of that data. The negative relationship is also true. Data that is not well-considered or valued is typically presented in ways that show little consideration of a specific message or purpose. 

The following diagram represents this relationship between intentional data collection and intentional data presentation.

Hilburns_Law.png

The four quadrants represented above can be explained as follows:

(A) LOW intentionality of data collection and presentation.

It is common for large volumes of data to be gathered without premeditated thought about how the data will be used. As a result, the presentation of this data often lacks a specific purpose. An example of this scenario is web analytics platforms that gather a broad array of measures about visitors to a website without a focus on a specific hypotheses or behaviors. This general dashboard or analytics tool approach asks the data audience to find their own intentionality within the data.

Web Analytics Dashboard

Web Analytics Dashboard

(B) HIGH intentionality of data collection but LOW intentionality of data presentation.

We must also consider the exceptions that prove the rule. In this quadrant are the researchers who have invested time, money, and effort into gathering valuable data but neglect to carry that effort forward into data presentation to their audience. Some syndicated research studies, for example, are presented as long written reports with appendices of data tables. Healthcare analytics start-ups and data scientists can find themselves in this quadrant when they lack the time, resources, or training to properly communicate their hard-earned insights.

(C) HIGH intentionality of data collection and presentation.

This quadrant represents the scenarios when there is consistent and persistent effort to extract value from data. Data experts consider what data they want to collect, the message they find in their cultivated data, and to who they want to communicate the results. Creators of market or customer surveys, consultants, and analytics businesses often fall into this category.

Data-story.jpg

(D) LOW intentionality of data collection but HIGH intentionality of data presentation.

Finally, this quadrant is another uncommon scenario. Data visualization students and practitioners will sometimes use standard data sets (e.g. US Census data, import-export data) as an easily accessible raw material for elaborate data presentation. 

It is important to note that every situation calls for its own level of effort, intentionality, and purposefulness, so there are legitimate reasons why someone would choose to invest or not invest in either intentional data collection or presentation.

Your Healthcare Analytics Solution: From Concept to Launch in 100 Days

Below is the video recording of a Juicebox healthcare analytics product webinar. The video is about 45 minutes and includes our tips for successful launches, a quick data product demo, and a Juicebox Q&A session at the end.

Our tips include which project tasks are best done slowly (e.g. getting your data ready) and which tasks need to move fast (e.g putting a working data product prototype in front of customers). Knowing what to do fast and what to do more slowly is a big part of the tips we share on the video.

We also touch on Juicebox strengths and alignment to support fast and slow tasks, such as:

🐇 Design development agility

🐢 Enterprise-grade app development

🐇 Understandable data stories

🐢 Capture your customer needs

Enjoy the webinar. After watching you can schedule a personal demo where we can go into Juicebox or case studies. https://www.juiceanalytics.com/get-started

Customer-Facing Data Solutions Need a Product Mindset

If most customer data initiatives are risky, have low adoption, and often don’t meet their goals, what steps can we take to improve success rates?

What we’ve learned at Juice over 10 years and hundreds of implementations is that customer-facing data initiatives succeed when you deliver a product, not a project. You need a product mindset to reduce risks, make customers happy, and deliver revenue. Having a product mindset means:

  1. Acting like you are delivering a game release to millions of gamers for a midnight deadline.

  2. Treating every deadline as a missed product launch (with lost revenue).

  3. Recognizing that disappointing users will hurt your organization’s reputation.

Here is an illustration of how a new way of designing and delivering data products looks compared to the traditional IT BI project.

Juicebox Product Launch.png

Here are three things you need to do to improve your odds for your external data initiative, i.e. customer reporting, embedded analytics, customer-facing machine learning product, etc.

1. Set a product launch date

With a product mindset you have a product launch and not a project plan. Product launches involve more stakeholders, like marketing and sale so accountability is extended beyond the project manager. With many teams committed to an on-time date and increased executive visibility, there are lots of folks with skin in the game.

Products (not projects) have a revenue commitment. As a general rule, you won’t get any of marketing or sales time if there isn’t revenue attached to the initiative. One agile process we love to get the organization bought into the initiative are release readiness meetings. If facilitated well, each team actively participates and has a stake in go/no go decisions. This force stakeholders to make commitments and take interest in the outcome.

Missing dates isn’t the real issue; it’s the consequences of missed dates which dramatically affect your data product initiative. Missed dates makes additional funding more challenging as executives question the initiative’s (and team’s) ability to meet goals. It affects your reputation with customers and their confidence in you to deliver. Note the diagram above. As soon as the launch date is missed there is lost revenue and that gets noticed more than a missed project date.

2. Plan for 2.0 version from Day One

When you have a product and agile mindset, it's all about getting a small, valued first release into the market because you know there will be a 2.0 version. With a 2.0 of your data application, you’re not trying for the perfect data product, but one the market will use. This changes the way you gather requirements and prioritize features. When it's only a one-time project, it is easy to go overboard, taking weeks to define your requirements. Here’s a good article describing an MVP product’s functionality and their illustration of how to prioritize features for your initial product.

mvp-priorities.png

Talk about 2.0 often, particularly with executives, to set their expectation that funding the initiative is an on-going endeavor and not just a single capital or funding request. When executives and customers know there is a 2.0, then every feature doesn’t have to be in the first release, helping reduce product complexity and likelihood of missed dates. Success improves if you build and share prototypes and explicitly deliver an MVP.

3. Talk about apps, not reports

We’ve touched on this point before, but it bears repeating. When you think of your solution as an application, it forces you to solve a problem. With a product mindset you spend more time on understanding the problem you want to solve vs. capturing feature wish-lists. As a data product manager, your new product mindset and focus on a series of apps vs. one large data project forces you to segment your audiences. You can now target each audience’s specific needs. Now that you’re delivering an app, your users will be more inclined to know when and how to use it. Apps, by their nature, have training and support built-in, reducing risks in implementation and adoption. 

There are some challenges that will arise with an "app focus”.  When delivering data apps, you’re creating a more prescriptive experience for your users. Those users or customers wanting to explore insights and define their own solutions could be disappointed. The right way to address this challenge is to view these data explorers as a unique audience with their own application needs since exploring data and presenting results are different problems.

Sure, we’ve now taken your single data initiative and made it into numerous projects. While it might be scary at first, it is most likely the reality of what your customers expect anyway. You didn’t expect to be able to meet the needs of all customers with a single offering did you? With your new found perspective you’re ready to think about your data initiative not as a single project, but eventually as a portal or app store to deliver many new products.

The Product Mindset

The product mindset is rooted in adopting agile methodology. The big IT project mentality hurts your customer data initiative and adds more risk. There are many excuses as to why folks don’t want to adopt a product mindset or agile for BI projects. We’ve heard them all; however if you feel like your organization wants to be part of the data economy, then building and launching data products is in your future.  

Want to learn more about our formula for successful customer data initiatives and see what a successful data product looks like then schedule a minute discussion below.

What does 'Bandersnatch' teach us about data storytelling?

netflixbandersnatch-800x358.jpg

“TV of tomorrow is now here.” So says The Guardian. The TV show that brought us into the future is Bandersnatch, the recently released interactive television show from the Black Mirror anthology. Bandersnatch is sort of modern reincarnation of the Choose Your Own Adventure books of your childhood. Some reviewers raved about the new experience:

"This is what Bandersnatch gave me that no other movie had ever been able to. Before you say it, I know it’s just an illusion of choice and I had no real control over what played out on screen, but it still provided me with more influence on a narrative that wasn’t my own than I’d ever had before, and I revelled in the possibilities. To me, Bandersnatch is both a movie and a game and something entirely new. It’s a lesson in human psychology, a thesis on the illusion free will, and one hell of an entertaining few hours all rolled into one. And, perhaps most importantly, it’s only the beginning.” Lauren O’Callaghan, Gamesradar

For me, my intrigue with Bandersnatch relates to its similarities with our Juicebox-created interactive data stories. I thought it was worth examining some of the technical and narrative challenges faced by this TV experiment to find lessons for data storytelling.

The very first lesson: Be careful about using the term “Choose Your Own Adventure.” Netflix is getting sued. The other lessons fall into a couple of categories: 1) functionality for interactive storytelling; 2) understanding the audience needs.

Essential Functionality for Interactive Storytelling

Netflix appreciates the necessity of teaching their audience how to use the interactive experience. From the very beginning of the show, you are confronted with the selection mechanism, even before the show begins. To validate that the audience is learning how it works, your next couple of choices are trivial ones, i.e. which type of cereal or music our character would like. These interactions build a sense of comfort before some of the more dreadful decisions arrive.

bandersnatch_frosties.jpg

In Bandersnatch, as in most analytical exercises, it is possible to make choices that result in a dead-end to your story. For example, when our hero Stefan decides to produce his game with a game company, we quickly learn that this choice won’t result in a realization of his gaming vision. Netflix provides a quick mechanism to teach you that this was a wrong-turn and gets you back to a place in the story where you can pick a better option. Imagine the same careful handholding in an Excel PivotTable: at the moment when you choose an option that creates one of those 200 column nightmares, Excel instantly asks if you’d like to make a better choice. If only.

Perhaps Netflix’s greatest accomplishment is with their underlying technology for Bandersnatch that enables a seamless video experience. As a viewer, you make a choice and the video immediately launches you into the next chapter of the story. The same seamless flow needs to exist in data stories; make a selection, immediately see the results. It is a requirement to make users excited to explore the world you are creating.

The Challenges of Interactive Storytelling

Choose Your Own Adventure-style storytelling comes with its own challenges. The author needs to define many endings. Each permutation needs to find a way to connect — all while still developing characters and themes. In fact, one theme of Bandersnatch is that the complexity of this type of storytelling may drive a person to madness. Here’s a look at the decision-tree behind the show:

Redditer u/alpine’s flow chart of Bandersnatch decision-tree

Redditer u/alpine’s flow chart of Bandersnatch decision-tree

What else do we need to consider in telling stories of this form?

To start, we might consider how different audiences react differently to the injection of choice into their entertainment. For every person who is intrigued by asserting more control over the story, there will be others who are looking for passive entertainment.

"Even the most complex, arresting, emotionally draining show is essentially escapism because all the work is done for us.” …do people want to be the decision makers? It is hard work.

As a viewer, I will wave from the shores of traditional TV, happy to be spoonfed my entertainment and hoping that the young folk are having fun.” — The Guardian

Secondly, we have to ask: Does choice come at the expense of characters, coherence, and clarity of story telling? Bandersnatch struggled to build interesting characters. Traditional stories control the audience’s perception of a character at all times, and therefore can build a foundation of what makes that character work, layer by layer. By ceding control of that process to the audience, the author provides a collection of character “bricks” that haven’t been constructed into anything.

"It rarely deviated from the expected deviance, rarely landed in an unexpected place or – and this was where it most resembled its videogaming ancestry – had energy to spare to make the characters much more than ciphers.” — The Guardian

Finally, the audience of an interactive story has to ask themselves (just as Stefan asks): Are we really in control of our choices, or is there a hidden power that is flipping the switches? Are we only getting the illusion of choice?

Bandersnatch is mostly satire, too, but the "gameplay" jumps around a confusing timeline, making you repeat past scenes with different decisions. How you interact with your therapist, whether you agree to take drugs, and if you manage to open a secret safe, for example, all bring you down different paths and to several Game Over screens. These soft endings then send you back to earlier scenes, so you can choose the "correct" choice to further your progress. You have to do this numerous times to eventually receive a true ending. The concept of "right" and "wrong" choices bothered me and cornered me into decisions I didn't want to pick. — Elise Favis, Game Informer

For TV, this is an evolving entertainment form. In the world of data, we are also creating an evolving communication form. How do we find the balance of choice and flexibility with message? How can we engage and entertain without heaping the burden of authorship on an audience? These are questions we’ll need to continue to explore.

New Years Resolutions to be a Better Data Product Manager

It is the the New Year, my favorite time for New Year’s resolutions. Time to look inward to see how we can change ourselves to change your world.

If you’re responsible for a data product or analytical solution, you might consider a little self-reflection in pursuit of a better solution for your customers. Here are a few places to start:

annie-spratt-54462-unsplash.jpg

Empathy

the ability to understand and share the feelings of another.

When it comes to data products, you’ll want to foster empathy for the users of your data. More likely than not, they have concerns such as:

  • Your data may replace their power in the decision-making process.

  • They don’t have the data fluency skills to properly interpret the data and what it means for their decisions.

  • They are afraid of changes that will impact how they do their work.

Appreciating and acknowledging these fears is a first step in building trust with your users.

zach-reiner-631836-unsplash.jpg

Learn to flow

“I would love to live like a river flows, carried by the surprise of its own unfolding.” — John O’Donohue

We all a little guilty of wanting to make others bend to our view of how things should work. This year, you may resolve instead to “flow like water.”

Data products should enhance how people make decisions, giving them the right information at the right time. This is best accomplished when the data product can fit into the existing workflows so you are augmenting the user’s role rather than trying to change it.

holger-link-699972-unsplash.png

Patience

“Wise to resolve, and patient to perform.” — Homer

Patience is accepting that progress takes baby steps. This is a critical skill to help manage your data product ambitions. The possibilities for analytical features can seem limitless — there are so many questions that should be asked and answered.

Beware this temptation. You’ll want to find the most impactful data first to allow your users to learn what they can learn. Before you try to do it all, have the patience to gather feedback and plan your next release.

nikhita-s-615116-unsplash.jpg

Growth mindset

“People believe that their most basic abilities can be developed through dedication and hard work.” — Carol Dweck

Analytics is best served by a growth mindset, the belief that taking on a challenge (and sometimes failing) with expand one’s mind and open up new horizons. Useful analysis begets questions, which leads to more analysis and even better questions.

As a data product manager, you want to encourage this growth mindset in your customers, encouraging and enabling them to expand their understanding of their world.

tobias-mrzyk-569902-unsplash.jpg

Inclusive

“We are less when we don't include everyone.” — Stuart Milk

Every year I tell myself I need to be better at meeting new people and keeping up with old friends. It’s a good ambition if you are leading efforts on a data products. It takes a diverse set of roles to get the support and commitment in your organization. Have you gotten legal on board? How about IT security? Does marketing and sales understand the value of your data product and who you are trying to target? You may need to change the way people think about making use of data to build company-wide support for your solution.

10 Visualizations of Juicebox

Christmas is a special time of year. We all have our favorite aspects of the season. In the spirit of Christmas and the Christmas carol, the 12 days of Christmas, here are the Juice team’s 10 favorite visualizations.

You will recognize some of these as your own favorites, but some are exclusive to the Juicebox platform. To learn more about the visualizations exclusive to Juicebox and Juice design schedule some time with us.

leaderboard.png

Leaderboard

An exclusive Juicebox visualization. Leaderboards are a great way to look at a dimension or group across multiple rankings. Who really is best on the team? Its never one metric and a leaderboard lets you compare across multiple metrics. Here’s a video showing a leaderboard in action from a few years ago.

 
Flower

Flower

A very engaging way to compare performance across locations, such as hospitals or schools. Each entity is represented as a flower and every metric is represented by a petal. We were inspired by the work of Moritz Stefaner.

 
rankedlist.png

Ranked List

Its not just a horizontal bar chart, but an interactive way to see a top ranking as well as a way to explore a long list. The Juicebox way of letting a user explore a long list is unique. Easy to understand because of its familiarity while delivering a lot of interactivity for exploration. Here’s an example from our Notre Dame application.

 
sankey.png

Sankey

This visualization isn’t exclusive to Juicebox, but well-loved by clients because of its easy way to explore changes over time among groups. Our version is much easier with more options than the Tableau version with its dynamic generated polygons.

 
Distribution

Distribution

This is exclusive to Juicebox despite its less than creative name. The data is binned to show distribution of values while also emphasizing the individual items (by showing details on roll-over) that make up the “bars.” A really easy, yet powerful way for users to explore their data.

 
Orbit

Orbit

A variation on the bubble chart that shows relationships. This breaks some data visualization rules, but is helpful for exploring hierarchy and avoiding too much overlap. Like a bubble chart it uses size and color to convey information.

 
Scatter

Scatter Plot

The scatter plot is a common visualization for data exploration. Juicebox adds panels and panel colors to better help the user understand the values that are good or bad. By clicking on a panel, the user can focus on a specific group of items for action.

 
Map

Map

We love using maps as a filter for other visualizations. Additional encoding with dynamic labels further adds to a user’s understanding of the information.

 
Treemap

Treemap

We’ve been doing TreeMaps since 2009 so they hold a special place for us. This is still one of the best ways to show hierarchical data that has values that can be aggregated.

 
key_metrics.png

Key Metric

While not quite a visualization, almost every data story starts with a quick summary of metrics, often with a comparison to goals or benchmarks. A key metric visual sets the foundation for what a user will get in their dashboard or application.

 
Lollipop

Lollipop

The Lollipop is another good way to show comparisons among groups. Lollipop is our preferred way of sharing metrics when the metrics can be compared along a common scale. This is a good alternative to a bullet chart.

Three Types Of Context To Make Your Audience Care About Your Data

The following scene is one of the most pivotal moments in the Game of Thrones series.

As a loyal viewer, this scene represents a turning point for Tyrion. He has reached a breaking point after a lifetime of conflict with his father. His speech is the moment that he sets out on a different path, a path that ultimately leads to (spoiler) the murder his father and (unsurprisingly) a deep schism with his family.

For a new viewer, it is a courtroom confession in costume.

Your experience of entertainment is entirely different based on the context you bring. It makes a world of different to know: Why we are here in this room? Who are these characters? What are their motivations?

Context is the foundation that gives a scene something to build on. Context makes your audience care.

It is the same thing when you design a dashboard, report, or analytical interface (with less beheading and back-stabbing). Lack of context — the set-up that explains the background and motivation for the data — may be one of the primary reasons why dashboards and reports fail to connect to audiences. And it may be the reason you can’t get your colleagues to open that spreadsheet you just sent.

How do you make someone care? You want to anticipate and answer a few inevitable questions:

  1. Why does this data matter to me?

  2. What am I about to see?

  3. What actions can I take based on this data?

Let’s explore these three elements of context with a few examples.

1. Why does this data matter to me?

Context should make it clear why the information is important. At Juice, we always start designing a data story by defining the audience we want to reach. It is best if we can be specific about the kind of person and role that they play in their organization. This person has things they want to accomplish that will make them successful. A good design takes all of that into account.

When it comes time to show the data, there is no reason to be secretive about who should be engaging with the data and why it is designed for them. As an example, take the following introduction to an analytical tool the New York Times’ Buy or Rent Calculator.


2. What am I going to see?

"Tell them what you are going to tell them, tell them, then tell them what you told them."

This famous piece of advice is often ignored by dashboard and report designers. A title isn’t enough; you should explain the scope of the content and, ideally, how the different elements fit together. Is there a structure or framework that undergirds your choice of metrics? Explain this visually before tossing your audience into the deep water.

One way that we’ve found to deliver this context is to provide an automated step-by-step tour of the content. You’ve undoubtedly experienced this approach when to try a new mobile app. The app designers walk you through the workflow and explain features. If done well, you’ve helped new users wrap their head around what they are going to see.

The following example prominently features a descriptive legend showing how to read the glyphs.

You may also want to consider ways how to help the user understand the interactions of your data interface or even show them the types of insights they can glean from the data. Here’s a great example showing survey data about the challenges women face in different countries.


3. What actions can I take on this information? 

Finally, effective context setting explains exactly how the data can guiding your audience to smarter actions. Your report or dashboard should lead to actions, not just show interesting data. It should point to what comes next.

The following example shows data about inequality in travel visas by country. For an individual, the actionable question is: For my country, where can I easily travel to? Data products are inherently personal so you want to highlight this in your context.


Context along a timeline

In summary, we can think about these three essential elements of context along a timeline. You want to explain for your audience:

1) Looking backward, what brought you to viewing this data?

2) Now, what should you see when they engage with the information?

3) Looking forward, what action can come from exploring the data?