Make Your Data Relatable

I’m a big admirer of Nancy Duarte and her new book DataStory. One thing that Nancy knows is that data communication isn’t just about data visualization — any more than a movie is just about moving images. Audiences are moved by a movie because of so much more: storyline, characters, conflict, context, mood, specificity, meaning, and relatability.

She makes this point in a recent HBR article

The more data we collect, the more mind-boggling these figures become. Though an audience may intellectually understand the measurement, they might fail to relate or connect with it emotionally. For numbers to inspire action, they have to do more than make sense — they have to make meaning.

Connecting to people requires connecting to things they can relate to. When we’re learning a word in a new language, pointing at the picture of a biblioteca can be the easiest way to make the connection. So too in teaching the language of data.

Let’s check out some examples of ways to relate a data value to something your audience already understands.

In a simple example, we might modify as statement like “sales of hydration bottles is expected to reach $10.3 billion by 2023…” by appending a comparison “…that’s more than twice the size of the current fiction book market.” Now, your audience can relate — we’re spending twice as much on water bottles as we are on summer reading.

2. Duarte shares this excellent example from Neil deGrasse Tyson:

A site called TheTrueSize shows us how the United States, India, and China can all fit within the geography of Africa.


Choosing relatable units can make all the difference. Thus, “a banana for scale.” The New York Times expresses the amount of water used to grow produce using a) gallons of water; b) a typical serving size.


In an article by ‘Wait But Why’ From 1 to 1,000,000 they show how to use unit charts to provide an intuitive sense of scale when the raw numbers alone might fail to convey meaning.


Send me your examples and I’ll add them in.

Teach Once, Use Often

“Can’t we sprinkle in some pie charts just to change things up?”

If you’re in the information design business, you’ve probably heard some version on this request. Variety is the spice of life, right?

I say: Get your spice elsewhere. Your audience doesn’t have interest in learning a bunch of different charts — and carrying an unnecessary cognitive load. Instead, teach once, use often.

Here’s a great example from a Pudding data story that explores “How many high school stars make it in the NBA?”. The following visualization is a little complex, non-standard, and will take readers a moment to grasp. Each dot represents a top 100 high school player and the progress they made on their way to stardom in the NBA.

Why ask your readers to learn this new Plinko-style visualization?

Because once they understand it, the data story goes on to use the same structure many times. Different segments of players (e.g. top 10 high school players, straight-to-NBA, players from specific colleges) are shown in this same model, each time exposing a new facet of insight into the data. By the second or third iteration, I’d expect most readers understand the visual vocabulary and are focused on what they can learn from the data. And that’s the point of data visualization — not to create a spicy grab-bag of charts.

Toward a Data Personality Framework

With all the talk about Data Literacy (led by folks like @Ben Jones, @Jordan Morrow, @Valerie Logan) and/or Data Fluency (👀@Dalton Ruer), the time is right for a more rigorous methodology for understanding the audiences for data dialogue.

Which got me thinking: What if there was a Myers Briggs-style personality type indicator for data personalities? It could be used to predict how someone is going to respond to data when it is presented, and by extension, what are the best ways to get the desired outcome.

I’d like to share a framework for profiling Data Personalities. Like Myers Briggs, it has four dimensions on which an individual can exists on a spectrum between two extremes.


Here’s how I think about each dimension:

  1. Decision-making approach. How does this person integrate data into their decision-making process? Do they lean on data to guide their thinking or are they more likely to depend on their experience and instinct?

  2. Types of decisions. Is this person in a role where they make decisions that have a long-term and strategic perspective, balancing more uncertainty and more sources of information? Alternatively, are most of this person’s decisions more real-time, tactical, or operational in nature?

  3. Experience with data. Does this person have a high level of data literacy — comfort analyzing, communicating, and interpreting data? Or are you dealing with someone who is relatively novice in working with data and who may express discomfort with data?

  4. Role in the organization. Is this person in a role where they can make decision directly from the data they are presented? Alternatively, will this person take your data and use it influence other people?

As a data author intent on encouraging smarter decisions, there is nothing more important than understanding your audience. The Data Personality Profile (DPP) is a good place to start. Now we just need some data.

Explaining Data Teams to HR

The importance of a good team to build data solutions can’t be underemphasized. If you’ve read anything like Francois Ajenstat’s recent Forbes article or Roger Pen’s e-book on building data science effective teams you get many of the key points; however I would argue that in addition to these points you need to invest time with your Human Resources (HR) team and make them an integral part of the success. Developing their data literacy should be part of your objective to building a successful team.

The following isn’t a prescription for a single conversation, presentation or analysis for your HR team, but a way to develop their data literacy around what constitutes a great data team.

Skill Diversity

Your HR team will focus on inclusion and diversity, but may not understand diverse skills and experiences and how they contribute to creating great dashboards, models, etc. At Juice we’ve found on numerous occaisions that Zach’s experience in digital marketing has opened new insights or ways of designing valuable healthcare analytics solutions. It could be just asking a different question or offering up a solution to a similar problem in a different industry (btw, funnel visualizations in healthcare are amazing). If everyone on the team is from the same industry or has similar experiences how do we get the HR team view this as a concern or red flag? How do we convince them to find someone with complementary skills and background?

The right way to think about skills is less about an individual’s skills, but about the team’s overall skill set. Your series of HR conversations should be an understanding of what the team is good at, where are their blindspots and what skills are needed. Also, HR needs to understand how to ask questions like, “Describe to me some of what you’ve built in Python and how users were impacted” vs. “How many years of Python experience do you have?”


One of my favorite quotes I’ve read recently on building data teams is “Hire people, not experience.” It comes from this piece on Medium by Murilo Nigris. How do we get to know people? Rather than ask them to talk about the tools they’ve worked with convince them to tell you story. Its in those stories that you’ll understand their values and priorities.

For the HR team to be able to assess fit you need to decide what your team’s value are. Do you value speed, creativity, production quality code or collaboration? All of the above is not a valid answer. Give your HR team 3 to 5 values to screen for. Take the time to explain why these values matter. Include examples of how someone on the team currently exhibits these values and how they makes you successful.

Defining and describing values will sound like a lot of work; however it will be a fraction of the effort of having to let someone go because they weren’t a fit.

Data-Driven Job Descriptions

Many of the job descriptions I read for data positions are painful to read. The biggest miss in my mind is I never really know what this person will be doing exactly on Day 1 or Day 500. Your job descriptions should read more like these on the Salesloft website.


  • Build a prototype application that will be posted on our website.

  • Completely data visualization online class to bring your data literacy vocabulary in line with the team.


  • Conduct product feedback interviews to gather feedback on existing features, and speak to new features coming.

  • Successfully lead a scrum team by running planning meetings daily.

A nice benefit is that a job description written like this becomes the individual’s performance plans and goals if they are hired. Here’s a template from the Google offers a way to think about job descriptions as another example.

Skills Assessment

Most new candidates have to go through some technical assessment. Make sure your HR team is involved with the assessment. Don’t let them punt involvement in the skill assessment because its “too technical”. If you can’t explain the skill assessment or if they don’t understand its desired goals then you have a problem. Use the opportunity to explain the assessment as one way to develop their data literacy. They can also see if you have any blindspots in the assessment and to make sure there isn’t bias in your assessment.

Also, make sure that they know skills assessment changes as new technologies are adopted and implemented, so it's never a static test.

Recruiting Talent

Often you are sharing the HR team with other departments. As a result, the amount of time that HR will actively recruit new candidates is limited. In my experience the HR team will send you 3 to 5 candidates or resumes and if you elect not to choose any then you’re completely dependent on whoever finds your website.

When discussing recruiting efforts with your HR team ask the following questions:

  1. What is your time commitment on this opening?

  2. What kinds of efforts will we make to find candidates that are probably already employed?

  3. What can our team do to supplement your efforts? (What are we allowed to do?)

  4. Are there any monetary incentives for us to find our own candidates?

  5. Are OPT candidates a viable option? Do they understand OPT?

After your disappointment diminishes, here are some items you can take to supplement their efforts:

  1. Have your team share the job posting link with their social networks

  2. Volunteer to present at local meetups, events, universities and conferences. Try to do at least 2 per position.

This will seem like a lot of work, but building models, visualizations and data solutions without a full team is time consuming too. Note that the lessons above are very applicable to bringing on contractors or consultants to your data team as well.

The initial HR conversations will be hard, but keep the dialogue going even when you don’t have openings.

To learn more about data culture and teams make sure to get your copy of Data Fluency, Empowering Your Organization with Effective Data Communication. If your timeline for your customer facing data project doesn’t include time to get HR on board and fluent, reach out to us to learn how the Juicebox platform can handle some of the challenges with getting the right data team in place.

The 2020 Twitter Election: Explore the 20+ Democratic Candidates

Our goal at Juice is to give everyone the ability to design and share compelling data stories. We're always inspired by the The New York Times information design group (and many other data story authors). We want to bring this kind of data communication to every organization.

However, we sometimes forget to share publicly all the cool stuff we can do. We’re going to fix that, starting with this data story, an exploration of democratic presidential candidates and their influence on Twitter. It was crafted as a passion project by our very own Susan Brake.

She set out to answer a few key questions:

How do the candidates compare in the reach of their Twitter audience?

Who has Twitter momentum?

What are the candidates saying on Twitter that is drawing the most attention?

Give it a try. I expect you’ll learn something and enjoy the journey.

If you like it, keep in mind that we work with organizations of all types — start-ups, non-profits, large enterprises, and the public sector — to help them tell the stories in their data.

2019 Data Summer Reading List

“Deep summer is when laziness finds respectability.”

Sam Keen

Now that Summer is here it’s a great time to recharge our batteries. Whether it’s a much needed vacation, a nap in the hammock, hours watching soccer games or curling up with a good book. Here are the books that made it onto the Juice Summer reading list this year. We’ve started some of them, but plan to get through the entire list by Labor Day.

Screen Shot 2019-06-16 at 09.08.44.png

After hearing Alberto speak recently in Atlanta on his book tour we added it to our list. We’re sure it will make it to the Juice reference library along with his other books.

Screen Shot 2019-06-16 at 09.08.11.png

This book was on Bill Gates Summer reading list last year and we’re finally getting around to reading it. Each chapter tells a great story about how to think about data in the context of real life.

Screen Shot 2019-06-16 at 09.56.49.png

This book has gotten a lot of interest in the data visualization community, so hard to ignore it and not make it a focal part of our Summer.

Screen Shot 2019-06-16 at 09.57.45.png

As Juicebox supports data storytelling at scale, we love to read anything we can get our hands on about stories. This one came highly recommended to us.

Screen Shot 2019-06-16 at 09.58.17.png

We’re always up for some clever humor. This book fits the bill and just skimming us made us laugh.


This book is a beautiful compilation of maps and hard to put down. Very enjoyable to skim and appreciate the illustrations on a rainy day or Summer afternoon.

Your Data Story Needs More Than Data

Data stories use the techniques of traditional storytelling — narrative flow, context, structure — to guide audiences through data and offer flexibility to find insights relevant to them. Data may be the star, but your data story won’t cohere without a mix of additional ingredients.

There are at least four things that you’ll want to incorporate into your data story that go beyond the data visualizations:

1. Context

The first step in a data story is to set the stage. You want to explain to your readers why they should care about what you’re going to tell them? This is also an opportunity to let your reader customize the data they are seeing to make it more relevant to them. A couple of good examples:

2. Educate your readers

Before plunging your audience into a complex or innovative visualization, you want to take some time and space to explain how that visualization works. Tooltips and gradual animation can help the user absorb how to read to the visualization. Try these examples out:

3. Explanation of insights, notes, help

Data stories shouldn’t create more questions than they answer. In some cases, you may want to be explicit about what meaning a reader should take from a visualization.

4. Actions and recommendations

A data story should lead to action. Make some space to explain what recommended actions your readers might take based on the results.

The Pretty Chart Dilemma

Whenever a client says, “We just need the charts to be pretty”, I pause and weigh my response. While clearly placing some value on the user experience with their comment, they clearly miss the point of information design and effective data visualization. My dilemma then becomes, what’s the right response to this statement?

To be clear I’m not rehashing the data visualization aesthetics debate, but wrestling with how to win over product leadership on how to implement data applications the right way, i.e. functional vs. pretty.  When I’m working on a project It's all about delivering a functional design supporting an existing workflow that leads users to a set of results they can act on.  Pretty can’t do that.

The tide has definitely turned.  Product leadership recognizes they must deliver well-designed data applications to customers.  However, even though there are dashboard UI user stories in everyone's upcoming sprints, it’s still treated as something that happens to the display (colors, fonts and charts) and not between the display and the user (insights, actions and to-do lists).  My fear is that when mediocre designs gets implemented and results are mixed, design budgets will only be further cut in the future.  Without a design that gets users to take action and drive improved results, subscription attrition, tepid customer survey responses and low adoption will continue to be the norm.

So, how should we engage with people when they disproportionately place value on pretty?

First, it's important to recognize the pretty chart dilemma in all its manifestations.   It can appear as variations of the following:

  • “The charts need to look better so users will use it."

  • “Make sure the dashboard uses approved company branded colors."

  • “They just need to be prettier than today’s version."

  • “The charts should “pop” for the user."

  • “Can we do a Sankey chart?"

Here’s one of my favorite examples of pretty over functional.  Note the icon relaxing in the upper right corner.  Is the call to action for Vacation Days Utilization ever to take a nap?

Necto Dashboard .png

Here’s how I’ve learned to tackle the pretty chart dilemma.

Address Comments Right Away

The best way to deal with the pretty chart dilemma is to address it immediately when it arises.  Whether that’s during the selling process or mid-project, there’s no better time than the present.  One phrase I’ve repeated several times is, “we don’t focus on pretty, we’re implementing effective.”  I will also emphasize that the changes and design were implementing are intended to impact metrics or goals such as utilization, improved feedback or user attrition. 

Change the Vocabulary

It's also important to have everyone on the team continually use “non-pretty” language with the client. Eventually, they will ask questions or just start mimicking you.  Here are the words and phrases I use and those that I try to avoid.

Encouraged Vocabulary          Discouraged Vocabulary

Useful                                         Good-looking

Distinct                                        Pretty

Well-labeled                              Attractive

Clear Call to Action                    Eye-pleasing

Functional                                  Beautiful

Engaging                                    Cool

Yes, I am avoiding actionable these days as it makes me feel the same way when I hear synergy.

Highlight the Process

Be sure to highlight and emphasize the design process and your principles at work throughout the implementation.   Don’t hide the science or methodology at work during your design phases from the client. Be sure to lead with it.  Also, demonstrate how information design works well with Agile Development.  Often times the Pretty Chart Dilemma arises when product owners are squeezed for time and want to reduce the number of points given to a UX user story in an upcoming sprint.  

Focus on Users not Charts

Often times the dilemma is the result of the product team having the wrong perspective.  They're focused on features, sprint completion and their own preferences vs. those of their users.  This is pretty natural. Don’t be hesitant to ask questions like, “Would a user value this chart type or color palette over a clear call to action?” Consider the Juice Dashboard White paper of 10 years ago as a resource to help with language or audience focus.  Note the white paper title.   Tts not about pretty or beautiful, but designing dashboards that people (users) love

Still not sure how to talk to clients about the Pretty Chart Dilemma?  

Visit the Juice Analytics resource page.  Feel free to use and share the content provided with clients and team to deal with the dilemma, while giving us the proper attribution of course.   Need a more hands on approach then schedule a session to talk through the best approach to handling your specific dilemma.

Getting Data Product Requirements Right

Often customer data products or applications go awry because of poor requirements.  While customers can describe a billing workflow or a mobile app feature, explaining how data should be used is less clear. Merely documenting a wish list of reports, fields and filters is a recipe for low adoption and canceled subscriptions.

To ensure that data requirements are relevant and the solution is useful to customers (profitable too) consider the expression Walking a Mile in their Shoes.   The original expression, which has evolved and used in different forms is before you judge a man walk a mile in his shoes.  Collecting good requirements is less about a laundry list of charts and metrics, but an understanding of how information can transform the business from how it exists today.

In 2017 I had the opportunity to work on an insurance industry project for the first time.  The challenge was to deliver the industry’s first insurance agency analytics solution.  The product team showed us their competitor’s dashboards and suggested we replicate them. The support team demanded more ad-hoc reporting functionality on top of the Crystal Reports report writer.   Customers wanted an embedded BI tool to make themselves more data-driven. Needless to say all parties were miffed when we accommodated none of their requests.

What we did was contrary to what everyone expected.  We didn’t talk about the data (at least not in the beginning) or ask them their report wish list, but strived to understand the drivers of success and behavior within an insurance agency.  To walk in their shoes, we scheduled agency office visits, had discovery meetings with executives, observed workflow and documented data usage.  In the discovery meetings we asked questions related to the end user’s data experience, how and when information was being used and what decisions were made using data.

Here’s a sample of our questions.

Data Consumers (Users)

  1. How well does the user understand the data?

  2. How much expertise do they have in the industry?

  3. What were some examples industry best practices?

  4. Are customers looking for data or insights?

  5. Does the end user trust the data?

Data Consumption

  1. What are some examples of business processes being influenced by data insights?

  2. What are the top 3 questions each audience wants to answer? 

  3. When is data being used and how, e.g. daily, weekly, monthly, in account reviews etc.

  4. How is information currently be displayed and disseminated?


  1. What are the current metrics that measure business success?

  2. What are the key decisions made every day?

  3. What are the decisions not made or delayed because of missing data?

  4. What are the top data conversations had or that need to be improved?

  5. What are the metrics that drive top line revenue?

  6. What business processes will be impacted by this new information?

  7. What are some example actions that might be taken as a result of insights? 


  1. What are the relevant time intervals that information will be updated, distributed and reviewed?

  2. What are the most relevant time comparisons, prior week, prior month, prior year? 

  3. Are these dashboard(s) meant to be exploratory or explanatory in nature?

  4. What offers the most relevant context to the end user?

Getting users to adopt 20 new reports or a single new dashboard can be challenging when habits are already in places. Your best bet for successful data product adoption is to improve existing workflow and/or meetings using the newly uncovered insights.  In the case of the insurance project customers already had access to 300 reports before we created their new analytics solution. 

As it relates to the insurance project our first phase developed three new data applications.

  1. Daily Cash Flow Application (printed) 

  2. Weekly Sales Meeting Dashboard (TV Monitor) 

  3. Monthly Carrier Marketing Meeting Presentation (2 Desktop Dashboards)

These solutions or apps solved specific problems and fit into their existing workflow.  In each case we developed a data application based on industry best practices.  

Just “knowing your audience” isn’t enough to get data requirements right.  Walking in their footsteps means understanding how their business works and how the right insights can impact it.  Some of the other benefits from this approach are:

  • Quantifiable Returns - It was easier to talk about the benefits of a data product when tied to a process where time or effort saved can be measured.

  • Increased Credibility - By taking the time to walk with customers we establish credibility.

  • Improved Stickiness - Tying new applications to existing processes not only aided in adoption, but made them harder to turn off over time with increase usage.

Much of what was discussed above can be found in the Juice design principles, resource page or in Data Fluency; however the quickest way to find our more is to schedule a case study review.  Click the button below, send us a message and indicate your industry.  We’ll walk you through a successful case study in a relevant industry and answer your questions.

The difference between visualization and data storytelling


Preet Bharara, former United States Attorney for the Southern District of New York, shared some thoughts on writing his new book "Doing Justice" (from the New York Times Book Review podcast — starts at 21:30). These lessons mirror the challenges we see when we think about the distinction between making good visualizations and crafting data stories.

Anything of length is difficult in a different way.

Lots of restaurants have good appetizers, but to sustain a great meal through the appetizer then the main course then dessert is more difficult.

Lots of people can write an article, but to sustain a book is difficult.

Lots of movies have great opening scenes, but to sustain it for 2 hours in an arc that is paced properly is a much different thing.

I also struggled with figuring out which stories to tell and which stories not to was too much.

Also the difficulty for me was wanting to write a book that wasn't for lawyers...that is [a book that is] page turning.

That’s a succinct three messages about why data stories are different from making a good visualizations:

  1. It takes a lot more effort to tell the whole story — with a beginning, end, and narrative flow.

  2. Don’t share all your data, just the most important stuff.

  3. Communicating to another data analyst isn’t the goal. You need to be able to communicate to people who don’t have the same foundation of understanding.