The 2020 Twitter Election: Explore the 20+ Democratic Candidates

Our goal at Juice is to give everyone the ability to design and share compelling data stories. We're always inspired by the The New York Times information design group (and many other data story authors). We want to bring this kind of data communication to every organization.

However, we sometimes forget to share publicly all the cool stuff we can do. We’re going to fix that, starting with this data story, an exploration of democratic presidential candidates and their influence on Twitter. It was crafted as a passion project by our very own Susan Brake.

She set out to answer a few key questions:

How do the candidates compare in the reach of their Twitter audience?

Who has Twitter momentum?

What are the candidates saying on Twitter that is drawing the most attention?

Give it a try. I expect you’ll learn something and enjoy the journey.

If you like it, keep in mind that we work with organizations of all types — start-ups, non-profits, large enterprises, and the public sector — to help them tell the stories in their data.

2019 Data Summer Reading List

“Deep summer is when laziness finds respectability.”

Sam Keen

Now that Summer is here it’s a great time to recharge our batteries. Whether it’s a much needed vacation, a nap in the hammock, hours watching soccer games or curling up with a good book. Here are the books that made it onto the Juice Summer reading list this year. We’ve started some of them, but plan to get through the entire list by Labor Day.

Screen Shot 2019-06-16 at 09.08.44.png

After hearing Alberto speak recently in Atlanta on his book tour we added it to our list. We’re sure it will make it to the Juice reference library along with his other books.

Screen Shot 2019-06-16 at 09.08.11.png

This book was on Bill Gates Summer reading list last year and we’re finally getting around to reading it. Each chapter tells a great story about how to think about data in the context of real life.

Screen Shot 2019-06-16 at 09.56.49.png

This book has gotten a lot of interest in the data visualization community, so hard to ignore it and not make it a focal part of our Summer.

Screen Shot 2019-06-16 at 09.57.45.png

As Juicebox supports data storytelling at scale, we love to read anything we can get our hands on about stories. This one came highly recommended to us.

Screen Shot 2019-06-16 at 09.58.17.png

We’re always up for some clever humor. This book fits the bill and just skimming us made us laugh.


This book is a beautiful compilation of maps and hard to put down. Very enjoyable to skim and appreciate the illustrations on a rainy day or Summer afternoon.

Your Data Story Needs More Than Data

Data stories use the techniques of traditional storytelling — narrative flow, context, structure — to guide audiences through data and offer flexibility to find insights relevant to them. Data may be the star, but your data story won’t cohere without a mix of additional ingredients.

There are at least four things that you’ll want to incorporate into your data story that go beyond the data visualizations:

1. Context

The first step in a data story is to set the stage. You want to explain to your readers why they should care about what you’re going to tell them? This is also an opportunity to let your reader customize the data they are seeing to make it more relevant to them. A couple of good examples:

2. Educate your readers

Before plunging your audience into a complex or innovative visualization, you want to take some time and space to explain how that visualization works. Tooltips and gradual animation can help the user absorb how to read to the visualization. Try these examples out:

3. Explanation of insights, notes, help

Data stories shouldn’t create more questions than they answer. In some cases, you may want to be explicit about what meaning a reader should take from a visualization.

4. Actions and recommendations

A data story should lead to action. Make some space to explain what recommended actions your readers might take based on the results.

The Pretty Chart Dilemma

Whenever a client says, “We just need the charts to be pretty”, I pause and weigh my response. While clearly placing some value on the user experience with their comment, they clearly miss the point of information design and effective data visualization. My dilemma then becomes, what’s the right response to this statement?

To be clear I’m not rehashing the data visualization aesthetics debate, but wrestling with how to win over product leadership on how to implement data applications the right way, i.e. functional vs. pretty.  When I’m working on a project It's all about delivering a functional design supporting an existing workflow that leads users to a set of results they can act on.  Pretty can’t do that.

The tide has definitely turned.  Product leadership recognizes they must deliver well-designed data applications to customers.  However, even though there are dashboard UI user stories in everyone's upcoming sprints, it’s still treated as something that happens to the display (colors, fonts and charts) and not between the display and the user (insights, actions and to-do lists).  My fear is that when mediocre designs gets implemented and results are mixed, design budgets will only be further cut in the future.  Without a design that gets users to take action and drive improved results, subscription attrition, tepid customer survey responses and low adoption will continue to be the norm.

So, how should we engage with people when they disproportionately place value on pretty?

First, it's important to recognize the pretty chart dilemma in all its manifestations.   It can appear as variations of the following:

  • “The charts need to look better so users will use it."

  • “Make sure the dashboard uses approved company branded colors."

  • “They just need to be prettier than today’s version."

  • “The charts should “pop” for the user."

  • “Can we do a Sankey chart?"

Here’s one of my favorite examples of pretty over functional.  Note the icon relaxing in the upper right corner.  Is the call to action for Vacation Days Utilization ever to take a nap?

Necto Dashboard .png

Here’s how I’ve learned to tackle the pretty chart dilemma.

Address Comments Right Away

The best way to deal with the pretty chart dilemma is to address it immediately when it arises.  Whether that’s during the selling process or mid-project, there’s no better time than the present.  One phrase I’ve repeated several times is, “we don’t focus on pretty, we’re implementing effective.”  I will also emphasize that the changes and design were implementing are intended to impact metrics or goals such as utilization, improved feedback or user attrition. 

Change the Vocabulary

It's also important to have everyone on the team continually use “non-pretty” language with the client. Eventually, they will ask questions or just start mimicking you.  Here are the words and phrases I use and those that I try to avoid.

Encouraged Vocabulary          Discouraged Vocabulary

Useful                                         Good-looking

Distinct                                        Pretty

Well-labeled                              Attractive

Clear Call to Action                    Eye-pleasing

Functional                                  Beautiful

Engaging                                    Cool

Yes, I am avoiding actionable these days as it makes me feel the same way when I hear synergy.

Highlight the Process

Be sure to highlight and emphasize the design process and your principles at work throughout the implementation.   Don’t hide the science or methodology at work during your design phases from the client. Be sure to lead with it.  Also, demonstrate how information design works well with Agile Development.  Often times the Pretty Chart Dilemma arises when product owners are squeezed for time and want to reduce the number of points given to a UX user story in an upcoming sprint.  

Focus on Users not Charts

Often times the dilemma is the result of the product team having the wrong perspective.  They're focused on features, sprint completion and their own preferences vs. those of their users.  This is pretty natural. Don’t be hesitant to ask questions like, “Would a user value this chart type or color palette over a clear call to action?” Consider the Juice Dashboard White paper of 10 years ago as a resource to help with language or audience focus.  Note the white paper title.   Tts not about pretty or beautiful, but designing dashboards that people (users) love

Still not sure how to talk to clients about the Pretty Chart Dilemma?  

Visit the Juice Analytics resource page.  Feel free to use and share the content provided with clients and team to deal with the dilemma, while giving us the proper attribution of course.   Need a more hands on approach then schedule a session to talk through the best approach to handling your specific dilemma.

Getting Data Product Requirements Right

Often customer data products or applications go awry because of poor requirements.  While customers can describe a billing workflow or a mobile app feature, explaining how data should be used is less clear. Merely documenting a wish list of reports, fields and filters is a recipe for low adoption and canceled subscriptions.

To ensure that data requirements are relevant and the solution is useful to customers (profitable too) consider the expression Walking a Mile in their Shoes.   The original expression, which has evolved and used in different forms is before you judge a man walk a mile in his shoes.  Collecting good requirements is less about a laundry list of charts and metrics, but an understanding of how information can transform the business from how it exists today.

In 2017 I had the opportunity to work on an insurance industry project for the first time.  The challenge was to deliver the industry’s first insurance agency analytics solution.  The product team showed us their competitor’s dashboards and suggested we replicate them. The support team demanded more ad-hoc reporting functionality on top of the Crystal Reports report writer.   Customers wanted an embedded BI tool to make themselves more data-driven. Needless to say all parties were miffed when we accommodated none of their requests.

What we did was contrary to what everyone expected.  We didn’t talk about the data (at least not in the beginning) or ask them their report wish list, but strived to understand the drivers of success and behavior within an insurance agency.  To walk in their shoes, we scheduled agency office visits, had discovery meetings with executives, observed workflow and documented data usage.  In the discovery meetings we asked questions related to the end user’s data experience, how and when information was being used and what decisions were made using data.

Here’s a sample of our questions.

Data Consumers (Users)

  1. How well does the user understand the data?

  2. How much expertise do they have in the industry?

  3. What were some examples industry best practices?

  4. Are customers looking for data or insights?

  5. Does the end user trust the data?

Data Consumption

  1. What are some examples of business processes being influenced by data insights?

  2. What are the top 3 questions each audience wants to answer? 

  3. When is data being used and how, e.g. daily, weekly, monthly, in account reviews etc.

  4. How is information currently be displayed and disseminated?


  1. What are the current metrics that measure business success?

  2. What are the key decisions made every day?

  3. What are the decisions not made or delayed because of missing data?

  4. What are the top data conversations had or that need to be improved?

  5. What are the metrics that drive top line revenue?

  6. What business processes will be impacted by this new information?

  7. What are some example actions that might be taken as a result of insights? 


  1. What are the relevant time intervals that information will be updated, distributed and reviewed?

  2. What are the most relevant time comparisons, prior week, prior month, prior year? 

  3. Are these dashboard(s) meant to be exploratory or explanatory in nature?

  4. What offers the most relevant context to the end user?

Getting users to adopt 20 new reports or a single new dashboard can be challenging when habits are already in places. Your best bet for successful data product adoption is to improve existing workflow and/or meetings using the newly uncovered insights.  In the case of the insurance project customers already had access to 300 reports before we created their new analytics solution. 

As it relates to the insurance project our first phase developed three new data applications.

  1. Daily Cash Flow Application (printed) 

  2. Weekly Sales Meeting Dashboard (TV Monitor) 

  3. Monthly Carrier Marketing Meeting Presentation (2 Desktop Dashboards)

These solutions or apps solved specific problems and fit into their existing workflow.  In each case we developed a data application based on industry best practices.  

Just “knowing your audience” isn’t enough to get data requirements right.  Walking in their footsteps means understanding how their business works and how the right insights can impact it.  Some of the other benefits from this approach are:

  • Quantifiable Returns - It was easier to talk about the benefits of a data product when tied to a process where time or effort saved can be measured.

  • Increased Credibility - By taking the time to walk with customers we establish credibility.

  • Improved Stickiness - Tying new applications to existing processes not only aided in adoption, but made them harder to turn off over time with increase usage.

Much of what was discussed above can be found in the Juice design principles, resource page or in Data Fluency; however the quickest way to find our more is to schedule a case study review.  Click the button below, send us a message and indicate your industry.  We’ll walk you through a successful case study in a relevant industry and answer your questions.

The difference between visualization and data storytelling


Preet Bharara, former United States Attorney for the Southern District of New York, shared some thoughts on writing his new book "Doing Justice" (from the New York Times Book Review podcast — starts at 21:30). These lessons mirror the challenges we see when we think about the distinction between making good visualizations and crafting data stories.

Anything of length is difficult in a different way.

Lots of restaurants have good appetizers, but to sustain a great meal through the appetizer then the main course then dessert is more difficult.

Lots of people can write an article, but to sustain a book is difficult.

Lots of movies have great opening scenes, but to sustain it for 2 hours in an arc that is paced properly is a much different thing.

I also struggled with figuring out which stories to tell and which stories not to was too much.

Also the difficulty for me was wanting to write a book that wasn't for lawyers...that is [a book that is] page turning.

That’s a succinct three messages about why data stories are different from making a good visualizations:

  1. It takes a lot more effort to tell the whole story — with a beginning, end, and narrative flow.

  2. Don’t share all your data, just the most important stuff.

  3. Communicating to another data analyst isn’t the goal. You need to be able to communicate to people who don’t have the same foundation of understanding.

Hilburn's Law of Data Intentionality

Hilburn's Law of Data Intentionality identifies the existence of a positive correlation between the intentionality of data collection and and the intentionality of data communication [citation needed].

When a person or organization makes deliberate and purposeful choices about the data gathered, the person or organization tends to place similar weight and effort into the presentation of that data. The negative relationship is also true. Data that is not well-considered or valued is typically presented in ways that show little consideration of a specific message or purpose. 

The following diagram represents this relationship between intentional data collection and intentional data presentation.


The four quadrants represented above can be explained as follows:

(A) LOW intentionality of data collection and presentation.

It is common for large volumes of data to be gathered without premeditated thought about how the data will be used. As a result, the presentation of this data often lacks a specific purpose. An example of this scenario is web analytics platforms that gather a broad array of measures about visitors to a website without a focus on a specific hypotheses or behaviors. This general dashboard or analytics tool approach asks the data audience to find their own intentionality within the data.

Web Analytics Dashboard

Web Analytics Dashboard

(B) HIGH intentionality of data collection but LOW intentionality of data presentation.

We must also consider the exceptions that prove the rule. In this quadrant are the researchers who have invested time, money, and effort into gathering valuable data but neglect to carry that effort forward into data presentation to their audience. Some syndicated research studies, for example, are presented as long written reports with appendices of data tables. Healthcare analytics start-ups and data scientists can find themselves in this quadrant when they lack the time, resources, or training to properly communicate their hard-earned insights.

(C) HIGH intentionality of data collection and presentation.

This quadrant represents the scenarios when there is consistent and persistent effort to extract value from data. Data experts consider what data they want to collect, the message they find in their cultivated data, and to who they want to communicate the results. Creators of market or customer surveys, consultants, and analytics businesses often fall into this category.


(D) LOW intentionality of data collection but HIGH intentionality of data presentation.

Finally, this quadrant is another uncommon scenario. Data visualization students and practitioners will sometimes use standard data sets (e.g. US Census data, import-export data) as an easily accessible raw material for elaborate data presentation. 

It is important to note that every situation calls for its own level of effort, intentionality, and purposefulness, so there are legitimate reasons why someone would choose to invest or not invest in either intentional data collection or presentation.

Your Healthcare Analytics Solution: From Concept to Launch in 100 Days

Below is the video recording of a Juicebox healthcare analytics product webinar. The video is about 45 minutes and includes our tips for successful launches, a quick data product demo, and a Juicebox Q&A session at the end.

Our tips include which project tasks are best done slowly (e.g. getting your data ready) and which tasks need to move fast (e.g putting a working data product prototype in front of customers). Knowing what to do fast and what to do more slowly is a big part of the tips we share on the video.

We also touch on Juicebox strengths and alignment to support fast and slow tasks, such as:

🐇 Design development agility

🐢 Enterprise-grade app development

🐇 Understandable data stories

🐢 Capture your customer needs

Enjoy the webinar. After watching you can schedule a personal demo where we can go into Juicebox or case studies.

Customer-Facing Data Solutions Need a Product Mindset

If most customer data initiatives are risky, have low adoption, and often don’t meet their goals, what steps can we take to improve success rates?

What we’ve learned at Juice over 10 years and hundreds of implementations is that customer-facing data initiatives succeed when you deliver a product, not a project. You need a product mindset to reduce risks, make customers happy, and deliver revenue. Having a product mindset means:

  1. Acting like you are delivering a game release to millions of gamers for a midnight deadline.

  2. Treating every deadline as a missed product launch (with lost revenue).

  3. Recognizing that disappointing users will hurt your organization’s reputation.

Here is an illustration of how a new way of designing and delivering data products looks compared to the traditional IT BI project.

Juicebox Product Launch.png

Here are three things you need to do to improve your odds for your external data initiative, i.e. customer reporting, embedded analytics, customer-facing machine learning product, etc.

1. Set a product launch date

With a product mindset you have a product launch and not a project plan. Product launches involve more stakeholders, like marketing and sale so accountability is extended beyond the project manager. With many teams committed to an on-time date and increased executive visibility, there are lots of folks with skin in the game.

Products (not projects) have a revenue commitment. As a general rule, you won’t get any of marketing or sales time if there isn’t revenue attached to the initiative. One agile process we love to get the organization bought into the initiative are release readiness meetings. If facilitated well, each team actively participates and has a stake in go/no go decisions. This force stakeholders to make commitments and take interest in the outcome.

Missing dates isn’t the real issue; it’s the consequences of missed dates which dramatically affect your data product initiative. Missed dates makes additional funding more challenging as executives question the initiative’s (and team’s) ability to meet goals. It affects your reputation with customers and their confidence in you to deliver. Note the diagram above. As soon as the launch date is missed there is lost revenue and that gets noticed more than a missed project date.

2. Plan for 2.0 version from Day One

When you have a product and agile mindset, it's all about getting a small, valued first release into the market because you know there will be a 2.0 version. With a 2.0 of your data application, you’re not trying for the perfect data product, but one the market will use. This changes the way you gather requirements and prioritize features. When it's only a one-time project, it is easy to go overboard, taking weeks to define your requirements. Here’s a good article describing an MVP product’s functionality and their illustration of how to prioritize features for your initial product.


Talk about 2.0 often, particularly with executives, to set their expectation that funding the initiative is an on-going endeavor and not just a single capital or funding request. When executives and customers know there is a 2.0, then every feature doesn’t have to be in the first release, helping reduce product complexity and likelihood of missed dates. Success improves if you build and share prototypes and explicitly deliver an MVP.

3. Talk about apps, not reports

We’ve touched on this point before, but it bears repeating. When you think of your solution as an application, it forces you to solve a problem. With a product mindset you spend more time on understanding the problem you want to solve vs. capturing feature wish-lists. As a data product manager, your new product mindset and focus on a series of apps vs. one large data project forces you to segment your audiences. You can now target each audience’s specific needs. Now that you’re delivering an app, your users will be more inclined to know when and how to use it. Apps, by their nature, have training and support built-in, reducing risks in implementation and adoption. 

There are some challenges that will arise with an "app focus”.  When delivering data apps, you’re creating a more prescriptive experience for your users. Those users or customers wanting to explore insights and define their own solutions could be disappointed. The right way to address this challenge is to view these data explorers as a unique audience with their own application needs since exploring data and presenting results are different problems.

Sure, we’ve now taken your single data initiative and made it into numerous projects. While it might be scary at first, it is most likely the reality of what your customers expect anyway. You didn’t expect to be able to meet the needs of all customers with a single offering did you? With your new found perspective you’re ready to think about your data initiative not as a single project, but eventually as a portal or app store to deliver many new products.

The Product Mindset

The product mindset is rooted in adopting agile methodology. The big IT project mentality hurts your customer data initiative and adds more risk. There are many excuses as to why folks don’t want to adopt a product mindset or agile for BI projects. We’ve heard them all; however if you feel like your organization wants to be part of the data economy, then building and launching data products is in your future.  

Want to learn more about our formula for successful customer data initiatives and see what a successful data product looks like then schedule a minute discussion below.

What does 'Bandersnatch' teach us about data storytelling?


“TV of tomorrow is now here.” So says The Guardian. The TV show that brought us into the future is Bandersnatch, the recently released interactive television show from the Black Mirror anthology. Bandersnatch is sort of modern reincarnation of the Choose Your Own Adventure books of your childhood. Some reviewers raved about the new experience:

"This is what Bandersnatch gave me that no other movie had ever been able to. Before you say it, I know it’s just an illusion of choice and I had no real control over what played out on screen, but it still provided me with more influence on a narrative that wasn’t my own than I’d ever had before, and I revelled in the possibilities. To me, Bandersnatch is both a movie and a game and something entirely new. It’s a lesson in human psychology, a thesis on the illusion free will, and one hell of an entertaining few hours all rolled into one. And, perhaps most importantly, it’s only the beginning.” Lauren O’Callaghan, Gamesradar

For me, my intrigue with Bandersnatch relates to its similarities with our Juicebox-created interactive data stories. I thought it was worth examining some of the technical and narrative challenges faced by this TV experiment to find lessons for data storytelling.

The very first lesson: Be careful about using the term “Choose Your Own Adventure.” Netflix is getting sued. The other lessons fall into a couple of categories: 1) functionality for interactive storytelling; 2) understanding the audience needs.

Essential Functionality for Interactive Storytelling

Netflix appreciates the necessity of teaching their audience how to use the interactive experience. From the very beginning of the show, you are confronted with the selection mechanism, even before the show begins. To validate that the audience is learning how it works, your next couple of choices are trivial ones, i.e. which type of cereal or music our character would like. These interactions build a sense of comfort before some of the more dreadful decisions arrive.


In Bandersnatch, as in most analytical exercises, it is possible to make choices that result in a dead-end to your story. For example, when our hero Stefan decides to produce his game with a game company, we quickly learn that this choice won’t result in a realization of his gaming vision. Netflix provides a quick mechanism to teach you that this was a wrong-turn and gets you back to a place in the story where you can pick a better option. Imagine the same careful handholding in an Excel PivotTable: at the moment when you choose an option that creates one of those 200 column nightmares, Excel instantly asks if you’d like to make a better choice. If only.

Perhaps Netflix’s greatest accomplishment is with their underlying technology for Bandersnatch that enables a seamless video experience. As a viewer, you make a choice and the video immediately launches you into the next chapter of the story. The same seamless flow needs to exist in data stories; make a selection, immediately see the results. It is a requirement to make users excited to explore the world you are creating.

The Challenges of Interactive Storytelling

Choose Your Own Adventure-style storytelling comes with its own challenges. The author needs to define many endings. Each permutation needs to find a way to connect — all while still developing characters and themes. In fact, one theme of Bandersnatch is that the complexity of this type of storytelling may drive a person to madness. Here’s a look at the decision-tree behind the show:

Redditer u/alpine’s flow chart of Bandersnatch decision-tree

Redditer u/alpine’s flow chart of Bandersnatch decision-tree

What else do we need to consider in telling stories of this form?

To start, we might consider how different audiences react differently to the injection of choice into their entertainment. For every person who is intrigued by asserting more control over the story, there will be others who are looking for passive entertainment.

"Even the most complex, arresting, emotionally draining show is essentially escapism because all the work is done for us.” …do people want to be the decision makers? It is hard work.

As a viewer, I will wave from the shores of traditional TV, happy to be spoonfed my entertainment and hoping that the young folk are having fun.” — The Guardian

Secondly, we have to ask: Does choice come at the expense of characters, coherence, and clarity of story telling? Bandersnatch struggled to build interesting characters. Traditional stories control the audience’s perception of a character at all times, and therefore can build a foundation of what makes that character work, layer by layer. By ceding control of that process to the audience, the author provides a collection of character “bricks” that haven’t been constructed into anything.

"It rarely deviated from the expected deviance, rarely landed in an unexpected place or – and this was where it most resembled its videogaming ancestry – had energy to spare to make the characters much more than ciphers.” — The Guardian

Finally, the audience of an interactive story has to ask themselves (just as Stefan asks): Are we really in control of our choices, or is there a hidden power that is flipping the switches? Are we only getting the illusion of choice?

Bandersnatch is mostly satire, too, but the "gameplay" jumps around a confusing timeline, making you repeat past scenes with different decisions. How you interact with your therapist, whether you agree to take drugs, and if you manage to open a secret safe, for example, all bring you down different paths and to several Game Over screens. These soft endings then send you back to earlier scenes, so you can choose the "correct" choice to further your progress. You have to do this numerous times to eventually receive a true ending. The concept of "right" and "wrong" choices bothered me and cornered me into decisions I didn't want to pick. — Elise Favis, Game Informer

For TV, this is an evolving entertainment form. In the world of data, we are also creating an evolving communication form. How do we find the balance of choice and flexibility with message? How can we engage and entertain without heaping the burden of authorship on an audience? These are questions we’ll need to continue to explore.