Explaining Data Teams to HR

The importance of a good team to build data solutions can’t be underemphasized. If you’ve read anything like Francois Ajenstat’s recent Forbes article or Roger Pen’s e-book on building data science effective teams you get many of the key points; however I would argue that in addition to these points you need to invest time with your Human Resources (HR) team and make them an integral part of the success. Developing their data literacy should be part of your objective to building a successful team.

The following isn’t a prescription for a single conversation, presentation or analysis for your HR team, but a way to develop their data literacy around what constitutes a great data team.

Skill Diversity

Your HR team will focus on inclusion and diversity, but may not understand diverse skills and experiences and how they contribute to creating great dashboards, models, etc. At Juice we’ve found on numerous occaisions that Zach’s experience in digital marketing has opened new insights or ways of designing valuable healthcare analytics solutions. It could be just asking a different question or offering up a solution to a similar problem in a different industry (btw, funnel visualizations in healthcare are amazing). If everyone on the team is from the same industry or has similar experiences how do we get the HR team view this as a concern or red flag? How do we convince them to find someone with complementary skills and background?

The right way to think about skills is less about an individual’s skills, but about the team’s overall skill set. Your series of HR conversations should be an understanding of what the team is good at, where are their blindspots and what skills are needed. Also, HR needs to understand how to ask questions like, “Describe to me some of what you’ve built in Python and how users were impacted” vs. “How many years of Python experience do you have?”

Fit

One of my favorite quotes I’ve read recently on building data teams is “Hire people, not experience.” It comes from this piece on Medium by Murilo Nigris. How do we get to know people? Rather than ask them to talk about the tools they’ve worked with convince them to tell you story. Its in those stories that you’ll understand their values and priorities.

For the HR team to be able to assess fit you need to decide what your team’s value are. Do you value speed, creativity, production quality code or collaboration? All of the above is not a valid answer. Give your HR team 3 to 5 values to screen for. Take the time to explain why these values matter. Include examples of how someone on the team currently exhibits these values and how they makes you successful.

Defining and describing values will sound like a lot of work; however it will be a fraction of the effort of having to let someone go because they weren’t a fit.

Data-Driven Job Descriptions

Many of the job descriptions I read for data positions are painful to read. The biggest miss in my mind is I never really know what this person will be doing exactly on Day 1 or Day 500. Your job descriptions should read more like these on the Salesloft website.

WITHIN ONE MONTH, YOU’LL:

  • Build a prototype application that will be posted on our website.

  • Completely data visualization online class to bring your data literacy vocabulary in line with the team.

WITHIN SIX MONTHS, YOU’LL:

  • Conduct product feedback interviews to gather feedback on existing features, and speak to new features coming.

  • Successfully lead a scrum team by running planning meetings daily.

A nice benefit is that a job description written like this becomes the individual’s performance plans and goals if they are hired. Here’s a template from the Google offers a way to think about job descriptions as another example. https://hire.google.com/job-description-template/

Skills Assessment

Most new candidates have to go through some technical assessment. Make sure your HR team is involved with the assessment. Don’t let them punt involvement in the skill assessment because its “too technical”. If you can’t explain the skill assessment or if they don’t understand its desired goals then you have a problem. Use the opportunity to explain the assessment as one way to develop their data literacy. They can also see if you have any blindspots in the assessment and to make sure there isn’t bias in your assessment.

Also, make sure that they know skills assessment changes as new technologies are adopted and implemented, so it's never a static test.

Recruiting Talent

Often you are sharing the HR team with other departments. As a result, the amount of time that HR will actively recruit new candidates is limited. In my experience the HR team will send you 3 to 5 candidates or resumes and if you elect not to choose any then you’re completely dependent on whoever finds your website.

When discussing recruiting efforts with your HR team ask the following questions:

  1. What is your time commitment on this opening?

  2. What kinds of efforts will we make to find candidates that are probably already employed?

  3. What can our team do to supplement your efforts? (What are we allowed to do?)

  4. Are there any monetary incentives for us to find our own candidates?

  5. Are OPT candidates a viable option? Do they understand OPT?

https://www.internationalstudentinsurance.com/opt/what-is-opt.php

After your disappointment diminishes, here are some items you can take to supplement their efforts:

  1. Have your team share the job posting link with their social networks

  2. Volunteer to present at local meetups, events, universities and conferences. Try to do at least 2 per position.

This will seem like a lot of work, but building models, visualizations and data solutions without a full team is time consuming too. Note that the lessons above are very applicable to bringing on contractors or consultants to your data team as well.

The initial HR conversations will be hard, but keep the dialogue going even when you don’t have openings.

To learn more about data culture and teams make sure to get your copy of Data Fluency, Empowering Your Organization with Effective Data Communication. If your timeline for your customer facing data project doesn’t include time to get HR on board and fluent, reach out to us to learn how the Juicebox platform can handle some of the challenges with getting the right data team in place.

The 2020 Twitter Election: Explore the 20+ Democratic Candidates

Our goal at Juice is to give everyone the ability to design and share compelling data stories. We're always inspired by the The New York Times information design group (and many other data story authors). We want to bring this kind of data communication to every organization.

However, we sometimes forget to share publicly all the cool stuff we can do. We’re going to fix that, starting with this data story, an exploration of democratic presidential candidates and their influence on Twitter. It was crafted as a passion project by our very own Susan Brake.

She set out to answer a few key questions:

How do the candidates compare in the reach of their Twitter audience?

Who has Twitter momentum?

What are the candidates saying on Twitter that is drawing the most attention?

Give it a try. I expect you’ll learn something and enjoy the journey.

If you like it, keep in mind that we work with organizations of all types — start-ups, non-profits, large enterprises, and the public sector — to help them tell the stories in their data.

2019 Data Summer Reading List

“Deep summer is when laziness finds respectability.”

Sam Keen

Now that Summer is here it’s a great time to recharge our batteries. Whether it’s a much needed vacation, a nap in the hammock, hours watching soccer games or curling up with a good book. Here are the books that made it onto the Juice Summer reading list this year. We’ve started some of them, but plan to get through the entire list by Labor Day.

Screen Shot 2019-06-16 at 09.08.44.png

After hearing Alberto speak recently in Atlanta on his book tour we added it to our list. We’re sure it will make it to the Juice reference library along with his other books.

 
Screen Shot 2019-06-16 at 09.08.11.png

This book was on Bill Gates Summer reading list last year and we’re finally getting around to reading it. Each chapter tells a great story about how to think about data in the context of real life.

 
Screen Shot 2019-06-16 at 09.56.49.png

This book has gotten a lot of interest in the data visualization community, so hard to ignore it and not make it a focal part of our Summer.

 
Screen Shot 2019-06-16 at 09.57.45.png

As Juicebox supports data storytelling at scale, we love to read anything we can get our hands on about stories. This one came highly recommended to us.

 
Screen Shot 2019-06-16 at 09.58.17.png

We’re always up for some clever humor. This book fits the bill and just skimming us made us laugh.

 
IMG-3084.JPG

This book is a beautiful compilation of maps and hard to put down. Very enjoyable to skim and appreciate the illustrations on a rainy day or Summer afternoon.

Your Data Story Needs More Than Data

Data stories use the techniques of traditional storytelling — narrative flow, context, structure — to guide audiences through data and offer flexibility to find insights relevant to them. Data may be the star, but your data story won’t cohere without a mix of additional ingredients.

There are at least four things that you’ll want to incorporate into your data story that go beyond the data visualizations:

1. Context

The first step in a data story is to set the stage. You want to explain to your readers why they should care about what you’re going to tell them? This is also an opportunity to let your reader customize the data they are seeing to make it more relevant to them. A couple of good examples:

2. Educate your readers

Before plunging your audience into a complex or innovative visualization, you want to take some time and space to explain how that visualization works. Tooltips and gradual animation can help the user absorb how to read to the visualization. Try these examples out:

3. Explanation of insights, notes, help

Data stories shouldn’t create more questions than they answer. In some cases, you may want to be explicit about what meaning a reader should take from a visualization.

4. Actions and recommendations

A data story should lead to action. Make some space to explain what recommended actions your readers might take based on the results.

The Pretty Chart Dilemma

Whenever a client says, “We just need the charts to be pretty”, I pause and weigh my response. While clearly placing some value on the user experience with their comment, they clearly miss the point of information design and effective data visualization. My dilemma then becomes, what’s the right response to this statement?

To be clear I’m not rehashing the data visualization aesthetics debate, but wrestling with how to win over product leadership on how to implement data applications the right way, i.e. functional vs. pretty.  When I’m working on a project It's all about delivering a functional design supporting an existing workflow that leads users to a set of results they can act on.  Pretty can’t do that.

The tide has definitely turned.  Product leadership recognizes they must deliver well-designed data applications to customers.  However, even though there are dashboard UI user stories in everyone's upcoming sprints, it’s still treated as something that happens to the display (colors, fonts and charts) and not between the display and the user (insights, actions and to-do lists).  My fear is that when mediocre designs gets implemented and results are mixed, design budgets will only be further cut in the future.  Without a design that gets users to take action and drive improved results, subscription attrition, tepid customer survey responses and low adoption will continue to be the norm.

So, how should we engage with people when they disproportionately place value on pretty?

First, it's important to recognize the pretty chart dilemma in all its manifestations.   It can appear as variations of the following:

  • “The charts need to look better so users will use it."

  • “Make sure the dashboard uses approved company branded colors."

  • “They just need to be prettier than today’s version."

  • “The charts should “pop” for the user."

  • “Can we do a Sankey chart?"

Here’s one of my favorite examples of pretty over functional.  Note the icon relaxing in the upper right corner.  Is the call to action for Vacation Days Utilization ever to take a nap?

Necto Dashboard .png

Here’s how I’ve learned to tackle the pretty chart dilemma.

Address Comments Right Away

The best way to deal with the pretty chart dilemma is to address it immediately when it arises.  Whether that’s during the selling process or mid-project, there’s no better time than the present.  One phrase I’ve repeated several times is, “we don’t focus on pretty, we’re implementing effective.”  I will also emphasize that the changes and design were implementing are intended to impact metrics or goals such as utilization, improved feedback or user attrition. 

Change the Vocabulary

It's also important to have everyone on the team continually use “non-pretty” language with the client. Eventually, they will ask questions or just start mimicking you.  Here are the words and phrases I use and those that I try to avoid.

Encouraged Vocabulary          Discouraged Vocabulary

Useful                                         Good-looking

Distinct                                        Pretty

Well-labeled                              Attractive

Clear Call to Action                    Eye-pleasing

Functional                                  Beautiful

Engaging                                    Cool

Yes, I am avoiding actionable these days as it makes me feel the same way when I hear synergy.

Highlight the Process

Be sure to highlight and emphasize the design process and your principles at work throughout the implementation.   Don’t hide the science or methodology at work during your design phases from the client. Be sure to lead with it.  Also, demonstrate how information design works well with Agile Development.  Often times the Pretty Chart Dilemma arises when product owners are squeezed for time and want to reduce the number of points given to a UX user story in an upcoming sprint.  

Focus on Users not Charts

Often times the dilemma is the result of the product team having the wrong perspective.  They're focused on features, sprint completion and their own preferences vs. those of their users.  This is pretty natural. Don’t be hesitant to ask questions like, “Would a user value this chart type or color palette over a clear call to action?” Consider the Juice Dashboard White paper of 10 years ago as a resource to help with language or audience focus.  Note the white paper title.   Tts not about pretty or beautiful, but designing dashboards that people (users) love

Still not sure how to talk to clients about the Pretty Chart Dilemma?  

Visit the Juice Analytics resource page.  Feel free to use and share the content provided with clients and team to deal with the dilemma, while giving us the proper attribution of course.   Need a more hands on approach then schedule a session to talk through the best approach to handling your specific dilemma.

Getting Data Product Requirements Right

Often customer data products or applications go awry because of poor requirements.  While customers can describe a billing workflow or a mobile app feature, explaining how data should be used is less clear. Merely documenting a wish list of reports, fields and filters is a recipe for low adoption and canceled subscriptions.

To ensure that data requirements are relevant and the solution is useful to customers (profitable too) consider the expression Walking a Mile in their Shoes.   The original expression, which has evolved and used in different forms is before you judge a man walk a mile in his shoes.  Collecting good requirements is less about a laundry list of charts and metrics, but an understanding of how information can transform the business from how it exists today.

In 2017 I had the opportunity to work on an insurance industry project for the first time.  The challenge was to deliver the industry’s first insurance agency analytics solution.  The product team showed us their competitor’s dashboards and suggested we replicate them. The support team demanded more ad-hoc reporting functionality on top of the Crystal Reports report writer.   Customers wanted an embedded BI tool to make themselves more data-driven. Needless to say all parties were miffed when we accommodated none of their requests.

What we did was contrary to what everyone expected.  We didn’t talk about the data (at least not in the beginning) or ask them their report wish list, but strived to understand the drivers of success and behavior within an insurance agency.  To walk in their shoes, we scheduled agency office visits, had discovery meetings with executives, observed workflow and documented data usage.  In the discovery meetings we asked questions related to the end user’s data experience, how and when information was being used and what decisions were made using data.

Here’s a sample of our questions.

Data Consumers (Users)

  1. How well does the user understand the data?

  2. How much expertise do they have in the industry?

  3. What were some examples industry best practices?

  4. Are customers looking for data or insights?

  5. Does the end user trust the data?

Data Consumption

  1. What are some examples of business processes being influenced by data insights?

  2. What are the top 3 questions each audience wants to answer? 

  3. When is data being used and how, e.g. daily, weekly, monthly, in account reviews etc.

  4. How is information currently be displayed and disseminated?

Decision-Making

  1. What are the current metrics that measure business success?

  2. What are the key decisions made every day?

  3. What are the decisions not made or delayed because of missing data?

  4. What are the top data conversations had or that need to be improved?

  5. What are the metrics that drive top line revenue?

  6. What business processes will be impacted by this new information?

  7. What are some example actions that might be taken as a result of insights? 

Data

  1. What are the relevant time intervals that information will be updated, distributed and reviewed?

  2. What are the most relevant time comparisons, prior week, prior month, prior year? 

  3. Are these dashboard(s) meant to be exploratory or explanatory in nature?

  4. What offers the most relevant context to the end user?

Getting users to adopt 20 new reports or a single new dashboard can be challenging when habits are already in places. Your best bet for successful data product adoption is to improve existing workflow and/or meetings using the newly uncovered insights.  In the case of the insurance project customers already had access to 300 reports before we created their new analytics solution. 

As it relates to the insurance project our first phase developed three new data applications.

  1. Daily Cash Flow Application (printed) 

  2. Weekly Sales Meeting Dashboard (TV Monitor) 

  3. Monthly Carrier Marketing Meeting Presentation (2 Desktop Dashboards)

These solutions or apps solved specific problems and fit into their existing workflow.  In each case we developed a data application based on industry best practices.  

Just “knowing your audience” isn’t enough to get data requirements right.  Walking in their footsteps means understanding how their business works and how the right insights can impact it.  Some of the other benefits from this approach are:

  • Quantifiable Returns - It was easier to talk about the benefits of a data product when tied to a process where time or effort saved can be measured.

  • Increased Credibility - By taking the time to walk with customers we establish credibility.

  • Improved Stickiness - Tying new applications to existing processes not only aided in adoption, but made them harder to turn off over time with increase usage.

Much of what was discussed above can be found in the Juice design principles, resource page or in Data Fluency; however the quickest way to find our more is to schedule a case study review.  Click the button below, send us a message and indicate your industry.  We’ll walk you through a successful case study in a relevant industry and answer your questions.



The difference between visualization and data storytelling

25senior2-articleLarge-v3.jpg

Preet Bharara, former United States Attorney for the Southern District of New York, shared some thoughts on writing his new book "Doing Justice" (from the New York Times Book Review podcast — starts at 21:30). These lessons mirror the challenges we see when we think about the distinction between making good visualizations and crafting data stories.

Anything of length is difficult in a different way.

Lots of restaurants have good appetizers, but to sustain a great meal through the appetizer then the main course then dessert is more difficult.

Lots of people can write an article, but to sustain a book is difficult.

Lots of movies have great opening scenes, but to sustain it for 2 hours in an arc that is paced properly is a much different thing.

I also struggled with figuring out which stories to tell and which stories not to tell...it was too much.

Also the difficulty for me was wanting to write a book that wasn't for lawyers...that is [a book that is] page turning.

That’s a succinct three messages about why data stories are different from making a good visualizations:

  1. It takes a lot more effort to tell the whole story — with a beginning, end, and narrative flow.

  2. Don’t share all your data, just the most important stuff.

  3. Communicating to another data analyst isn’t the goal. You need to be able to communicate to people who don’t have the same foundation of understanding.

Hilburn's Law of Data Intentionality

Hilburn's Law of Data Intentionality identifies the existence of a positive correlation between the intentionality of data collection and and the intentionality of data communication [citation needed].

When a person or organization makes deliberate and purposeful choices about the data gathered, the person or organization tends to place similar weight and effort into the presentation of that data. The negative relationship is also true. Data that is not well-considered or valued is typically presented in ways that show little consideration of a specific message or purpose. 

The following diagram represents this relationship between intentional data collection and intentional data presentation.

Hilburns_Law.png

The four quadrants represented above can be explained as follows:

(A) LOW intentionality of data collection and presentation.

It is common for large volumes of data to be gathered without premeditated thought about how the data will be used. As a result, the presentation of this data often lacks a specific purpose. An example of this scenario is web analytics platforms that gather a broad array of measures about visitors to a website without a focus on a specific hypotheses or behaviors. This general dashboard or analytics tool approach asks the data audience to find their own intentionality within the data.

Web Analytics Dashboard

Web Analytics Dashboard

(B) HIGH intentionality of data collection but LOW intentionality of data presentation.

We must also consider the exceptions that prove the rule. In this quadrant are the researchers who have invested time, money, and effort into gathering valuable data but neglect to carry that effort forward into data presentation to their audience. Some syndicated research studies, for example, are presented as long written reports with appendices of data tables. Healthcare analytics start-ups and data scientists can find themselves in this quadrant when they lack the time, resources, or training to properly communicate their hard-earned insights.

(C) HIGH intentionality of data collection and presentation.

This quadrant represents the scenarios when there is consistent and persistent effort to extract value from data. Data experts consider what data they want to collect, the message they find in their cultivated data, and to who they want to communicate the results. Creators of market or customer surveys, consultants, and analytics businesses often fall into this category.

Data-story.jpg

(D) LOW intentionality of data collection but HIGH intentionality of data presentation.

Finally, this quadrant is another uncommon scenario. Data visualization students and practitioners will sometimes use standard data sets (e.g. US Census data, import-export data) as an easily accessible raw material for elaborate data presentation. 

It is important to note that every situation calls for its own level of effort, intentionality, and purposefulness, so there are legitimate reasons why someone would choose to invest or not invest in either intentional data collection or presentation.

Your Healthcare Analytics Solution: From Concept to Launch in 100 Days

Below is the video recording of a Juicebox healthcare analytics product webinar. The video is about 45 minutes and includes our tips for successful launches, a quick data product demo, and a Juicebox Q&A session at the end.

Our tips include which project tasks are best done slowly (e.g. getting your data ready) and which tasks need to move fast (e.g putting a working data product prototype in front of customers). Knowing what to do fast and what to do more slowly is a big part of the tips we share on the video.

We also touch on Juicebox strengths and alignment to support fast and slow tasks, such as:

🐇 Design development agility

🐢 Enterprise-grade app development

🐇 Understandable data stories

🐢 Capture your customer needs

Enjoy the webinar. After watching you can schedule a personal demo where we can go into Juicebox or case studies. https://www.juiceanalytics.com/get-started

Customer-Facing Data Solutions Need a Product Mindset

If most customer data initiatives are risky, have low adoption, and often don’t meet their goals, what steps can we take to improve success rates?

What we’ve learned at Juice over 10 years and hundreds of implementations is that customer-facing data initiatives succeed when you deliver a product, not a project. You need a product mindset to reduce risks, make customers happy, and deliver revenue. Having a product mindset means:

  1. Acting like you are delivering a game release to millions of gamers for a midnight deadline.

  2. Treating every deadline as a missed product launch (with lost revenue).

  3. Recognizing that disappointing users will hurt your organization’s reputation.

Here is an illustration of how a new way of designing and delivering data products looks compared to the traditional IT BI project.

Juicebox Product Launch.png

Here are three things you need to do to improve your odds for your external data initiative, i.e. customer reporting, embedded analytics, customer-facing machine learning product, etc.

1. Set a product launch date

With a product mindset you have a product launch and not a project plan. Product launches involve more stakeholders, like marketing and sale so accountability is extended beyond the project manager. With many teams committed to an on-time date and increased executive visibility, there are lots of folks with skin in the game.

Products (not projects) have a revenue commitment. As a general rule, you won’t get any of marketing or sales time if there isn’t revenue attached to the initiative. One agile process we love to get the organization bought into the initiative are release readiness meetings. If facilitated well, each team actively participates and has a stake in go/no go decisions. This force stakeholders to make commitments and take interest in the outcome.

Missing dates isn’t the real issue; it’s the consequences of missed dates which dramatically affect your data product initiative. Missed dates makes additional funding more challenging as executives question the initiative’s (and team’s) ability to meet goals. It affects your reputation with customers and their confidence in you to deliver. Note the diagram above. As soon as the launch date is missed there is lost revenue and that gets noticed more than a missed project date.

2. Plan for 2.0 version from Day One

When you have a product and agile mindset, it's all about getting a small, valued first release into the market because you know there will be a 2.0 version. With a 2.0 of your data application, you’re not trying for the perfect data product, but one the market will use. This changes the way you gather requirements and prioritize features. When it's only a one-time project, it is easy to go overboard, taking weeks to define your requirements. Here’s a good article describing an MVP product’s functionality and their illustration of how to prioritize features for your initial product.

mvp-priorities.png

Talk about 2.0 often, particularly with executives, to set their expectation that funding the initiative is an on-going endeavor and not just a single capital or funding request. When executives and customers know there is a 2.0, then every feature doesn’t have to be in the first release, helping reduce product complexity and likelihood of missed dates. Success improves if you build and share prototypes and explicitly deliver an MVP.

3. Talk about apps, not reports

We’ve touched on this point before, but it bears repeating. When you think of your solution as an application, it forces you to solve a problem. With a product mindset you spend more time on understanding the problem you want to solve vs. capturing feature wish-lists. As a data product manager, your new product mindset and focus on a series of apps vs. one large data project forces you to segment your audiences. You can now target each audience’s specific needs. Now that you’re delivering an app, your users will be more inclined to know when and how to use it. Apps, by their nature, have training and support built-in, reducing risks in implementation and adoption. 

There are some challenges that will arise with an "app focus”.  When delivering data apps, you’re creating a more prescriptive experience for your users. Those users or customers wanting to explore insights and define their own solutions could be disappointed. The right way to address this challenge is to view these data explorers as a unique audience with their own application needs since exploring data and presenting results are different problems.

Sure, we’ve now taken your single data initiative and made it into numerous projects. While it might be scary at first, it is most likely the reality of what your customers expect anyway. You didn’t expect to be able to meet the needs of all customers with a single offering did you? With your new found perspective you’re ready to think about your data initiative not as a single project, but eventually as a portal or app store to deliver many new products.

The Product Mindset

The product mindset is rooted in adopting agile methodology. The big IT project mentality hurts your customer data initiative and adds more risk. There are many excuses as to why folks don’t want to adopt a product mindset or agile for BI projects. We’ve heard them all; however if you feel like your organization wants to be part of the data economy, then building and launching data products is in your future.  

Want to learn more about our formula for successful customer data initiatives and see what a successful data product looks like then schedule a minute discussion below.