customeranalytics

Survey Results: Are the Viz-Pundits Really Helping?

A few weeks ago Juice asked our readers to give us a few insights into whether or not we and other info-viz sites are actually helping them and their organizations be more effective at communicating information.

Well, the time has come to take a look at the results (oooh - pins and needles). The survey was way more popular than we expected, receiving well over 500 responses.

We had a few questions that were of the form "select the answer that best describes you" but, for the most part, we focussed on text based answers so that we could try to avoid directing the answers and could demonstrate some non-traditional visualization styles to explore results. As a side note, the open ended answers to the text based questions were truly intriguing to read - hopefully the presentation of the results below will give you a small insight to what we learned.

So, here are the results.

Survey Results

The first section of questions dealt with getting some context about our readers. Since the questions were multiple choice, we’re showing the results in traditional bar chart format.

Question 1

In terms of size, which of the following is your company most like?

  • A one man band
  • The Dirty Dozen
  • The University of Rhode Island
  • Microsoft
Q1: Company Size
Question 2

In terms of information presentation expertise, who do you see yourself as?

  • The Excel Chart Wizard incarnate (I’m happy with the quickest route)
  • Harold and the Purple Crayon (I’m pretty good, but not too finicky)
  • A Tufte clone (every chart is carefully and lovingly crafted with intention)
Q2: Expertise
Question 3

If your company were stuck on Gilligan’s Island, would you be able to use information presentation to get rescued?

  • No, Gilligan keeps using our Tufte books to prop up the break room table.
  • Maybe. The Skipper rigged up this island beacon system using coconuts, vines, and tiki torches.
  • You betcha! The Professor could build a huge island sized information display that could be seen, understood, and acted upon by the astronauts on the International Space Station.
Q3: Escape from Gilligan’s Island
Question 4

What two information sources do you most frequently use for information presentation tips, trends, and best practices?

  • BI Vendor’s website (e.g., Business Objects, Tableau, Cognos, etc.)
  • The Dashboard Spy
  • Dashboards by Example
  • FlowingData
  • Infographic News
  • Information Aesthetics
  • Jorge Camoes’ Charts
  • Juice Analytics
  • Junk Charts
  • Tufte’s web Site
  • Visual Business Intelligence (Stephen Few’s site)
  • VizThink
  • Other
Q4: Popular Sites

However, What we really want to know is what sites are most closely related. So we tried looking at them with a phrase net from ManyEyes:

Q4: Phrasenet

( You can experiment with it yourself here. )

This is a great way to demonstrate how sites are "connected". We see a very strong relationship between Juice and the other non-Juice sites, but not a strong relationship between the non-Juice sites, themselves. In retrospect, the question would have been more effective had we asked respondents for their "top three or four" sites (approximately: total number of options ÷ 3).

The next group of questions were crafted to help us understand the problems our users and their organizations are encountering when it comes to presenting information to stakeholders and users. For most of these questions we broke the number one rule in surveys: stay away from text based answers.

Question 5

Using one word for each, list three things that you most frequently find useful from these sources?

Q5: Tag Cloud

( You can experiment with it yourself here. )

This was one of the most useful result sets and clearly shows that people like examples and new ideas for visualizations, followed by tips on how to get it done. (I’m hoping this post meets all of those criteria to some level.)

Question 6

Within your organization, would you say the understanding of information visualization best practices is:

  • Staying the same
  • Improving
Q6: Improving?
Question 7

What one word describes the biggest barrier to improved information presentation at your company?

I selected a Wordle (as opposed to a tag cloud) for questions 7 and 8 because I wanted to see the results in a way that would give me the general feeling of the barriers and benefits - I wanted the answers to spur some sort of emotive response. I think a Wordle does this better than a tag cloud.

Q7: Barriers

( You can experiment with it yourself here. )

Question 8

What one word describes the biggest boon to improved information presentation at your company?

Q8: Benefits

( You can experiment with it yourself here. )

While the "barriers" answers were interesting, there are some real nuggets hidden in these "benefits" results.

Question 9

Finish this sentence: "My company would be oh so much better at information presentation if we just had..."

What we really want to know is what are the patterns and relationships between words. Having said that, the most common words are still interesting to see:

Q9: What would be better?

( You can experiment with it yourself here. )

But, we are really interested in the word patterns. So, we used the Juice search patterns tools Concentrate to identify patterns. The top patterns were

Pattern

Count

more X

76

more time X

30

better X

29

X data

15

X time

15

more time to X

14

time X

12

a better X

11

X data.

9

X more time

9

people X

8

more people X

7

more resources X

6

the right X

6

more people who X

5

people who X

5

time to X

5

more time and X

4

Now, if we look at how the "non-common" words relate visually, here’s what we get:

Q9: Phrasenet

( You can experiment with it yourself here. )

Question 10

Finish this sentence: "If I were to advise someone on how to best improve your capability to create really useful information presentation solutions, I’d say don’t forget..."

Again, it’s interesting to see the most commonly used words:

Q10: How to improve

( You can experiment with it yourself here. )

But the most value again comes from looking at the phrase net:

Q10: Phrasenet

( You can experiment with it yourself here. )

Question 11

Finally, we’re going to post results on our blog for free download. However, if you want us to notify you when the report is ready, please provide your email address below.(And because we have a large international following, please add your country as well, if you don’t mind. Why? ’cuz we’re just curious. Thanks!)

So, we’re going to show only the countries here, no email addresses (whew!). Let’s start with looking at the standard distribution:

Q11: Respondent Countries

And here’s the geographic representation from Many Eyes:

Q11: Many Eyes Map

( You can experiment with it yourself here. )

But, having looked at that, I thought it might be a little more interesting to look at the country locations like this (text sized based on number of participants):

Q11: Country Cloud

Additional Insights

And that was all of the questions that were in the survey. However, I thought some of the multiple choice "context" question required just a bit more analysis; there were some questions I still had that weren’t yet answered. So, I loaded the data into Tableau’s Public version of their application to give a little more analysis flexibility. Here is the dashboard I created to better understand expertise:

Characteristics of expertise
Characteristics of expertise

Powered by Tableau

What this shows is that organizations that are more capable of responding to tough information presentation challenges have a substantially higher ration of "Tufte Clones".

And this made me wonder how skills basis might be impacting different sizes of companies:

What companies are improving?
What companies are improving?

Powered by Tableau

A pretty nice linear correlation between company size and improvement trends, don’t you think?

You made it to the end!

This post turned out to be much longer than I wanted it to be, but hopefully you found it interesting and learned a few things about your fellow readers and how to display different kinds of survey responses. If you have other insights you think you see, please comment below! Thanks for participating!

Survey Analysis Grows Up with SurveyVisualizer

Luc Girardin of Macrofocus contacted us in response to our post "When Will Survey Analysis Grow Up?" to point us to their SurveyVisualizer analysis tool. I had a little time this weekend to download and play with this application. There is a lot to like.

SurveyVisualizer is designed for surveys that have a hierachical or tree structure. Luc describes the relevant data structure in a background paper about the product:

The questions—also called quality criteria—are then aggregated into 23 quality dimensions (e.g. network quality, ticketing, cleanliness, security, reliability). They represent the level of satisfaction with a whole group of questions pertaining to a particular issue. The quality dimensions themselves are further aggregated into three different customer satisfaction indices, reflecting the different areas of responsibility.

The free download has multiple satisfaction "criteria" (e.g. friendliness of crew) roll up to "dimensions" (e.g. cabin crew) which fall under "indices" (e.g. index of flight services). This may be an appropriate structure for a satisfaction survey—but it isn’t one I’ve encountered before.

Despite this limitation, the analysis capabilities delivered by SurveyVisualizer are intuitive and innovative. For example, all your survey data is displayed at once in a kind of relational map. This lets users visually identify patterns in the full set of results. Each of the vertical hashes represents a question or roll-up of questions. Clicking on any one of these hashes highlights the hierarchical relationships. The "ghost" lines represent the results across questions for a multitude of dimensions or respondent types.

SurveyVisualizer 1

Users have the ability to select specific dimensions to identify patterns in the corresponding results. An easy-to-use interface lets you choose a dimension then apply a color to the line within the relational map.

SurveyVisualizer 2

Also, users can click on individual display lines to investigate the results (e.g. I wonder who had that particularly crappy score for flight delays?)

If your analysis requirements don’t fit this particular structure, Macrofocus has a more general-purpose tool called InfoScope.

When Will Survey Analysis Grow Up?

Malcolm Gladwell at TED

At the 2004TED conference, Malcolm Gladwell tells the story of Dr. Howard Moskowitz, a man who revolutionized the prepared food industry through a new kind of analytical thinking. Long story short: Dr. Moskowitz was one of the first people to argue that companies should pursue multiple products targeted at customer subsegments rather than try to create the perfect product for all customers. He realized that an attempt to create a "platonic ideal" —whether it was pickles, mustard, or pasta sauce—would be a suboptimal result for most consumers. Consumers are individuals with preferences that are better clustered than averaged. Mr. Gladwell states that this change in business thinking (spurred by Moskowitz’s study of pasta sauce) mirrors a more general scientific shift from a focus on universal truths to the study of variation.

The prepared food industry gets it—as evidenced by nine variations of Ragu sauce on the grocery shelves—but I’m not convinced that these lessons have permeated the rest of the business analytics landscape. In particular, I am struck by the inability of most survey analyses to reveal insights about respondents.

The tools may be part of the problem. Here’s an example of what WebSurveyor provides its users to help them analyze online surveys:

Sample Survey Analysis

Their site tells us:

"Each question is graphed independently allowing you greater flexibility in customizing the layout of reporting for each question...Filter results based on specific responses or cross-tabulate results from two different questions, giving you powerful tools for detailed analysis."

Powerful? Flexible? More like barebones. WebSurveyor is putting the analyst in a very constrained box that won’t help deliver an better understanding of respondents. WebSurveyor’s tool demands "question-centric" not "customer-centric" analysis.

Consider how this typical survey approach would serve you in an effort to understand the passengers of Noah’s Ark. A surveyor would ask each animal to fill out basic information about their height, weight, number of legs, food preference, etc. The results would then let us know that the average animal weights 23 pounds, has a height of 1.2 feet, 5.6 legs, 30% omnivore and so on. All of which would miss the essential insight about the animals on board: there are two of each.

Unfortunately, the kind of analysis needed to reveal personality / needs / behavior clusters in your respondent population isn’t well supported by out-of-the-box analytical tools. One approach is factor analysis—a statistical technique that is used in marketing to "identify the salient attributes consumers use to evaluate products in a category" (Wikipedia). Another approach is to examine individual visual representations of individual respondents—a technique that we term (rather clumsily): customer flashcards.

My not so Jiffy experience

Funny thing happened to me last week on my way to an oil change - my car’s engine was destroyed.

It all happened in a blink: I stopped by Jiffy Lube on my way to the bakery and swung up to the garage entrance, first in line. If you’ve ever had an oil change, you’ve probably experienced the old "preventative maintenance" up-sell: a technician pulls you out of the waiting room, gives you a grim, disappointed look, then explains the various parts of your car that are in severe need of service. In the past, I’ve been good at standing up to these automotive authority figures. I’d mumble "no thanks, maybe next time," not daring to look into the eyes which so clearly said: Don’t you care about your own safety? This time, however, I broke down and gave the go ahead for an engine flush. I was assured that any sane car owner would have this procedure done every 15k miles; here I am at 80k without my first flush.

I knew something was wrong when I saw them pushing my car out of the garage half an hour later. I was assured it was no problem; they just needed to dry off my spark plugs. Two hours later I was calling for a ride.

All of which would have been a small inconvenience if I hadn’t gotten a call the next morning letting me know they would need to replace my engine. Clearly something had gone terribly wrong with that engine flush.

I should say: I have little reason to gripe about Jiffy Lube. They are covering the engine replacement and a rental car. That said, there are a few lessons Jiffy Lube management might take from this situation:

  • The edge cases matter. A while back we wrote (here and here) about analysis of anomalies and the opportunity for learning. One point that applies in this case: Collectively, outlier customers provide a service: they stress test the product and highlight unrealized strengths or weaknesses. In its desire to relentlessly upsell, Jiffy Lube has extended its service outside its comfort zone to a point of weakness.
  • Data can make you smarter. I had an interesting conversation with the outside mechanic that is installing the replacement engine. He said I was lucky. My engine has a known problem with high levels of sludge build-up. He has seen other instances where an engine can be so full of sludge that an engine flush is incapable of breaking through the muck (like clogged arteries, I imagined) and the result is ruin. I get a refurbished engine with 80% new parts in place of an engine that was like the heart of an overweight cholesteral-holic. Maybe Jiffy Lube shouldn’t be indiscriminantly upselling every customer. It wouldn’t be difficult to build some filters into their system for high risk maintenance.
  • Communicate with unhappy customers. Most companies would benefit from a simple alarm system for catching and responding to customers with particularly bad experiences. Something to appease them before they tell all their friends, family, and co-workers about their crappy experience (heck, they might even blog about it). All I ask as a customer is: a) recognize that you have created an inconvenience for me; b) convey that this isn’t a status quo situation; and c) assure me that you will make me whole. Jiffy Lube wasn’t effective in communicating any of these. They had an odd nonchalance that suggested this happens all the time, no single point of contact to speak to, and no apology for the inconvenience.

Judge customers by behavior, not fur color

To a stranger, my two dogs look alike. To me, they couldn’t be more different. They came from different dog shelters and are more than two years apart (that’s 14+ human years). Here they are: Maddie has her chin resting on Ally.

Dogs

Ally is twitchy, a mama’s girl, frightened of loud noises, and getting creaky. Maddie is confident, independent, curious about the loud noise, and energetic. Ally loves other dogs and distrusts new people. Maddie adores all people and is suspicious around certain dogs. Their features and personalities couldn’t be more different. I’ve had some time to get to know them.

When we meet a stranger on a walk (particularly one who isn’t a dog owner), we often get: "They must be related." Our denials don’t seem to phase these people as they point to the obvious evidence: "...but they are exactly the same size and color."

Superficial judgements are natural - a first level of defense to categorizing and manage a complex world. However, it’s unhealthy to not try to dig deeper. For some businesses, superficial characteristics are as far as the analysis goes when segmenting or profiling customers. A better approach is to look at customer behaviors which provides a much more accurate reflection of interests and needs. Jim Novo, marketing consultant, agrees:

Customer behavior is a much stronger predictor of your future relationship with a customer than demographic information ever will be

Simple customer characteristics can be easy to come by; age, income, zip code are probably part of your basic customer database. In contrast, behavioral segmentation is a more initimating analytical challenge. Here’s the approach we’ve used successfully at Juice:

  1. Create individual pictures of customers that visually show their behaviors over time. The trick is to create a "visual language" that represents actions and is intuitive
  2. With a dash of Excel, SAS, and python code, we generate thousands of these pictures of individual customers
  3. We visually scan for common patterns of behaviors and the associated success/failure points (e.g. repurchase, upsell, churn, etc.)
  4. Finally, we work backwards from our new understanding of behaviors to segment customers based on statistical measures of behaviors.

This approach differs from traditional data mining-based approaches that drill down from the top looking for patterns. We start at a very granular level and looks for patterns (using the power of the human visual system). It may sound a little crazy, but we’ve found that it can be both insightful and highly predictive.

Know your customers

The best businesses connect with their customers. They build intimate relationships, learn, and extend their products using this knowledge. After Apple learned that customers were using iPods to save addresses and data, they incorporated this feature into their next release. Intuit heard their small business customers saying, “I need to keep the books without the complexities of accounting" and QuickBooks was born.

Many companies have a different story. For them, technology has been a killer app—it’s killed the ability of individuals in the company to see their customers as individuals. Customers are a list to be manipulated, a total in a spreadsheet. They aren’t seen as people, much less as potential innovators. Dependence on big information systems is a source of the problem. These technology solutions are built to be comprehensive; built for speed; built for anywhere, anytime access. They aren’t built to understand individuals one at a time.

Sometimes the inability to understand customers stems from a business’ impatience and short-term focus on ROI. Tom Asacker pulls out an early marketing guru to make his point:

Abraham Lincoln on chopping down a tree: "If I had six hours to chop down a tree, I’d spend the first four hours sharpening the axe".

Instead, what do most marketers do? They take a whack at the tree, put down the axe, measure the cut, pick up the axe, whack the tree in a different spot, and repeat ad nauseum. Exhausting, to say the least.

If you are in an information-rich business with many customer interactions—you can know your customers intimately. You can look at individual customer behaviors and start to recognize important and startling patterns. It will take some time, but Abe would say it is time well spent.

Learning from the edge cases (Part 2)

In my previous post, I mused on the subject of edge cases and the learning opportunity they provide. I want to touch on how this applies to customer analytics.

This weekend I read a blurb in the Washington Post Food section. A customer by the name of Anne Monahan complained about the “dark blue menus printed in black ink" at a local restaurant. “In dim light, the menus were nearly impossible to read," she remarked. The restaurant co-owner said that she hadn’t noticed the problem before, and vowed to change the hue of the menus the next day.

It would be easy to dismiss Ms. Monahan as an outlier and a whiner. After all, this complaint was rare. An edge case. But the restaurateur decided to respond to the issue. Perhaps other patrons were bothered by it, but hadn’t commented.

This is not to say that companies should be a slave to the edge case. But don’t throw them out. Listen to what they have to say and be willing to respond, because:

  • They may be the canary in the mindshaft — telling you something that others haven’t yet realized

  • They may be an extreme case of common behavior that shows up more subtly amongst other customers. Recognizing this behavior can only help you better meet customer needs in the future.

  • They may offer new ways to think about the business or customers. As I said in my last post, edge cases help define the boundaries of reality.

  • Collectively, outlier customers provide a service: they stress test the product and highlight unrealized strengths or weaknesses.

Of course, there is also much to learn from the ordinary cases — the mainstream customers. I think most companies already understand the ordinary. In fact, the ordinary is already deeply embedded in the business’ assumptions.

Most statistical analysis tells you: watch out for outliers. They are the data points that can screw up your averages. Because of their rarity, they aren’t deemed worth focusing on. I disagree.

I hope to return to this topic as we find ways to apply it at our clients.

Learning from the edge cases (part 1)

I’ve recently developed an interest in "edge cases" - the extreme situations or data points that fall far outside the norm.

It was first piqued when I read a post by James Vornov about the impact of extreme cases on decision making. He notes: "Studies of decision making have shown that people are strongly influenced by single, uncommon events. Even when the pattern of frequent events indicates one type of behavior, the uncommon event prevails."

Edge cases do more than create the deepest impression; they also offer rich ground for learning. Consider two catalysts for learning: 1) frequent but ordinary events and 2) extreme events. Each offers a different type of lesson. When we learn from the ordinary, we gain the ability to predict likely outcomes and put clear dimensions around expected results. Learning from the edge cases is wholly different: it helps us define the bounds of reality. It tests our assumptions and creates sharp contrasts.

Storytelling is just one example where edge cases are a teaching tool:

 

 

 

 

 

 

  • The legal profession uses the extreme cases to define precedents and test the limits of laws
  • Engineers conduct stress testing on materials or products to understand the limits of capabilities. Similarly, programmers test code by defining edge cases.
  • Individually, I think we learn most about ourselves in situations when we experience something new, unusual, and challenging.

Another way to view edge cases is that they test our common sense. Tato on Everything2 points out that "as science pushes our understanding of the universe and our selves, we are confronted with new complexities and edge cases where these instincts [common sense] are actually dysfunctional, or wrong."

Michael Feldstein offers a similar view in his blog when he says: "In any field of inquiry, the edge cases are where some of the most interesting work gets done."

In each case, edge cases help us understand the far reaches of the possible. They help us map out reality. In a future post, I want to talk about how businesses can use edge cases, in particular outlying customer data points, to better understand their products, customers, and marketplace.