Five Reasons Your Client Analytics Dashboard Project Will Fail, and How to Avoid Them
Everything Goes Wrong
I had a good friend call me up with a problem he was having at work.
He had been hired on to create a new, analytical product for his company—one that would embed analytics into informational dashboards for clients. This would be a totally new line of business, and the analytical product was well poised to bring in millions of dollars of new revenue.
My friend had worked in the analytics space for years. He’d had great success. He was confident he could get this job done for them and come away a hero.
But that’s not what happened.
When I received the phone call, he was exasperated and at wit’s end. Nothing he was trying was working. Clients weren’t willing to pay for the product, they didn’t see value in it, and the ones that did buy weren’t engaging with it. They left shortly after buying.
Why wasn’t the product resonating? Why didn’t anyone care about the embedded analytics? My friend had killer analytical skills, a great eye for design, and a good business mind. And yet, the product was failing.
It was failing because embedded analytics are hard. And not for the reasons you may think. I’d like to say that I was able to turn things around for my friend. That I had some killer advice for him. At the time, I didn’t. But it started me down the path of research, trial and error, and speaking with successful people to figure out why a seemingly perfect scenario set up for success with embedded analytics had gone horribly wrong.
And here we are. If this story sounds familiar to you, read on. If it doesn’t, read on so that it never happens to you.
Searching for the Solution
After a decade of work in the space and years of interviews with business leaders in data and analytics, I’ve worked embedded analytics down to a process. One that identifies the hidden nuances of failure and capitalized on the hidden opportunities for success.
My hope is that the few items outlined here will set you off on the right path and give you some important things to consider, and that it will help you have success with your embedded analytics project.
There is a lot of potential for embedded analytics to make impactful, meaningful, successful products, and it’s our obligation to deliver a product to our clients that gets the job done right. Here are five things that will kill your embedded analytics project. Make sure that they don’t.
1. Poor Design Distracts From True Insight
I mentioned earlier that my friend had a great eye for design. Even though that wasn’t able to save his product (more on that later), having poor visual design would certainly be a way to tank a good embedded analytics project, even if it gets everything else right.
While the ways to mess up data visualizations are numerous (with entire blogs dedicated to horrendous charts that exist in the wild), I’m going to go ahead and assume no one reading this would be okay with producing something like this:
But even with a basic grasp of chart decency, it’s easy to mess up visualizations and dashboards in a way that will ruin your embedded analytics project.
A while back I interviewed an organization called Transparency International. The founder of the organization, Peter Eigen, used to work for the World Bank over 25 years ago. He was something of a rebel, and he was frustrated because no one in the World Bank would talk about or recognize government corruption in the countries that the bank would lend to.
Peter saw that public money that was supposed to be going to help people was being siphoned off by corrupt governments, and the intended assistance wouldn’t get to the people that needed it. Still, people within the World Bank wouldn’t discuss the topic of corruption. It was seen as a political issue, and not the World Bank’s business. It was taboo, and Peter almost got fired for trying to get people to pay attention.
So he decided to leave. This was a big problem, and if he couldn’t fix it within the World Bank, he’d find some other way. Transparency International was born.
Over the past 25 years, they have completely changed the way the media, international businesses, and international aid organizations talk about, think about, and tackle the issue of corruption. What once was a taboo topic that was hard to define and see, has now become front and center in people’s development efforts.
How did they do it?
A simple data visualization.
Have a look. Yellow is less corrupt, red is more corrupt.
Every year they publish this map. And every year intense scrutiny, media coverage, discussions, and development initiatives are born from it. It’s been pretty successful. And it’s just a simple map.
And that’s the point.
Most people wouldn’t recognize this as an example of embedded analytics, but I disagree. I think it’s a really important example. We’ll talk more about what embedded analytics is and isn’t later, and why most people’s definition is dead wrong.
But the point at hand is design. And this map says it all. Complexity and clutter are one of the biggest enemies to good data visualization design. Everyone wants to put too much information into their dashboard because they don’t want to leave anything out. They don’t want to make the hard decisions about what the data should and shouldn’t say.
And therein lies the factor that will kill any embedded analytics project. You have to be willing to make hard decisions, remove clutter, and highlight what’s most important. Design your dashboard and charts to do this. Make the most important parts the largest. Highlight them. Put them front and center. Or even better, just get rid of everything else, like Transparency International.
- The most important insights should be the most noticeable item presented.
- The most important insights should be highlighted and clear. Consider removing everything that isn’t the most important insight.
- Don’t give the customer everything. Make decisions on what is most important.
2. You Don’t Speak the Client’s Language
The words and labels we use to describe data on a visualization or dashboard don’t get enough credit. They are often an afterthought, after all of the “real work” of data analysis has been done.
This is a huge mistake.
Labels and words are small, seemingly simple things. But so are ship rudders. And they steer entire battleships. So, show some respect.
I recently spoke with Ryan Deeds about his work transforming analytically immature companies into data driven machines. He works specifically with insurance agencies, but the approach and experience he shared applies in a much broader scope.
He told me that one of the very first things he does when he starts consulting with a company is hold interviews with everyone. Executives, management, and front line sales people. He learns the kinds of words people use to describe their work, their goals, and the metrics they measure themselves by. Even within the same company, the words people use for the same things can be different.
After the interviews, he then builds consensus around what words are going to be used to describe specific metrics going forward. Everyone agrees on a common term, and whatever terms people used to apply to the metric are now obsolete. Everyone is expected to use the same word to describe the metric.
This common language is important. The point of analytics is understanding, and language is inherent to understanding. If two people see the same word, but understand different concepts, you’ve already lost, no matter what your data says.
You won’t always have the ability to control what words people use to describe metrics and concepts, especially if you are working on client-facing embedded analytics. But you can at least do the interviews and research you need to in order to understand what word conveys the correct idea to the most people you are working with. Talk with them, ask them questions, and write down the words they use.
When putting labels and titles on your dashboards and charts, take care to use their words, not yours. Then, when showing them the dashboard, ask them what they understand from the words and titles. If they aren’t getting the right concept, ask them what word would convey that concept better.
This is very much a conversational exercise, and it takes time. But getting it right is the difference between clarity and confusion.
- Talk to you clients to understand their vocabulary
- Use their words on the dashboards, not yours
- If you are doing internal embedded analytics, get company consensus around a single term per concept
3. The Sophistication Level Is Too High or Too Low
When I was first starting my career, I was hired onto a small team that had just been acquired by a much larger company. They told me what I’ve since heard over and over again from employers—”We have a bunch of data and we don’t know what to do with it. But we think it’s valuable. Can you figure that out?”
So I set to work. I was the only person doing analytics on the team. I had to teach myself SQL, database schemas, python, and visualization on the job. My analysis and data work started to have an impact, and fairly soon I found myself doing data analysis not for the small acquired company, but for the entire firm as a whole. As it turns out, no one was doing analytics on the corporate level.
I eventually found myself with a booked meeting with the Chief Marketing Officer. I felt encouraged, excited, and slightly nervous. After being brand new in the workforce and quickly moving up the ranks, I felt like this was a big moment. I was going to present some important information to the top management of the company.
I prepared my data and my slides. I worked hard. I learned my talking points. I knew the value of each chart that I was going to share.
But when the time came, the presentation fell flat.
The meeting did not go as I had envisioned, and I left without making any kind of difference. It was a long while before I got another meeting like that again.
It felt terrible. So I promised it would never happen again, and started to think through why things had gone wrong.
It seemed that in the meeting, the CMO didn’t care about any of the points I was making. He didn’t care about the validity of the analysis. He didn’t care about the possibilities of data exploration my analysis brought to the table.
He didn’t care because it wasn’t his job to care. I had gone in with a presentation that was too sophisticated, and it completely missed the mark.
All the CMO cared about was specific actions he could take, or ask someone else to take, that would make a direct measurable difference on the bottom line. My analysis would have got us there, but I was too far in the weeds for him to see it.
The level of sophistication matters when we embed analytics. We have to understand what mode our customers will be in when looking at the data.
The CMO was in action mode. He wanted to know what to do, right now. He didn’t have time to sift through the analysis and make sure everything was statistically valid. That was my job.
If you are embedding analytics for people in action mode, keep the sophistication low. Only provide what they need to know to take action or make a strategic decision.
If you are embedding analytics for people in exploration mode, keep the sophistication higher. Filters, navigations, parameters, and a lot of data. Validity measures and statistical information. People in this mode want to sift through the information and find the insights and make sure they are valid.
One extra note, there is also a third group. People in action mode, that don’t necessarily trust you yet. They don’t want to sift through the data, but they do need a reason to trust you. Sometimes that means layering in some measures of validity into the analysis, and the assumptions you took to reach your conclusions. Sometimes it just means pulling authority from another source, like an impressive client list, or mentioning that your methods mimic those of a trusted entity.
Whatever the case, make sure that the level of sophistication of your embedded analytics matches the needs of the customer.
- People are in two modes when
- Action mode—They need low sophistication
- Exploration mode—They need high sophistication
- People in action mode that don’t trust you yet may also need some level of sophistication, or at least some authority to lean on, so they know your conclusions are valid
4. You Don’t Have the Right Delivery System
Mid way through my career as an employee, I was working at a company in a capacity to embed analytics across all of their processes. At this point, I had learned a lot about presentation, about sophistication, and about design. I was able to make things happen.
And then I hit an interesting fork in the road. I was working with two different departments of the company—retention and marketing.
Everything was working well on the retention side. I had gone through a process of iterations to make sure that the analytics made sense and they were targeted. The retention team was using them to great effect, having answers to questions they’d never been able answer before and making better strategic and tactical decisions.
Marketing was a different story. I went through the same process with them, but inevitably they would get no insights from the analytics. My work would fall by the wayside, unused.
I had done everything the same on both sides, and yet one project flourished while the other languished.
So, I started to look at the differences in the teams, and how they would consume the analytics I provided.
The retention team would take the analysis and actually expand on it. They would build out new views, think through what the data meant, and answer new questions.
The marketing team would just come back to me with questions about how to use the dashboards. How to do the filtering, what the analysis meant, and how to match it up with other data they were seeing. The Tableau workbook never really stuck for them, and they never learned how to use it.
The level of sophistication wasn’t necessarily the problem here. Both teams needed to be able to explore the data. What was different, however, was how each team was used to consuming data in general.
The retention team was used to doing things in excel. They would manipulate data, update sheets, and prepare presentations. They were used to using a more open format tool.
The marketing team was used to seeing the data inside of the different systems they used—the marketing automation system, the web analytics system, and the CRM system. They were used to looking at stock reports in these systems.
My delivery vehicle worked for the retention team because they were already used to a more open form format. It was already in their mental process, and then the Tableau dashboards fit into that mental model.
Here is one of the keys to delivery. The more you can match the current processes and mental models your customer uses to consume data, the more successful you will be.
For one, it’s less friction if you can integrate into whatever system they are currently using. Then they don’t have to log in to yet another system to get more data. This reduces the technical friction your customers experience.
Sometimes integrating into the actual system they use in order to reduce technical friction can be difficult, or even impossible if you can’t get access to the tool. But here is a pro secret—there is a system that almost everyone is used to using and you can almost always get access to it. That system is their inbox.
Emailing reports, alerts, and links into a dashboard system is a great universal way to meet people where they are at and reduce technical friction.
Secondly, if you can mimic how they are used to seeing and processing data themselves, it will reduce their mental friction to your analytics. Some people are used to paper reports, some people are used to Excel, some people are used to meetings and presentations. Meet them where they are.
- Deliver your analytics in a way
that matches how people are used to consuming data
- Integrate into the technical
systems they use (reduced technical friction)
- A ubiquitous way of doing this is by sending emails
- Follow the same workflow and mental models they currently use (reduced mental friction)
- Integrate into the technical systems they use (reduced technical friction)
5. You Aren’t Focused on Changing Customer Behavior
Everything I’ve mentioned up to now is important.
But this one is the most important.
If you take nothing away from this report, please understand this point.
I did an interview with Tyler Folkman from Ancestry.com, and he was talking about all of the different ways ancestry uses data in its business. Their customers get on the site to build out their family tree. They put in known relative’s information, find other relatives in the larger database, and connect everyone so they can have a look at their family history.
The more engaged people are with their family history, the more they stay with the company. Ancestry, then, has a vested interested in helping their customers build out their family tree, and they use a lot of machine learning and data analytics to do so.
While there is a lot that they could do with typical embedded analytics like dashboards and charts, they have a specific feature that I would argue is more true to actual embedded analytics. It’s a little, shaking green leaf icon.
When Ancestry identifies a potential relative for you that you haven’t added yourself, it lets you know with a little green leaf icon. You then have the ability to accept or reject the suggestion. These suggestions also work for adding information to people already in your tree.
This embedded analytics device covers all the basis. The design is clear and to the point. It fits within the user’s workflow. And it changes, or in this case, augments a positive behavior—adding more information to your family tree. This change in behavior ultimately drives engagement and revenue for the Ancestry business.
I love this example because no where is there mention of a dashboard, or even a graph. Everything happens behind the scenes, and only shows the user what is needed to promote a positive behavior. In the end, this is really what embedded analytics is all about.
Never start with the data when designing a dashboard or embedded analytics system. Always start with understanding the behavior you want to change with the analytics. And always tie that changed behavior to how much revenue or impact it will eventually have.
Don’t listen to the vendors and the pundits out there that keep pushing embedded analytics dashboards as though they are the panacea to your problems. Embedded dashboards *might* be a part of your embedded analytics strategy, but they are not the end game and they are not what you are driving for.
Know what behavior you are changing and why. Then, provide the data and analytics in whatever format, system, and sophistication that level best meets the need of changing that behavior.
- Define the behavior you want to change or promote, and design your solution around that
- The solution may or may not include a dashboard or even charts. It doesn’t have to.
- Tie the positive behavior back to a positive outcome for the business to understand the overall value to the business
As I’ve worked in this industry and interviewed people about it, I’ve noticed patterns in how people get to changed behavior with embedded analytics, and each type of behavior you want to change merits a different kind of embedded analytics model and approach. There are several models we’ve identified that can serve as templates to help create solutions that match the situation. Some of these include the
- Data Coaching Model
- Conversation Starter Model
- Tactical Action Model
- Researcher Model
- Strategic Action Model
- Metric Optimization Model
I’ll be detailing out how these models work in future articles, as well as giving examples of them getting results out in the wild. Stay tuned!
In the meantime, if you are working on client facing dashboards or analytics, we can help you cut through the noise and get it right. Sign up for our Client-Facing Analytics Accelerator Program, or you can contact me personally if you have questions or need some guidance!