Monday, May 30, 2016

I want to measure everything. I want it all. And I want it now.


The most-heard sentence from Managers: we must measure everything. However, this is far from being true. Not everything must be measured.



Lets imagine you own a business. An e-business. Maybe successful, maybe not (yet). You are in the moment in which you consider implementing a tracking tool (Google Analytics, Omniture, MixPanel, etc.). Then the fateful sentence comes: we must measure everything. Please track every single click, every single action, every field entered in a form, etc. EVERYTHING! Usually when a Manager is asked why everything should be measured he answers: because everything can be optimized with data.

At this point two sentences crosses the Manager's mind:
"In you can't measure it, you can't improve it".
"If you didn't measure it, it didn't happen".

The first sentence is absolutely true in almost all situations. The second one would need some elaboration. Indeed, is true: if you didn't measure it, it didn't happen. My question, as an analyst, is: "Yes, we did not measure it. Yes, indeed, it did not happen. So?". I will state it very clearly here and now: it's not mandatory to track everything. Why? Simply because you can't optimize everything at the same time, or simply because the benefit of optimizing some features is insignificant.

When we measure everything we overcomplicate the implementation. It becomes a pain, it becomes never-ending. The analyst is always thinking what to measure instead of actionating the data that arises from the current implementation. The tool's interface turns into a nightmare: sampling, unclean data, impractical volumes of data to process, etc. The answer to this is: keep it simple.

How do we keep simple a tracking tool implementation? The key is to design Measurement Plans. Avinash Kaushik expresses this in a majestic way. A Measurement Plan consists of five steps:

- Goals: identify the business objectives (sell more, get more leads, increase CLV, decrease returns, improve margins, etc.). According to Kaushik's framework, the Goals must be Doable, Understandable, Manageable, and Beneficial).
- Strategies: for each objective identify crisp goals. They must be specific in the sense that they will be used to accomplish the goals (increase repurchase ratio, increase new users, decrease budget for some marketing campaigns, etc.)
- KPIs: no need to say what a Key Performance Indicator is. Of course, there are metrics that will tell you how are we doing with respect the established strategies.
- Targets: not mandatory but very useful. They are used to establish an end-point for our KPIs.
- Segments: The most important part of the plan. We take segments of users or behaviors that we'll analyze in order to understand where the fail or the success is (new users, paid campaign users, mobile users, users that land in the homepage, etc.). This is the hardest point within the Measurement Plan, and this is really where actionability arises.

The measurement plan is devoted to actionability. Whatever it brings no action is not a valid Goal or Strategy. A KPI or a segment that is not devoted to fulfil a strategy is not useful. At the end is very simple. If your Measurement Plans does not consider that a given button should be tracked, don't track it! If your Measurement Plan does not consider that the fields of a form should be tracked, don't track them! If your Measurement Plan does not consider that scrolling should be tracked, don't track it! In this way you will have a clean tracking tool interface full of data that can be transformed into actionable information.

Ideally the Measurement Plans are build by the team of analysts. They (should) know the business and they (should) know the technology. They are able to talk to the Business stakeholders and talk to the developers. They can gather business requirements across the company, think about business opportunities, and transform them into technical specifications. They are also able to gather all the data, build the KPIs, compare them to the targets, and recommend actions. In other words, they must take ownership of the Measurement Plans, from its conceptualization to its implementation.

At this point, I want to recall Kaushik's "Three Layers of So What" Test. It's a very simple test that will help you to decide whether a KPI (or a metric) is useful or not. Against every metric you want to report ask the question "So What?" three times. Each question provides an answer that will raise another question. If at the third time you don't get a clear action that must be taken, then you just face a nice-to-have metric and not an actionable one. Non-actionable metrics keep the focus off what is really important. Non-actionable metrics and, hence, non-actionable tracking, is like having Diogenes syndrome for data: you collect, collect, and collect data without extracting any useful information from it.

Last but not least, don't fell guilty for leaving features of your site without tracking. Feel proud for the recommendations to actions taken from the tracked items. And, again, keep it simple.

Wednesday, May 11, 2016

Using data effectively. I bet you don't!

Every company has data. Every company understands (or should understand) the value of data. But, are you using data in an effective way?

According to FORTUNE, in a post early this year only 27% of C-level executives think their company makes "highly effective" use of data. Now, the question is: is your company making "highly effective" use of data? Or even: is your company making "effective" use of data? Or even more: is your company using data?

Every company has data, and working with data is simple to start: just start collecting and processing some data. You will find very soon that making a (highly) effective use of data is much harder than desired. In any case, I can focus for the rest of the post in the case in which your company uses data.

The key (and very complex) exercise is to define what (highly) effective use of data mean. Let's do it in the opposite way: let's define what an ineffective use of data mean by pointing out some situations that we've found over the last years. Will you be able to pass these tests? And, please, be honest!

Test #1: Which three indicators you check every morning when sitting at your desk? If you can't answer this question with a clear set of KPIs or with a clear set of dashboards, then your company is making an ineffective use of data.

Test #2: Do you have any doubts about the data you retrieve? Is it reliable? Is it clean? Is it readable? If one single answer is "no", then your company is making an ineffective use of data.

Test #3: Does data take ages to be retrieved? Then your company is making an ineffective use of data.

Test #4: Can you retrieve joined data from different sources? If not, or if you need to manually join it, or some sources are not accessible, then your company is making an ineffective use of data.

Test #5: Can you read and relate large amounts of data? If Excel is not enough and you use no other tool to do so, then your company is making an ineffective use of data.

Test #6: Can you retrieve simple data by yourself? If you need permanently to ask for help (either because the systems are too complex, or because you simply don't want to do it by yourself), then your company is making an ineffective use of data.

Test #7: Are the insights properly communicated and understood (not necessarily agreed) by everybody? If data is misunderstood, or poorly communicated, then your company is making an ineffective use of data.

Test #8: Are you having tons of bureaucracy that keeps relevant information from reaching the decision makers who need to see it? Then your company is making an ineffective use of data.

Test #9: Do you figure out the specific question you need to answer, and then determine whether the right information exists and where it's located in the organization? If not, then your company is making an ineffective use of data.

Test #10: Do you take actions out of the data and insights? If not, then you only face nice-to-have data. Hence, your company is making an ineffective use of data.

Your company will achieve a truly data-driven culture if and only if none of these 10 situation take place. So, how do we solve them?

Solution for #1: Define KPIs, organize them, and conceptualize dashboards. Start with a napkin, then draw them in a piece of paper. Have somebody generating a pdf file with them and make sure they are in your inbox every morning. Then evolve with a BI tool.

Solution for #2: If data is not reliable then you must investigate if the source is shaky, the retrieval processes have flows, or the consolidation and calculation rules are buggy. The first case would probably require data-freezing processes and rules. The second and third cases would probably require new data manipulation rules.

Solution for #3: If data takes a long time to be retrieved, you must investigate the cause. It can be that the sources are slow to access and retrieve. It can also be that the transformation and manipulation processes are buggy. It can also be that the reporting tool is not optimized. Last but not least, it can also be that your BI department is flooded with requests.

Solution for #4: If you have many sources that need to be accessed and joined, then you must define ETL processes (Extract, Transform, Load). If the volume and number of sources is really big, then a data warehouse is a good solution.

Solution for #5: First you need to wonder whether you actually need such amount of data. Data pre-processing and calculation at the server side are good ideas as well. If not of these apply (despite I bet they do), then you must find a tool able to read such amount of data.

Solution for #6: Empowerment is a must-have in any data-driven company. Start by saying no to silly data requests. Train your people. Make it simple: implement a reporting tool and teach people how to use it! You can start with fact tables that can't be modified. Then (much sooner than you think) users will start asking for the edition capabilities!

Solution for #7: Check where the bottlenecks are. Make sure your analysts develop soft skills such as communication techniques. Apply Barbara Minto's Pyramidal Principle to your communication techniques. Avoid presentations with 1000 slides. Focus, focus, and focus.

Solution for #8: Again, check where the bottlenecks are. Make sure decision makers can read data (and make sure they use it!). Improve transparency at the BI or Analysts department, and make sure to have proper feedback loops.

Solution for #9: Please avoid the Diogenes Syndrome for data. Don't store ALL data waiting for a miracle to occur and insights appear out of them. Know your business and identify the pain points. Then, and only then, figure out which data is needed. If the data is there, use it. If not, start recording it now!

Solution for #10: Avoid having too many operations and strategical dashboards. Kill the non-essential indicators. A good hint here is the Three Layers of So What test. Ask every indicator or analysis or insight the question "so what" three times. Each question provides an answer that in return raises another question (a "so what" again). If at the third "so what" you don't get a recommendation for an action you should take, then you have the wrong information. Kill it.

In any case, if this sounds complicated or unachievable, reach us: info@ducks-in-a-row.es.

As a summary, data is devoted to actionability. For this to happen it must be accessed, relied, and properly communicated. Then and only then your company will be making a highly effective use of data.