Tuesday, September 6, 2016

Haciendo un uso efectivo de los datos. Puede ser que no lo estés haciendo.


Los datos son el nuevo petróleo. Como tal, debe ser procesado y refinado para poder ser usado. ¿Hace tu compañía un uso efectivo de los datos?

Tal y como comentamos en el post fundacional, según FORTUNE en un post de principios del 2016, sólo el 20% de los directores piensan que su compañía hace un uso "altamente efectivo" de los datos. Ahora, la pregunta del millón es: ¿hace su compañía un uso efectivo de los datos? O, mejor aún, ¿su compañía usa datos?

Todas las compañías tienen datos, y es relativamente fácil comenzar a usarlos: sencillamente comience recolectando algunos datos en diversos sistemas y haga algún procesamiento sobre ellos (aunque sea manual). Muy rápidamente nos damos cuenta que hacer un uso (altamente) efectivo de los datos es bastante más difícil de lo que habíamos imaginado. En cualquier caso, centraremos el resto de este post en aquellas compañía que, de una manera u otra ya usan datos y desean hacer un uso más efectivo de los mismos.

El ejercicio clave es determinar qué quiere decir hacer un uso (altamente) efectivo de los datos. Para ello, lo haremos de la manera inversa: determinaremos qué es un uso inefectivo de los datos a partir de situaciones que nos hemos ido encontrando en los últimos años. ¿Serás capaz de pasar el test?

Test #1: ¿Qué tres KPI (indicadores clave de rendimiento) compruebas cada mañana cuando te sientas en tu escritorio? Si la respuesta a esta pregunta no es otra cosa que un conjunto claro y conciso de indicadores o cuadros de mando, entonces tu compañía hace un uso inefectivo de los datos.

Test #2: ¿Los datos que extraes generan alguna duda sobre su calidad? ¿Son datos fiables? ¿Son legibles? Si una sola de las respuestas es "no", entonces tu compañía hace un uso inefectivo de los datos.

Test #3: ¿Tardas demasiado tiempo en extraer datos? Entonces tu compañía hace un uso inefectivo de los datos.


Test #4: ¿Puedes extraer y cruzar datos de diversas fuentes? Si no, o se trata de un proceso manual, o algunas de las fuentes no son accesibles, entonces tu compañía hace un uso inefectivo de los datos.

Test #5: ¿Puedes leer grandes cantidades de datos? Si Excel parece quedarse corto y no se usa ninguna otra herramienta para hacerlo, entonces tu compañía hace un uso inefectivo de los datos.

Test #6: ¿Puedes extraer datos por ti mismo? Si necesitas estar permanentemente pidiendo ayuda (normalmente por la complejidad de los sistemas involucrados), entonces tu compañía hace un uso inefectivo de los datos.

Test #7: ¿Los descubrimientos son comunicados y entendidos (no necesariamente compartidos) por todo el mundo? Si los datos dan lugar a confusión o son pobremente comunicados, entonces tu compañía hace un uso ineficiente de los datos.

Test #8: ¿Experimentas montones de burocracia que hace que quienes tienen que tomar las decisiones no sean provistos con la información correcta y a tiempo? Entonces tu compañía hace un uso inefectivo de los datos.

Test #9: ¿Tienes preguntas concretas que debas contestar, y entonces determinas dónde se encuentra la información necesaria para responderla? Si es que no, entonces tu compañía hace un uso inefectivo de los datos.

Test #10: ¿Tomas acciones concretas a partir de los datos y de los descubrimientos? Si es que no, entonces sólo estás coleccionando información. Por lo tanto, tu compañía hace un uso inefectivo de los datos.

Tu compañía, pues, alcanza una cultura data-driven sí y sólo sí ninguna de las diez situaciones arriba mencionadas tiene lugar. Por lo tanto, ¿cómo las resolvemos?

Solución para #1: Definir KPI, organizarlos y conceptualizar cuadros de mando. Comienza por una servilleta y después pásalos a un papel. Luego haz que alguien lo tenga listo cada mañana en un PDF correctamente distribuido, y asegúrate de tenerlo en tu bandeja de entrada cada mañana. Por último, evoluciona hacia una herramienta de BI.

Solución para #2: Si los datos no son fiables, entonces debemos investigar en qué punto los datos dejan de serlos, tanto entre sistemas como dentro de los mismos. De la misma manera, es posible que haya errores en el proceso de extracción o que los procesos de transformación y manipulación sean defectuosos. El primer caso requerirá probablemente de procesos de congelación de datos. El segundo y tercer caso probablemente necesite nuevas reglas de manipulación.

Solución para #3: Si los datos tardan años en ser extraídos, es posible que las fuentes sean lentas de por sí (API de SalesForce, por ejemplo). También es posible que los procesos de transformación y manipulación presenten fallos o que la herramienta de reporting no esté optimizada. Por último y no menos importante, es posible que que tu departamento de BI esté ahogado en peticiones.

Solución para #4: Si tienes muchas fuentes de datos que necesiten ser accedidas y cruzadas. se deberán definir procesos ETL (Extract, Transform, Load). Si el volumen de datos es verdaderamente grande, un Data Warehouse (DWH) puede ser una buena solución.

Solución para #5: Lo primero es preguntarse si se necesita tal cantidad de datos. Si resulta que sí, entonces se pueden hacer pre-procesamientos y cálculos de datos en servidor. Si esto no es posible (aunque apostamos a que sí se puede), se deberá buscar una herramienta que sea capaz de leer tal cantidad de datos.

Solución para #6: La capacidad de servirse por mismo es una quimera para muchas compañías, pero a la vez es algo deseable. Se debe comenzar por decir que no a peticiones sencillas. Forma a tu gente y hazlo muy simple (no necesariamente barato): implementa una herramienta de reporting y enseña a tu gente (desde el analista hasta el CEO) a usarla. Se puede comenzar por tablas muy sencillas que no pueden ser modificadas. Más adelante (mucho más temprano de lo que puede parecer) los usuarios irán pidiendo derechos de modificación y edición.

Solución para #7: Comprueba dónde están los cuellos de botella. Asegúrate que tus analistas tengan buenos dotes de comunicación. Aplica el Principio de Comunicación Piramidal, de Barbara Minto. Y, por favor, evita presentaciones con decenas de diapositivas. La clave es el foco.

Solución para #8: Nuevamente, hay que analizar dónde están los cuellos de botella. Asegúrate que las personas que tienen que tomar las decisiones pueden acceder a los datos (¡y asegúrate de que usan datos!). También es posible que el departamento de BI o de Análisis necesite mejorar en transparencia. La clave puede estar en mejorar los procesos de feedback.

Solución para #9: Por favor, evita el Síndrome de Diógenes en versión datos. No es necesario almacenar todos los datos a la espera de que, por algún milagro, se obtengan descubrimientos a partir de ellos. Hay que conocer bien la estrategia y las operaciones del negocio para identificar los puntos claves que necesiten una decisión. Entonces, y sólo entonces, trata de descubrir qué datos son necesarios. Si los datos están ahí, úsalos. Si no, comienza a almacenarlos inmediatamente.

Solución para #10: Un buen consejo es el de evitar tener montones de cuadros de mandos y KPI (nuevamente, la clave es el foco). Una buena herramienta es el del "Test de las Tres Capas de para qué" (Three Layers of So What Test). A cada indicador, cuadro de mando o análisis se le pregunta: ¿para qué? Frecuentemente, la respuesta a esta pregunta vuelve a requerir nuevamente la pregunta: ¿para qué? Si a la tercera vez que se hace la pregunta no aparece una recomendación clara de qué hacer, entonces el análisis, KPI o cuadro de mando es totalmente inútil. Tírelo.

En todo caso, si esto suena muy complicado, no tiene más que contactarnos: es.ducks-in-a-row.es.

Para acabar, sólo queremos mencionar que los datos representan, o deben representar, acción. Para que esto pase, los datos deben ser accesibles, fiables y comunicados de manera eficiente. Entonces, y sólo entonces, tu compañía estará haciendo un uso efectivo de los datos.

Tuesday, August 30, 2016

Bed & Breakfast Analytics: lanzamos blog en castellano

Si hablas a una persona en una lengua que entiende, las palabras irán a su cabeza. Si le hablas en su propia lengua, las palabras irán a su corazón. Nelson Mandela.

Hace tiempo que veníamos pensando en abrir una sección en Bed & Breakfast Analytics para los castellano-parlantes. Después de mucho trabajo en distintos ámbitos, nos hemos lanzado a postear (lo siento, aquí no hemos encontrado una traducción que nos convenciera) en castellano. Eso sí, fieles a nuestro estilo.

Como buen post fundacional, queremos hablar de quiénes somos y qué hacemos. Nuestro blog, al que hemos llamado Bed & Breakfast Analytics es el blog de nuestra compañía, Ducks|in|a|row. Somos una consultoría de reciente creación, que se especializa en la gestión efectiva y eficiente de los datos. Nuestros servicios incluyen: análisis predictivo, analítica de clientes, analítica digital, optimización de la tasa de conversión, reporting, gestión de los flujos de información, optimización de procesos y formación.

El eje central que une todos estos puntos es el de los datos. Todas las compañías tienen datos. Muchas veces están diseminados en diversos sistemas que, muy frecuentemente, hablan lenguajes diferentes que difícilmente se entienden. De esta manera, la consolidación de las diversas fuentes de información, en caso de ser posible, se acaba realizando manualmente. El resultado final de esto es una inversión considerable de tiempo en la manipulación de los datos, y no en su análisis, que es lo único que lleva a la accionabilidad de los mismos. Si un análisis, un informe, un gráfico o un dato suelto no lleva en última instancia a un acción o decisión, entonces estaremos perdiendo el tiempo. Este es el gran foco de Ducks|in|a|row: generar acción de manera eficiente.

¿Qué quiere decir eficiente? En un artículo de Fortune de principios del 2016, se mencionan tres grandes problemas en la gestión de datos. Analicémonos con algo de detalle, aunque volveremos a este punto en futuros posts.

Alrededor del 80% de las grandes compañías han visto como alguna decisión estratégica de importancia se ha ido al garete por culpa de datos erróneos. Esto suele pasar cuando las fuentes de datos son inestables o deben manipularse frecuentemente y de manera manual.

Un 72% de tales empresas han experimentado retrasos en los tiempos de entrega de información a las personas que tienen que tomar las decisiones. Ésta es una consecuencia obvia de la desestrucutración de los datos, de un exceso de manipulación (y falta de análisis) y de una estrategia de reporting no existente o mal implementada.

Sólo un 27% de los directores cree que su compañía hace un uso efectivo de los datos, y un 32% piensa que las montañas de datos han hecho que las cosas vayan a peor. Éste es un síntoma que hemos notado muy frecuentemente: se tiende a invertir mucho tiempo (y dinero) en recolectar datos y muy poco a consolidarlos o analizarlos.

Siempre se ha dicho que los datos son el nuevo petróleo. Efectivamente, se puede extraer mucha información de los datos, pero, al igual que el petróleo, hay que definir bien los procesos de refinación. En el caso de los datos, esto hace referencia a procesos de automatización de limpieza, consolidación, reporting y análisis.

En definitiva, no es fácil hacer un uso efectivo de los datos, pero es posible. Nosotros podemos ayudarte.

Espero que nos acompañes leyéndonos. ¡Iremos publicando tan frecuentemente como nuestros quehaceres nos lo permitan!

Monday, June 13, 2016

Chicken or the egg dilemma: tools or analysis, what does come first?

We need to install the tool X, said the CEO. How much does it cost, asks the CFO. Why do we need it, asks the analyst. There is not a best-tool-for-everything. Your own and unique context should determine the right tools for your company.



This post can be thought as a continuation of our previous post called "The broken pyramid, or the hungry hungry hippos game". In that occasion we discussed what happens when a company decides to stop the data-related processes at the reporting step and forget about further analyzing and predictive analysis. In this post we are going to discuss what happens when a tool is chosen without contemplating the scenarios for its usage. This is another major disaster. Let me recall four different stories I've faced in the last years.

The current tool is the worst you could ever have.
We were doing consultancy for an e-retailer in the European market. The company decided to open a new channel: Wines. For this sake, they hired an experienced Wine buyer and expert in its logistics (pretty fascinating, by the way). The guy, as we can imagine, had no experience in the digital world, and hardly used data for its strategy and daily operations (according to Avinash, these ones should be immediately fired). It was a promising start! Right after he joined I was explaining him how to use our BI tool (Qlikview). He did not seem impressed. When I asked him why he answered: "SAP is the best tool for this kind of things". Of course, we did not implement SAP. For the business complexity the company faced at that moment, Qlikview was more than enough. Actually, it still is. He never used Qlikview, he never used data. He was fired three months after.

The best-tool-according-to-the-CEO.
Another situation I can recall is when the CEO of another e-company came to me and said: I've been in a conference, and now I'm convinced this is the tool we need for A/B testing. I don't recall the name, but it was not one of the most used tools in the market. The tool was very complex to implement, the setup of every test was a nightmare and we hardly managed to implement a single complex test successfully. And, of top of that, it was very expensive. Result: no more A/B testing because "it's a framework that brings no added value" (sic). Pretty disappointing.

We have a great tool but lets try another one.
Another situation that needs to be avoided relates to the fact of changing of tool with no apparent reason. I remember another situation in which we were using Google Analytics for web tracking. The CIO came to us and said that he managed to get a free trial for another tool (Mixpanel). We were reluctant because the tool we were already using (fully deployed and operational) was enough for our present and short and mid-term needs. We were happy with it, users were empowered, lots of decisions were taken out of it, and we did not understand why we should try another one. We implemented Mixpanel while in parallel we were working with Google Analytics. Of course, we could not use all Mixpanel's features and the final result, according to the CIO was: "Mixpanel is a bad tool", which is totally inaccurate. The implementation of a tool requires time and focus. If you don't have them, don't start a new implementation.

I need this-and-that but I will use none.
Knowing and understanding the needs is as well a very complex task: we were doing that job for another company and I recall a CMO saying that he wanted a BI tool with real-time reporting capabilities. When I asked him why, he was not able to answer. He just wanted it, despite having no resources and no process defined for reacting based on information that arose from real-time data collection. Result: we implemented something (very expensive) that can be translated as "real-time". It was hardly used. After tons of money spent and an extremely complex implementation, we came with a tool that had lots of features but were, by far, underexploited.

Based on these four (horrible) experiences, we can tell that the tool is the result of your needs, and it deeply depends on your own and unique context. The opposite will hardly work. This, however, is a very complex process, and you need to proceed thoroughly in order to get the best tool possible for your case. Communication with the stakeholders is the key: know their needs, know the current status, know a roadmap for the short and mid-term, etc. If you can't do this by your own, I strongly recommend you to hire a consultant to do this job with an independent view. 

To summarize, take always into account your context in the moment of deciding which tool to use:

- Talk to every potential user of the tool.
- Establish realistic needs.
- Stick to the tool unless a new set of realistic needs appears and your current tool can't fulfil it.

And, last but not least, remember the 90/10 rule: 10% of your budget for tools, 90% for people exploiting them.

Good hunt! There is always a right-tool if you understand your context.

Wednesday, May 11, 2016

Using data effectively. I bet you don't!

Every company has data. Every company understands (or should understand) the value of data. But, are you using data in an effective way?

According to FORTUNE, in a post early this year only 27% of C-level executives think their company makes "highly effective" use of data. Now, the question is: is your company making "highly effective" use of data? Or even: is your company making "effective" use of data? Or even more: is your company using data?

Every company has data, and working with data is simple to start: just start collecting and processing some data. You will find very soon that making a (highly) effective use of data is much harder than desired. In any case, I can focus for the rest of the post in the case in which your company uses data.

The key (and very complex) exercise is to define what (highly) effective use of data mean. Let's do it in the opposite way: let's define what an ineffective use of data mean by pointing out some situations that we've found over the last years. Will you be able to pass these tests? And, please, be honest!

Test #1: Which three indicators you check every morning when sitting at your desk? If you can't answer this question with a clear set of KPIs or with a clear set of dashboards, then your company is making an ineffective use of data.

Test #2: Do you have any doubts about the data you retrieve? Is it reliable? Is it clean? Is it readable? If one single answer is "no", then your company is making an ineffective use of data.

Test #3: Does data take ages to be retrieved? Then your company is making an ineffective use of data.

Test #4: Can you retrieve joined data from different sources? If not, or if you need to manually join it, or some sources are not accessible, then your company is making an ineffective use of data.

Test #5: Can you read and relate large amounts of data? If Excel is not enough and you use no other tool to do so, then your company is making an ineffective use of data.

Test #6: Can you retrieve simple data by yourself? If you need permanently to ask for help (either because the systems are too complex, or because you simply don't want to do it by yourself), then your company is making an ineffective use of data.

Test #7: Are the insights properly communicated and understood (not necessarily agreed) by everybody? If data is misunderstood, or poorly communicated, then your company is making an ineffective use of data.

Test #8: Are you having tons of bureaucracy that keeps relevant information from reaching the decision makers who need to see it? Then your company is making an ineffective use of data.

Test #9: Do you figure out the specific question you need to answer, and then determine whether the right information exists and where it's located in the organization? If not, then your company is making an ineffective use of data.

Test #10: Do you take actions out of the data and insights? If not, then you only face nice-to-have data. Hence, your company is making an ineffective use of data.

Your company will achieve a truly data-driven culture if and only if none of these 10 situation take place. So, how do we solve them?

Solution for #1: Define KPIs, organize them, and conceptualize dashboards. Start with a napkin, then draw them in a piece of paper. Have somebody generating a pdf file with them and make sure they are in your inbox every morning. Then evolve with a BI tool.

Solution for #2: If data is not reliable then you must investigate if the source is shaky, the retrieval processes have flows, or the consolidation and calculation rules are buggy. The first case would probably require data-freezing processes and rules. The second and third cases would probably require new data manipulation rules.

Solution for #3: If data takes a long time to be retrieved, you must investigate the cause. It can be that the sources are slow to access and retrieve. It can also be that the transformation and manipulation processes are buggy. It can also be that the reporting tool is not optimized. Last but not least, it can also be that your BI department is flooded with requests.

Solution for #4: If you have many sources that need to be accessed and joined, then you must define ETL processes (Extract, Transform, Load). If the volume and number of sources is really big, then a data warehouse is a good solution.

Solution for #5: First you need to wonder whether you actually need such amount of data. Data pre-processing and calculation at the server side are good ideas as well. If not of these apply (despite I bet they do), then you must find a tool able to read such amount of data.

Solution for #6: Empowerment is a must-have in any data-driven company. Start by saying no to silly data requests. Train your people. Make it simple: implement a reporting tool and teach people how to use it! You can start with fact tables that can't be modified. Then (much sooner than you think) users will start asking for the edition capabilities!

Solution for #7: Check where the bottlenecks are. Make sure your analysts develop soft skills such as communication techniques. Apply Barbara Minto's Pyramidal Principle to your communication techniques. Avoid presentations with 1000 slides. Focus, focus, and focus.

Solution for #8: Again, check where the bottlenecks are. Make sure decision makers can read data (and make sure they use it!). Improve transparency at the BI or Analysts department, and make sure to have proper feedback loops.

Solution for #9: Please avoid the Diogenes Syndrome for data. Don't store ALL data waiting for a miracle to occur and insights appear out of them. Know your business and identify the pain points. Then, and only then, figure out which data is needed. If the data is there, use it. If not, start recording it now!

Solution for #10: Avoid having too many operations and strategical dashboards. Kill the non-essential indicators. A good hint here is the Three Layers of So What test. Ask every indicator or analysis or insight the question "so what" three times. Each question provides an answer that in return raises another question (a "so what" again). If at the third "so what" you don't get a recommendation for an action you should take, then you have the wrong information. Kill it.

In any case, if this sounds complicated or unachievable, reach us: info@ducks-in-a-row.es.

As a summary, data is devoted to actionability. For this to happen it must be accessed, relied, and properly communicated. Then and only then your company will be making a highly effective use of data.

Tuesday, April 14, 2015

Bed & Breakfast Analytics: The 10 Motivations and the 10 Foundations

We are surrounded by data. What can we do with that? For what? How? What can we expect and what not? What are the common errors? Size matters? Open source or commercial tools? Here you should find some tips to discover your own journey on the data realm. Bon voyage!

I still remember a nearly-hilarious situation I faced in my first job as data analyst. The CEO came to me and asked me for tons of data very important for a strategic decision. Wow! Panic! Just graduated from College. Just landed in this job. No clue about the business. No clue about data structure. No clue about KPI names. No clue about anything. After some minutes of panic, I deeply breathed and tried to deliver what I've been requested. I promised I did the best I could: gathered data from different departments (no Data Warehouse, no unique source of truth) and different people, in very different formats, I used some advanced and fancy stuff in Excel, and, after 10 hours of intense job, I delivered a kind-of-report. I really had no idea what I was doing. I had no idea what data I was delivering. Some days after I went back to my CEO and asked him how useful my data was. His answer was: "Which data? Ah, that report. Well, we did not use it. We took the decision XXX based on a market research the CMO found in a blog". I suppose I should say thanks.

Sorry. Probably your data-setting is not correct. You should consider start reading this.

After years of experience, probably you've heard these stories many times. The Marketing Manager (random Manager example) requires some data. Let's depict some standard scenarios. The requested data corresponds to...

1. ... clicks, sessions, bounces, etc. This one should be easy. The Web Analytics Manager easily performs this task (is part of its basic skill set), probably by applying some complex advanced segments to the data (easy does not necessarily mean simple). Nowadays, the implementations of the web analytics tools trend to be very complex, mainly because they need to cover a lot of business cases. Simple, right? Well, now imagine that for some unfortunate reason, the Web Analytics Manager is on vacation. Panic! Then, the request is given to, let's say, a Campaign Manager. Of course, he/she has access to the web analytics tool, and hence tries to retrieve the requested data. A bit of panic appears, as the data seems not coherent (of course not, he's not applying those complex advanced segments he should be applying). He then tries to search for some documentation regarding this topics and...  surprise! he/she finds no such documentation. Finally he delivers some numbers, but they all know those numbers might not be totally reliable. At the end, and as the requests get more complex, the process to retrieve such data gets more complex as well. If the process is not clear enough for all stakeholders, the result is a lack of trust on the delivered data, leading to a lack on trust on the data strategy (if such thing exists in the company).

Here I already find my first three motivations:

  • Motivation 1: there is a lack of proper documentation. Information transference is virtually inexistent in many e-commerce companies, specially for data-related topics.
  • Motivation 2: business complexity translates directly into data complexity. Not every stakeholder understands this implication.
  • Motivation 3: wrong data strategy leads to lack of trust and, even worse, to wrong decisions. 

2. ... revenues, sales, etc. This one gets a bit trickier. The Marketing Manager pings somebody by BI, or by Finance. Traditionally the request is not complete or it's poorly written: time frames missing, before/after refunds, etc. Normally, such simple requests, requires 2-3 iterations, leading, again, to a lack of trust in the provided data. In some cases, a variation of this scenario takes place: reports and data is built by manually joining data retrieved from different data sources, as the full data map is not clear for everyone.

Again, two more motivations appear:

  • Motivation 4: it's very hard to write clear requests.
  • Motivation 5: outside our comfort area, finding data could be a challenge. Even when having a data warehouse, or a nice-and-expensive-but-totally-useless BI tool.
  • Motivation 6: it's easier to request than to retrieve, and it's easier to retrieve than to process.

3. ... data that has been already requested any time (many times?) before. This one is a quite disappointing. There is nothing more frustrating, in both directions, that performing a recurrent request, and being requested for the same time after time. Assume for a second that, indeed, such data is available. Why is data recurrently needed not easily available? Even worse, what if we have (as I mentioned in the previous paragraph) a very nice BI tool? Why some users are reluctant to use self-service data platforms? Now, assume that the data isn't available. Tough times are about to come: it's time to reach IT in order to start gathering this data. Normally, from a BI/Data department is very hard to write clear specifications for IT to start gathering some data, due to several reasons: lack of knowledge on the platform, lack of database architecture knowledge, etc.

With this, two further motivations appear:
  • Motivation 7: having a BI tool does not ensure self-service. Having a self-service platform does not ensure data availability.
  • Motivation 8: communication between BI and IT could be a struggle.

4. ... data, or analysis that we don't know whether it can be accomplished or not, or data which is not clear how is going to be used upon delivery. The first challenge when receiving a request, or when performing it, is to determine whether it can be done or not (assuming whether it makes sense or not). Many of the analysts work directly with data, without designing a plan for such analysis or request. That is, both requesters and analysts work without an analysis framework, even when it's clear that the analysis will require some time to be finished, probably due to its complexity. A different case appears when the request is coming from the CEO. We have to admit that is very hard to say no to our CEO. However, the CEO does not know everything, and he's not always right. Even CEO's requests need to be challenged, understood, and accepted.

With this, we find my two final motivations:
  • Motivation 9: working with analysis framework is a must-have.
  • Motivation 10: determine whether a request (for data or for an analysis) makes sense. Find the way to challenge every single request.

The Decalogue: the 10 Foundations of Bed & Breakfast Analytics

With my thoughts over the desk, and the motivations I find out of them, I'm ready to state my Decalogue.

1. Burn the silos! Managing data requires transversality and deepness on each vertical. Skill silos are not suitable any more.

2. Complexity matters! Understand how business complexity affects data complexity.

3. Better alone than... No data is better than wrong data.

4. Write, write, and write. Documentation is a must-have. Learn how to document and learn how to request.

5. Going beyond your comfort area. You should consider expanding your comfort area. Even more, you should consider not having any comfort area at all.

6. Bring order to chaos. Narrow your analysis: understand the need, design and framework, and only then, retrieve data.

7. Communication is the key. Your CEO does not care about regression models, decision trees, or how fast your database engine is. He wants a way to keep a sustainable and profitable business.

8. Choose wisely. The right tool for the right set-up. Self-service is not always the best solution.

9. Don't rush! Data is a path with some mandatory steps. Cheating leads to frustration and lack of trust.

10. So, how are you doing? Integrate data. Move aways from data silos. Design KPIs, reports, and dashboards based on integrated data.


So, what's next?

With all these, I want to share with you how do I measure, why do I measure and how do I analyze, with the hope that you will join me walking through the learning curve of the data-related world. In a Bed and Breakfast hotel you share your experiences with many others, and you obtain a clean and cheap way for sightseeing. This is exactly what I pretend to do here: every two or three weeks I will be sharing my thoughts, tips, tools, and techniques. Everything what I know will be shared. I'm willing to do so!

Hope you find this interesting, and welcome onboard!