A couple of weeks ago, I wrote about creating systems to generate more quality insights. I presented how you could increase your team's performance by working on areas like processes, tools, culture, etc., but I never defined what I meant by “quality,” so this week we'll dive deeper into this concept.
Typically, when someone talks about quality regarding a data study, we immediately jump to “make sure the data analysis is robust and the results are reliable.” I think this is only part of the definition. According to my more than 8 years of experience in analysis, for data analysis to be a “good job”, it has to be a combination of three fundamental elements:
- Responds to a real need with timely precision.
- It is backed by a solid and proven methodology.
- It is digestible by the organization.
Let's dive in!
For a data analysis to be truly impactful, it is essential that it addresses a genuine, well-defined need. This means understanding exactly what problem is being addressed, identifying who it affects, recognizing why it is relevant at this specific time, and being clear about how the analysis results will be used specifically. The accuracy of this understanding directly correlates to the value your analysis brings to your end users.
And it is vital to identify a real need, as opposed to a perceived one. This will ensure that the analysis is not only useful theoretically but also applicable in practice. It will ensure that on the last day of the project, when you present it to stakeholders, you don't get questions like “what now?” It makes the difference between providing insightful, actionable data and offering information that, while interesting, may not be immediately beneficial.
For example, a retail company might perceive a need to analyze customer demographics broadly, but the real need might be to understand the purchasing patterns of a specific age group. The latter directly influences marketing strategies and inventory decisions, thus having a deeper impact on business operations.
Equally important is the timeliness of the analysis. This aspect comprises two key elements: the relevance of the need at the current time and the speed with which the analysis is provided.
- Relevance of the need: Business needs are often urgent and can evolve quickly, especially if you are a rapidly moving organization. An analysis that addresses a current pressing issue is much more valuable than one that comes too late or was done too early. For example, an analysis of consumer trends in the run-up to a major holiday season can be invaluable to a company in terms of stocking and marketing, but if done after the season has begun, the opportunity is lost.
- Analysis speed: The speed at which the analysis is performed is equally critical, as it influences the relevance of the need. And this is an important factor to keep in mind, as sometimes you may have to make trade-offs between the thoroughness of the study and speed (for example, if there is a new trend on social media and your company wants an analysis to take advantage of a viral trend). topic: you can't take 2 months to get results).
In short, the chances of success of your data analytics are significantly higher when you accurately identify and address a real, current need and when it is delivered in a timely manner, ensuring maximum relevance and impact.
Too often I see data analysis that doesn't use any standard methodology. And while this doesn't necessarily mean the study won't be good, the chances of producing high-quality work are greatly reduced if you don't follow a proven methodology.
A structured/standardized approach ensures completeness and also enhances the credibility and replicability of the analysis.
One methodology I find easy to follow is the CRoss Industry Standard Process for Data Mining (CRISP-DM) framework. After almost a decade in the field, that is still my frame of reference when starting an analysis from scratch. This framework, which is said to be the standard “data science”/“data analytics” process, has 6 main phases:
- Business understanding: During this phase, the data analyst must thoroughly understand the “business context” of the question: what is the problem we are trying to solve, what we did in the past, who are the “actors”, what are the risks, resources, etc., and also very important, what would be the success criteria of the project.
- Data understanding: This phase involves becoming familiar with the data; It involves a descriptive and exploratory analysis of the data and the identification of data quality problems. It is your own “preliminary survey,” where you begin to grasp the nuances and potential of the data.
- Data preparation: This phase involves selecting the data you want to work with, based on inclusion/exclusion, and then cleaning and transforming the data into a format suitable for analysis. It's like preparing the ingredients before cooking a meal: something essential to obtain a good result.
- Modeling: The idea of “modeling” may be daunting to some people, but modeling can be as easy as “creating a certain threshold” for a true-false metric (for example, if your project understands or defines churn). During this phase, various modeling techniques are applied to the prepared data, so you can compare them with each other and understand which ones are the most successful.
- Assessment: Models are now critically evaluated to ensure they meet the business objectives and success criteria that were established in Phase 1. This often leads to insights that you can use to revisit your business understanding.
- Deployment: The final phase involves applying the model to real-world data and situations, effectively putting the analysis into action, and beginning to use the insights to improve team operations.
This framework increases the chances that your analysis will be more robust by forcing you to follow those different steps, leaving room for creativity.
Digestibility is not just about simplifying complex information and making slides easier to understand. It involves two integral aspects: (1) fostering a deep level of understanding on the part of the audience and (2) allowing them to apply this knowledge in a practical and impactful way. This process is similar to how the body not only breaks down food but also uses nutrients to power various functions.
Foster a deep level of audience understanding
Achieving this requires making data accessible and resonant with the audience. This is where subject matter experts (SMEs) play a crucial role. By involving SMEs early in the analysis process, their domain knowledge can guide the framing and interpretation of data, ensuring that analysis aligns with real-world contexts and is presented in a way that is meaningful. for the intended audience.
Another key strategy to improve digestibility is the implementation of a “stage-to-gate” process, involving regular checks and updates with stakeholders or the receiving team. This approach avoids overwhelming them with a large amount of complex information at the end of the study. Instead, stakeholders participate in the journey, allowing them to gradually assimilate new knowledge. It also opens avenues for continuous feedback, ensuring that analysis remains aligned with the audience's changing needs and expectations.
Imagine you are in a large organization implementing a new data-driven strategy. If the data team only presents the final analysis without prior participation, it may be difficult for stakeholders to grasp the nuances or see its relevance to their specific contexts. However, by engaging these stakeholders at regular intervals (through regular presentations or workshops) they become more familiar with the data and its implications. They can provide valuable feedback, directing the analysis towards the areas most relevant to them, ensuring that the end result is not only understandable but immediately actionable and tailored to your needs.
Allow the audience to apply knowledge
Actionability revolves around translating this deep understanding into real-world applications or decisions. It is about ensuring that the audience can effectively use the knowledge to generate tangible results. It's about really thinking about the “last mile” between your analysis and real-life impact, and how you can help remove any friction in adopting your insights.
For example, if you're working on a project whose goal is to define user churn, making your study more digestible could include creating a dashboard that allows your business stakeholders to understand what their results concretely look like.
Other ideas include holding workshops, developing interactive visualizations, etc. – anything that makes it easier to get the team up and running.
In summary, the digestibility of a data analytics project is significantly improved by involving SMEs from the beginning and maintaining ongoing communication with stakeholders. This collaborative approach ensures that the study is not only understandable but also directly relevant and valuable to those it is intended to benefit.
Successful data analytics is a combination of technical competence, strategic alignment, and practical applicability. It's not just about following a series of steps, but about understanding and adapting those steps to the unique context of each project. Punctuality, proper execution and attention to real organizational needs are the pillars that support the bridge that connects data analysis with organizational success. The ultimate goal is to transform data into actionable insights that drive value and inform strategic decision-making.
I hope you enjoyed reading this article! Do you have any tips you would like to share? Let everyone know in the comments section!
PS: This article was published in Analysis explaineda newsletter where I summarize what I've learned in various analytics roles (from Singapore startups to SF tech majors) and answer reader questions about analytics, growth, and career.