In “The E-Myth Revisited: Why Most Small Businesses Don’t Work and What to Do About It”, Michael E. Gerber invites small business owners to stop working “on your business” and start working on your business. One of the central theses of the book is that SME owners should act as if they wanted to franchise their business. This forces them to (1) take a hard look at all of their activities and processes and (2) optimize and standardize those activities and processes. By doing so, they will maximize the performance of their business and make it replicable. This idea is similar to something expressed by Ray Dahlio in “Beginning” – For a team to be successful, its manager must work on the team (and not at the team) and build a system that maximizes the return on any given input.
To some extent, those tips can also be applied to analytics teams. For an analytics team, schematically, the input is the time spent converting data into insights, the output is “quality insights,” and the relationship between the two can be represented as follows:
# of quality insights per month = time spent converting data into insights / average time needed to convert data into quality insights
To increase the amount of quality insights generated by your team, you should work to increase the time spent converting data into insights or to decrease the average time needed to convert data into quality insights. You can do this by building “systems.”
Increase time spent turning data into insights
The time spent converting data into insights is clearly a function of your total headcount, so increasing headcount is the obvious solution, but it may not be the easiest.
Another way to look at it is that the time spent converting data into insights is the result of the following equation:
Time spent converting data into insights = Total staff time – Time spent on non-data work
Time spent working without data includes things like “stakeholder alignment,” “communication,” etc.
- These tasks are essential to the success of any good work with data (what’s the point of generating insights if there is no interest in them or if they are not communicated properly?).
- But these tasks are usually treated as “afterthoughts.” It is quite rare to see a team with a clear strategy or process around those elements, most likely because this is not as “cool” as working with real data, and also because this is not necessarily part of their skill set.
- This results in these tasks taking longer than expected and more time than necessary to ensure the success of the actual data work it supports.
By (1) defining clear processes for how to perform these tasks and (2) standardizing and optimizing these processes over time, many time savings can be generated (i.e., reducing time spent working without data). and improve the quality of your production at the same time.
A concrete example of this around cross-functional alignment could be to start holding prioritization sessions at the beginning of each month. In the first month of doing this, you realize that to have a good prioritization session you need to have a standard framework for making prioritization decisions. You introduce it in month 2 and it works, but then you realize that to make it even better, you need to have a better process for mapping potential projects to the team, so you introduce it in month 3, etc. Over time, with this iterative approach, you can arrive at a very effective process, allowing your team to spend less time on “political work” and focus more on knowledge creation.
Another example related to company-wide communication: you start without a clear process in month 1 and realize that your study is not being consumed as much as it should be. So in month 2, you launch a monthly forum. During those monthly forums, you realize that your stakeholders need to see the data presented in a certain way to make it more digestible for them, so you adopt a certain format/template, etc.
Again, by optimizing those processes, you not only save time that you can reinvest in creating insights, but you also set yourself up for success because those time-consuming non-data processes support your team’s ability to create insights. quality.
Reduce the average time needed to convert data into quality knowledge.
There are a couple of factors that can influence the time it takes to convert data into quality insights. To name a few:
- The skills of the analyst.
- The support of the team.
- Data availability
- The existence of tools.
The first strategy is to upskill your analysts to reduce the time it takes them to convert data into quality insights. The higher their skills and the more experience they have, the faster they can turn data into quality insights. While team-level training or individual coaching can typically generate a lot of value, a “soft” way to upskill is to create project “templates” so more junior analysts can adopt best practices and learn quickly. For example, having templates can force them to think about key questions like “what is the problem”, “how will their results be used in real life”, etc., which will ultimately help them formulate stronger problem statements before begin. your study.
Creating ways for the team to collaborate and share their knowledge can also be a way to reduce time to information. It can be as easy as creating Slack channels or Google groups and finding some incentive for people to participate, but those small actions can go a long way. Once those “places” exist, analysts can find support when they are unsure how to proceed, use the collective knowledge of the team, and create discussions that inspire new ideas. That’s why I also think it’s great to have regular meetings where analysts can present what they worked on, focusing on the methodology they used, as it spreads knowledge and can give ideas.
Data availability can be a big obstacle. If you have to spend your time performing complicated queries because there are no simple aggregate databases, and if you have to triple check your results each time because there is no certified or centralized data source, not only will that create unnecessary stress for the team, but you will waste precious time. Creating the right data pipelines to facilitate downstream analysis can be an effective strategy, if it hasn’t already been done.
Finally, if you have to do the same analysis quite frequently, the existence of tools can be a way to reduce the time you spend doing repetitive work. This is pretty common for things like A/B testing, where you can create/purchase licenses for automated tools to do all the statistical testing for you, so you don’t have to reinvent the wheel every time you get data from an experiment. It requires having a specific and repeated use case, but when that’s the case, it can be a great way to reduce time to information (and a bonus point: this is also a great way to standardize the quality of the result).
Ultimately, there are a few ways to reduce your average time to valuable information, and I think I’m pretty far from exhaustive. You can also think about knowledge management, data discoverability, etc.; It all depends on what the biggest pain points your team is facing.
In conclusion
We can rework our initial formula:
# of quality insights per month = (total staff time: time spent on non-data work) / average time to obtain quality insights.
And while increasing your overall headcount is one way to address the problem, you could achieve similar results by taking a closer look at your processes, infrastructure, tools, and “analyst support” strategy.
This article was cross-published on Analysis explaineda newsletter where I summarize what I’ve learned in various analytics roles (from Singapore startups to SF tech majors) and answer reader questions about analytics, growth, and career.