Introduction
Data and analytics functions aren’t new. Since the 1950s, data-driven decision-making processes have been used by organizations to improve business performance and gain an edge over the competition[1]. From the 1950s through the 1990s data and analytics functions were generally the domain of a few large companies. Beginning in the early 2000s breakthroughs in information management and widespread adoption of the World Wide Web, data, and the technology to store, process, and analyze it became available to almost every organization regardless of size.
In more recent years the extraordinary promise of data science, machine learning, and artificial intelligence has resulted in a desperate scramble to build data science teams in every organization. In many cases, the rationale seems to be “If you build it, value will come”. As a result, many organizations are adopting technology and analytical capabilities faster than their capacity to adapt. The rapid pace of change in technology, knowledge, skills, and abilities is creating confusion about the capabilities and value potential of data science.
Challenges are to be expected in a nascent field. Over time business managers and data and analytics functions will learn how to work together and add value, but until then the haphazard adoption of data and analytics functions across so many organizations and industries has the potential to harm business performance, careers, and the credibility of data and analytics at a monumental scale. It doesn’t have to be this way. Data and analytics functions will improve the performance of almost any organization with proper planning, execution, and integration.
This series of articles aims to identify and provide solutions to many of the challenges organizations, business managers, and data & analytics professionals face when developing data & analytics functions. The recommendations contained are primarily targeted at data & analytics teams and functions that support the business performance of an organization rather than groups that develop customer-facing products. You will also notice that I use the term data & analytics or D&A functions or teams. This is to cover the many names given to roles from data scientist to marketing analyst and so many more[2].
The series will cover the following topics with the potential for deeper dives into subtopics:
- Chapter 1: From Data to Value - Wherein we explore process-supported alignment of leaders and D&A teams.
- Chapter 2: All Models Are Wrong, Some Are Useful - Wherein we explore a hypothesis-driven approach to building practical intelligence, tools, and resources.
- Chapter 3: System of Organizational Intelligence - Wherein we explore a system that enables the continuous advancement of applied intelligence.
Chapter 1: From Data to Value
Wherein we explore process-supported alignment of leaders and D&A teams
“The world’s most valuable resource is no longer oil, but data”, said the Economist. The internet is littered with articles echoing the same sentiment. While this is certainly true in terms of the net profits of a handful of tech giants like Alphabet, Amazon, and Meta the value of data outside these select few is less clear. A 2023 study by NewVantage found that 23.9% of companies characterize themselves as data-driven and only 20.6% say that they have developed a data culture[3]. A KPMG study found that 67% of CEOs often prefer to make decisions based on their intuition and experience over insights from analytics[4]. In another survey of data and analytics leaders, less than half (44%) reported that their team is effective in providing value to their organization[5].
If data is so valuable why is it so hard to create a data-driven organization? Because they are wrong. Data isn’t valuable. Like oil, in its crude form data isn’t useful. It has to be refined into products and services that customers need. Alphabet, Meta, Amazon, etc. don’t sell data, they sell products and services. These companies have identified customers' needs and refined raw data into products and services that deliver on those needs. For most organizations, the potential value of data isn’t in the creation of customer-facing products but in the development of internal products such as intelligence, tools, and resources that improve business performance. Failure to deliver value most often reflects a failure to create a shared vision of internal customer needs and products that will improve business performance rather than a failure of data science or analytics.
The Rift that Undermines Credibility
The unbridled enthusiasm surrounding AI, machine learning, and data science has driven a rush to adopt or extend data and analytics (D&A) functions across the business world. Although outside their domain of expertise, business leaders are making considerable investments in, and having to manage capabilities that are evolving so quickly that even D&A professionals can struggle to keep up. The D&A team comes with substantial expertise in their domain and their own impenetrable language but often lacks expertise in the organization's business. Business leaders and D&A teams often go together like vegans and barbeques. Where business leaders are all EBITDA and taking action, D&A professionals are p-values and scientific discovery. Terms like AI, machine learning, and actionable insights mean very different things between groups. The deep differences between these groups coupled with rapidly changing technology and ambiguous terminology create an environment where business leaders and D&A teams are unwittingly at cross purposes.
Consider the following scenario:
The CMO informs the D&A team that the leadership team wants a customer segmentation and market size analysis to determine the ideal customer targets for the upcoming launch of Super Widget 2.0. In addition to analyzing first-party consumer data, the CFO has approved funds to field a custom segmentation survey.
The D&A team designs a survey to distinguish customer types across various factors like socio-economic status, demographics, psychographics, etc., and distributes the survey to stakeholders for approval. After a couple of rounds of edits and the addition of several unrelated items because “since we’re doing a survey anyway…” the survey is approved and fielded. Once the data is collected the D&A team retreats to their corner of the building where they practice the dark magic of data science.
Two weeks later they emerge with a 78-slide PowerPoint deck (plus appendix) excited to share their findings including a new classification algorithm developed by the team. After 71 slides of methodology, utility scores, Mahalanobis distances, ROC curves, and so on, they present the consumer segment profiles. The segment descriptions are full of rich detail about the likes, dislikes, recreational activities, and so forth. The primary, secondary, and tertiary targets and population estimates for each are identified. The segments are distinct from the others and have catchy names like “Neo-Classical Urban Hipster” and “Empty-Nest Suburban Luddite.”
At the end of the presentation, the COO points out that the population estimate for the primary target is 10 times greater than the total widget industry forecast.
The CMO says “This is great! How do I use it?”. To which the D&A team exchanges blank stares and the CFO grumbles something about dollars and a toilet.
In this admittedly exaggerated example, a poor result was almost certain before the first data point was collected because they failed to establish a shared line of sight to the objective, use cases, and how it would deliver value to the organization. The CMO’s concept of segmentation differed from the D&A team’s and the COO’s. No one considered what information media planners needed to implement the results etc. Yet everyone was confident they understood each other until the presentation. Even if the D&A team can go back and revise their analysis to meet the needs of the business leaders they are likely to have lost some credibility with those leaders. The D&A team may be viewed as a group of really smart folks who don’t quite get the business. With diminished credibility, the planning process is less collaborative often leading to D&A efforts dictated by reactive ad hoc requests that often confuse urgent for important. Under these conditions, it’s not uncommon to find D&A teams that are overworked and somehow deliver little if any measurable value to the organization.
Value-Based D&A Product Planning Process
Considering the differences in backgrounds, communication styles, and life experiences misunderstandings between business leaders and D&A teams are unavoidable. People-led efforts to develop a shared vision of D&A products are likely to be inconsistent at best. The problem is not unlike communication problems that were common in operating rooms leading to increased surgical complications. The WHO created the Surgical Safety Checklist which has demonstrated a significant reduction in adverse outcomes and served as “a scaffold on which attitudes towards teamwork and communication can be encouraged and improved”[6]. Shifting the burden of communication from individuals to a process adds structure and repeatability needed for consistent alignment across D&A product opportunities.
The process below is a starting point that should be adapted to your organization. The objective is to align business leaders, the D&A team, and the end user/internal customer (if different) on the opportunities that will return the greatest value. The process should minimize the effort required to meet that objective. That said, if business leaders and the D&A teams approach the process with sincerity it can be a meeting of the minds working together to make a better organization. These interactions are engaging and rewarding, perhaps even fun.
The bulk of the planning process would be conducted annually and aligned with the organization’s goals and strategy. The idea is to create a proactive plan for D&A product opportunities outside of the influence of urgent tasks that divert attention and resources. While the D&A team needs to remain flexible and support the urgent, the plan provides a measure of the opportunity cost of those urgent diversions that will help keep everyone on the path of greatest value.
Opportunity Discovery
The objective of Opportunity Discovery processes is to identify several opportunities to support the business strategy and objectives, resolve problems, improve processes, efficiency, or other areas where the D&A team can develop products that will improve performance. My use of ‘Discovery’ here is deliberate as it implies that the planning team should expect to discover opportunities that weren’t on their radar. It is critical to obtain input from a broad cross-section of functions and levels to ensure you aren’t missing high-value opportunities. The planning team is composed of selected business leaders, D&A team members, and end users of D&A products to be developed. The end users should work closely with the D&A team throughout development to ensure the end product adds value and integrates into their workflow. Chapter 3 will cover organizational systems that support opportunity discovery without becoming a burden to the planning team.
Opportunity Types
The D&A products developed to fit the opportunity can be broadly classified into one of three product types: intelligence, tools, and resources. The characteristics of each type have implications for the potential value and the probability of a successful outcome.
Intelligence
Intelligence in this context is applied knowledge or another term for ‘Actionable Insights’. An effective intelligence product increases the user's knowledge of principles, relationships, and other forces that govern some aspects of business or market performance. The user can apply the knowledge to make better decisions that will improve business performance. Developing intelligence products can be exciting. The chance to delve into the unknown and discover something no one else knows is thrilling, at least to people like me. However, the objectives are inherently ambiguous, with that ambiguity comes risk. It is much more difficult to create a shared vision of success and estimate the return on effort. When evaluating intelligence products it is important to be mindful of this and vet the opportunity accordingly. Too often intelligence products are born from fire drills where business leaders want to understand a specific event such as “Our sales dropped by 7% month over month, why?” The opportunity as presented will almost certainly add near zero value to the organization. None of this is to say that organizations should not develop intelligence products but to highlight the need to be vigilant in assessing the opportunity. Chapter two will address this topic in greater detail.
Tools
Tools are D&A products designed to improve repeated business decisions or processes. These are the bread and butter of D&A products. Unlike intelligence products, the use case, objectives, current performance, and value potential are knowable from the outset. While the value of a single decision may be tiny, D&A tools add value through scale. Developing D&A tools requires working closely with the end users and system administrators to ensure the tool integrates seamlessly into the end user's workflow and with the existing IT systems. You will also be mindful of the cost of deployment as well. It’s surprisingly easy to rack up hundreds of thousands of dollars in data queries and compute time. Cloud service billing terms are inscrutable so work with your system administrators and the end user to minimize the cost of deployment.
Resources
Resources is a kind of catch-all category for products that don’t fit within either category. For example, a self-serve analytics suite that media buyers could use to explore the past performance of ad campaigns by certain characteristics and customer types, media platforms, etc.
Opportunity Value Estimation
Opportunity Brief
Just as the WHO Surgical Safety Checklist improved surgical outcomes, The opportunity brief is designed to support teamwork and communication during D&A product planning and development. The process guides business leaders, D&A teams, and end users through a process to establish unambiguous product specifications and value estimates. (see appendix for an example of a completed opportunity brief). This is intended to be a starting point and should be adapted to your organization's needs.
Opportunity Description
Overview of opportunity and solution?
Product Type
Intelligence, Tools, Resources?
Estimated Value
How will the product improve business performance?
How will performance be measured?
Use Case
Who are the end users/internal customers?
How will they use it?
How often will they use it?
Who is the user side point of contact?
Integration
How will the product fit into the user workflow or systems?
Resources Required for Development
Data - granularity, update frequency
Infrastructure
ETL process
Compute time
Labor, etc.
Resources Required for Maintenance
Data - granularity, update frequency
Infrastructure
ETL process
Compute time
Labor, etc.
Time to Develop
How long will it take to develop?
Estimated Cost of Development & Maintenance
Total cost of additional resources including the fully burdened cost of labor to develop
Total cost of additional resources including the fully burdened cost of labor to maintain
Probability of Success
How likely is it that the product will fully realize the opportunity
Creating the opportunity brief needs to balance the need for clarity without being onerous. The purpose is not an exhaustive accounting of each opportunity, but to establish concrete descriptions that allow for comparisons across opportunities, and expose uncertainties that are likely to derail them.
Quantifying the potential value of an opportunity in terms of revenue is going to be the most difficult and time-consuming part of the process. In some instances, a qualitative assessment of the impact may have to suffice. For example, new or revolutionary ideas are unlikely to have sufficient data to accurately estimate the revenue potential. Don’t pass on a great opportunity because you lack the data to generate a statistically valid forecast for revenue. That said, I’d hesitate to give priority to opportunities where any ambiguity remains regarding product requirements and required resources.
Prioritization
Prioritizing opportunities for development is selecting the best mix of opportunities that will yield the greatest value for the organization. The plan should reflect your organization’s objectives. For example, a video delivery service company wants to hit revenue growth of 5% year over year through increased premium subscriptions. Opportunities that support premium subscription acquisitions or reduce churn etc. would make sense for prioritization. Depending on the size and skill sets of the D&A team you may need to consider prioritizing opportunities that will maximize utilization of your team. Whatever the plan team selects it is a good practice to share the plan with business leaders for approval.
Don’t toss opportunities not selected for development. Document the reasons not included and the next steps, if any. There is value in tracking ideas that showed little promise even if for no other reason than to remind you why it was dropped the last time it came up. If possible store the opportunity briefs with tags in a searchable location for future reference.
Final Thoughts
Any new process viewed as additional work is likely to be met with some resistance. Integrating the Value-Based D&A Product Planning Process into the regular workflow of your organization will require support from senior leadership and champions to extoll the value of the process and encourage adoption. Also, encourage feedback and be prepared to adapt the process to best serve your organization's needs.
Ironically, the differences in experience, perspectives, education, etc. that often hinder understanding between business leaders and D&A teams are also the very things that are needed to consistently transform raw data into high-value D&A products. The processes I’ve proposed are designed to support business leaders and D&A teams in merging their respective knowledge, skills, and abilities toward that end. However, process alone will not be enough; all parties must also come together with a sincere commitment to listen, learn, and understand their colleagues. Ultimately, respect, humility, and kindness are essential to creating teams that are able to leverage the diversity of the members to achieve high performance.
In the next chapter, we explore a hypothesis-driven approach to building practical intelligence, tools, and resources.
Appendix
Example Opportunity Brief
Background
The media buying team has identified the current ad campaign split testing process as an opportunity for improvement. The current A/B testing process is a time-consuming manual process with input errors, unclear or contradictory outcomes, prone to false positives, test wins do not generalize to post-test performance gains, and no opportunity for meta-analysis across tests.
Opportunity Brief
Product Description - D&A team will create a revised A/B testing process that does the following:
Automate the data collection and version testing process using existing data used to create tests.
Create a new report with enhanced usability and self-serve analytics.
Implement new performance criterion of revenue per thousand clicks vs. current metrics of conversion rate or average order value which has led to cherry-picking results.
Implement proper sampling methods and statistical methods to reduce false positives, identify anomalies, and improve the generalizability of results.
Implement a variation-type classification system which will allow for meta-analysis to discover attributes of successful campaigns that can be applied broadly.
Product Type
Tool
Estimated Value
How will the product improve business performance?
Reducing media buyer time spent manually entering data by approximately 100 hours per week total at an average fully burdened rate of $44.98 ≅ $234k annual
Average campaign performance improvement of 0.1% per year ≅ $975k annual
How will actual performance gains be measured?
Changes in campaign performance can be measured by comparing campaign performance for the old system vs. the revised system.
Better understanding of true attributes of successful campaigns due to less contaminated result history and ability to perform analysis. Revenue included in the previous estimate.
Increased employee engagement due to reduction in low-value activities and time spent in creative problem solving etc. (not quantified)
Use Case
Who are the end users?
Media Buyers
How will they use it?
Test results will be updated in the A/B Test Report and Media Buyers will push winners to production
How often will they use it?
3-6 tests per week
Who will be the user point of contact?
Media Man Dan
Integration
How will the product fit into the user workflow or systems?
Media Buyer creates an A/B test in Salesforce using the current process.
The automated system samples campaign performance from the database at defined intervals, performs tests, and appends new results to the results data table.
Media Buyer receives an email alert when test is completed.
A/B Test Report updates to include new results (connected to data table).
Resources Required for Development
120 hours labor
Campaign data at the individual level (ready)
A/B Test data at the campaign and variation level (ready)
Virtual Machine for compute (ready).
Resources Required for Production - Data, infrastructure, ETL, compute time, labor, etc. Data
1 hour labor per week
Campaign data at the individual level (ready)
A/B Test data at the campaign and variation level (ready)
Virtual Machine for compute (ready).
Time to Develop
4 weeks
2.5 weeks initial development
1.5 weeks pilot and revision
Estimated Cost of Development & Maintenance - Total cost of additional resources including fully burdened cost of labor.
60 hours of data science labor at $80.60 ≅ $5,000
4x Daily queries and writes at $5.00 per instance ≅ $7,300 per year
Probability of Success
90% confidence. The D&A team has extensive experience with data, sampling, statistics, and automation programming required. The performance estimates are very conservative and very reachable. The risk of failure is low with a strong upside potential.
[1] Parra, X., Tort-Martorell, X., Alvarez-Gomez, F., & Ruiz-ViƱals, C. (2022). Chronological evolution of the information-driven decision-making process (1950–2020). Journal of the Knowledge Economy. https://doi.org/10.1007/s13132-022-00917-y
[2] Some more examples: Data Scientist, Marketing; Data Strategy, Research & Insights; Data Solutions; Business Intelligence; Marketing Science…
[3] NewVantage Partners. Data and Analytics Leadership Annual Executive Survey 2023. (2023, Jan). https://www.wavestone.us/wp-content/uploads/2022/12/Design-2023-Data-Analytics-Survey-Report.pdf
[4] Leavy, Eliot. Data Quality Crisis: New Survey Reveals 77% of Organizations Have Quality Issues. (2022, Jun, 28). https://www.aidataanalytics.network/data-governance/articles/data-quality-crisis-new-survey-reveals-77-of-organizations-have-quality-issues
[5] Gartner. Gartner Survey Reveals Marketing Analytics are Only Influencing 53% of Decisions. (2022, Sep. 15). https://www.gartner.com/en/newsroom/press-releases/2022-09-15-gartner-survey-reveals-marketing-analytics-are-only-influencing-53-percent-of-decisions
[6] Pugel, A. E., Simianu, V. V., Flum, D. R., & Patchen Dellinger, E. (2015). Use of the surgical safety checklist to improve communication and reduce complications. Journal of infection and public health, 8(3), 219–225. https://doi.org/10.1016/j.jiph.2015.01.001
One of the most challenging things for an organization is to get relevant data ahead of important decisions. This post reveals some of the reasons why this is difficult and proposes some useful insights into methods that, when followed, can help overcome those challenges.
ReplyDelete