Feb 27, 2024

Dangerously Data-Driven

Dangerously Data-Driven

Reflections from my time at Google: Examining the Hidden Risks of Relying Solely on Data in Decision-Making

Reflections from my time at Google: Examining the Hidden Risks of Relying Solely on Data in Decision-Making

I squinted at the data my team was laying out in front of me, my nose practically pressed against the giant monitor. 

"Are we sure this is right?" I asked, repeating the question for the third time in that cramped conference room. 

The chorus of exasperated "Yes's" deepened the pit in my stomach.

My engineering and analyst team on Android Messages were detailing the outcomes of our massive integration with the Google Photos team. 

The results were underwhelming.

For months, we had developed a feature allowing users to share album links directly with their iPhone friends, circumventing the quality degradation typical of sharing media via MMS, a.k.a. the Green Bubble problem.

Users had requested this capability for months, and analytics showed many even resorted to other apps to sidestep the issue.

Intuitively, streamlining an existing user behavior should've boosted engagement. 

Yet, in half an hour, I would stand before a panel of Google directors to answer: "Why isn't anyone using this?"

Understanding that user reaction can be unpredictable post-launch didn't fully prepare me for the magnitude of our miscalculation.

The project was a testament to our data-driven approach, underpinned by months of research, analysis, and prototyping—all predicting a surge in adoption and engagement.

Moving from ideation to development, we validated our project at every stage with prototypes, customer research, and more analysis.

We did everything we were supposed to. As a staunch practitioner of data-driven decision-making, this experience served as a wake-up call.

It reinforced a critical lesson: even the most well-intentioned approach can lead to bad outcomes when mindlessly followed.

The Pitfalls of Being Data-Driven

With the benefit of hindsight, I recognize that our well-intentioned, rigorous, data-driven approach led to significant setbacks:

Setback #1: Execution Procrastination

The mantra "Data is the new oil" implies that perfect information leads to perfect decisions.

But this is misguided. High-quality decision-making doesn't solely emerge from some form of information arbitrage. If that were true, then the tech giants with coffers of user data would always make the right moves.

Asking for more analysis just bred more questions, which drew us into a futile quest for absolute certainty that doesn't exist in complex, dynamic environments.

Instead of spending months trying to predict the future, we should've been building, launching, and learning from real-world feedback.

Pitfall #2: Equating Measurability with Quality

Our over-reliance on metrics steered us towards problems that were easier to quantify. 

As deadlines approached, ROI-based prioritization led us to favor measurable outcomes, sidelining harder-to-quantify aspects.

Consequently, we often opted for quick fixes that improved metrics like latency and error rates, neglecting elements like design polish or more ambiguous explorations of alternative solutions. 

Our approach yielded a functional yet compromised user experience

which I believe contributed to the feature's lack of adoption. 

This experience underscored a vital lesson: the difficulty in measuring something doesn't diminish its value.

The visual aesthetics of a Tesla or Duolingo's irreverent marketing voice weren't easily quantifiable, yet they were pivotal in driving product adoption.

Instead of letting data dictate every decision, we needed to let our product values and a commitment to quality guide us.

Pitfall #3: Reinforcing Cognitive Biases

Data, while offering an illusion of certainty, can reinforce some of our worst decision-making biases due to our innate discomfort with ambiguity:

  1. Confirmation Bias: The more ways you can slice data, the more our analysis cherry-picks justifications for pre-determined decisions. We're more likely to interpret new information in ways that entrench existing beliefs instead of ones that challenge them.

  2. Certainty Bias: The quest for certainty often outweighs the pursuit of understanding. We're prone to accept concrete, albeit flawed, conclusions over ambiguous insights that might offer deeper truth. You see this when analyses that highlight correlation get projected as a statement of causality.

  3. Availability Bias: We prioritize readily available information, potentially missing out on crucial insights. This bias can solidify existing mental models without new data to challenge them.

Strict adherence to a data-driven approach didn't eliminate biases; it reinforced them.

From Data-Driven to Decision-Driven

The shortcomings we faced served as a turning point for our team and marked a significant evolution in my approach as a product leader. 

We recognized the need to refine our decision-making process, gradually adopting what Kyle Byrd would characterize as a Decision-Driven approach.

Before jumping into the data or analysis, we evaluated the decision that needed to be made. Data was still a critical input, but it was no longer the center of gravity behind everything we did. The means to measure shouldn’t drive the decision; the decision should drive our investment in what to measure.

Embracing this approach meant learning to navigate the ambiguity inherent in most decisions rather than expecting data to render everything in stark black and white. 

Analysis was still vital, but we established a culture and systems that forced us to apply critical thinking and discretion.

We implemented several key practices that led to substantial improvements:

  • Sufficient Information Over Complete Information: We prioritized gathering just enough information to make informed decisions, understanding that waiting for complete data came at the cost of launching and iterating.

  • Directing Analysis Towards Decisions: We ensured every analysis directly contributed to making a decision, preventing unnecessary data collection and clarifying the value analysis in terms of reducing uncertainty.

  • Embracing Uncertainty: We acknowledged that no amount of planning could eliminate all uncertainty. Planning tools like forecasts, OKRs, and roadmaps became means to communicate intentions, progress, and risks rather than rigid plans to be blindly followed.

  • Question-Led Conversations: Our discussions centered on identifying the right questions to ask. By focusing on the decision-making process instead of just the conclusions, we fostered a more thoughtful approach to problem-solving.

  • Balancing Speed with Rigor: Decisions were made within a specific timeframe, but not without due diligence. Decision-makers were encouraged to seek diverse perspectives and articulate their reasoning. We subscribed to "Think deeply to move quickly" instead of "Move fast (recklessly) and break things."

  • Valuing Personal Judgment: Ultimately, we recognized that while data and structured processes are invaluable, they cannot substitute for human judgment and discretion.

💭 Parting Thoughts

While we were still far from perfect, by embracing a decision-driven approach, our team struck a balance between leveraging data and following intuition.

Our analyses became more focused on facilitating decisions, allowing us to explore a broader range of projects and easily adapt to new information.

While the optimal routines and rituals will vary, I believe a Decision-driven approach can benefit any team.

This shift towards becoming more decisive and adaptable in our decision-making enabled us to accelerate execution, enhance our resilience, and improve the quality of our outcomes.

Read more articles

Read more articles

Thanks for stopping by ❤️

- Phil

© Updated 2025

Thanks for stopping by ❤️

- Phil

© Updated 2025