A year or so later, you read about someone else succeeding with that very same idea. Thoughtland has produced another false negative and claimed another victim.
If you are reading this book, you are probably the type of person who regularly comes up with ideas for new products and businesses, and I bet that false-negative story I just described sounds familiar to you. I’ve been through this scenario myself too many times to count. On some occasions, I was the one with the idea being dismissed and ridiculed; on other occasions, I was the one doing the dismissing. Let me share more examples with you.
When I was offered the opportunity to join a little-known startup called Google as the company’s first engineering director, almost all my friends and former colleagues thought it would be a bad move. In their opinion, there were already several well-established search engines (remember Alta Vista?) and other, even better portals to the web such as Yahoo! In other words, that market was already spoken for and saturated.
I ignored their opinions and joined Google anyway. A few months later, I tried to get a good friend to join me and work in the Google ads team. I told him that we were building one of the most amazing money-making machines ever developed and that in a few years it would generate billions of dollars. His response was: “It may make a few bucks, but I don’t think it will be big at all. I never click on online ads.”
Another friend I tried to recruit chose a lesser position at Yahoo! In his opinion, Google was a minor internet player and would remain one. He thought that Google’s minimalistic home page was a stupid waste of valuable screen real estate compared to Yahoo’s everything-but-the-kitchen-sink landing page and that its PageRank algorithm was inferior to Yahoo’s manually curated results.
In Google’s case, I made the right bet, and those two friends are still licking their wounds. But I’ve been on the wrong side of the fence many times myself. Here’s a short list of relatively recent ideas I snickered at when I first heard them, along with my initial reaction:
Twitter: Who’d want to follow people like me? Heck, I don’t even want to be followed. And what’s with that 140-character limit anyway?
Uber: No thanks. If I can’t find a taxi and can’t afford a limo, I’d rather take the bus than hop in a car driven by an unlicensed stranger. What’s next, go spend the night in a stranger’s bedroom?
Airbnb: You mean that people rent rooms to virtual strangers for the night? I can’t imagine enough people willing to open their home like that or enough people willing to go to sleep in a stranger’s home. Have you ever watched a horror movie?
Tesla’s original Roadster: $120,000 for an all-electric two-seater that runs on batteries? I can buy a Porsche or a used Ferrari for that kind of cash.
I could go on for pages, but you get the point. I should add that I was far from alone with my negative initial opinion about the aforementioned ideas and companies; most of the people I talked to had similar reactions. It happens all the time, to most new ideas.
Just this morning, I read that Amazon agreed to buy Ring,* a company that makes wi-fi–enabled video doorbells and other home security products. The acquisition price has not been disclosed yet, but it’s rumored to be in the neighborhood of a billion dollars. Why do I mention Ring? Because just few years earlier Ring’s founder and CEO, Jamie Siminoff, couldn’t convince anyone to invest in his video doorbell idea. He couldn’t even get a deal on the popular TV show Shark Tank. In fact, one of the show’s savviest investors and an expert in television infomercials, Lori Greiner, the “Queen of QVC TV,” made the comment: “You’ll never be able to sell this on QVC.”
In a recent QVC appearance, Siminoff said that he sold 140,000 units of his video doorbell ($22.5 million worth) in twenty-four hours—one of the most successful QVC sales of the year. Oops. Add to that a couple of other million units sold through various outlets, and what we have is another spectacular false negative courtesy of Thoughtland. If a tank full of sharks can get it so wrong, what chance do we little guppies have?
Escape from Thoughtland
When you combine the Law of Market Failure and Thoughtland, chances are that you will be a victim of one of two possible scenarios:
Disregard for the Law of Market Failure combined with false-positive responses from Thoughtland leads you to overinvest in The Wrong It—an idea that is destined to fail.
Fear of failure combined with false-negative feedback from Thoughtland stops you from pursuing an idea that has the potential to be The Right It—an idea that, if competently executed, is destined to succeed.
As I’ve already mentioned, opinions from Thoughtland and the real world do align sometimes. False positives and false negatives are the norm, but true positives and true negatives do happen.
Sometimes, an idea gets an enthusiastic response in Thoughtland, is launched based on that, and succeeds. “I knew that this was going to be big!” And sometimes an idea gets slammed in Thoughtland, is executed despite the negative reactions, and flops miserably. “Told you so. Do you think you can get your old job back?”
How can you know if the negative or positive responses you are getting from Thoughtland are true or false? My conclusion is that you can’t. Given the Lost-in-Translation Problem, the Prediction Problem, the No-Skin-in-the-Game Problem, and the Confirmation-Bias Problem, there are too many ways to misjudge an idea’s likelihood for success.
But if you can’t trust your opinions, other people’s opinions, or even the opinions of experts, how can you decide if the idea you have and would like to develop is likely to succeed?
You need data!
3
Data Beats Opinions
The title for this chapter, “Data Beats Opinions,” comes from one of Google’s key operational principles. I had always believed that, being a rational guy, I had been making most of my work-related decisions based on data and hard facts. It wasn’t until I started working at Google in 2001 that I realized how much my own opinions, preferences, and biases affected my decision process. I won’t go as far as saying that opinions carried zero weight at Google, but after a few meetings I learned that if I could not support my opinions with sufficient objective data, I had little chance of winning arguments or convincing my colleagues.
Not only that, but I also learned that the company’s data-driven decision process was very exacting. What most people would consider data did not pass muster at Google. In order to get serious consideration in the decision process, the data had to satisfy a number of key criteria:
Freshness: The data has to be fresh—the fresher the better. That’s because what was true a few years (or months or weeks) ago may not be true today. This is particularly important in high-tech businesses and the online world, where people’s attitudes and expectations change the most rapidly. In the late 1990s, for example, one of the performance rules of thumb for websites was that a page had to load in eight seconds or less. Data from a widely publicized study showed that if a web page took longer than eight seconds to load, at least 50% of the website visitors would lose patience and leave the site.
These days, those eight seconds would feel like an eternity to 90% of users. We expect web pages to load instantly, and if they take more than a couple of seconds, we are gone. The “eight-second rule” became the “two-second rule,” and in a few years it will probably be the “half-second rule.” Some types of data spoil faster than a banana left in the back seat of a hot car; others remain valid longer. Unfortunately, unlike ripe bananas, aging data does not develop brown spots and go mushy to alert you, nor does it come with a helpful expiration date. So it’s up to you to be careful with the data you choose to use. If in doubt about its freshness, throw it out.
Strong relevance: The data must be directly applicable to the specific product or decision being evaluated. This may sound like an obvious criterion, but you’d be surprised how often data with weak relevance can trickle into the decision process. The fact that, say, most McDonald’s customers would not order onion rings wi
th their burgers even if they were offered does not mean that you should not include them on the menu for your idea for a burger food truck.
Known provenance: You should not rely on data collected from other people, in other organizations, or for other projects to make your decisions. Who knows what methods those people used to collect and filter the data? And who knows what biases, influences, and motivations may have affected them when they compiled and summarized the data? The “eight-second rule” and other similar studies mentioned before, for example, were sponsored and publicized by companies that sold products and services to accelerate website performance; so they had a vested interested in showing data that supported their business offering. Make sure you know where your data comes from and how it was collected and filtered.
Statistical significance: The data must be statistically significant. It must be based on a sufficiently large sample to ensure that the result cannot be attributed to chance. And unless you want to be humiliated in front of your colleagues, don’t try to present personal experiences or one-off stories as data. I made that mistake twice in my early tenure at Google, and both times I was quickly reprimanded with a chorus of “Anecdotes are not data.”
To be clear, nobody at Google sat me down and formally walked me through such a list of criteria. But after a few meetings, I learned that the term data in “data beats opinions” meant fresh, relevant, trustworthy, and statistically significant data. I also learned that the quickest and most reliable way to get that kind of data was to collect that data myself. As a result, I developed a deep-seated distrust of Other People’s Data.
Other People’s Data
You should not rely on Other People’s Data (OPD) to determine whether your idea is likely to succeed in the market. It’s a tempting, but lazy and dangerous shortcut.
First, let me define what I mean by Other People’s Data. OPD is any market data collected and compiled by other people, for other projects, at other times, in other places, with other methods, and for other purposes. OPD violates one or more of the criteria of freshness, relevance, trustworthiness, and significance that we’ve just outlined. The data derived from the experiments, actions, and decisions of other people working on ideas similar to yours can be used to supplement and inform your own actions and decisions. But it’s not sufficient and it should not be a substitute for collecting your own data. Let me explain why.
Whenever you think you have a new idea for a business, product, or service, there are five possible scenarios:
You are the first person in history to come up with that idea. There’s nothing in the world like it.
Other people have had the same or a similar idea and they chose not to pursue it.
they are actively pursuing it, but have not launched it yet.
they pursued it, launched it, and failed.
they pursued it, launched it, and succeeded.
Let’s examine each of these scenarios more closely:
Scenario 1: You are the first to come up with that idea. This scenario is extremely unlikely. I know that because I am constantly trying to come up with new and unique, but somewhat plausible product ideas to use as examples in my pretotyping classes and workshops, and it’s a nearly impossible task. Even when I push the limits of plausibility, tastefulness, or ethics (e.g., beer for dogs, squirrel burgers, cat cloning), I find that someone has beaten me to it. Furthermore, even with all the information now available on the internet, there’s no way of knowing for sure that nobody else is thinking or secretly working on a similar idea somewhere in the world. In the extremely unlikely event that you are indeed the first person in the world to think of a totally new idea, you don’t need to worry about OPD anyway—because there is none. You are in virgin territory, and you will have to collect every bit of data yourself.
Scenario 2a: Others have had a similar idea and chose not to pursue it. This scenario provides us with zero applicable data. The fact that nobody else decided to develop an idea similar to yours does not mean that that idea cannot succeed in the market—if competently executed. Coming up with new ideas is easy; pursuing them requires effort, sacrifices, skin in the game, and more. Most people have lots of ideas but don’t do anything about them; that may tell you something about those people, but it tells you nothing about the idea’s potential for success.
Scenario 2b: Others are actively pursuing a similar idea, but they haven’t launched it yet. This scenario also fails to provide us with any meaningful market data because, unless you spy on them, you can’t know how similar their idea is to yours, what market tests and experiments (if any) they ran to inform their decision, what their risk tolerance is, and so on.
Scenario 2c: Others pursued a similar idea, launched it, and failed. This scenario provides us with some data, but not enough to make a decision based on it. They may have botched one or more aspects of their execution, or their product may have differed from your idea in a small but meaningful way. Not to mention the fact that if that idea was tried at some other time, in another location, or on a different population, their results may not apply to your idea and target market. The history of business is full of ideas that failed at some time or in some place, but flourished at other times and in other places. McDonald’s McSpaghetti, for example, flopped in most countries, but is, believe it or not, popular in the Philippines.
Scenario 2d: Others pursued a similar idea, launched it, and succeeded. This case provides us with potentially relevant market data, but not enough to make a decision based on it. Just because someone succeeds with an idea does not mean that you will succeed with a similar idea. The fact that, in 1983, Stephen King’s book about a homicidal car, Christine, sold well and was even made into a movie does not mean that Madeline, Alberto Savoia’s idea for book about a murderous motorcycle, will succeed.
The bottom line is this. You should not make a decision about your idea based solely on what other people did or did not do with an idea similar to yours. Their experience, results, and data are not necessarily applicable to your idea.
Am I telling you to ignore any and all data from other people who have pursued ideas and markets similar to yours? Not exactly. I am not telling you to completely ignore it, because there may be something, perhaps even a lot, that you can learn from OPD. But I am telling you not to depend on it, because OPD is not sufficient. When it comes to determining the market potential of a new idea, OPD is simply not enough, and it’s no substitute for your own data.
You Must Get Your Own DAta
Your Own DAta (YODA) is market data collected firsthand by your own team to validate your own idea. To qualify as YODA, the data must satisfy the criteria of freshness, relevance, trustworthiness, and significance. YODA is the opposite of OPD—and heaps more valuable. OPD may seem easier to collect, especially with all the data available online these days. But don’t be lured by the ease with which you can obtain it, because an ounce of YODA is worth a ton of OPD. And the best news is that collecting YODA is neither difficult nor time consuming nor expensive. In fact, getting fresh YODA is often easier, faster, and more fun than digging out and dusting off some stale OPD—especially if you use the tools and tactics you will learn in Parts II and III.
Quick Recap
In Part I, we spent a lot of time learning about failure. This may not have been the most encouraging way to start, but understanding how and why most new products fail in the market is a key prerequisite for appreciating and understanding the tools, techniques, and tactics you will learn in Parts II and III. Before we proceed, however, let me take a moment to summarize what we’ve learned so far.
Here are the main takeaways from Part I, worth repeating, memorizing, and perhaps tattooing somewhere on your body as a reminder:
The Law of Market Failure: Most new ideas will fail in the market—even if competently executed.
Most new ideas fail in the market because they are The Wrong It—ideas that the market is not interested in regardless of how well they are executed.
r /> Your best chance for succeeding in the market is to combine an idea that is The Right It with competent execution.
You cannot depend on your intuition, other people’s opinions, or other people’s data to determine if a new idea is The Right It.
The most reliable way to determine if a new idea is likely to be The Right It is to collect Your Own DAta (YODA).
In Part II of this book, I will introduce you to three categories of tools and techniques to help you collect, analyze, and interpret your own data: thinking tools, to help clarify your idea and identify the data you need to collect; pretotyping tools, to help you test your idea in the market so you can collect YODA efficiently; and analysis tools, to help you interpret the data you collect with objectivity and help you translate that data into decisions.
Part II
Sharp Tools
4
Thinking Tools
We’ve seen what can happen in Thoughtland when you combine fuzzy thinking with a lack of reality checks. We will address reality checks a bit later; first we need to fix the way we think about our ideas for new products.
Clarity of thought is paramount. If your new product idea is vague, imprecise, ambiguous, or open to multiple interpretations, then you don’t have a solid foundation for going forward. Before you can put an idea to the test, you must be able to articulate it with enough clarity and precision to guide the design of meaningful and revealing tests, tests whose results you can trust.
The Right It Page 5