Most people educated in the Western world are taught to trust numbers. We find it easier to work with numbers and make comparisons with them than with abstract feelings or ideas. Decision and utility theory, mentioned briefly earlier, depends on this notion by claiming that we make better decisions if we can convert our desires and the probabilities of choices into numbers and make calculations based on them. Despite my earlier criticism of these theories, sometimes forcing ourselves to put numerical values on things can help us define our true opinions and make decisions on them.
But decisions aside, we commonly like to see evidence for claims in numeric form. There is a difference in usefulness and believability in someone saying “Our search engine is 12% slower on 3-word queries” than “The system is slow.” Numerical data gives a kind of precision that human language cannot. More so, numerical data is often demanded by people to support claims that they make. The statement “The system is slow” begs the question “How do you know this?” The lack of some kind of study or research into the answer makes the claim difficult to trust, or dependent solely on the opinion and judgment of the person saying it. Sometimes, a specific piece of information answers an important question and resolves a decision much faster than possible otherwise.
Data does not make decisions
The first misconception about information is that it rarely makes a decision for you. A good piece of information works like a flashlight. It helps illuminate a space and allows someone who is looking carefully to see details and boundaries that were invisible before. If there is currently no data or research into important claims, taking the time to get data can accelerate the decision-making process. The fog starts to lift and things become clear. But the returns diminish over time. After the first light has been lit and the basic details have been revealed, no amount of information can change the nature of what’;s been seen. If you’;re stranded in the middle of the Pacific Ocean, knowing the current water temperature or the subspecies of fish nearby won’;t factor much in any of the decisions you’;re likely to make (but knowing the water currents, trade routes, and constellations might). For most tough decisions, the problem isn’;t a lack of research or data. Tough decisions exist in this universe no matter how much information you have. I think the phenomenon of analysis paralysis, where people analyze and discuss obsessively, is symptomatic of the desperate belief that if only there was enough data, the decision would resolve itself. Sadly, this isn’;t so. Information helps, but only up to a point.
It’;s easy to misinterpret data
The second misnomer about data is that it’;s all created equally. It turns out that when working with numbers, it’;s very easy to misinterpret information. As Darrell Huff wrote in How to Lie with Statistics (W.W. Norton, 1993), “The secret language of statistics, so appealing in a fact-minded culture, is employed to sensationalize, inflate, confuse, and oversimplify.” Huff categorizes the many simple ways the same data can be manipulated to make opposing arguments, and he offers advice that should be standard training for decision makers everywhere. Most of the tricks involve the omission of important details or the exclusive selection of information that supports a desired claim.
For example, let’;s say a popular sports drink has an advertisement that claims “Used by 5 out of 6 superstars.” It sounds impressive, but which superstars are using the product? What exactly separates a star from a superstar? Whoever they are, how were they chosen for the survey? How do they use the drinkto wash their cars? Were they paid first, or were they rejected from the survey if they didn’;t already use the drink? Who knows. The advertisement certainly wouldn’;t say. If you look carefully at all kinds of data, from medical research to business analysis to technological trends, you’;ll find all kinds of startling assumptions and caveats tucked away in the fine print, or not mentioned at all. Many surveys and research reports are funded primarily by people who have much to gain by particular results. Worse, in many cases, it’;s magazines and newspaper articles written by people other than those doing the research that are our point of contact to the information, and their objectives and sense of academic scrutiny are often not as high as we’;d like them to be.
Research as ammunition
The last thing to watch out for is ammunition pretending to be research. There is a world of difference between trying to understand something and trying to support a specific pet theory. What happens all too often is someone (let’;s call him Skip) has an idea, but no data, and seeks out data that fits his theory. As soon as Skip finds it, he returns to whomever he’;s trying to convince and says, “See! This proves I’;m right.” Not having any reason to doubt the data, the person yields and Skip gets his way. But sadly, Skip’;s supporting evidence proves almost nothing. One pile of research saying Pepsi is better than Coke doesn’;t mean there isn’;t another pile of research somewhere that proves the exact opposite. Research, to be of honest use, has to seek out evidence for the claim in question and evidence to dispute the claim (this is a very simple and partial explanation of what is often referred to as the scientific method). Good researchers and scientists do this. Good advertisers, marketers, and people trying to sell things (including ideas) typically don’;t.
The best defense against data manipulation and misinterpretation is direct communication between people. Talk to the person who wrote the report instead of just reading it. Avoid second-, third-, and fourth-hand information whenever possible. Talking to the expert directly often reveals details and nuances that are useful but were inappropriate for inclusion in a report or presentation. Instead of depending exclusively on that forwarded bit of email, call the programmer or marketer on the phone and get his opinion on the decision you’;re facing. There’;s always greater value in people than in information. The person writing the report learned 1,000 things she couldn’;t include in it but would now love to share with someone curious enough to ask.
Aside from using people as sources, a culture of questioning is the best way to understand and minimize the risks of information. As we covered earlier in matters of design and decision making, questions lead to alternatives, and they help everyone to consider what might be missing or assumed in the information presented. Questioning also leads to the desire for data from different sources, possibly from people or organizations with different agendas or biases, allowing for the decision maker and the group to obtain a clear picture of the world they’;re trying to make decisions in.
Precision is not accuracy
As a last note about information and data, many of us forget the distinction between precision and accuracy. Precision is how specific a measurement is; accuracy is how close to reality a measurement is. Simply because we are offered a precise number (say, a work estimate of 5.273 days) doesn’;t mean it has any greater likelihood of being accurate than a fuzzier number (4 or 5 days). We tend to confuse precision and accuracy because we assume if someone has taken the time to figure out such a specific number, then the analysis should improve the odds that his estimation is good. The trap is that bogus precision is free. If I take a wild-assed guess (a.k.a. WAG) at next year’;s revenue ($5.5 million), and another one for next year’;s expenses ($2.35 million), I can combine them to produce a convincing-sounding profit projection: $3.15 million. Precise? Yes. Accurate? Who knows. Without asking “How do you know this?” or “How was this data produced?”, it’;s impossible to be sure if those decimal places represent accuracy or just precision. Make a habit of breaking other people’;s bad habits of misleading uses of precision.