Counter Common Sense

As long as we employ “common sense” to guide our own actions, we can’t really go wrong because there is an almost infinite amount of commonsense advice for us to deal with our daily situations. If some of these commonsense-guided actions seem inconsistent, so be it; life goes on. Similarly, when we employ common sense in decisions that would impact large numbers of people, we can usually find some commonsense explanations to cite when we are confronted with criticism. In a way, common sense becomes the shield for our hubris. Politicians of any stripe can always find commonsense explanations that appeal to their supporters however much they evoke disbelief in their opponents. Managers can usually justify their decisions to their peers and management superiors, but not to others whose lives are most impacted.

As I mentioned at the beginning of this “common sense” journey, one of the major problems with using common sense to predict others’ behavior is that we inevitably assume too much. Erroneous assumptions on a large scale lead to all kinds of unintended consequences. Andrew Watts suggests that instead of using the old but not trustworthy “predict and control” model, we may want to consider switching to “measure and react.” As Mr. Watts points out in Everything Is Obvious, lay people’s prediction is often no worse than the “experts’,” and frequently the layman and expert are equally wrong. If the prediction is off, then the planning control elements that follow the prediction are predisposed to go awry.

maples4

Just because we can’t confidently predict most complex systems doesn’t mean that we can’t use probability to help make decisions. Of course, we still need to understand the nature of the phenomenon which we are confronting. It’s one thing to plan for social behavior that happens with regularity, such as flu season or holiday shopping; it’s another to plan for known but infrequent phenomena, such as impact from a category 4 hurricane or the impact of the “ice bucket challenge.”  Seriously, who had predicted the success of the “ice bucket challenge?”

In addition, we should be cautious about reliance on “expert’s” opinions. Why? Watts explains that it is because we largely consult “experts” only one at a time. We would be much better off relying on polls of many people, experts and non-experts (or, no experts at all), for input. Not only do experts cost more; they also tend to advocate employing more sophisticated models for “better control.” In Watts’ many experiments and reviews of others’ experiments, we learn that in trying to make predictions, simple models do just as well as the more sophisticated ones. Or, the more sophisticated models don’t bring enough return on the investment for all the additional information you have to acquire (at a cost, of course). Watts uses the example of sports games. The key factors for predicting which team might win are whether it’s a home game and what the historical data tells about the teams. All the additional nuanced information helps only just a little, not enough to make any significant difference.

dahlia

With experiment, “measure and react” approach would allow (especially for organizations) more immediate information on what the next step should be and how to implement it. For example, a company can do advertising in one geographic area or to one demographic group, and compare with similar markets. Of course, not every decision allows conducting experiments; imagine launching a military surge in one town but not in other towns. Watts offers these additional principles: “local knowledge,” “bright spot success stories,” and “bootstrapping,” and they are all connected. Local knowledge incorporates more accurate information and focused skills to tackle specific problems. In other words, one size cannot possibly fit all. “Local” personnel would have much better grasp of who to contact, for what resources and how much, and where to focus the resources, etc. Most of the issues that organizations face are not likely to be brand new, thus it’s efficient to look for ideas that have been tried. But don’t just copy; by studying other ideas closely you can see how they can be adapted to your needs. Underlying all this is the notion of “humility.” Watts quotes William Easterly,

A Planner thinks he already knows the answer; he thinks of poverty [or whatever issue] as a technical engineering problem that his answers will solve. A Searcher admits he doesn’t know the answers in advance; he believes that poverty is a complicated tangle of political, social, historical, institutional, and technological factors…and hopes to find answers to individual problems by trial and error…A Planner believes outsiders know enough to impose solutions. A Searcher believes only insiders have enough knowledge to find solutions, and that most solutions must be homegrown.

Watts further drives home with this observation, “[Planners] develop plans on the basis on intuition and experience alone. Plans fail, in other words, not because planners ignore common sense, but rather because they rely on their own common sense to reason about the behavior of people who are different from them.”

Of course, what I’ve been presenting in this space is based on my “expertise and experience” which is likely to commit the same commonsense fallacy even as I have been learning from Mr. Watts. So, I strongly suggest that you read Everything Is Obvious: How common sense fails us for yourself.

Till next time,

Staying Sane and Charging Ahead.

Direct Contact: taso100@gmail.com

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s