...helping clients in complex environments make better decisions, faster.

CALL NOW: 404.898.9096

info@decisionclarityconsulting.com

Archive for the ‘Articles’ Category

Human knowledge in the linear domain, does not transfer properly to the complex domain

Nassim Nicholas Taleb is a brilliant thinker, philosopher and writer.  I have read and enjoyed Fooled by Randomness as well as Black Swan and am looking forward to his most recent, Antifragile (reviewed in The Economist here).

Taleb had an article in Foreign Affairs in 2011 which hit on some of his main arguments (as applied to international relations) – The Black Swan of Cairo.  Black Swan is Taleb’s term for an unpredicted (and essentially unpredictable) event that disrupts the status quo.

Complex systems that have artificially suppressed volatility tend to become extremely fragile, while at the same time exhibiting no visible risks. In fact, they tend to be too calm and exhibit minimal variability as silent risks accumulate beneath the surface. Although the stated intention of political leaders and economic policymakers is to stabilize the system by inhibiting fluctuations, the result tends to be the opposite. These artificially constrained systems become prone to “Black Swans”—that is, they become extremely vulnerable to large-scale events that lie far from the statistical norm and were largely unpredictable to a given set of observers.

Such environments eventually experience massive blowups, catching everyone off-guard and undoing years of stability or, in some cases, ending up far worse than they were in their initial volatile state. Indeed, the longer it takes for the blowup to occur, the worse the resulting harm in both economic and political systems.

[snip]

Humans simultaneously inhabit two systems: the linear and the complex.  The linear domain is characterized by its predictability and the low degree of interaction among its components, which allows the use of mathematical methods that make forecasts reliable. In complex systems, there is an absence of visible causal links between the elements, masking a high degree of interdependence and extremely low predictability. Nonlinear elements are also present, such as those commonly known, and generally misunderstood, as “tipping points.” Imagine someone who keeps adding sand to a sand pile without any visible consequence, until suddenly the entire pile crumbles. It would be foolish to blame the collapse on the last grain of sand rather than the structure of the pile, but that is what people do consistently, and that is the policy error.

[snip]

Engineering, architecture, astronomy, most of physics, and much of common science are linear domains. The complex domain is the realm of the social world, epidemics, and economics. Crucially, the linear domain delivers mild variations without large shocks, whereas the complex domain delivers massive jumps and gaps. Complex systems are misunderstood, mostly because humans’ sophistication, obtained over the history of human knowledge in the linear domain, does not transfer properly to the complex domain. Humans can predict a solar eclipse and the trajectory of a space vessel, but not the stock market or Egyptian political events. All man-made complex systems have commonalities and even universalities.  Sadly, deceptive calm (followed by Black Swan surprises) seems to be one of those properties.

Taleb also mentions but does not elaborate on, the important issue of “the illusion of local causal chains—that is, confusing catalysts for causes and assuming that one can know which catalyst will produce which effect.”  There is a tendency to see the last event as the “cause” of something when in fact it is sometimes simply the catalyst to a systemic readjustment, i.e. the straw that broke the camel’s back.  It wasn’t the straw per se, but the cumulative weight that preceded it.

There are echoes of Stephen Jay Gould and Niles Edlredge’s Punctuated Equilibrium  in which they argued that evolution is not a smooth continual process but rather a process characterized by fits and starts or a system of punctuated equilibrium.

Taleb is arguing that our efforts to ensure near term tactical stability are often at odds with desirable system evolution over the long run.  It is a classic trade-off decision.  He has observed many times that the good tactical intentions often end up unintentionally leading to catastrophic strategic outcomes.  An example would be that of forest fire management.  Nobody wants forest fires and for decades the strategy was simple fire suppression, keep fires from happening and put them out as fast as possible when they do happen.

In reducing near term fires, regrettably, forests have accumulated much greater fuel loads than they would otherwise under natural conditions where lightning strike fires periodically clear dead brush.  The result has been increasingly frequent, vast and intense wildfires beyond control.  A strategy for achieving near term stability (reduced wildfires) has ended up worsening the situation in the long run.

When making a strategic decision, it is important to consider the historical context.  Has the existing system evolved over time and therefore has some base level of stability, or has it existed in an unnatural state of artificial stability with all variance suppressed?  If it is the latter, then any actions undertaken related to a new change may have unanticipated consequences not necessarily having anything to do with the intended plan of action but simply as a consequence of cumulative avoided evolution.

 

 

Go away and think

Wise counsel on understanding the history and context of any fact or assumed causation.  When solving problems, we often want to start with a clean slate, unencumbered by history.  Even where that is possible, it is not always wise.  Only when we understand the nature of the status quo do we begin to be in a position to intelligently change it.  From G.K. Chesterton in The Thing (1929), Ch. IV : The Drift From Domesticity.

In the matter of reforming things, as distinct from deforming them, there is one plain and simple principle; a principle which will probably be called a paradox. There exists in such a case a certain institution or law; let us say, for the sake of simplicity, a fence or gate erected across a road. The more modern type of reformer goes gaily up to it and says, “I don’t see the use of this; let us clear it away.” To which the more intelligent type of reformer will do well to answer: “If you don’t see the use of it, I certainly won’t let you clear it away. Go away and think. Then, when you can come back and tell me that you do see the use of it, I may allow you to destroy it.”

This paradox rests on the most elementary common sense. The gate or fence did not grow there. It was not set up by somnambulists who built it in their sleep. It is highly improbable that it was put there by escaped lunatics who were for some reason loose in the street. Some person had some reason for thinking it would be a good thing for somebody. And until we know what the reason was, we really cannot judge whether the reason was reasonable.  It is extremely probable that we have overlooked some whole aspect of the question, if something set up by human beings like ourselves seems to be entirely meaningless and mysterious. There are reformers who get over this difficulty by assuming that all their fathers were fools; but if that be so, we can only say that folly appears to be a hereditary disease. But the truth is that nobody has any business to destroy a social institution until he has really seen it as an historical institution. If he knows how it arose, and what purposes it was supposed to serve, he may really be able to say that they were bad purposes, that they have since become bad purposes, or that they are purposes which are no longer served. But if he simply stares at the thing as a senseless monstrosity that has somehow sprung up in his path, it is he and not the traditionalist who is suffering from an illusion.

Trials and tribulations of evidence based decision-making

Take Back Your Pregnancy by Emily Oster.

The key to good decision making is evaluating the available information—the data—and combining it with your own estimates of pluses and minuses. As an economist, I do this every day. It turns out, however, that this kind of training isn’t really done much in medical schools. Medical school tends to focus much more, appropriately, on the mechanics of being a doctor.

When I asked my doctor about drinking wine, she said that one or two glasses a week was “probably fine.” But “probably fine” isn’t a number. In search of real answers, I combed through hundreds of studies—the ones that the recommendations were based on—to get to the good data. This is where another part of my training as an economist came in: I knew enough to read the numbers correctly. What I found was surprising.

The key problem lies in separating correlation from causation. The claim that you should stop having coffee while pregnant, for instance, is based on causal reasoning: If you change nothing else, you’ll be less likely to have a miscarriage if you drink less coffee. But what we see in the data is only a correlation—the women who drink coffee are more likely to miscarry. There are also many other differences between women who drink coffee and those who don’t, differences that could themselves be responsible for the differences in miscarriage rates.

Well worth a read.  Her experience is not dissimilar to that of the science writer Stephen Jay Gould when he was diagnosed with an uncommon malignant cancer.  He documented his exploration of the facts and statistics of his diagnosis (which was a median survival time of eight months from diagnosis) in an article, The Median Isn’t the Message.  Understanding what are the real facts is a task in itself.

On a separate note, Oster alludes to decision-making being the evaluation of available information.  That is part of it but I would suggest it is actually the iterative balance of four activities.

1)  Intentions – What are your goals, how will you measure them, what are the parameters that cannot be exceeded, and how will you know when you are successful?

2) Evidence – What does the evidence actually say?

3) Estimates – For the critical knowledge needed to make the decision and which is not available, what are the best available estimates and what risks are associated with those estimates?

4) Forecasts – Once causal analysis is completed, what are the actions necessary to achieve the desired outcomes and what are the forecasts of both required resources and effort as well as forecasts of outcomes?

Intent, Evidence, Estimates, Forecasts – that sounds better and more accurate.

Measuring expert opinion

The value and dangers of forecasting.  Everybody’s An Expert: Putting predictions to the test by Louis Menand.

Well documented and researched.

 

None of those things are observable

An interesting article, The STEM Crisis Is a Myth by Robert N. Charette.

For years I have been hearing about the STEM shortage but every time I look at the employment and salary numbers, all I see is the market functioning normally. People with STEM degrees and functioning in a STEM capacity in a STEM field are always in demand and their average compensation, as the article points out, has been fairly steady.

Granted, there are emergent fields, unique circumstances, and pressing needs that will suddenly create a temporary demand for those with a very particular STEM skill set, but the market functions, more people move in or specialize in the hot area and pretty soon things are back to normal.  Also granted that the best people in any one of the STEM fields can command very high premiums over the novice.  You might not like having to pay $150,000 for an experienced ERP implementation manager and you might wish that they were cheaper but that does not necessarily mean that there is a shortage of experienced ERP implementation managers.

I think this is once again an issue arising from particular advocates wanting to use the coercive force of government to achieve individual objectives.  Engineers too expensive, issue more green cards.  Increased supply will reduce the cost.

Given all of the above, it is difficult to make a case that there has been, is, or will soon be a STEM labor shortage. “If there was really a STEM labor market crisis, you’d be seeing very different behaviors from companies,” notes Ron Hira, an associate professor of public policy at the Rochester Institute of Technology, in New York state. “You wouldn’t see companies cutting their retirement contributions, or hiring new workers and giving them worse benefits packages. Instead you would see signing bonuses, you’d see wage increases. You would see these companies really training their incumbent workers.”

“None of those things are observable,” Hira says. “In fact, they’re operating in the opposite way.”

Read the whole thing.

If you want Wal-Mart to have a labor force like Trader Joe’s and Costco, you probably want them to have a business model like Trader Joe’s and Costco

From Why Wal-Mart Will Never Pay Like Costco by Megan McArdle.

In other words, Trader Joe’s and Costco are the specialty grocer and warehouse club for an affluent, educated college demographic. They woo this crowd with a stripped-down array of high quality stock-keeping units, and high-quality customer service. The high wages produce the high levels of customer service, and the small number of products are what allow them to pay the high wages. Fewer products to handle (and restock) lowers the labor intensity of your operation. In the case of Trader Joe’s, it also dramatically decreases the amount of space you need for your supermarket … which in turn is why their revenue per square foot is so high. (Costco solves this problem by leaving the stuff on pallets, so that you can be your own stockboy).

Both these strategies work in part because very few people expect to do all their shopping at Trader Joe’s, and no one expects to do all their shopping at Costco. They don’t need to be comprehensive. Supermarkets, and Wal-Mart, have to devote a lot of shelf space, and labor, to products that don’t turn over that often.

Wal-Mart’s customers expect a very broad array of goods, because they’re a department store, not a specialty retailer; lots of people rely on Wal-Mart for their regular weekly shopping. The retailer has tried to cut the number of SKUs it carries, but ended up having to put them back, because it cost them in complaints, and sales. That means more labor, and lower profits per square foot. It also means that when you ask a clerk where something is, he’s likely to have no idea, because no person could master 108,000 SKUs. Even if Wal-Mart did pay a higher wage, you wouldn’t get the kind of easy, effortless service that you do at Trader Joe’s because the business models are just too different. If your business model inherently requires a lot of low-skill labor, efficiency wages don’t necessarily make financial sense.

That’s not the only reason that the Trader Joe’s/Costco model wouldn’t work for Wal-Mart. For one thing, it’s no accident that the high-wage favorites cited by activists tend to serve the affluent; lower income households can’t afford to pay extra for top-notch service. If it really matters to you whether you pay 50 cents a loaf less for generic bread, you’re not going to go to the specialty store where the organic produce is super-cheap and the clerk gave a cookie to your kid. Every time I write about Wal-Mart (or McDonald’s, or [insert store here]), several people will e-mail, or tweet, or come into the comments to say they’d be happy to pay 25 percent more for their Big Mac or their Wal-Mart goods if it means that the workers are well paid. I have taken to asking them how often they go to Wal-Mart or McDonald’s. So far, no one has reported going as often as once a week; the modal answer is a sudden disappearance from the conversation. If I had to guess, I’d estimate that most of the people making such statements go to Wal-Mart or McDonald’s only on road trips.

However, there are people for whom the McDonald’s Dollar Menu is a bit of a splurge, and Wal-Mart’s prices mean an extra pair of shoes for the kids. Those people might theoretically favor high wages, but they do not act on those beliefs — just witness last Thanksgiving’s union action against Wal-Mart, which featured indifferent crowds streaming past a handful of activists, most of whom did not actually work for Wal-Mart.

If you want Wal-Mart to have a labor force like Trader Joe’s and Costco, you probably want them to have a business model like Trader Joe’s and Costco — which is to say that you want them to have a customer demographic like Trader Joe’s and Costco. Obviously if you belong to that demographic — which is to say, if you’re a policy analyst, or a magazine writer — then this sounds like a splendid idea. To Wal-Mart’s actual customer base, however, it might sound like “take your business somewhere else.”

There is a correct answer to that question, but it’s unlikely we’ll ever know what it was.

From Why Do Education and Health Care Cost So Much? by Megan McArdle.  A great example of the challenges related to causal density.  We may accurately identify all the causes of an outcome but still not be able, because of poor understanding of the relationships between root causes, to predict outcomes.  Absent accurate prediction, we don’t really understand the nature of a problem at all.

So how do we explain health care and college cost inflation? Well, health care economist David Cutler once offered me the following observation: In health care, as in education, the output is very important, and impossible to measure accurately. Two 65-year-olds check into two hospitals with pneumonia; one lives, one dies. Was the difference in the medical care, or their constitutions, or the bacteria that infected them? There is a correct answer to that question, but it’s unlikely we’ll ever know what it was.

Similarly, two students go to different colleges; one flunks out, while the other gets a Rhodes Scholarship. Is one school better, or is one student? You can’t even answer these questions by aggregating data; better schools may attract better students. Even when you control for income and parental education, you’re left with what researchers call “omitted variable bias” — a better school may attract more motivated and education-oriented parents to enroll their kids there.

So on the one hand, we have two inelastic goods with a high perceived need; and on the other hand, you have no way to measure quality of output. The result is that we keep increasing the inputs: the expensive professors and doctors and research and facilities.

I would quibble with McArdle.  There are actually two problems.  It is true that it is hard to measure education and health outcomes and that is a challenge.  But even if we were able to measure with great precision and accuracy, that is still not the same as forecasting.  Measuring is a predicate to forecasting.

If we precisely and accurately measure our initiating action X, we want to know with some level of accuracy and certainty that X will lead to Y, the outcome we desire.  If we cannot predict the outcome, it means we don’t understand the relationship between and among the various causes.

Solution to low public transportation utilization

From Mobility for the Poor: Car-Sharing, Car Loans, and the Limits of Public Transit by Jeff Khau.  An example of the importance of establishing the difference between correlation and causation; of the importance of directionality of causation; of context; of root cause analysis; and goal definition.

Theoretically, one can look at this graph and legitimately make the argument that in order to increase public utilization of public transportation, one ought to increase the average commute time.  It is a good exercise in critical thinking to spot the fallacy of such an interpretation.

Precision and Accuracy

While it might seem an exercise in splitting hairs, in Approximate quotations by Mark Liberman, the author opens up the basis for a good discussion on the differences between approximation, precision and accuracy.  All three aspects are important and necessary but they are different and are each more appropriate in different contexts.

An approximate quote can yield a better general sense of what was being communicated but then the accuracy depends on the interpretation of the journalist.  A direct quote as from a transcript is more precise but more burdensome to the reader.

Sometimes the need for precision is paramount.  On other occasions, an approximation is more efficient.  It is a matter of horses for courses, as long as we keep the distinction between precision and accuracy clear.

Fairy tales masquerading as evidence

Science bible stories, take 27 by Mark Liberman is a useful discussion (including in the comments) about the tendency of media to take up a topical research paper without regard to the methodological robustness of the study or whether the results are meaningfully true.

As I observed a few years ago, “scientific studies”  have taken over the place that bible stories used to occupy. It’s only fundamentalists like me who worry about whether they’re true. For most people, it’s enough that they can be interpreted to be morally instructive.

[snip]

I’d add a third important factor: by and large, the “wise men” (and now the “wise women”) don’t really care about whether the empirical and theoretical foundations of their opinions are sound . They care about readers, ratings, and reputation — and in some cases about political outcomes or cultural values —  with truth relevant only insofar as it affects those goals.

I think Liberman is correct.  People rarely consider what evidence they need in order to make an argument, instead they go after information that is convenient to get.  At the same time, the market structure for ideas and information is such that there are incentives to produce affirming information to a range of prejudices, regardless of the truth of the matter.  Elsewhere I refer to this as cognitive pollution as it constitutes dirt in the system that tends to occlude rather than clarify.

RELATED:  The culturomic psychology of urbanization by Mark Liberman

Decision Clarity Consulting has no affiliation with the Scheier Group, owner of the DECISION CLARITY®, and the Scheier Group does not sponsor, endorse, or control any material on this site. Decision Clarity Consulting, in accordance with an agreement with Scheier Group, performs no work with non-profit or government entities.