...helping clients in complex environments make better decisions, faster.

CALL NOW: 404.898.9096

info@decisionclarityconsulting.com

Archive for November, 2013

Causal density, trade-offs and unsatisfactory outcomes

Here is an interesting juxtaposition of articles.  The first, Death of an Adjunct by Daniel Kovalik, came out in September 2013 and chronicled the tribulations and death of an 83 year old adjunct professor at Duquesne University, Margaret Mary Vojtko.  It seems to tell an appalling tale of neglect and the exploitation of a vulnerable old woman by a heartless educational bureaucracy.

But then there is Death of a Professor by L.V. Anderson, covering exactly the same ground and arriving at an entirely different, and much more nuanced, conclusion.

Kovalik is a lawyer for the union seeking to organize adjunct professors and clearly his article is an exercise in rhetoric to advance his cause.  Given that his article was republished fairly broadly, it was clearly a reasonably effective exercise in rhetoric.  Kovalik is basically arguing that had there been a union representing the interests of Vojtko, she would not have died in circumstances of near poverty.

I think what the juxtaposition of the two articles does is profile 1) the importance of providing the full dossier of evidence and 2) the importance of problem definition.

Anderson does not significantly contradict anything in Kosalik’s rendition of events.  All she does is provide a more complete context.  In doing so, she pretty much demolishes his argument.  With the more complete picture she provides, it is clear that Vojtko’s penurious circumstances were the result of a range of decisions on her part (causal density), most of which could not be addressed through the existence of a union.  This is not to deny the tragedy of this story but rather to point out the obvious – things are not always the way they seem and few human issues have simple answers.

In fact, the latter point is illustrated by a follow-up article written by Anderson, Why Adjunct Professors Don’t Just Find Other Jobs.  She answers her own question in three parts.  First, they still hope to find a track to a tenured position.  Given the terrible working conditions of the adjunct professor, this is a testament to the value of the tenured position.   Second, there are scheduling challenges to finding jobs outside of academia (I didn’t say that Anderson’s reasons were good reasons).  Third, they really, really like teaching rather than anything else they might do.

So if your concern is the financial well-being of adjunct professors, then Anderson’s second article means that there is little that can be done.  Adjunct professors will do just about anything in the hopes of getting on a tenure track, they aren’t all that interested in non-academic jobs and they prefer teaching, even at low rates of pay, to just about any alternative.  And there are a lot of them.  Given limited tenured positions and increasing supply of individuals freely seeking those positions, there is little that can be done to change the economic circumstances of adjunct professors.  Excessive supply and limited demand inherently means that compensation will be low.  You can either increase demand (increase the number of tenured positions) or reduce the supply of adjunct professors – neither action being particularly feasible in a free market environment where people are free to make their own choices.

What are the decision-making lessons from these related articles.  I would argue there are three.  1) Work hard to get the complete picture.  The emotionally satisfying story is often not the complete story.  2)  Problem definition is critical.  3) Causal density means that there are many human problems, tragedies even, which do not have acceptable solutions.

As an exercise, what systemic actions could have been taken, consistent with free citizens and the law, that could have precluded the actual outcome?

Human knowledge in the linear domain, does not transfer properly to the complex domain

Nassim Nicholas Taleb is a brilliant thinker, philosopher and writer.  I have read and enjoyed Fooled by Randomness as well as Black Swan and am looking forward to his most recent, Antifragile (reviewed in The Economist here).

Taleb had an article in Foreign Affairs in 2011 which hit on some of his main arguments (as applied to international relations) – The Black Swan of Cairo.  Black Swan is Taleb’s term for an unpredicted (and essentially unpredictable) event that disrupts the status quo.

Complex systems that have artificially suppressed volatility tend to become extremely fragile, while at the same time exhibiting no visible risks. In fact, they tend to be too calm and exhibit minimal variability as silent risks accumulate beneath the surface. Although the stated intention of political leaders and economic policymakers is to stabilize the system by inhibiting fluctuations, the result tends to be the opposite. These artificially constrained systems become prone to “Black Swans”—that is, they become extremely vulnerable to large-scale events that lie far from the statistical norm and were largely unpredictable to a given set of observers.

Such environments eventually experience massive blowups, catching everyone off-guard and undoing years of stability or, in some cases, ending up far worse than they were in their initial volatile state. Indeed, the longer it takes for the blowup to occur, the worse the resulting harm in both economic and political systems.

[snip]

Humans simultaneously inhabit two systems: the linear and the complex.  The linear domain is characterized by its predictability and the low degree of interaction among its components, which allows the use of mathematical methods that make forecasts reliable. In complex systems, there is an absence of visible causal links between the elements, masking a high degree of interdependence and extremely low predictability. Nonlinear elements are also present, such as those commonly known, and generally misunderstood, as “tipping points.” Imagine someone who keeps adding sand to a sand pile without any visible consequence, until suddenly the entire pile crumbles. It would be foolish to blame the collapse on the last grain of sand rather than the structure of the pile, but that is what people do consistently, and that is the policy error.

[snip]

Engineering, architecture, astronomy, most of physics, and much of common science are linear domains. The complex domain is the realm of the social world, epidemics, and economics. Crucially, the linear domain delivers mild variations without large shocks, whereas the complex domain delivers massive jumps and gaps. Complex systems are misunderstood, mostly because humans’ sophistication, obtained over the history of human knowledge in the linear domain, does not transfer properly to the complex domain. Humans can predict a solar eclipse and the trajectory of a space vessel, but not the stock market or Egyptian political events. All man-made complex systems have commonalities and even universalities.  Sadly, deceptive calm (followed by Black Swan surprises) seems to be one of those properties.

Taleb also mentions but does not elaborate on, the important issue of “the illusion of local causal chains—that is, confusing catalysts for causes and assuming that one can know which catalyst will produce which effect.”  There is a tendency to see the last event as the “cause” of something when in fact it is sometimes simply the catalyst to a systemic readjustment, i.e. the straw that broke the camel’s back.  It wasn’t the straw per se, but the cumulative weight that preceded it.

There are echoes of Stephen Jay Gould and Niles Edlredge’s Punctuated Equilibrium  in which they argued that evolution is not a smooth continual process but rather a process characterized by fits and starts or a system of punctuated equilibrium.

Taleb is arguing that our efforts to ensure near term tactical stability are often at odds with desirable system evolution over the long run.  It is a classic trade-off decision.  He has observed many times that the good tactical intentions often end up unintentionally leading to catastrophic strategic outcomes.  An example would be that of forest fire management.  Nobody wants forest fires and for decades the strategy was simple fire suppression, keep fires from happening and put them out as fast as possible when they do happen.

In reducing near term fires, regrettably, forests have accumulated much greater fuel loads than they would otherwise under natural conditions where lightning strike fires periodically clear dead brush.  The result has been increasingly frequent, vast and intense wildfires beyond control.  A strategy for achieving near term stability (reduced wildfires) has ended up worsening the situation in the long run.

When making a strategic decision, it is important to consider the historical context.  Has the existing system evolved over time and therefore has some base level of stability, or has it existed in an unnatural state of artificial stability with all variance suppressed?  If it is the latter, then any actions undertaken related to a new change may have unanticipated consequences not necessarily having anything to do with the intended plan of action but simply as a consequence of cumulative avoided evolution.

 

 

Decision Clarity Consulting has no affiliation with the Scheier Group, owner of the DECISION CLARITY®, and the Scheier Group does not sponsor, endorse, or control any material on this site. Decision Clarity Consulting, in accordance with an agreement with Scheier Group, performs no work with non-profit or government entities.