Thursday, August 8, 2013

Cluelessness, complacency and the great unknown

The case of the self-blind psychologist

An experienced forensic psychologist -- let's call him Dr. Short -- applies for a job as a forensic evaluator. He is rejected based on his written work sample. He files a formal protest, insisting that the report was fine.

As you all know, forensic reports should be (in the words of an excellent trainer I once had) both "fact-based" and "fact-limited." In other words, we must (a) carefully explain the data that support our opinion, and (b) exclude irrelevant information, especially that which is intrusive or prejudicial.[1]

Dr. Short's report was neither fact-based nor fact-limited. The adduced evidence did not support his forensic opinions, and the report was littered with extraneous material insinuating bad moral character. We learned of the subject's unorthodox sexual tastes and former gang associations, neither of which were relevant to the very limited forensic issue at hand. Using ethnic terms to describe the subject's hair, Dr. Short inadvertently revealed more about his biases than about the subject.

Obviously, based on his vehement insistence that his report was fine, Dr. Short was blind to these deficiencies. Which got me to thinking: Since biases are largely unconscious, can people be made aware of them? Can blind spots be overcome? How can we come to understand what we do not know?

"The anosognosia of everyday life"

Pondering these questions in connection with one of my seminars at Bond University, I stumbled across some intriguing philosophical discourse on the various types of unknowns, and how to remedy them:

The simplest type of unknown has been labeled a "known unknown." This is something we don't know, and know we don't know. Let’s say you learn that someone you are evaluating in a sanity proceeding had ingested an obscure substance just before the crime. If you don’t know the substance’s potential effects, the solution is straightforward (assuming you are motivated): Do the research.

In some cases, we know the question, but no answer exists. For example, we know that six out of ten individuals who score as high risk on actuarial instruments will not reoffend violently, due to the base rates of violence. What we don’t know is how to distinguish the true from false positives. So that’s a known unknown with an unknown answer. But if we are at least aware of the issue, we can explain the field’s empirical knowledge gap in our reports.

However, unknown unknowns [2] are an entirely different kettle of fish. These are things we don't know and don't realize that we don't know. We don't know that there even IS a question that needs to be asked. Without being able to frame the question, we obviously cannot figure out an answer. Put simply: We are clueless.

Unknown unknowns are a major problem in forensic psychology, with its dearth of racial, ethnic and cultural diversity among researchers and practitioners.[3] Vast experiential divides lead evaluators to impose their own moral standards without even realizing they are doing so. In condemning his subject's sexual promiscuity and drug use, for example, Dr. Short made false and universalizing assumptions that revealed ignorance of lifestyles other than his own. (This reminded me of an African American prisoner’s dilemma interacting with white guards in remote, rural prisons; because the farming communities from which these guards were recruited are devoid of mainstream African Americans, the guards tended to assume that all Black people had the characteristics of Black convicts.)

"The anosognosia of everyday life" is the rather gloomy term coined by David Dunning of Cornell University, who specializes in decision-making processes, to describe such routine ignorance.[4] Dunning is a great believer in ignorance as a driving force that shapes our lives in ways in which we will never know. 
"Put simply, people tend to do what they know and fail to do that which they have no conception of. In that way, ignorance profoundly channels the course we take in life."

Apropos of Dr. Short's report, Dunning notes that cluelessness on the part of a so-called expert does not imply dishonesty or a lack of caring:
"People can be clueless in a million different ways, even though they are largely trying to get things right in an honest way. Deficits in knowledge, or in information the world is giving them, leads people toward false beliefs and holes in their expertise."

Laziness a major culprit

Unknown unknowns are not unfathomable mysteries that can never be solved. They are caused by laziness and complacency, which block the process of discovery as surely as a dam holds back water. It’s what German cognitive scientist Dietrich Dorner was talking about when he wrote, in The Logic of Failure, that “to the ignorant, the world looks simple.”[5] We’ve all known people who are incompetent, but whose very incompetence masks their ability to recognize their incompetence. There’s even an unwieldy term for this condition (named after the researchers who studied it, naturally): Just call it the Dunning-Kruger Effect. Quoting Dunning yet again: 
"Unknown unknown solutions haunt the mediocre without their knowledge. The average detective does not realize the clues he or she neglects. The mediocre doctor is not aware of the diagnostic possibilities or treatments never considered. The run-of-the-mill lawyer fails to recognize the winning legal argument that is out there. People fail to reach their potential as professionals, lovers, parents and people simply because they are not aware of the possible."

Before leaving the topic of the great unknowns, I must mention one final type of unknown, an especially pernicious one in forensic work. Unknown knowns, which undoubtedly beset Dr. Short, are unconscious beliefs and prejudices that guide one’s perceptions and interactions. Perhaps the 19th century humorist Josh Billings captured the quality of these unknowns the best when he wryly observed:
"It ain't what you don't know that gets you into trouble. It's what you think you know that just ain't so." [6]

Tackling the great unknown

So, is there any hope for our wayward Dr. Short, oblivious to his biases and blind spots? The answer, as in many facets of life, is: It depends. One of the most elementary lessons one learns as a novice psychologist is that people don’t change unless they are motivated to change. (Hence, a whole area of psychology devoted to enhancing motivation to change, through so-called “motivational interviewing.”) Effective change is rarely compelled. If Dr. Short is open to feedback and correction, this experience could be a wake-up call. On the other hand, his very protest speaks to an impaired capacity for self-reflection, a brittle ego defense that may be difficult to penetrate.

Either way, Dr. Short's dilemma can serve as a lesson for others, including both students and practitioners. The key to opening the locks on the dam of knowledge is readily available: It is simply a genuine desire to learn, and a willingness to confront life’s complexities. To those with a thirst for knowledge, the world is complex, and that complexity is what makes it so fascinating.

Here, in a nutshell, is the advice I gave to the graduate students at Bond during last week’s lecture that touched on the paradoxes of the unknowns:
    If you haven't faced it, it's not easy to imagine this life
  • To reduce the unknown unknowns, seek broad knowledge. Seek out people from other walks of life, who may not share your views or experiences. Travel outside your comfort zone, not just geographically but in other local cultures as well. These experiences can open one’s eyes to difference. Travel vicariously by reading widely, especially OUTSIDE of the insular, micro-focused and ahistorical field of psychology.
  • Study up on cognitive biases and how they work. Especially, understand confirmatory bias, and build in hypothesis testing (including the testing of alternate hypotheses) as a routine practice. (Excellent resources on cognitive biases include Nate Silver's The Signal and the Noise and Carol Tavris and Elliot Aronson's Mistakes Were Made (but not by me), which brilliantly and unforgettably explains how two people can start out much the same but diverge dramatically so that they ultimately stare at each other as strangers across a great chasm.)
  • Create formal feedback loops so that you learn how cases you were involved in were resolved, how your work was received, and whether your opinions proved accurate. 
  • Don't assume you know the answer. Ask questions. And then ask more questions. 

  • Stay humble. Arrogance, or overconfidence in one’s wisdom, can short-circuit understanding as surely as TSA security checkpoints destroy the fun of flying. (That rather strained metaphor is a clue that this post was penned from 40,000 feet in the air.)
  • Finally, and most critically: When you look across the table, try to see a fellow human being, someone who perhaps lost their way in life's dark wood, rather than an alien or a monster. Before you judge someone, try to walk a mile in his shoes.

Ultimately, Dr. Short's dilemma flows not only from complacency but from an essential deficit in empathy, an inability to truly see -- and understand -- the fellow human being sitting across from him in that forensic interview room.

* * * * *

Notes
  1. This is discussed in both the American Psychological Association's Ethics Code (Standard 4.04, Minimizing Intrusions on Privacy, states that psychologists should include in written reports "only information germane to the purpose for which the communication is made") as well as the Specialty Guidelines for Forensic Psychology (see, for example, 10.01, Focus on Legally Relevant Factors).

  2. The term "unknown unknown" is sometimes credited to US Secretary of Defense Donald Rumsfeld, who used it to explain why the United States went to war with Iraq over mythical Weapons of Mass Destruction (WMD’s). Although the phrase gained currency at this time, others had already used it

  3. Heilbrun, K., & Brooks, S. (2010). Forensic psychology and forensic science: A proposed agenda for the next decade. Psychology, Public Policy, and Law, 16, 219-253. 

  4. For further conversation on this topic, see: Morris, E. (2010, June 20), The anosognosic's dilemma: Something's wrong but you'll never know what it is, New York Times blog.  Also see: Dunning, D. (2005). Self-Insight: Roadblocks and Detours on the Path to Knowing Thyself (Essays in Social Psychology), Psychology Press, p. 14-15; Dunning, D. & Kruger, J. (1999), Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments, Journal of Personality and Social Psychology 77 (6), 1121-1134. 

  5. I cribbed the Dorner quote from Dr. Wayne Petherick, Associate Professor of Criminology and coordinator of the criminology program at Bond University. 

  6. Some attribute this quote not to Josh Billings but to Mark Twain, who was kicking around at the same time. 

1 comment:

  1. Excellent (as always) and thought -provoking post....

    ReplyDelete

Note: Only a member of this blog may post a comment.

 
Real Time Web Analytics