|Previous Page||1||2||3||4||5||6||7||8||9||10||Next Page|
By Peter Facione
“Thinking” in Cognitive Science
Theories emerging from more scientific studies of human thinking and decision-making in recent years propose that thinking is more integrated and less dualistic than the notions in popular culture suggest. We should be cautious about proposals suggesting oversimplified ways of understanding how humans think. We should avoid harsh, rigid dichotomies such as “reason vs. emotion,” “intuitive vs. linear,” “creativity vs. criticality,” “right brained vs. left brained,” “as on Mars vs. as on Venus.”
There is often a kernel of wisdom in popular beliefs, and perhaps that gem this time is the realization that some times we decide things very quickly almost as spontaneous, intuitive, reactions to the situation at hand. Many accidents on the freeways of this nation are avoided precisely because drivers are able to see and react to dangerous situations so quickly. Many good decisions which feel intuitive are really the fruit of expertise.
Decisions good drivers make in those moments of crisis, just like the decisions which practiced athletes make in the flow of a game or the decisions that a gifted teacher makes as she or he interacts with students, are borne of expertise, training, and practice.
At the same time that we are immersed in the world around us and in our daily lives, constantly making decisions unreflectively, we may also be thinking quite reflectively about something. Perhaps we’re worried about a decision which we have to make about an important project at work, or about a personal relationship, or about a legal matter, whatever.
We gather information, consider our options, explore possibilities, formulate some thoughts about what we propose to do and why this choice is the right one. In other words, we make a purposeful, reflective judgment about what to believe or what to do – precisely the kind
of judgment which is the focus of critical thinking.
Recent integrative models of human decision-making propose that the thinking processes of our species is not best described as a conflictive duality as in “intuitive vs. reflective” but rather an integrative functioning of two mutually supportive systems “intuitive and reflective.” These two systems of thinking are present in all of us and can act in parallel to process cognitively the matters over which we are deciding.
One system is more intuitive,reactive, quick and holistic. So as not to confuse things with the notions of thinking in popular culture, cognitive scientists often name this system, “System 1.” The other (yes, you can guess its name) is more deliberative, reflective, computational and rule governed. You are right, it is called “System 2.”
In System 1 thinking, one relies heavily on a number of heuristics (cognitive maneuvers), key situational characteristics, readily associated ideas, and vivid memories to arrive quickly and confidently at a judgment. System 1 thinking is particularly helpful in familiar situations when time is short and immediate action is required.
While System 1 is functioning,another powerful system is also at work, that is, unless we shut it down by abusing alcohol or drugs, or with fear or indifference. Called “System 2,” this is our more reflective thinking system. It is useful for making judgments when you find yourself in unfamiliar situations and have more time to figure things out. It allows us to process abstract concepts, to deliberate, to plan ahead, to consider options carefully, to review and revise our work in the light of relevant guidelines or standards or rules of procedure. While System 2 decisions are also influenced by the correct or incorrect application of heuristic maneuvers, this is the system which relies on well articulated reasons and more fully developed evidence. It is reasoning based on what we have learned through careful analysis, evaluation, explanation, and self-correction. This is the system which values intellectual honesty, analytically anticipating what happens next, maturity of judgment, fair-mindedness, elimination of biases, and truth-seeking. This is the system which we rely on to think carefully trough complex, novel, highstakes, and highly integrative problems.(3)
Educators urge us to improve our critical thinking skills and to reinforce our disposition to use those skills because that is perhaps the best way to develop and refine our System 2 reasoning.
System 1 and System 2 are both believed to be vital decision-making tools when stakes are high and when uncertainty is an issue. Each of these two cognitive systems are believed to be capable of functioning to monitor and potentially override the other. This is one of the ways our species reduces the chance of making foolish, sub-optimal or even dangerous errors in judgment. Human thinking is far from perfect. Even a good thinker makes both System 1 and 2 errors. At times we misinterpret things, or we get our facts wrong, and we make mistakes as a result. But often our errors are directly related to the influences and misapplications of cognitive heuristics. Because we share the propensity to use these heuristics as we make decisions, let’s examine how some of them influence us.
Cognitive heuristics are thinking maneuvers which, at times, appear to be almost hardwired into our species. They influence both systems of thinking, the intuitive thinking of System 1 and the reflective reasoning of System 2. Five heuristics often seem to be more frequently operating in our System 1 reasoning are known as availability, affect, association, simulation, and similarity.
Availability, the coming to mind of a story or vivid memory of something that happened to you or to someone close to you, inclines a person make inaccurate estimates of the likelihood of that thing’s happening again. People tell stories of things that happened to themselves or their friends all the time as a way of explaining their own decisions. The stories may not be scientifically representative, the events may be mistaken, misunderstood, or misinterpreted. But all that aside, the power of the story is to guide, often in a good way, the decision toward one choice rather than another.
The Affect heuristic operates when you have an immediate positive or an negative reaction to some idea, proposal, person, object, whatever. Sometimes called a “gut reaction” this affective response sets up an initial orientation in us, positive or negative, toward the object. It takes a lot of System 2 reasoning to overcome a powerful affective response to an idea, but it can be done. And at times it should be, because there is no guarantee that your gut reaction is always right.
The Association heuristic is operating when one word or idea reminds us of something else. For example, some people associate the word “cancer” with “death.” Some associate “sunshine” with “happiness.” These kinds of associational reasoning responses can be helpful at times, as for example if associating cancer with death leads you not to smoke and to go in for regular checkups. At other times the same association may influence a person to make an unwise decision, as for example if associating “cancer” with “death” were to lead you to be so fearful and pessimistic that you do not seek diagnosis and treatment of a worrisome cancer symptom until it was really too late to do anything.
The Simulation heuristic is working when you are imagining how various scenarios will unfold. People often imagine how a conversation will go, or how they will be treated by someone else when they meet the person, or what their friends or boss or lover will say and do when they have to address some difficult issue. These simulations, like movies in our heads, help us prepare and do a better job when the difficult moment arrives. But they can also lead us to have mistaken expectations. People may not respond as we imagined, things may go much differently. Our preparations may fail us because the ease of our simulation misled us into thinking that things would have to go as we had imagined them. And they did not.
The Similarity heuristic operates when we notice some way in which we are like someone else and infer that what happened to that person is therefore more likely to happen to us. The similarity heuristic functions much like an analogical argument or metaphorical model. The similarity we focus on might be fundamental and relevant, which would make the inference more warranted. For example, the boss fired your coworker for missing sales targets and you draw the reasonable conclusion that if you miss your sales targets you’ll be fired too. Or the similarity that comes to mind might be superficial or not connected with the outcome, which would make the inference unwarranted. For example you see a TV commercial showing trim-figured young people enjoying fattening fast foods and infer that because you’re young too you can indulge your cravings for fast foods without gaining a lot of excess unsightly poundage.
Heuristics and biases often appearing to be somewhat more associated with System 2 thinking include: satisficing, risk/loss aversion, anchoring with adjustment, and the illusion of control.
Satisficing occurs as we consider our alternatives. When we come to one which is good enough to fulfill our objectives we often regard ourselves as having completed our deliberations. We have satisficed. And why not? The choice is, after all, good enough. It may not be perfect, it may not be optimal, it may not even be the best among the options available. But it is good enough. Time to decide and move forward.
The running mate of satisficing is temporizing.
Temporizing is deciding that the option which we have come to is “good enough for now.” We often move through life satisficing and temporizing. At times we look back on our situations and wonder why it is that we have settled for far less than we might have. If we had only studied harder, worked out a little more, taken better care of ourselves and our relationships, perhaps we would not be living as we are now. But, at the time each of the decisions along the way was “good enough for the time being.”
We are by nature a species that is averse to risk and loss. Often we make decisions on the basis of what we are too worried about losing, rather than on the basis of what we might gain. This works out to be a rather serviceable approach in many circumstances. People do not want to lose control, they do not want to lose their freedom, they do not want to lose their lives, their families, their jobs, their possessions. High stakes gambling is best left to those who can afford to lose the money. Las Vegas didn’t build all those multi-million dollar casino hotels because vacationers are winning all the time! And so, in real life, we take precautions. We avoid unnecessary risks. The odds may not be stacked against us, but the consequences of losing at times are so great that we would prefer to forego the possibilities of gain in order not to lose what we have. And yet, on occasion this can be a most unfortunate decision too. History has shown time and time again that businesses which avoid risks often are unable to compete successfully with those willing to move more boldly into new markets or into new product lines.
Any heuristic is only a maneuver, perhaps a shortcut or impulse to think or act in one way rather than another, but certainly not a failsafe rule. It may work out well much of the time to rely on the heuristic, but it will not work out for the best all of the time.
For example, people with something to lose tend toward conservative choices politically as well as economically. Nothing wrong with that necessarily. Just an observation about the influence of Loss Aversion heuristic on actual decision making. We are more apt to endure the status quo, even as it slowly deteriorates, than we are to call for “radical” change. Regrettably, however, when the call for change comes, it often requires a far greater upheaval to make the necessary transformations, or, on occasion, the situation has deteriorated beyond the point of no return. In those situations we find ourselves wondering why we waited so long before doing something.
The heuristic known as Anchoring with Adjustment is operative when we find ourselves making evaluative judgments. The natural thing for us to do is to locate or anchor our evaluation at some point along whatever scale we are using. For example, a professor says that the student’s paper is a C+. Then, as other information comes our way, we may adjust that judgment. The professor, for example, may decide that the paper is as good as some others that were given a B-, and so adjust the grade upward.
The interesting thing about this heuristic, is that we do not normally start over with a fresh evaluation. We have dropped anchor and we may drag it upward or downward a bit, but we do not pull it off the bottom of the sea to relocate our evaluation. First impressions, as the saying goes, cannot be undone. The good thing about this heuristic is that it permits us to move on. We have done the evaluation; there are other papers to grade, other projects to do, other things in life that need attention. We could not long endure if we had to constantly reevaluate every thing anew. The unfortunate thing about this heuristic is that we sometimes drop anchor in the wrong place we have a hard time giving people a second chance at making a good first impression.