© Alexmillos - Dreamstime.com
Gar P01 5dbb4cc3ea232

Metacognition demystified

Nov. 1, 2019
You might think you’re a reasonable person with broad knowledge, who is open to others’ ideas, and is not biased. I hate to tell you, but if you believe this, you’re wrong.

You might think you’re a reasonable person with broad knowledge, who is open to others’ ideas, is not biased, and who weighs all opinions fairly and thoughtfully. I hate to tell you, but if you believe this, you’re wrong. If you’re human, you will have biases and mental models.1

Science says that we operate in the world with our own mental models that help us make sense of the real world. These are a representation of the surrounding world, the relationships between its various parts, and our intuitive perception about our actions and their consequences. A mental model is a kind of internal symbol of external reality, and it can help shape behavior as well as an approach to solving problems and doing tasks.2

These constructs from which we operate are formed from multiple factors, including our culture, experiences, profession, time, and era in which we live.3 For example, at one time people believed that the world was flat and they would fall off the earth if they sailed to the horizon. We now know that this “knowledge” is not true.4 Mental models can be flawed, incomplete in their information, closed to new information, and limited by our experience. Being human exposes us to flawed thinking, yet we continue to follow it as if our way is the only and best way. We’re married to our limited assumptions, and we disregard other people’s limited assumptions (figure 1).

Think about a forest. A biologist sees a forest and looks at the ecosystem; an environmentalist looks at a forest and thinks about climate change; a plant engineer thinks about the growth of the trees; and a businessman thinks about the value of the land. Although no one discipline is wrong, none are fully correct. Alone they cannot look at the full scope of the trees, but together they can form a more complete picture. There is a web of information that forms the picture, and each discipline can help expand the knowledge and conservation of the forest.

As professionals, we need to look through our dental lenses to address dental needs, but we also need to look at a broader spectrum to ensure that our patients are getting the best care possible. For examaple, is this the right time to perform extensive dental work on a patient who is experiencing acute anxiety? Perhaps a case worker needs to attend an older person’s appointment because the case worker will be charged with the home care. How can we better meet the requirements of special needs patients?

Our mental models can affect us in other ways also. Consider that you have a patient with poor oral hygiene who wants an implant, but you determine that his oral hygiene care would set him (and you) up for failure. You refuse to proceed with the procedure, but the patient is persistent. Eventually, with borderline improvement in his home care, you place the implant. 

When healing does not occur, you assume it is due to lack of home care, so you encourage more regular dental visits, multiple rounds of antibiotics, antimicrobial rinses, and every tool available for improving home care. When there is still no improvement, you advise the patient to modify his diet and nutrition by adding a vitamin C supplement. You even consult a colleague to see what else might be available to improve home care. Nothing changes and eventually you refer the patient to a medical doctor, who immediately refers him to a specialist. You learn that the patient has leukemia, and three months later, he’s dead.

Sometimes our own ways of thinking can be disastrous. Are we cognizant of how we arrive at a diagnosis? We can get stuck in our thinking and look for evidence to support our hypotheses and diagnoses. We all have limited views of the world and we make assumptions based on those limited perspectives. We might make a diagnosis and then unknowingly tailor our questions to the patient to support our preconceived diagnosis. We might unconsciously ignore evidence contrary to our diagnosis because we’re already invested in a train of thought. Being aware of this thinking and having knowledge about our knowledge (what scientists call metacognition) allows us to broaden our resources for a complete and accurate picture. Metacognition leads to greater accuracy in our thinking, such as when we diagnose conditions. 

Of course, some shared mental models are necessary to operate a dental office. Responding to an emergency is an example. Training ensures that everyone understands what procedures to follow if a patient goes into shock, faints, has a drop in blood sugar, or experiences another emergency. Even chairside procedures are shared mental models between the doctor and the assistant. 

It is imperative that we follow best-practice standards, but we also need to guard against “groupthink,” a common side effect of standardized policies and procedures. Groupthink can cause us to shut down or disregard other information that may be pertinent to our patients or the running of the practice. Creating an environment where team members feel free to bring up new ideas may in fact prevent catastrophic mistakes.

Research by Nobel prize recipient Daniel Kahneman says that the brain uses two systems for processing information—system I and system II.5

System I: Cognitive ease or fluency is the measure of how easy it is for our brains to process information. According to Dr. Kahneman, cognitive ease is both a cause and a consequence of a pleasant feeling. Cognitive ease makes us feel more favorable toward things that are familiar and easy to understand. We can operate on automatic pilot with system I: we don’t need to consciously focus. There is a tendency for people to have positive feelings toward information that is widely accepted, easily recalled, and frequently repeated. Little cognitive energy is required with system I.

If we feel uneasy, uncertain, or sense a problem, we engage system II (cognitive strain). Science supports that there is a default system that seeks cognitive ease. People apparently don’t like to think! 

System II: Cognitive strain occurs when a person makes multiple mental calculations, reads instructions in a poor or faint font, decodes complicated language, or is in a bad mood.6 Our brains do not work on automatic pilot; they have to focus. There are many examples of cognitive strain in dentistry, such as dealing with disgruntled patients, diagnosing difficult cases, or performing new procedures. Raising awareness about how our brains function and respecting the focus required to perform new procedures can go a long way in the optimum care not only of our patients, but of ourselves (figure 2).

Kahneman’s research supports the position that cognitive biases are often the result of the brain’s attempt to simplify information processing. There are rules of thumb that help people make sense of the world and quickly reach decisions. Experts have identified 188 cognitive biases. These can affect how we hire people, how we interact with others, and whether we have an inclusive culture in our workplace.7 It has a huge impact on our decisions. Arming ourselves with knowledge about our thinking reduces blind spots and enables broad thinking and therefore better decision-making.

The field of metacognition is mushrooming, as is the science on how to actively and consciously achieve a sense of well-being while being more objective in our thinking.  The results are fewer blind spots and more effective decisions. Being mindful about our thinking improves our faulty mental models.

REFERENCES

1. Getting better by being wrong with Annie Duke. Podcast. Farnam Street Blog website. https://fs.blog/annie-duke/. Published 2019.

2. Furlough CS, Gillan DJ. Mental models: Structural differences and the role of experience. J Cog Eng and Decis Mak. 2018;12(4):269-287.

3. Irrationality, bad decisions, and the truth about lies: My conversation with Dan Ariely. Podcast. Farnam Street Blog website. https://fs.blog/dan-ariely/. Published 2019.

4. Nease B. How your brain keeps you believing crap that Isn’t true. Fast Company website. https://www.fastcompany.com/3063319/how-your-brain-keeps-you-believing-crap-that-isnt-true. Published August 31, 2016

5. Kahneman D. Thinking fast and slow. Random House; 2011.

6. The art of changing minds: My conversation with Julia Galef. Podcast. Farnam Street Blog website. https://fs.blog/julia-galef/.

7. Hanson R. Buddha’s Brain. Oakland, CA: New Harbinger Publication; 2008.

Dorothy Garlough, MPA, RDH, is an innovation architect, facilitating strategy sessions and forums to orchestrate change within dentistry. As an international speaker and writer, Garlough trains others to broaden their skill sets to include creativity, collaborative innovation, and forward thinking. She recognizes that engagement is the outcome when the mechanisms are put in place to drive new innovations. Connect with her at [email protected] or visit engagingteams.com.

About the Author

Dorothy Garlough, MPA, RDH

Dorothy Garlough, MPA, RDH,is an innovation architect, facilitating strategy sessions and forums to orchestrate change within dentistry. As an international speaker and writer, Garlough trains others to broaden their skill sets to include creativity, collaborative innovation, and forward-thinking. She recognizes that engagement is the outcome when the mechanisms are put in place to drive new innovations. Connect with her at [email protected] or visit engagingteams.com.