• Career Advice
  • Job Search & Interview
  • Productivity
  • Public Speaking and Presentation
  • Social & Interpersonal Skills
  • Professional Development
  • Remote Work

Eggcellent Work

12 common barriers to critical thinking (and how to overcome them).

As you know, critical thinking is a vital skill necessary for success in life and work. Unfortunately,  barriers to critical thinking  can hinder a person’s ability. This piece will discuss some of the most common  internal and external barriers to critical thinking  and what you should do if one of them hinders your ability to think critically.

Table of Contents

Critical Thinking Challenges

You already know that  critical thinking  is the process of analyzing and evaluating a situation or person so that you can make a sound judgment. You normally use the judgment you derive from your critical thinking process to make crucial decisions, and the choices you make affect you in workplaces, relationships, and life’s goals and achievements.

Several  barriers to critical thinking  can cause you to skew your judgment. This could happen even if you have a large amount of data and information to the contrary. The result might be that you make a poor or ineffective decision instead of a choice that could improve your life quality. These are some of the top obstacles that hinder and distort the ability to think critically:

1. Using Emotions Instead of Logic

Failing to remove one’s emotions from a critical thinking analysis is one of the hugest barriers to the process. People make these mistakes mainly in the relationship realm when choosing partners based on how they “make them feel” instead of the information collected.

The correct way to decide about a relationship is to use all facts, data, opinions, and situations to make a final judgment call. More times than not, individuals use their hearts instead of their minds.

Emotions can hinder critical thinking in the employment realm as well. One example is an employee who reacts negatively to a business decision, change, or process without gathering more information. The relationship between that person and the employer could become severed by her  lack of critical thinking  instead of being salvaged by further investigations and rational reactions.

2. Personal Biases

Personal biases can come from past negative experiences, skewed teachings, and peer pressure. They create a huge obstacle in critical thinking because they overshadow open-mindedness and fairness.

One example is failing to hire someone because of a specific race, age, religious preference, or perceived attitude. The hiring person circumvents using critical thinking by accepting his or her biases as truth. Thus, the entire processes of information gathering and objective analysis get lost in the mix.

3. Obstinance

Stubbornness almost always ruins the critical thinking procedure. Sometimes, people get so wrapped up in being right that they fail to look at the big picture. Big-picture thinking is a large part of critical thinking; without it, all judgments and choices are rash and incomplete.

4. Unbelief

It’s difficult for a person to do something he or she doesn’t believe in. It’s also challenging to engage in something that seems complex. Many people don’t think critically because they believe they must be scholarly to do so. The truth is that  anyone  can think critically by practicing the following steps:

  • 1. Gather as much data as possible.
  • 2. Have an opinion, but be open to changing it.
  • 3. Understand that assumptions are not the truth, and opinions are not facts.
  • 4. Think about the scenario, person, or problem from different angles.
  • 5. Evaluate all the information thoroughly.
  • 6. Ask simple, precise, and abundant questions.
  • 7. Take time to observe.
  • 8. Don’t be afraid to spend time on the problem or issue.
  • 9. Ask for input or additional information.
  • 10. Make it make sense.

5. Fear of Failure or Change

Fear of change and failure often hinders a person’s critical thinking process because it doesn’t allow thinking outside the box. Sometimes, the most efficient way to resolve a problem is to be open to changing something.

That change might be a different way of doing something, a relationship termination, or a shift of positions at a workplace. Fear can block out all possible scenarios in the critical thinking cycle. The result is often one-dimensional thinking, tunnel vision, or proverbial head-banging.

6. Egocentric Thinking

Egocentric thinking is also one of the main barriers to critical thinking. It occurs when a person examines everything through a “me” lens. Evaluating something properly requires an individual to understand and consider other people’s perspectives, plights, goals, input, etc.

7. Assumptions

Assumptions are one of the negative  factors that affect critical thinking . They are detrimental to the process because they cause distortions and misguided judgments. When using assumptions, an individual could unknowingly insert an invalid prejudgment into a stage of the thought process and sway the final decision.

It’s never wise to assume anything about a person, entity, or situation because it could be 100 percent wrong. The correct way to deal with assumptions is to store them in a separate thought category of possibilities and then use the data and other evidence to validate or nullify them.

XYZ  might  be why ABC happened, but there isn’t enough information or data to conclude it. The same concept is true for the rest of the possibilities, and thus, it’s necessary to research and analyze the facts before accepting them as truths.

8. Group Thinking

Group thinking is another one of the  barriers to critical thinking  that can block sound decisions and muddy judgments. It’s similar to peer pressure, where the person takes on the viewpoint of the people around him or her to avoid seeming “different.”

This barrier is dangerous because it affects how some people think about right and wrong. It’s most prevalent among teens. One example is the “everybody’s doing it (drugs, bullying), so I should too” mindset.

Unfortunately, this barrier can sometimes spill over into the workplace and darken the environment when workers can’t think for themselves. Workers may end up breaking policies, engaging in negative behavior, or harassing the workers who don’t conform.

Group thinking can also skew someone’s opinion of another person before the individual gets a chance to collect facts and evaluate the person for himself. You’ve probably heard of smear campaigns. They work so well against targets because the parties involved don’t use the critical thinking process at all.

9. Impulsivity

Impulsivity is the tendency to do things without thinking, and it’s a bona fide critical thinking killer. It skips right by  every  step in the critical thinking process and goes directly to what feels good in the moment.

Alleviating the habit takes practice and dedication. The first step is to set time aside when impulsive urges come to think about all aspects of the situation. It may take an impulsive person a while to develop a good critical thinking strategy, but it can work with time.

10. Not Knowing What’s Fact and Opinion

Critical thinking requires the thinker to know the difference between facts and opinions. Opinions are statements based on other people’s evaluative processes, and those processes may not be critical or analytical. Facts are an unemotional and unbiased piece of data that one can verify. Statistics and governmental texts are examples.

11. Having a Highly Competitive Nature

A “winning” mindset can overshadow the fair and objective evaluation of a problem, task, or person and undermine critical thinking. People who  think competitively  could lose sight of what’s right and wrong to meet a selfish goal that way.

12. Basing Statements on Popularity

This problem is prevalent in today’s world. Many people will accept anything a celebrity, political figure, or popular person says as gospel, but discredit or discount other people’s input. An adept critical thinker knows how to separate  what’s  being said from  who  said it and perform the necessary verification steps.

  • The Ultimate Guide To Critical Thinking

Is Critical Thinking A Soft Skill Or Hard Skill?

  • How To Improve Critical Thinking Skills At Work And Make Better Decisions
  • 5 Creative and Critical Thinking Examples In Workplace
  • 10 Best Books On Critical Thinking And Problem Solving
  • 12 Critical Thinking Interview Questions and Scenarios With Sample Answers
  • How To Promote Critical Thinking In The Workplace

How To Overcome Barriers in Critical Thinking

If you can identify any of the above-mentioned  barriers , your critical thinking may be flawed. These are some tips for overcoming such barriers:

1. Know your flaws.

The very first step toward improving anything is to know and admit your flaws. If you can do that, you are halfway to using better critical thinking strategies.

2. Park your emotions.

Use logic, not emotion, when you are evaluating something to form a judgment. It’s not the time to think with your heart.

3. Be mindful of others.

Try to put yourself in other people’s shoes to understand their stance. A little empathy goes a long way.

4. Avoid black-and-white thinking.

Understand that there’s always more than one way to solve a problem or achieve a goal. Additionally, consider that not every person is all bad or all good.

5. Dare to be unpopular.

Avoid making decisions to please other people. Instead, evaluate the full lot of information and make the decision you feel is best.

6. Don’t assign unjustified merit.

Don’t assume someone is telling the truth or giving you more accurate information because of his or her name or status. Evaluate  all  people’s input equally.

7. Avoid judging others.

Try to keep biases and prejudices out of your decision-making processes. That will make them fair and just.

8. Be patient with yourself.

Take all the days you need to pick apart a situation or problem and resolve it. Don’t rush to make hasty decisions.

9. Accept different points of view.

Not everyone will agree with you or tell you what you want to hear.

10. Embrace change.

Don’t ever be afraid of changing something or trying something new. Thinking outside the box is an integral part of the critical thinking process.

Now you know the answers to the question,  “What are the challenges of critical thinking?”  Use the information about the  barriers to critical thinking  to improve your critical thinking process and make healthier and more beneficial decisions for everyone.

  • Critical Thinking vs Problem Solving: What’s the Difference?
  • Is Critical Thinking Overrated?  Disadvantages Of Critical Thinking
  • 25 In-Demand Jobs That Require Critical Thinking and Problem-Solving Skills
  • Brainstorming: Techniques Used To Boost Critical Thinking and Creativity
  • 11 Principles Of Critical Thinking  

' src=

Jenny Palmer

Founder of Eggcellentwork.com. With over 20 years of experience in HR and various roles in corporate world, Jenny shares tips and advice to help professionals advance in their careers. Her blog is a go-to resource for anyone looking to improve their skills, land their dream job, or make a career change.

Further Reading...

self-help books that actually help

10 Self-Help Books That Actually Help Improve Your Life and Career

is critical thinking a skill

From Good To Great: 20 Examples Of Exceeding Expectations

No comments, leave a reply cancel reply.

Save my name, email, and website in this browser for the next time I comment.

How To List Skills That I Taught Myself On Resume

12 critical thinking interview questions and scenarios with sample answers  .

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • PMC10300824

Logo of jintell

An Evaluative Review of Barriers to Critical Thinking in Educational and Real-World Settings

Associated data.

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Though a wide array of definitions and conceptualisations of critical thinking have been offered in the past, further elaboration on some concepts is required, particularly with respect to various factors that may impede an individual’s application of critical thinking, such as in the case of reflective judgment. These barriers include varying levels of epistemological engagement or understanding, issues pertaining to heuristic-based thinking and intuitive judgment, as well as emotional and biased thinking. The aim of this review is to discuss such barriers and evaluate their impact on critical thinking in light of perspectives from research in an effort to reinforce the ‘completeness’ of extant critical thinking frameworks and to enhance the potential benefits of implementation in real-world settings. Recommendations and implications for overcoming such barriers are also discussed and evaluated.

1. Introduction

Critical thinking (CT) is a metacognitive process—consisting of a number of skills and dispositions—that, through purposeful, self-regulatory reflective judgment, increases the chances of producing a logical solution to a problem or a valid conclusion to an argument ( Dwyer 2017 , 2020 ; Dwyer et al. 2012 , 2014 , 2015 , 2016 ; Dwyer and Walsh 2019 ; Quinn et al. 2020 ).

CT has long been identified as a desired outcome of education ( Bezanilla et al. 2019 ; Butler et al. 2012 ; Dwyer 2017 ; Ennis 2018 ), given that it facilitates a more complex understanding of information ( Dwyer et al. 2012 ; Halpern 2014 ), better judgment and decision-making ( Gambrill 2006 ) and less dependence on cognitive bias and heuristic thinking ( Facione and Facione 2001 ; McGuinness 2013 ). A vast body of research (e.g., Dwyer et al. 2012 ; Gadzella 1996 ; Hitchcock 2004 ; Reed and Kromrey 2001 ; Rimiene 2002 ; Solon 2007 ), including various meta-analyses (e.g., Abrami et al. 2008 , 2015 ; Niu et al. 2013 ; Ortiz 2007 ), indicates that CT can be enhanced through targeted, explicit instruction. Though CT can be taught in domain-specific areas, its domain-generality means that it can be taught across disciplines and in relation to real-world scenarios ( Dwyer 2011 , 2017 ; Dwyer and Eigenauer 2017 ; Dwyer et al. 2015 ; Gabennesch 2006 ; Halpern 2014 ). Indeed, the positive outcomes associated with CT transcend educational settings into real-world, everyday situations, which is important because CT is necessary for a variety of social and interpersonal contexts where good decision-making and problem-solving are needed on a daily basis ( Ku 2009 ). However, regardless of domain-specificity or domain-generality of instruction, the transferability of CT application has been an issue in CT research (e.g., see Dumitru 2012 ). This is an important consideration because issues with transferability—for example, in real-world settings—may imply something lacking in CT instruction.

In light of the large, aforementioned body of research focusing on enhancing CT through instruction, a growing body of research has also evaluated the manner in which CT instruction is delivered (e.g., Abrami et al. 2008 , 2015 ; Ahern et al. 2019 ; Cáceres et al. 2020 ; Byerly 2019 ; Dwyer and Eigenauer 2017 ), along with additional considerations for and the barriers to such education, faced by teachers and students alike (e.g., Aliakbari and Sadeghdaghighi 2013 ; Cáceres et al. 2020 ; Cornell et al. 2011 ; Lloyd and Bahr 2010 ; Ma and Liu 2022 ; Ma and Luo 2021 ; Rear 2019 ; Saleh 2019 ); for example, those regarding conceptualisation, beliefs about CT, having feasible time for CT application and CT’s aforementioned transferability. However, there is a significant lack of research investigating barriers to CT application by individuals in real-world settings, even by those who have enjoyed benefits from previous CT instruction. Thus, perhaps the previously conjectured ‘something lacking in CT instruction’ refers to, in conjunction with the teaching of what CT consists of, making clear to students what barriers to CT application we face.

Simply, CT instruction is designed in such a way as to enhance the likelihood of positive decision-making outcomes. However, there are a variety of barriers that can impede an individual’s application of CT, regardless of past instruction with respect to ‘how to conduct CT’. For example, an individual might be regarded as a ‘critical thinker’ because they apply it in a vast majority of appropriate scenarios, but that does not ensure that they apply CT in all such appropriate scenarios. What keeps them from applying CT in those scenarios might well be one of a number of barriers to CT that often go unaddressed in CT instruction, particularly if such instruction is exclusively focused on skills and dispositions. Perhaps too much focus is placed on what educators are teaching their students to do in their CT courses as opposed to what educators should be recommending their students to look out for or advising what they should not be doing. That is, perhaps just as important for understanding what CT is and how it is conducted (i.e., knowing what to do) is a genuine awareness of the various factors and processes that can impede CT; and so, for an individual to think critically, they must know what to look out for and be able to monitor for such barriers to CT application.

To clarify, thought has not changed regarding what CT is or the cognitive/metacognitive processes at its foundation (e.g., see Dwyer 2017 ; Dwyer et al. 2014 ; Ennis 1987 , 1996 , 1998 ; Facione 1990 ; Halpern 2014 ; Paul 1993 ; Paul and Elder 2008 ); rather, additional consideration of issues that have potential to negatively impact CT is required, such as those pertaining to epistemological engagement; intuitive judgment; as well as emotional and biased thinking. This notion has been made clear through what might be perceived of as a ‘loud shout’ for CT over at least the past 10–15 years in light of growing political, economic, social, and health-related concerns (e.g., ‘fake news’, gaps between political views in the general population, various social movements and the COVID-19 pandemic). Indeed, there is a dearth of research on barriers to CT ( Haynes et al. 2016 ; Lloyd and Bahr 2010 ; Mangena and Chabeli 2005 ; Rowe et al. 2015 ). As a result, this evaluative perspective review aims to provide an impetus for updating the manner in which CT education is approached and, perhaps most importantly, applied in real-world settings—through further identifying and elaborating on specific barriers of concern in order to reinforce the ‘completeness’ of extant CT frameworks and to enhance the potential benefits of their implementation 1 .

2. Barriers to Critical Thinking

2.1. inadequate skills and dispositions.

In order to better understand the various barriers to CT that will be discussed, the manner in which CT is conceptualised must first be revisited. Though debate over its definition and what components are necessary to think critically has existed over the 80-plus years since the term’s coining (i.e., Glaser 1941 ), it is generally accepted that CT consists of two main components: skills and dispositions ( Dwyer 2017 ; Dwyer et al. 2012 , 2014 ; Ennis 1996 , 1998 ; Facione 1990 ; Facione et al. 2002 ; Halpern 2014 ; Ku and Ho 2010a ; Perkins and Ritchhart 2004 ; Quinn et al. 2020 ). CT skills—analysis, evaluation, and inference—refer to the higher-order, cognitive, ‘task-based’ processes necessary to conduct CT (e.g., see Dwyer et al. 2014 ; Facione 1990 ). CT dispositions have been described as inclinations, tendencies, or willingness to perform a given thinking skill (e.g., see Dwyer et al. 2016 ; Siegel 1999 ; Valenzuela et al. 2011 ), which may relate to attitudinal and intellectual habits of thinking, as well as motivational processes ( Ennis 1996 ; Norris 1994 ; Paul and Elder 2008 ; Perkins et al. 1993 ; Valenzuela et al. 2011 ). The relationship between CT skills and dispositions has been argued to be mutually dependent. As a result, overemphasising or encouraging the development of one over the other is a barrier to CT as a whole. Though this may seem obvious, it remains the case that CT instruction often places added emphasis on skills simply because they can be taught (though that does not ensure that everyone has or will be taught such skills), whereas dispositions are ‘trickier’ (e.g., see Dwyer 2017 ; Ku and Ho 2010a ). That is, it is unlikely that simply ‘teaching’ students to be motivated towards CT or to value it over short-instructional periods will actually meaningfully enhance it. Moreover, debate exists over how best to train disposition or even measure it. With that, some individuals might be more ‘inherently’ disposed to CT in light of their truth-seeking, open-minded, or inquisitive natures ( Facione and Facione 1992 ; Quinn et al. 2020 ). The barrier, in this context, is how we can enhance the disposition of those who are not ‘inherently’ inclined. For example, though an individual may possess the requisite skills to conduct CT, it does not ensure the tendency or willingness to apply them; and conversely, having the disposition to apply CT does not mean that one has the ability to do so ( Valenzuela et al. 2011 ). Given the pertinence of CT skills and dispositions to the application of CT in a broader sense, inadequacies in either create a barrier to application.

2.2. Epistemological (Mis)Understanding

To reiterate, most extant conceptualisations of CT focus on the tandem working of skills and dispositions, though significantly fewer emphasise the reflective judgment aspect of CT that might govern various associated processes ( Dawson 2008 ; Dwyer 2017 ; Dwyer et al. 2014 , 2015 ; King and Kitchener 1994 , 2004 ; Stanovich and Stanovich 2010 ). Reflective judgment (RJ) refers to a self-regulatory process of decision-making, with respect to taking time to engage one’s understanding of the nature, limits, and certainty of knowing and how this can affect the defense of their reasoning ( Dwyer 2017 ; King and Kitchener 1994 ; Ku and Ho 2010b ). The ability to metacognitively ‘think about thinking’ ( Flavell 1976 ; Ku and Ho 2010b ) in the application of critical thinking skills implies a reflective sensibility consistent with epistemological understanding and the capacity for reflective judgement ( Dwyer et al. 2015 ; King and Kitchener 1994 ). Acknowledging levels of (un)certainty is important in CT because the information a person is presented with (along with that person’s pre-existing knowledge) often provides only a limited source of information from which to draw a conclusion. Thus, RJ is considered a component of CT ( Baril et al. 1998 ; Dwyer et al. 2015 ; Huffman et al. 1991 ) because it allows one to acknowledge that epistemological understanding is necessary for recognising and judging a situation in which CT may be required ( King and Kitchener 1994 ). For example, the interdependence between RJ and CT can be seen in the way that RJ influences the manner in which CT skills like analysis and evaluation are conducted or the balance and perspective within the subsequent inferences drawn ( Dwyer et al. 2015 ; King et al. 1990 ). Moreover, research suggests that RJ development is not a simple function of age or time but more so a function of the amount of active engagement an individual has working in problem spaces that require CT ( Brabeck 1981 ; Dawson 2008 ; Dwyer et al. 2015 ). The more developed one’s RJ, the better able one is to present “a more complex and effective form of justification, providing more inclusive and better integrated assumptions for evaluating and defending a point of view” ( King and Kitchener 1994, p. 13 ).

Despite a lesser focus on RJ, research indicates a positive relationship between it and CT ( Baril et al. 1998 ; Brabeck 1981 ; Dawson 2008 ; Dwyer et al. 2015 ; Huffman et al. 1991 ; King et al. 1990 )—the understanding of which is pertinent to better understanding the foundation to CT barriers. For example, when considering one’s proficiency in CT skills, there might come a time when the individual becomes so good at using them that their application becomes something akin to ‘second nature’ or even ‘automatic’. However, this creates a contradiction: automatic thinking is largely the antithesis of reflective judgment (even though judgment is never fully intuitive or reflective; see Cader et al. 2005 ; Dunwoody et al. 2000 ; Hamm 1988 ; Hammond 1981 , 1996 , 2000 )—those who think critically take their time and reflect on their decision-making; even if the solution/conclusion drawn from the automatic thinking is ‘correct’ or yields a positive outcome, it is not a critically thought out answer, per se. Thus, no matter how skilled one is at applying CT skills, once the application becomes primarily ‘automatic’, the thinking ceases to be critical ( Dwyer 2017 )—a perspective consistent with Dual Process Theory (e.g., Stanovich and West 2000 ). Indeed, RJ acts as System 2 thinking ( Stanovich and West 2000 ): it is slow, careful, conscious, and consistent ( Kahneman 2011 ; Hamm 1988 ); it is associated with high cognitive control, attention, awareness, concentration, and complex computation ( Cader et al. 2005 ; Kahneman 2011 ; Hamm 1988 ); and accounts for epistemological concerns—consistent not only with King and Kitchener’s ( 1994 ) conceptualisation but also Kuhn’s ( 1999 , 2000 ) perspective on metacognition and epistemological knowing . This is where RJ comes into play as an important component of CT—interdependent among the requisite skills and dispositions ( Baril et al. 1998 ; Dwyer et al. 2015 )—it allows one to acknowledge that epistemological understanding is vital to recognising and judging a situation in which CT is required ( King and Kitchener 1994 ). With respect to the importance of epistemological understanding, consider the following examples for elaboration.

The primary goal of CT is to enhance the likelihood of generating reasonable conclusions and/or solutions. Truth-seeking is a CT disposition fundamental to the attainment of this goal ( Dwyer et al. 2016 ; Facione 1990 ; Facione and Facione 1992 ) because if we just applied any old nonsense as justification for our arguments or solutions, they would fail in the application and yield undesirable consequences. Despite what may seem like truth-seeking’s obvious importance in this context, all thinkers succumb to unwarranted assumptions on occasion (i.e., beliefs presumed to be true without adequate justification). It may also seem obvious, in context, that it is important to be able to distinguish facts from beliefs. However, the concepts of ‘fact’ or ‘truth’, with respect to how much empirical support they have to validate them, also require consideration. For example, some might conceptualise truth as factual information or information that has been or can be ‘proven’ true. Likewise, ‘proof’ is often described as evidence establishing a fact or the truth of a statement—indicating a level of absolutism. However, the reality is that we cannot ‘prove’ things—as scientists and researchers well know—we can only disprove them, such as in experimental settings where we observe a significant difference between groups on some measure—we do not prove the hypothesis correct, rather, we disprove the null hypothesis. This is why, in large part, researchers and scientists use cautious language in reporting their results. We know the best our findings can do is reinforce a theory—another concept often misconstrued in the wider population as something like a hypothesis, as opposed to what it actually entails: a robust model for how and/or why a given phenomenon might occur (e.g., gravity). Thus, theories will hold ‘true’ until they are falsified—that is, disproven (e.g., Popper [1934] 1959 , 1999 ).

Unfortunately, ‘proof’, ‘prove’, and ‘proven’—words that ensure certainty to large populations—actually disservice the public in subtle ways that can hinder CT. For example, a company that produces toothpaste might claim its product to be ‘clinically proven’ to whiten teeth. Consumers purchasing that toothpaste are likely to expect to have whiter teeth after use. However, what happens—as often may be the case—if it does not whiten their teeth? The word ‘proven’ implies a false claim in context. Of course, those in research understand that the word’s use is a marketing ploy, given that ‘clinically proven’ sounds more reassuring to consumers than ‘there is evidence to suggest…’; but, by incorrectly using words like ‘proven’ in our daily language, we reinforce a misunderstanding of what it means to assess, measure and evaluate—particularly from a scientific standpoint (e.g., again, see Popper [1934] 1959 , 1999 ).

Though this example may seem like a semantic issue, it has great implications for CT in the population. For example, a vast majority of us grew up being taught the ‘factual’ information that there were nine planets in our solar system; then, in 2006, Pluto was reclassified as a dwarf planet—no longer being considered a ‘major’ planet of our solar system. As a result, we now have eight planets. This change might be perceived in two distinct ways: (1) ‘science is amazing because it’s always developing—we’ve now reached a stage where we know so much about the solar system that we can differentiate celestial bodies to the extent of distinguishing planets from dwarf planets’; and (2) ‘I don’t understand why these scientists even have jobs, they can’t even count planets’. The first perspective is consistent with that of an individual with epistemological understanding and engagement that previous understandings of models and theories can change, not necessarily because they were wrong, but rather because they have been advanced in light of gaining further credible evidence. The second perspective is consistent with that of someone who has failed to engage epistemological understanding, who does not necessarily see that the change might reflect progress, who might be resistant to change, and who might grow in distrust of science and research in light of these changes. The latter point is of great concern in the CT research community because the unwarranted cynicism and distrust of science and research, in context, may simply reflect a lack of epistemological understanding or engagement (e.g., to some extent consistent with the manner in which conspiracy theories are developed, rationalised and maintained (e.g., Swami and Furnham 2014 )). Notably, this should also be of great concern to education departments around the world, as well as society, more broadly speaking.

Upon considering epistemological engagement in more practical, day-to-day scenarios (or perhaps a lack thereof), we begin to see the need for CT in everyday 21st-century life—heightened by the ‘new knowledge economy’, which has resulted in exponential increases in the amount of information made available since the late 1990s (e.g., Darling-Hammond 2008 ; Dwyer 2017 ; Jukes and McCain 2002 ; Varian and Lyman 2003 ). Though increased amounts of and enhanced access to information are largely good things, what is alarming about this is how much of it is misinformation or disinformation ( Commission on Fake News and the Teaching of Critical Literacy in Schools 2018 ). Truth be told, the new knowledge economy is anything but ‘new’ anymore. Perhaps, over the past 10–15 years, there has been an increase in the need for CT above and beyond that seen in the ‘economy’s’ wake—or maybe ever before; for example, in light of the social media boom, political unrest, ‘fake news’, and issues regarding health literacy. The ‘new’ knowledge economy has made it so that knowledge acquisition, on its own, is no longer sufficient for learning—individuals must be able to work with and adapt information through CT in order to apply it appropriately ( Dwyer 2017 ).

Though extant research has addressed the importance of epistemological understanding for CT (e.g., Dwyer et al. 2014 ), it does not address how not engaging it can substantially hinder it—regardless of how skilled or disposed to think critically an individual may be. Notably, this is distinct from ‘inadequacies’ in, say, memory, comprehension, or other ‘lower-order’ cognitively-associated skills required for CT ( Dwyer et al. 2014 ; Halpern 2014 ; see, again, Note 1) in that reflective judgment is essentially a pole on a cognitive continuum (e.g., see Cader et al. 2005 ; Hamm 1988 ; Hammond 1981 , 1996 , 2000 ). Cognitive Continuum Theory postulates a continuum of cognitive processes anchored by reflective judgment and intuitive judgment, which represents how judgment situations or tasks relate to cognition, given that thinking is never purely reflective, nor is it completely intuitive; rather, it rests somewhere in between ( Cader et al. 2005 ; Dunwoody et al. 2000 ). It is also worth noting that, in Cognitive Continuum Theory, neither reflective nor intuitive judgment is assumed, a priori, to be superior ( Dunwoody et al. 2000 ), despite most contemporary research on judgment and decision-making focusing on the strengths of RJ and limitations associated with intuitive judgment ( Cabantous et al. 2010 ; Dhami and Thomson 2012 ; Gilovich et al. 2002 ). Though this point regarding superiority is acknowledged and respected (particularly in non-CT cases where it is advantageous to utilise intuitive judgment), in the context of CT, it is rejected in light of the example above regarding the automaticity of thinking skills.

2.3. Intuitive Judgment

The manner in which human beings think and the evolution of which, over millions of years, is a truly amazing thing. Such evolution has made it so that we can observe a particular event and make complex computations regarding predictions, interpretations, and reactions in less than a second (e.g., Teichert et al. 2014 ). Unfortunately, we have become so good at it that we often over-rely on ‘fast’ thinking and intuitive judgments that we have become ‘cognitively lazy’, given the speed at which we can make decisions with little energy ( Kahneman 2011 ; Simon 1957 ). In the context of CT, this ‘lazy’ thinking is an impediment (as in opposition to reflective judgment). For example, consider a time in which you have been presented numeric data on a topic, and you instantly aligned your perspective with what the ‘numbers indicate’. Of course, numbers do not lie… but people do—that is not to say that the person who initially interpreted and then presented you with those numbers is trying to disinform you; rather, the numbers presented might not tell the full story (i.e., the data are incomplete or inadequate, unbeknownst to the person reporting on them); and thus, there might be alternative interpretations to the data in question. With that, there most certainly are individuals who will wish to persuade you to align with their perspective, which only strengthens the impetus for being aware of intuitive judgment as a barrier. Consider another example: have you ever accidentally insulted someone at work, school, or in a social setting? Was it because the statement you made was based on some kind of assumption or stereotype? It may have been an honest mistake, but if a statement is made based on what one thinks they know, as opposed to what they actually know about the situation—without taking the time to recognise that all situations are unique and that reflection is likely warranted in light of such uncertainty—then it is likely that the schema-based ‘intuitive judgment’ is what is a fault here.

Our ability to construct schemas (i.e., mental frameworks for how we interpret the world) is evolutionarily adaptive in that these scripts allow us to: make quick decisions when necessary and without much effort, such as in moments of impending danger, answer questions in conversation; interpret social situations; or try to stave off cognitive load or decision fatigue ( Baumeister 2003 ; Sweller 2010 ; Vohs et al. 2014 ). To reiterate, research in the field of higher-order thinking often focuses on the failings of intuitive judgment ( Dwyer 2017 ; Hamm 1988 ) as being limited, misapplied, and, sometimes, yielding grossly incorrect responses—thus, leading to faulty reasoning and judgment as a result of systematic biases and errors ( Gilovich et al. 2002 ; Kahneman 2011 ; Kahneman et al. 1982 ; Slovic et al. 1977 ; Tversky and Kahneman 1974 ; in terms of schematic thinking ( Leventhal 1984 ), system 1 thinking ( Stanovich and West 2000 ; Kahneman 2011 ), miserly thinking ( Stanovich 2018 ) or even heuristics ( Kahneman and Frederick 2002 ; Tversky and Kahneman 1974 ). Nevertheless, it remains that such protocols are learned—not just through experience (as discussed below), but often through more ‘academic’ means. For example, consider again the anecdote above about learning to apply CT skills so well that it becomes like ‘second nature’. Such skills become a part of an individual’s ‘mindware’ ( Clark 2001 ; Stanovich 2018 ; Stanovich et al. 2016 ) and, in essence, become heuristics themselves. Though their application requires RJ for them to be CT, it does not mean that the responses yielded will be incorrect.

Moreover, despite the descriptions above, it would be incorrect, and a disservice to readers to imply that RJ is always right and intuitive judgment is always wrong, especially without consideration of the contextual issues—both intuitive and reflective judgments have the potential to be ‘correct’ or ‘incorrect’ with respect to validity, reasonableness or appropriateness. However, it must also be acknowledged that there is a cognitive ‘miserliness’ to depending on intuitive judgment, in which case, the ability to detect and override this dependence ( Stanovich 2018 )—consistent with RJ, is of utmost importance if we care about our decision-making. That is, if we care about our CT (see below for a more detailed discussion), we must ignore the implicit ‘noise’ associated with the intuitive judgment (regardless of whether or not it is ‘correct’) and, instead, apply the necessary RJ to ensure, as best we can, that the conclusion or solution is valid, reasonable or appropriate.

Although, such a recommendation is much easier said than done. One problem with relying on mental shortcuts afforded by intuition and heuristics is that they are largely experience-based protocols. Though that may sound like a positive thing, using ‘experience’ to draw a conclusion in a task that requires CT is erroneous because it essentially acts as ‘research’ based on a sample size of one; and so, ‘findings’ (i.e., one’s conclusion) cannot be generalised to the larger population—in this case, other contexts or problem-spaces ( Dwyer 2017 ). Despite this, we often over-emphasise the importance of experience in two related ways. First, people have a tendency to confuse experience for expertise (e.g., see the Dunning–KrugerEffect (i.e., the tendency for low-skilled individuals to overestimate their ability in tasks relevant to said skill and highly skilled individuals to underestimate their ability in tasks relevant to said skills); see also: ( Kruger and Dunning 1999 ; Mahmood 2016 ), wherein people may not necessarily be expert, rather they may just have a lot of experience completing a task imperfectly or wrong ( Dwyer and Walsh 2019 ; Hammond 1996 ; Kahneman 2011 ). Second, depending on the nature of the topic or problem, people often evaluate experience on par with research evidence (in terms of credibility), given its personalised nature, which is reinforced by self-serving bias(es).

When evaluating topics in domains wherein one lacks expertise, the need for intellectual integrity and humility ( Paul and Elder 2008 ) in their RJ is increased so that the individual may assess what knowledge is required to make a critically considered judgment. However, this is not necessarily a common response to a lack of relevant knowledge, given that when individuals are tasked with decision-making regarding a topic in which they do not possess relevant knowledge, these individuals will generally rely on emotional cues to inform their decision-making (e.g., Kahneman and Frederick 2002 ). Concerns here are not necessarily about the lack of domain-specific knowledge necessary to make an accurate decision, but rather the (1) belief of the individual that they have the knowledge necessary to make a critically thought-out judgment, even when this is not the case—again, akin to the Dunning–Kruger Effect ( Kruger and Dunning 1999 ); or (2) lack of willingness (i.e., disposition) to gain additional, relevant topic knowledge.

One final problem with relying on experience for important decisions, as alluded to above, is that when experience is engaged, it is not necessarily an objective recollection of the procedure. It can be accompanied by the individual’s beliefs, attitudes, and feelings—how that experience is recalled. The manner in which an individual draws on their personal experience, in light of these other factors, is inherently emotion-based and, likewise, biased (e.g., Croskerry et al. 2013 ; Loftus 2017 ; Paul 1993 ).

2.4. Bias and Emotion

Definitions of CT often reflect that it is to be applied to a topic, argument, or problem of importance that the individual cares about ( Dwyer 2017 ). The issue of ‘caring’ is important because it excludes judgment and decision-making in day-to-day scenarios that are not of great importance and do not warrant CT (e.g., ‘what colour pants best match my shirt’ and ‘what to eat for dinner’); again, for example, in an effort to conserve time and cognitive resources (e.g., Baumeister 2003 ; Sweller 2010 ). However, given that ‘importance’ is subjective, it essentially boils down to what one cares about (e.g., issues potentially impactful in one’s personal life; topics of personal importance to the individual; or even problems faced by an individual’s social group or work organisation (in which case, care might be more extrinsically-oriented). This is arguably one of the most difficult issues to resolve in CT application, given its contradictory nature—where it is generally recommended that CT should be conducted void of emotion and bias (as much as it can be possible), at the same time, it is also recommended that it should only be applied to things we care about. As a result, the manner in which care is conceptualised requires consideration. For example, in terms of CT, care can be conceptualised as ‘concern or interest; the attachment of importance to a person, place, object or concept; and serious attention or consideration applied to doing something correctly or to avoid damage or risk’; as opposed to some form of passion (e.g., intense, driving or over-powering feeling or conviction; emotions as distinguished from reason; a strong liking or desire for or devotion to some activity, object or concept). In this light, care could be argued as more of a dispositional or self-regulatory factor than emotional bias; thus, making it useful to CT. Though this distinction is important, the manner in which care is labeled does not lessen the potential for biased emotion to play a role in the thinking process. For example, it has been argued that if one cares about the decision they make or the conclusion they draw, then the individual will do their best to be objective as possible ( Dwyer 2017 ). However, it must also be acknowledged that this may not always be the case or even completely feasible (i.e., how can any decision be fully void of emotional input? )—though one may strive to be as objective as possible, such objectivity is not ensured given that implicit bias may infiltrate their decision-making (e.g., taking assumptions for granted as facts in filling gaps (unknowns) in a given problem-space). Consequently, such implicit biases may be difficult to amend, given that we may not be fully aware of them at play.

With that, explicit biases are just as concerning, despite our awareness of them. For example, the more important an opinion or belief is to an individual, the greater the resistance to changing their mind about it ( Rowe et al. 2015 ), even in light of evidence indicating the contrary ( Tavris and Aronson 2007 ). In some cases, the provision of information that corrects the flawed concept may even ‘backfire’ and reinforce the flawed or debunked stance ( Cook and Lewandowsky 2011 ). This cognitive resistance is an important barrier to CT to consider for obvious reasons—as a process; it acts in direct opposition to RJ, the skill of evaluation, as well as a number of requisite dispositions towards CT, including truth-seeking and open-mindedness (e.g., Dwyer et al. 2014 , 2016 ; Facione 1990 ); and at the same time, yields important real-world impacts (e.g., see Nyhan et al. 2014 ).

The notion of emotion impacting rational thought is by no means a novel concept. A large body of research indicates a negative impact of emotion on decision-making (e.g., Kahneman and Frederick 2002 ; Slovic et al. 2002 ; Strack et al. 1988 ), higher-order cognition ( Anticevic et al. 2011 ; Chuah et al. 2010 ; Denkova et al. 2010 ; Dolcos and McCarthy 2006 ) and cognition, more generally ( Iordan et al. 2013 ; Johnson et al. 2005 ; Most et al. 2005 ; Shackman et al. 2006 ) 2 . However, less attention has specifically focused on emotion’s impact on the application of critical thought. This may be a result of assumptions that if a person is inclined to think critically, then what is yielded will typically be void of emotion—which is true to a certain extent. However, despite the domain generality of CT ( Dwyer 2011 , 2017 ; Dwyer and Eigenauer 2017 ; Dwyer et al. 2015 ; Gabennesch 2006 ; Halpern 2014 ), the likelihood of emotional control during the CT process remains heavily dependent on the topic of application. Consider again, for example; there is no guarantee that an individual who generally applies CT to important topics or situations will do so in all contexts. Indeed, depending on the nature of the topic or the problem faced, an individual’s mindware ( Clark 2001 ; Stanovich 2018 ; Stanovich et al. 2016 ; consistent with the metacognitive nature of CT) and the extent to which a context can evoke emotion in the thinker will influence what and how thinking is applied. As addressed above, if the topic is something to which the individual feels passionate, then it will more likely be a greater challenge for them to remain unbiased and develop a reasonably objective argument or solution.

Notably, self-regulation is an important aspect of both RJ and CT ( Dwyer 2017 ; Dwyer et al. 2014 ), and, in this context, it is difficult not to consider the role emotional intelligence might play in the relationship between affect and CT. For example, though there are a variety of conceptualisations of emotional intelligence (e.g., Bar-On 2006 ; Feyerherm and Rice 2002 ; Goleman 1995 ; Salovey and Mayer 1990 ; Schutte et al. 1998 ), the underlying thread among these is that, similar to the concept of self-regulation, emotional intelligence (EI) refers to the ability to monitor (e.g., perceive, understand and regulate) one’s own feelings, as well as those of others, and to use this information to guide relevant thinking and behaviour. Indeed, extant research indicates that there is a positive association between EI and CT (e.g., Afshar and Rahimi 2014 ; Akbari-Lakeh et al. 2018 ; Ghanizadeh and Moafian 2011 ; Kaya et al. 2017 ; Stedman and Andenoro 2007 ; Yao et al. 2018 ). To shed light upon this relationship, Elder ( 1997 ) addressed the potential link between CT and EI through her description of the latter as a measure of the extent to which affective responses are rationally-based , in which reasonable desires and behaviours emerge from such rationally-based emotions. Though there is extant research on the links between CT and EI, it is recommended that future research further elaborate on this relationship, as well as with other self-regulatory processes, in an effort to further establish the potentially important role that EI might play within CT.

3. Discussion

3.1. interpretations.

Given difficulties in the past regarding the conceptualisation of CT ( Dwyer et al. 2014 ), efforts have been made to be as specific and comprehensive as possible when discussing CT in the literature to ensure clarity and accuracy. However, it has been argued that such efforts have actually added to the complexity of CT’s conceptualisation and had the opposite effect on clarity and, perhaps, more importantly, the accessibility and practical usefulness for educators (and students) not working in the research area. As a result, when asked what CT is, I generally follow up the ‘long definition’, in light of past research, with a much simpler description: CT is akin to ‘playing devil’s advocate’. That is, once a claim is made, one should second-guess it in as many conceivable ways as possible, in a process similar to the Socratic Method. Through asking ‘why’ and conjecturing alternatives, we ask the individual—be it another person or even ourselves—to justify the decision-making. It keeps the thinker ‘honest’, which is particularly useful if we’re questioning ourselves. If we do not have justifiable reason(s) for why we think or intend to act in a particular way (above and beyond considered objections), then it should become obvious that we either missed something or we are biased. It is perhaps this simplified description of CT that gives such impetus for the aim of this review.

Whereas extant frameworks often discuss the importance of CT skills, dispositions, and, to a lesser extent, RJ and other self-regulatory functions of CT, they do so with respect to components of CT or processes that facilitate CT (e.g., motivation, executive functions, and dispositions), without fully encapsulating cognitive processes and other factors that may hinder it (e.g., emotion, bias, intuitive judgment and a lack of epistemological understanding or engagement). With that, this review is neither a criticism of existing CT frameworks nor is it to imply that CT has so many barriers that it cannot be taught well, nor does it claim to be a complete list of processes that can impede CT (see again Note 1). To reiterate, education in CT can yield beneficial effects ( Abrami et al. 2008 , 2015 ; Dwyer 2017 ; Dwyer and Eigenauer 2017 ); however, such efficacy may be further enhanced by presenting students and individuals interested in CT the barriers they are likely to face in its application; explaining how these barriers manifest and operate; and offer potential strategies for overcoming them.

3.2. Further Implications and Future Research

Though the barriers addressed here are by no means new to the arena of research in higher-order cognition, there is a novelty in their collated discussion as impactful barriers in the context of CT, particularly with respect to extant CT research typically focusing on introducing strategies and skills for enhancing CT, rather than identifying ‘preventative measures’ for barriers that can negatively impact CT. Nevertheless, future research is necessary to address how such barriers can be overcome in the context of CT. As addressed above, it is recommended that CT education include discussion of these barriers and encourage self-regulation against them; and, given the vast body of CT research focusing on enhancement through training and education, it seems obvious to make such a recommendation in this context. However, it is also recognised that simply identifying these barriers and encouraging people to engage in RJ and self-regulation to combat them may not suffice. For example, educators might very well succeed in teaching students how to apply CT skills , but just as these educators may not be able to motivate students to use them as often as they might be needed or even to value such skills (such as in attempting to elicit a positive disposition towards CT), it might be the case that without knowing about the impact of the discussed barriers to CT (e.g., emotion and/or intuitive judgment), students may be just as susceptible to biases in their attempts to think critically as others without CT skills. Thus, what such individuals might be applying is not CT at all; rather, just a series of higher-order cognitive skills from a biased or emotion-driven perspective. As a result, a genuine understanding of these barriers is necessary for individuals to appropriately self-regulate their thinking.

Moreover, though the issues of epistemological beliefs, bias, emotion, and intuitive processes are distinct in the manner in which they can impact CT, these do not have set boundaries; thus, an important implication is that they can overlap. For example, epistemological understanding can influence how individuals make decisions in real-world scenarios, such as through intuiting a judgment in social situations (i.e., without considering the nature of the knowledge behind the decision, the manner in which such knowledge interacts [e.g., correlation v. causation], the level of uncertainty regarding both the decision-maker’s personal stance and the available evidence), when a situation might actually require further consideration or even the honest response of ‘I don’t know’. The latter concept—that of simply responding ‘I don’t know’ is interesting to consider because though it seems, on the surface, to be inconsistent with CT and its outcomes, it is commensurate with many of its associated components (e.g., intellectual honesty and humility; see Paul and Elder 2008 ). In the context this example is used, ‘I don’t know’ refers to epistemological understanding. With that, it may also be impacted by bias and emotion. For example, depending on the topic, an individual may be likely to respond ‘I don’t know’ when they do not have the relevant knowledge or evidence to provide a sufficient answer. However, in the event that the topic is something the individual is emotionally invested in or feels passionate about, an opinion or belief may be shared instead of ‘I don’t know’ (e.g., Kahneman and Frederick 2002 ), despite a lack of requisite evidence-based knowledge (e.g., Kruger and Dunning 1999 ). An emotional response based on belief may be motivated in the sense that the individual knows that they do not know for sure and simply uses a belief to support their reasoning as a persuasive tool. On the other hand, the emotional response based on belief might be used simply because the individual may not know that the use of a belief is an insufficient means of supporting their perspective– instead, they might think that their intuitive, belief-based judgment is as good as a piece of empirical evidence; thus, suggesting a lack of empirical understanding. With that, it is fair to say that though epistemological understanding, intuitive judgment, emotion, and bias are distinct concepts, they can influence each other in real-world CT and decision-making. Though there are many more examples of how this might occur, the one presented may further support the recommendation that education can be used to overcome some of the negative effects associated with the barriers presented.

For example, in Ireland, students are not generally taught about academic referencing until they reach third-level education. Anecdotally, I was taught about referencing at age 12 and had to use it all the way through high school when I was growing up in New York. In the context of these referencing lessons, we were taught about the credibility of sources, as well as how analyse and evaluate arguments and subsequently infer conclusions in light of these sources (i.e., CT skills). We were motivated by our teacher to find the ‘truth’ as best we could (i.e., a fundament of CT disposition). Now, I recognise that this experience cannot be generalised to larger populations, given that I am a sample size of one, but I do look upon such education, perhaps, as a kind of transformative learning experience ( Casey 2018 ; King 2009 ; Mezirow 1978 , 1990 ) in the sense that such education might have provided a basis for both CT and epistemological understanding. For CT, we use research to support our positions, hence the importance of referencing. When a ‘reference’ is not available, one must ask if there is actual evidence available to support the proposition. If there is not, one must question the basis for why they think or believe that their stance is correct—that is, where there is logic to the reasoning or if the proposition is simply an emotion- or bias-based intuitive judgment. So, in addition to referencing, the teaching of some form of epistemology—perhaps early in children’s secondary school careers, might benefit students in future efforts to overcome some barriers to CT. Likewise, presenting examples of the observable impact that bias, emotions, and intuitive thought can have on their thinking might also facilitate overcoming these barriers.

As addressed above, it is acknowledged that we may not be able to ‘teach’ people not to be biased or emotionally driven in their thinking because it occurs naturally ( Kahneman 2011 )—regardless of how ‘skilled’ one might be in CT. For example, though research suggests that components of CT, such as disposition, can improve over relatively short periods of time (e.g., over the duration of a semester-long course; Rimiene 2002 ), less is known about how such components have been enhanced (given the difficulty often associated with trying to teach something like disposition ( Dwyer 2017 ); i.e., to reiterate, it is unlikely that simply ‘teaching’ (or telling) students to be motivated towards CT or to value it (or its associated concepts) will actually enhance it over short periods of time (e.g., semester-long training). Nevertheless, it is reasonable to suggest that, in light of such research, educators can encourage dispositional growth and provide opportunities to develop it. Likewise, it is recommended that educators encourage students to be aware of the cognitive barriers discussed and provide chances to engage in CT scenarios where such barriers are likely to play a role, thus, giving students opportunities to acknowledge the barriers and practice overcoming them. Moreover, making students aware of such barriers at younger ages—in a simplified manner, may promote the development of personal perspectives and approaches that are better able to overcome the discussed barriers to CT. This perspective is consistent with research on RJ ( Dwyer et al. 2015 ), in which it was recommended that such enhancement requires not only time to develop (be it over the course of a semester or longer) but is also a function of having increased opportunities to engage CT. In the possibilities described, individuals may learn both to overcome barriers to CT and from the positive outcomes of applying CT; and, perhaps, engage in some form of transformative learning ( Casey 2018 ; King 2009 ; Mezirow 1978 , 1990 ) that facilitates an enhanced ‘valuing’ of and motivation towards CT. For example, through growing an understanding of the nature of epistemology, intuitive-based thinking, emotion, bias, and the manner in which people often succumb to faulty reasoning in light of these, individuals may come to better understand the limits of knowledge, barriers to CT and how both understandings can be applied; thus, growing further appreciation of the process as it is needed.

To reiterate, research suggests that there may be a developmental trajectory above and beyond the parameters of a semester-long training course that is necessary to develop the RJ necessary to think critically and, likewise, engage an adequate epistemological stance and self-regulate against impeding cognitive processes ( Dwyer et al. 2015 ). Though such research suggests that such development may not be an issue of time, but rather the amount of opportunities to engage RJ and CT, there is a dearth of recommendations offered with respect to how this could be performed in practice. Moreover, the how and what regarding ‘opportunities for engagement’ requires further investigation as well. For example, does this require additional academic work outside the classroom in a formal manner, or does it require informal ‘exploration’ of the world of information on one’s own? If the latter, the case of motivational and dispositional levels once again comes into question; thus, even further consideration is needed. One way or another, future research efforts are necessary to identify how best to make individuals aware of barriers to CT, encourage them to self-regulate against them, and identify means of increasing opportunities to engage RJ and CT.

4. Conclusions

Taking heed that it is unnecessary to reinvent the CT wheel ( Eigenauer 2017 ), the aim of this review was to further elaborate on the processes associated with CT and make a valuable contribution to its literature with respect to conceptualisation—not just in light of making people explicitly aware of what it is, but also what it is not and how it can be impeded (e.g., through inadequate CT skills and dispositions; epistemological misunderstanding; intuitive judgment; as well as bias and emotion)—a perspective consistent with that of ‘constructive feedback’ wherein students need to know both what they are doing right and what they are doing wrong. This review further contributes to the CT education literature by identifying the importance of (1) engaging understanding of the nature, limits, and certainty of knowing as individuals traverse the landscape of evidence-bases in their research and ‘truth-seeking’; (2) understanding how emotions and biases can affect CT, regardless of the topic; (3) managing gut-level intuition until RJ has been appropriately engaged; and (4) the manner in which language is used to convey meaning to important and/or abstract concepts (e.g., ‘caring’, ‘proof’, causation/correlation, etc.). Consistent with the perspectives on research advancement presented in this review, it is acknowledged that the issues addressed here may not be complete and may themselves be advanced upon and updated in time; thus, future research is recommended and welcomed to improve and further establish our working conceptualisation of critical thinking, particularly in a real-world application.

Acknowledgments

The author would like to acknowledge, with great thanks and appreciation, John Eigenauer (Taft College) for his consult, review and advice regarding earlier versions of this manuscript.

Funding Statement

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Data availability statement, conflicts of interest.

The author declares no conflict of interest.

1 Notably, though inadequacies in cognitive resources (apart from those explicitly set within the conceptualisations of CT discussed; e.g., see Section 2.1 ) are acknowledged as impediments to one’s ability to apply CT (e.g., a lack of relevant background knowledge, as well as broader cognitive abilities and resources ( Dwyer 2017 ; Halpern 2014 ; Stanovich and Stanovich 2010 )), these will not be discussed as focus is largely restricted to issues of cognitive processes that ‘naturally’ act as barriers in their functioning. Moreover, such inadequacies may more so be issues of individual differences than ongoing issues that everyone , regardless of ability, would face in CT (e.g., the impact of emotion and bias). Nevertheless, it is recommended that future research further investigates the influence of such inadequacies in cognitive resources on CT.

2 There is also some research that suggests that emotion may mediate enhanced cognition ( Dolcos et al. 2011 , 2012 ). However, this discrepancy in findings may result from the types of emotion studied—such as task-relevant emotion and task-irrelevant emotion. The distinction between the two is important to consider in terms of, for example, the distinction between one’s general mood and feelings specific unto the topic under consideration. Though mood may play a role in the manner in which CT is conducted (e.g., making judgments about a topic one is passionate about may elicit positive or negative emotions that affect the thinker’s mood in some way), notably, this discussion focuses on task-relevant emotion and associated biases that negatively impact the CT process. This is also an important distinction because an individual may generally think critically about ‘important’ topics, but may fail to do so when faced with a cognitive task that requires CT with which the individual has a strong, emotional perspective (e.g., in terms of passion , as described above).

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

  • Abrami Philip C., Bernard Robert M., Borokhovski Eugene, Waddington David I., Wade C. Anne, Persson Tonje. Strategies for teaching students to think critically: A meta-analysis. Review of Educational Research. 2015; 85 :275–314. doi: 10.3102/0034654314551063. [ CrossRef ] [ Google Scholar ]
  • Abrami Philip C., Bernard Robert M., Borokhovski Evgueni, Wade Anne, Surkes Michael A., Tamim Rana, Zhang Dai. Instructional interventions affecting critical thinking skills and dispositions: A stage 1 meta-analysis. Review of Educational Research. 2008; 78 :1102–34. [ Google Scholar ]
  • Afshar Hassan Soodmand, Rahimi Masoud. The relationship among critical thinking, emotional intelligence, and speaking abilities of Iranian EFL learners. Procedia-Social and Behavioral Sciences. 2014; 136 :75–79. doi: 10.1016/j.sbspro.2014.05.291. [ CrossRef ] [ Google Scholar ]
  • Ahern Aoife, Dominguez Caroline, McNally Ciaran, O’Sullivan John J., Pedrosa Daniela. A literature review of critical thinking in engineering education. Studies in Higher Education. 2019; 44 :816–28. doi: 10.1080/03075079.2019.1586325. [ CrossRef ] [ Google Scholar ]
  • Akbari-Lakeh M., Naderi A., Arbabisarjou A. Critical thinking and emotional intelligence skills and relationship with students’ academic achievement. Prensa Médica Argentina. 2018; 104 :2. [ Google Scholar ]
  • Aliakbari Mohammad, Sadeghdaghighi Akram. Teachers’ perception of the barriers to critical thinking. Procedia-Social and Behavioral Sciences. 2013; 70 :1–5. doi: 10.1016/j.sbspro.2013.01.031. [ CrossRef ] [ Google Scholar ]
  • Anticevic Alan, Repovs Grega, Corlett Philip R., Barch Deanna M. Negative and nonemotional interference with visual working memory in schizophrenia. Biological Psychiatry. 2011; 70 :1159–68. doi: 10.1016/j.biopsych.2011.07.010. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Baril Charles P., Cunningham Billie M., Fordham David R., Gardner Robert L., Wolcott Susan K. Critical thinking in the public accounting profession: Aptitudes and attitudes. Journal of Accounting Education. 1998; 16 :381–406. doi: 10.1016/S0748-5751(98)00023-2. [ CrossRef ] [ Google Scholar ]
  • Bar-On Reuven. The Bar-On model of emotional-social intelligence (ESI) Psicothema. 2006; 18 :13–25. [ PubMed ] [ Google Scholar ]
  • Baumeister Roy. The psychology of irrationality: Why people make foolish, self-defeating choices. The Psychology of Economic Decisions. 2003; 1 :3–16. [ Google Scholar ]
  • Bezanilla María José, Fernández-Nogueira Donna, Poblete Manuel, Galindo-Domínguez Hector. Methodologies for teaching-learning critical thinking in higher education: The teacher’s view. Thinking Skills and Creativity. 2019; 33 :100584. doi: 10.1016/j.tsc.2019.100584. [ CrossRef ] [ Google Scholar ]
  • Brabeck Mary Margaret. The relationship between critical thinking skills and development of reflective judgment among adolescent and adult women; Paper presented at the 89th annual convention of the American Psychological Association; Los Angeles, CA, USA. August 24–26; 1981. [ Google Scholar ]
  • Butler Heather A., Dwyer Christopher P., Hogan Michael J., Franco Amanda, Rivas Silvia F., Saiz Carlos, Almeida Leandro S. The Halpern Critical Thinking Assessment and real-world outcomes: Cross-national applications. Thinking Skills and Creativity. 2012; 7 :112–21. doi: 10.1016/j.tsc.2012.04.001. [ CrossRef ] [ Google Scholar ]
  • Byerly T. Ryan. Teaching for intellectual virtue in logic and critical thinking classes: Why and how. Teaching Philosophy. 2019; 42 :1. doi: 10.5840/teachphil201911599. [ CrossRef ] [ Google Scholar ]
  • Cabantous Laure, Gond Jean-Pascal, Johnson-Cramer Michael. Decision theory as practice: Crafting rationality in organizations. Organization Studies. 2010; 31 :1531–66. doi: 10.1177/0170840610380804. [ CrossRef ] [ Google Scholar ]
  • Cáceres Martín, Nussbaum Miguel, Ortiz Jorge. Integrating critical thinking into the classroom: A teacher’s perspective. Thinking Skills and Creativity. 2020; 37 :100674. [ Google Scholar ]
  • Cader Raffik, Campbell Steve, Watson Don. Cognitive continuum theory in nursing decision-making. Journal of Advanced Nursing. 2005; 49 :397–405. doi: 10.1111/j.1365-2648.2004.03303.x. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Casey Helen. Doctoral dissertation. National University of Ireland; Galway, Ireland: 2018. Transformative Learning: An Exploration of the BA in Community and Family Studies Graduates’ Experiences. [ Google Scholar ]
  • Chuah Lisa YM, Dolcos Florin, Chen Annette K., Zheng Hui, Parimal Sarayu, Chee Michael WL. Sleep deprivation and interference by emotional distracters. SLEEP. 2010; 33 :1305–13. doi: 10.1093/sleep/33.10.1305. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Clark Andy. Mindware: An Introduction to the Philosophy of Cognitive Science. Oxford University Press; New York: 2001. [ Google Scholar ]
  • Commission on Fake News and the Teaching of Critical Literacy in Schools . Fake News and Critical Literacy: Final Report. National Literacy Trust; London: 2018. [ Google Scholar ]
  • Cook John, Lewandowsky Stephan. The Debunking Handbook. University of Queensland; St. Lucia: 2011. [ Google Scholar ]
  • Cornell Paul, Riordan Monica, Townsend-Gervis Mary, Mobley Robin. Barriers to critical thinking: Workflow interruptions and task switching among nurses. JONA: The Journal of Nursing Administration. 2011; 41 :407–14. doi: 10.1097/NNA.0b013e31822edd42. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Croskerry Pat, Singhal Geeta, Mamede Sílvia. Cognitive debiasing 2: Impediments to and strategies for change. BMJ Quality and Safety. 2013; 22 :ii65–ii72. doi: 10.1136/bmjqs-2012-001713. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Darling-Hammond Linda. How can we teach for meaningful learning? In: Darling-Hammond L., editor. Powerful Learning. Wiley; New York: 2008. pp. 1–10. [ Google Scholar ]
  • Dawson Theo L. Prepared in Response to Tasking from ODNI/CHCO/IC Leadership Development Office. Developmental Testing Service, LLC; Northampton: 2008. Metacognition and learning in adulthood. [ Google Scholar ]
  • Denkova Ekaterina, Wong Gloria, Dolcos Sanda, Sung Keen, Wang Lihong, Coupland Nicholas, Dolcos Florin. The impact of anxiety-inducing distraction on cognitive performance: A combined brain imaging and personality investigation. PLoS ONE. 2010; 5 :e14150. doi: 10.1371/journal.pone.0014150. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Dhami Mandeep K., Thomson Mary E. On the relevance of cognitive continuum theory and quasirationality for understanding management judgment and decision making. European Management Journal. 2012; 30 :316–26. doi: 10.1016/j.emj.2012.02.002. [ CrossRef ] [ Google Scholar ]
  • Dolcos Florin, Iordan Alexandru D., Dolcos Sanda. Neural correlates of emotion–cognition interactions: A review of evidence from brain imaging investigations. Journal of Cognitive Psychology. 2011; 23 :669–94. doi: 10.1080/20445911.2011.594433. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Dolcos Florin, McCarthy Gregory. Brain systems mediating cognitive interference by emotional distraction. Journal of Neuroscience. 2006; 26 :2072–79. doi: 10.1523/JNEUROSCI.5042-05.2006. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Dolcos Florin, Denkova Ekaterina, Dolcos Sanda. Neural correlates of emotional memories: A review of evidence from brain imaging studies. Psychologia. 2012; 55 :80–111. doi: 10.2117/psysoc.2012.80. [ CrossRef ] [ Google Scholar ]
  • Dumitru Daniela. Critical thinking and integrated programs. The problem of transferability. Procedia-Social and Behavioral Sciences. 2012; 33 :143–47. doi: 10.1016/j.sbspro.2012.01.100. [ CrossRef ] [ Google Scholar ]
  • Dunwoody Philip T., Haarbauer Eric, Mahan Robert P., Marino Christopher, Tang Chu-Chun. Cognitive adaptation and its consequences: A test of cognitive continuum theory. Journal of Behavioral Decision Making. 2000; 13 :35–54. doi: 10.1002/(SICI)1099-0771(200001/03)13:1<35::AID-BDM339>3.0.CO;2-U. [ CrossRef ] [ Google Scholar ]
  • Dwyer Christopher P. Doctoral thesis. National University of Ireland; Galway, Ireland: 2011. The Evaluation of Argument Mapping as a Learning Tool. [ Google Scholar ]
  • Dwyer Christopher P. Critical Thinking: Conceptual Perspectives and Practical Guidelines. Cambridge University Press; Cambridge: 2017. [ Google Scholar ]
  • Dwyer Christopher P. Teaching critical thinking. The SAGE Encyclopedia of Higher Education. 2020; 4 :1510–12. [ Google Scholar ]
  • Dwyer Christopher P., Walsh Anne. A case study of the effects of critical thinking instruction through adult distance learning on critical thinking performance: Implications for critical thinking development. Educational Technology and Research. 2019; 68 :17–35. doi: 10.1007/s11423-019-09659-2. [ CrossRef ] [ Google Scholar ]
  • Dwyer Christopher P., Eigenauer John D. To Teach or not to Teach Critical Thinking: A Reply to Huber and Kuncel. Thinking Skills and Creativity. 2017; 26 :92–95. doi: 10.1016/j.tsc.2017.08.002. [ CrossRef ] [ Google Scholar ]
  • Dwyer Christopher P., Hogan Michael J., Stewart Ian. An evaluation of argument mapping as a method of enhancing critical thinking performance in e-learning environments. Metacognition and Learning. 2012; 7 :219–44. doi: 10.1007/s11409-012-9092-1. [ CrossRef ] [ Google Scholar ]
  • Dwyer Christopher P., Hogan Michael J., Stewart Ian. An integrated critical thinking framework for the 21st century. Thinking Skills and Creativity. 2014; 12 :43–52. doi: 10.1016/j.tsc.2013.12.004. [ CrossRef ] [ Google Scholar ]
  • Dwyer Christopher P., Hogan Michael J., Stewart Ian. The evaluation of argument mapping-infused critical thinking instruction as a method of enhancing reflective judgment performance. Thinking Skills & Creativity. 2015; 16 :11–26. [ Google Scholar ]
  • Dwyer Christopher. P., Hogan Michael J., Harney Owen M., Kavanagh Caroline. Facilitating a Student-Educator Conceptual Model of Dispositions towards Critical Thinking through Interactive Management. Educational Technology & Research. 2016; 65 :47–73. [ Google Scholar ]
  • Eigenauer John D. Don’t reinvent the critical thinking wheel: What scholarly literature says about critical thinking instruction. NISOD Innovation Abstracts. 2017; 39 :2. [ Google Scholar ]
  • Elder Linda. Critical thinking: The key to emotional intelligence. Journal of Developmental Education. 1997; 21 :40. doi: 10.5840/inquiryctnews199616211. [ CrossRef ] [ Google Scholar ]
  • Ennis Robert H. A taxonomoy of critical thinking dispositions and abilities. In: Baron J. B., Sternberg R. J., editors. Teaching Thinking Skills: Theory and Practice. W.H. Freeman; New York: 1987. pp. 9–26. [ Google Scholar ]
  • Ennis Robert H. Critical Thinking. Prentice-Hall; Upper Saddle River: 1996. [ Google Scholar ]
  • Ennis Robert H. Is critical thinking culturally biased? Teaching Philosophy. 1998; 21 :15–33. doi: 10.5840/teachphil19982113. [ CrossRef ] [ Google Scholar ]
  • Ennis Robert. H. Critical thinking across the curriculum: A vision. Topoi. 2018; 37 :165–84. doi: 10.1007/s11245-016-9401-4. [ CrossRef ] [ Google Scholar ]
  • Facione Noreen C., Facione Peter A. Analyzing explanations for seemingly irrational choices: Linking argument analysis and cognitive science. International Journal of Applied Philosophy. 2001; 15 :267–68. [ Google Scholar ]
  • Facione Peter A. The Delphi Report: Committee on Pre-College Philosophy. California Academic Press; Millbrae: 1990. [ Google Scholar ]
  • Facione Peter A., Facione Noreen C. CCTDI: A Disposition Inventory. California Academic Press; Millbrae: 1992. [ Google Scholar ]
  • Facione Peter A., Facione Noreen C., Blohm Stephen W., Giancarlo Carol Ann F. The California Critical Thinking Skills Test: CCTST. California Academic Press; San Jose: 2002. [ Google Scholar ]
  • Feyerherm Ann E., Rice Cheryl L. Emotional intelligence and team performance: The good, the bad and the ugly. International Journal of Organizational Analysis. 2002; 10 :343–63. doi: 10.1108/eb028957. [ CrossRef ] [ Google Scholar ]
  • Flavell John H. Metacognitive aspects of problem solving. The Nature of Intelligence. 1976:231–36. [ Google Scholar ]
  • Gabennesch Howard. Critical thinking… what is it good for? (In fact, what is it?) Skeptical Inquirer. 2006; 30 :36–41. [ Google Scholar ]
  • Gadzella Bernadette M. Teaching and Learning Critical Thinking Skills. 1996.
  • Gambrill Eileen. Evidence-based practice and policy: Choices ahead. Research on Social Work Practice. 2006; 16 :338–57. [ Google Scholar ]
  • Ghanizadeh Afsaneh, Moafian Fatemeh. Critical thinking and emotional intelligence: Investigating the relationship among EFL learners and the contribution of age and gender. Iranian Journal of Applied Linguistics. 2011; 14 :23–48. [ Google Scholar ]
  • Gilovich Thomas, Griffin Dale, Kahneman Daniel., editors. Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge University Press; Cambridge: 2002. [ Google Scholar ]
  • Glaser Edward. M. An Experiment in the Development of Critical Thinking. Teachers College of Columbia University, Bureau of Publications; New York: 1941. [ Google Scholar ]
  • Goleman Daniel. Emotional Intelligence. Bantam; New York: 1995. [ Google Scholar ]
  • Halpern Diane F. Thought & Knowledge: An Introduction to Critical Thinking. 5th ed. Psychology Press; London: 2014. [ Google Scholar ]
  • Hamm Robert M. Clinical intuition and clinical analysis: Expertise and the cognitive continuum. In: Dowie J., Elstein A. S., editors. Professional Judgment: A Reader in Clinical Decision Making. Cambridge University Press; Cambridge: 1988. pp. 78–105. [ Google Scholar ]
  • Hammond Kenneth R. Principles of Organization in Intuitive and Analytical Cognition. Center for Research on Judgment and Policy, University of Colorado; Boulder: 1981. Report No. 231. [ Google Scholar ]
  • Hammond Kenneth R. Upon reflection. Thinking and Reasoning. 1996; 2 :239–48. doi: 10.1080/135467896394537. [ CrossRef ] [ Google Scholar ]
  • Hammond Kenneth R. Judgments Under Stress. Oxford University Press on Demand; New York: 2000. [ Google Scholar ]
  • Haynes Ada, Lisic Elizabeth, Goltz Michele, Stein Barry, Harris Kevin. Moving beyond assessment to improving students’ critical thinking skills: A model for implementing change. Journal of the Scholarship of Teaching and Learning. 2016; 16 :44–61. doi: 10.14434/josotl.v16i4.19407. [ CrossRef ] [ Google Scholar ]
  • Hitchcock David. The effectiveness of computer-assisted instruction in critical thinking. Informal Logic. 2004; 24 :183–218. doi: 10.22329/il.v24i3.2145. [ CrossRef ] [ Google Scholar ]
  • Huffman Karen, Vernoy Mark W., William Barbara F. Studying Psychology in Action: A Study Guide to Accompany Psychology in Action. Wiley; Hoboken: 1991. [ Google Scholar ]
  • Iordan Alexandru D., Dolcos Sanda, Dolcos Florin. Neural signatures of the response to emotional distraction: A review of evidence from brain imaging investigations. Frontiers in Human Neuroscience. 2013; 7 :200. doi: 10.3389/fnhum.2013.00200. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Johnson Marcia K., Raye Carol L., Mitchell Karen J., Greene Erich J., Cunningham William A., Sanislow Charles A. Using fMRI to investigate a component process of reflection: Prefrontal correlates of refreshing a just-activated representation. Cognitive, Affective, & Behavioral Neuroscience. 2005; 5 :339–61. doi: 10.3758/CABN.5.3.339. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Jukes I., McCain T. Minds in Play: Computer Game Design as a Context of Children’s Learning. Erlbaum; Hillsdale: 2002. [ Google Scholar ]
  • Kahneman Daniel. Thinking Fast and Slow. Penguin; Great Britain: 2011. [ Google Scholar ]
  • Kahneman Daniel, Frederick Shane. Representativeness revisited: Attribute substitution in 240 intuitive judgment. In: Gilovich T., Griffin D., Kahneman D., editors. Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge University Press; New York: 2002. pp. 49–81. [ Google Scholar ]
  • Kahneman Daniel, Slovic Paul, Tversky Amos., editors. Judgment under Uncertainty: Heuristics and Biases. Cambridge University Press; Cambridge: 1982. [ Google Scholar ]
  • Kaya Hülya, Şenyuva Emine, Bodur Gönül. Developing critical thinking disposition and emotional intelligence of nursing students: A longitudinal research. Nurse Education Today. 2017; 48 :72–77. doi: 10.1016/j.nedt.2016.09.011. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • King Kathleen P. Adult Education Special Topics: Theory, Research, and Practice in Lifelong Learning. Information Age Publishing; Charlotte: 2009. The Handbook of the Evolving Research of Transformative Learning Based on the Learning Activities Survey. [ Google Scholar ]
  • King Patricia M., Kitchener Karen S. Reflective judgment: Theory and research on the development of epistemic assumptions through adulthood. Educational Psychologist. 2004; 39 :5–18. doi: 10.1207/s15326985ep3901_2. [ CrossRef ] [ Google Scholar ]
  • King Patricia M., Wood Phillip K., Mines Robert A. Critical thinking among college and graduate students. The Review of Higher Education. 1990; 13 :167–86. doi: 10.1353/rhe.1990.0026. [ CrossRef ] [ Google Scholar ]
  • King Patricia. M., Kitchener Karen. Developing Reflective Judgment: Understanding and Promoting Intellectual Growth and Critical Thinking in Adolescents and Adults. Jossey Bass; San Francisco: 1994. [ Google Scholar ]
  • Kruger Justin, Dunning David. Unskilled and unaware of it: How difficulties in recognizing one’s own incompetence lead to inflated self-Assessments. Journal of Personality and Social Psychology. 1999; 77 :1121–34. doi: 10.1037/0022-3514.77.6.1121. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Ku Kelly Y. L. Assessing students’ critical thinking performance: Urging for measurements using multi-response format. Thinking Skills and Creativity. 2009; 4 :70–76. doi: 10.1016/j.tsc.2009.02.001. [ CrossRef ] [ Google Scholar ]
  • Ku Kelly Y. L., Ho Irene T. Dispositional factors predicting Chinese students’ critical thinking performance. Personality and Individual Differences. 2010a; 48 :54–58. doi: 10.1016/j.paid.2009.08.015. [ CrossRef ] [ Google Scholar ]
  • Ku Kelly Y. L., Ho Irene T. Metacognitive strategies that enhance critical thinking. Metacognition and Learning. 2010b; 5 :251–67. doi: 10.1007/s11409-010-9060-6. [ CrossRef ] [ Google Scholar ]
  • Kuhn Deanna. A developmental model of critical thinking. Educational Researcher. 1999; 28 :16–25. doi: 10.3102/0013189X028002016. [ CrossRef ] [ Google Scholar ]
  • Kuhn Deanna. Metacognitive development. Current Directions in Psychological Science. 2000; 9 :178–81. doi: 10.1111/1467-8721.00088. [ CrossRef ] [ Google Scholar ]
  • Leventhal Howard. A perceptual-motor theory of emotion. Advances in Experimental Social Psychology. 1984; 17 :117–82. [ Google Scholar ]
  • Lloyd Margaret, Bahr Nan. Thinking critically about critical thinking in higher education. International Journal for the Scholarship of Teaching and Learning. 2010; 4 :1–16. [ Google Scholar ]
  • Loftus Elizabeth. F. Eavesdropping on memory. Annual Review of Psychology. 2017; 68 :1–18. doi: 10.1146/annurev-psych-010416-044138. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Ma Lihong, Luo Haifeng. Chinese pre-service teachers’ cognitions about cultivating critical thinking in teaching English as a foreign language. Asia Pacific Journal of Education. 2021; 41 :543–57. doi: 10.1080/02188791.2020.1793733. [ CrossRef ] [ Google Scholar ]
  • Ma Lihong, Liu Ning. Teacher belief about integrating critical thinking in English teaching in China. Journal of Education for Teaching. 2022; 49 :137–52. doi: 10.1080/02607476.2022.2044267. [ CrossRef ] [ Google Scholar ]
  • Mahmood Khalid. Do people overestimate their information literacy skills? A systematic review of empirical evidence on the Dunning-Kruger effect. Communications in Information Literacy. 2016; 10 :199–213. doi: 10.15760/comminfolit.2016.10.2.24. [ CrossRef ] [ Google Scholar ]
  • Mangena Agnes, Chabeli Mary M. Strategies to overcome obstacles in the facilitation of critical thinking in nursing education. Nurse Education Today. 2005; 25 :291–98. doi: 10.1016/j.nedt.2005.01.012. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • McGuinness Carol. Teaching thinking: Learning how to think; Presented at the Psychological Society of Ireland and British Psychological Association’sPublic Lecture Series; Galway, Ireland. March 6; 2013. [ Google Scholar ]
  • Mezirow Jack. Perspective Transformation. Adult Education. 1978; 28 :100–10. doi: 10.1177/074171367802800202. [ CrossRef ] [ Google Scholar ]
  • Mezirow Jack. How Critical Reflection Triggers Transformative Learning. In: Mezirow J., editor. Fostering Critical Reflection in Adulthood. Jossey Bass; San Francisco: 1990. pp. 1–20. [ Google Scholar ]
  • Most Steven B., Chun Marvin M., Widders David M., Zald David H. Attentional rubbernecking: Cognitive control and personality in emotioninduced blindness. Psychonomic Bulletin and Review. 2005; 12 :654–61. doi: 10.3758/BF03196754. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Niu Lian, Behar-Horenstein Linda S., Garvan Cyndi W. Do instructional interventions influence college students’ critical thinking skills? A meta-analysis. Educational Research Review. 2013; 9 :114–28. doi: 10.1016/j.edurev.2012.12.002. [ CrossRef ] [ Google Scholar ]
  • Norris Stephen P. Critical Thinking: Current Research, Theory, and Practice. Kluwer; Dordrecht: 1994. The meaning of critical thinking test performance: The effects of abilities and dispositions on scores; pp. 315–29. [ Google Scholar ]
  • Nyhan Brendan, Reifler Jason, Richey Sean, Freed Gary L. Effective messages in vaccine promotion: A randomized trial. Pediatrics. 2014; 133 :E835–E842. doi: 10.1542/peds.2013-2365. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Ortiz Claudia Maria Alvarez. Master’s thesis. University of Melbourne; Melbourne, VIC, Australia: 2007. Does Philosophy Improve Critical Thinking Skills? [ Google Scholar ]
  • Paul Richard W. Critical Thinking: What Every Person Needs to Survive in a Rapidly Changing World. Foundation for Critical Thinking; Santa Barbara: 1993. [ Google Scholar ]
  • Paul Richard, Elder Linda. Critical. The Foundation for Critical Thinking; Dillon Beach: 2008. Thinking. [ Google Scholar ]
  • Perkins David N., Jay Eileen, Tishman Shari. Beyond abilities: A dispositional theory of thinking. Merrill Palmer Quarterly. 1993; 39 :1. [ Google Scholar ]
  • Perkins David, Ritchhart Ron. Motivation, Emotion, and Cognition. Routledge; London: 2004. When is good thinking? pp. 365–98. [ Google Scholar ]
  • Popper Karl R. The Logic of Scientific Discovery. Routledge; London: 1959. First published 1934. [ Google Scholar ]
  • Popper Karl R. All Life Is Problem Solving. Psychology Press; London: 1999. [ Google Scholar ]
  • Quinn Sarah, Hogan Michael, Dwyer Christopher, Finn Patrick, Fogarty Emer. Development and Validation of the Student-Educator Negotiated Critical Thinking Dispositions Scale (SENCTDS) Thinking Skills and Creativity. 2020; 38 :100710. doi: 10.1016/j.tsc.2020.100710. [ CrossRef ] [ Google Scholar ]
  • Rear David. One size fits all? The limitations of standardised assessment in critical thinking. Assessment & Evaluation in Higher Education. 2019; 44 :664–75. [ Google Scholar ]
  • Reed Jennifer H., Kromrey Jeffrey D. Teaching critical thinking in a community college history course: Empirical evidence from infusing Paul’s model. College Student Journal. 2001; 35 :201–15. [ Google Scholar ]
  • Rimiene Vaiva. Assessing and developing students’ critical thinking. Psychology Learning & Teaching. 2002; 2 :17–22. [ Google Scholar ]
  • Rowe Matthew P., Gillespie B. Marcus, Harris Kevin R., Koether Steven D., Shannon Li-Jen Y., Rose Lori A. Redesigning a general education science course to promote critical thinking. CBE—Life Sciences Education. 2015; 14 :ar30. doi: 10.1187/cbe.15-02-0032. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Saleh Salamah Embark. Critical thinking as a 21st century skill: Conceptions, implementation and challenges in the EFL classroom. European Journal of Foreign Language Teaching. 2019; 4 :1. doi: 10.5281/zenodo.2542838. [ CrossRef ] [ Google Scholar ]
  • Salovey Peter, Mayer John D. Emotional intelligence. Imagination, Cognition and Personality. 1990; 9 :185–211. doi: 10.2190/DUGG-P24E-52WK-6CDG. [ CrossRef ] [ Google Scholar ]
  • Schutte Nicola S., Malouff John M., Hall Lena E., Haggerty Donald J., Cooper Joan T., Golden Charles J., Dornheim Liane. Development and validation of a measure of emotional intelligence. Personality and Individual Differences. 1998; 25 :167–77. doi: 10.1016/S0191-8869(98)00001-4. [ CrossRef ] [ Google Scholar ]
  • Shackman Alexander J., Sarinopoulos Issidoros, Maxwell Jeffrey S., Pizzagalli Diego A., Lavric Aureliu, Davidson Richard J. Anxiety selectively disrupts visuospatial working memory. Emotion. 2006; 6 :40–61. doi: 10.1037/1528-3542.6.1.40. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Siegel Harvey. What (good) are thinking dispositions? Educational Theory. 1999; 49 :207–21. doi: 10.1111/j.1741-5446.1999.00207.x. [ CrossRef ] [ Google Scholar ]
  • Simon Herbert A. Models of Man. Wiley; New York: 1957. [ Google Scholar ]
  • Slovic Paul, Fischhoff Baruch, Lichtenstein Sarah. Behavioral decision theory. Annual Review of Psychology. 1977; 28 :1–39. doi: 10.1146/annurev.ps.28.020177.000245. [ CrossRef ] [ Google Scholar ]
  • Slovic Paul, Finucane Melissa, Peters Ellen, MacGregor Donald G. Rational actors or rational fools: Implications of the affect heuristic for behavioral economics. The Journal of SocioEconomics. 2002; 31 :329–42. doi: 10.1016/S1053-5357(02)00174-9. [ CrossRef ] [ Google Scholar ]
  • Solon Tom. Generic critical thinking infusion and course content learning in Introductory Psychology. Journal of Instructional Psychology. 2007; 34 :95–109. [ Google Scholar ]
  • Stanovich Keith E. Miserliness in human cognition: The interaction of detection, override and mindware. Thinking & Reasoning. 2018; 24 :423–44. [ Google Scholar ]
  • Stanovich Keith E., Stanovich Paula J. A framework for critical thinking, rational thinking, and intelligence. In: Preiss D. D., Sternberg R. J., editors. Innovations in Educational Psychology: Perspectives on Learning, Teaching, and Human Development. Springer Publishing Company; Berlin/Heidelberg: 2010. pp. 195–237. [ Google Scholar ]
  • Stanovich Keith E., West Richard F. Individual differences in reasoning: Implications for the rationality debate? Behavioral and Brain Sciences. 2000; 23 :645–65. doi: 10.1017/S0140525X00003435. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Stanovich Keith E., West Richard F., Toplak Maggie E. The Rationality Quotient: Toward a Test of Rational Thinking. MIT Press; Cambridge: 2016. [ Google Scholar ]
  • Stedman Nicole LP, Andenoro Anthony C. Identification of relationships between emotional intelligence skill and critical thinking disposition in undergraduate leadership students. Journal of Leadership Education. 2007; 6 :190–208. doi: 10.12806/V6/I1/RF10. [ CrossRef ] [ Google Scholar ]
  • Strack Fritz, Martin Leonard L., Schwarz Norbert. Priming and communication: Social determinants of information use in judgments of life satisfaction. European Journal of Social Psychology. 1988; 18 :429–42. doi: 10.1002/ejsp.2420180505. [ CrossRef ] [ Google Scholar ]
  • Swami Viren, Furnham Adrian. Political paranoia and conspiracy theories. In: van Prooijen J. W., van Lange P. A. M., editors. Power, Politics, and Paranoia: Why People Are Suspicious of Their Leaders. Cambridge University Press; Cambridge: 2014. pp. 218–36. [ Google Scholar ]
  • Sweller John. Cognitive load theory: Recent theoretical advances. In: Plass J. L., Moreno R., Brünken R., editors. Cognitive Load Theory. Cambridge University Press; New York: 2010. pp. 29–47. [ Google Scholar ]
  • Tavris Carol, Aronson Elliot. Mistakes Were Made (But Not by Me) Harcourt; Orlando: 2007. [ Google Scholar ]
  • Teichert Tobias, Ferrera Vincent P., Grinband Jack. Humans optimize decision-making by delaying decision onset. PLoS ONE. 2014; 9 :e89638. doi: 10.1371/journal.pone.0089638. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Tversky Amos, Kahneman Daniel. Judgment under uncertainty: Heuristics and biases. Science. 1974; 185 :1124–31. doi: 10.1126/science.185.4157.1124. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Valenzuela Jorge, Nieto Ana, Saiz Carlos. Critical Thinking Motivational Scale: A 253 contribution to the study of relationship between critical thinking and motivation. Journal of Research in Educational Psychology. 2011; 9 :823–48. doi: 10.25115/ejrep.v9i24.1475. [ CrossRef ] [ Google Scholar ]
  • Varian Hal, Lyman Peter. How Much Information? School of Information Management and Systems, UC Berkeley; Berkeley: 2003. [ Google Scholar ]
  • Vohs Kathleen D., Baumeister Roy F., Schmeichel Brandon J., Twenge Jean M., Nelson Noelle M., Tice Dianne M. Making choices impairs subsequent self-control: A limited-resource account of decision making, self-regulation, and active initiative. Personality Processes and Individual Differences. 2014; 94 :883–98. doi: 10.1037/2333-8113.1.S.19. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Yao Xiaonan, Yuan Shuge, Yang Wenjing, Chen Qunlin, Wei Dongtao, Hou Yuling, Zhang Lijie, Qiu Jiang, Yang Dong. Emotional intelligence moderates the relationship between regional gray matter volume in the bilateral temporal pole and critical thinking disposition. Brain Imaging and Behavior. 2018; 12 :488–98. doi: 10.1007/s11682-017-9701-3. [ PubMed ] [ CrossRef ] [ Google Scholar ]

Christopher Dwyer Ph.D.

5 Barriers to Critical Thinking

What holds us back from thinking critically in day-to-day situations.

Posted January 18, 2019 | Reviewed by Davia Sills

  • What Is Cognition?
  • Find counselling near me

Quite often, discussions of Critical Thinking (CT) revolve around tips for what you or your students should be doing to enhance CT ability. However, it seems that there’s substantially less discussion of what you shouldn’t be doing—that is, barriers to CT.

About a year ago, I posted "5 Tips for Critical Thinking" to this blog, and after thinking about it in terms of what not to do , along with more modern conceptualizations of CT (see Dwyer, 2017), I’ve compiled a list of five major barriers to CT. Of course, these are not the only barriers to CT; rather, they are five that may have the most impact on how one applies CT.

1. Trusting Your Gut

Trust your gut is a piece of advice often thrown around in the context of being in doubt. The concept of using intuitive judgment is actually the last thing you want to be doing if critical thinking is your goal. In the past, intuitive judgment has been described as "the absence of analysis" (Hamm, 1988); and automatic cognitive processing—which generally lacks effort, intention, awareness, or voluntary control—is usually experienced as perceptions or feelings (Kahneman, 2011; Lieberman, 2003).

Given that intuitive judgment operates automatically and cannot be voluntarily "turned off," associated errors and unsupported biases are difficult to prevent, largely because reflective judgment has not been consulted. Even when errors appear obvious in hindsight, they can only be prevented through the careful, self-regulated monitoring and control afforded by reflective judgment. Such errors and flawed reasoning include cognitive biases and logical fallacies .

Going with your gut—experienced as perceptions or feelings—generally leads the thinker to favor perspectives consistent with their own personal biases and experiences or those of their group.

2. Lack of Knowledge

CT skills are key components of what CT is, and in order to conduct it, one must know how to use these skills. Not knowing the skills of CT—analysis, evaluation, and inference (i.e., what they are or how to use them)—is, of course, a major barrier to its application. However, consideration of a lack of knowledge does not end with the knowledge of CT skills.

Let’s say you know what analysis, evaluation, and inference are, as well as how to apply them. The question then becomes: Are you knowledgeable in the topic area you have been asked to apply the CT? If not, intellectual honesty and reflective judgment should be engaged to allow you to consider the nature, limits, and certainty of what knowledge you do have, so that you can evaluate what is required of you to gain the knowledge necessary to make a critically thought-out judgment.

However, the barrier here may not necessarily be a lack of topic knowledge, but perhaps rather believing that you have the requisite knowledge to make a critically thought-out judgment when this is not the case or lacking the willingness to gain additional, relevant topic knowledge.

3. Lack of Willingness

In addition to skills, disposition towards thinking is also key to CT. Disposition towards thinking refers to the extent to which an individual is willing or inclined to perform a given thinking skill, and is essential for understanding how we think and how we can make our thinking better, in both academic settings and everyday circumstances (Norris, 1992; Siegel, 1999; Valenzuela, Nieto, & Saiz, 2011; Dwyer, Hogan & Stewart, 2014).

Dispositions can’t be taught, per se, but they do play a large role in determining whether or not CT will be performed. Simply, it doesn’t matter how skilled one is at analysis, evaluation, and inference—if they’re not willing to think critically, CT is not likely to occur.

4. Misunderstanding of Truth

Truth-seeking is one such disposition towards thinking, which refers to a desire for knowledge; to seek and offer both reasons and objections in an effort to inform and to be well-informed; a willingness to challenge popular beliefs and social norms by asking questions (of oneself and others); to be honest and objective about pursuing the truth, even if the findings do not support one’s self-interest or pre-conceived beliefs or opinions; and to change one’s mind about an idea as a result of the desire for truth (Dwyer, 2017).

wishful thinking as a barrier to critical thinking

Though this is something for which many of us strive or even just assume we do, the truth is that we all succumb to unwarranted assumptions from time to time: that is, beliefs presumed to be true without adequate justification. For example, we might make a judgment based on an unsubstantiated stereotype or a commonsense/belief statement that has no empirical evidence to justify it. When using CT, it’s important to distinguish facts from beliefs and, also, to dig a little deeper by evaluating "facts" with respect to how much empirical support they have to validate them as fact (see " The Dirtiest Word in Critical Thinking: 'Proof' and its Burden ").

Furthermore, sometimes the truth doesn’t suit people, and so, they might choose to ignore it or try and manipulate knowledge or understanding to accommodate their bias . For example, some people may engage in wishful thinking , in which they believe something is true because they wish it to be; some might engage in relativistic thinking , in which, for them, the truth is subjective or just a matter of opinion.

5. Closed-mindedness

In one of my previous posts, I lay out " 5 Tips for Critical Thinking "—one of which is to play Devil’s Advocate , which refers to the "consideration of alternatives." There’s always more than one way to do or think about something—why not engage such consideration?

The willingness to play Devil’s Advocate implies a sensibility consistent with open-mindedness (i.e., an inclination to be cognitively flexible and avoid rigidity in thinking; to tolerate divergent or conflicting views and treat all viewpoints alike, prior to subsequent analysis and evaluation; to detach from one’s own beliefs and consider, seriously, points of view other than one’s own without bias or self-interest; to be open to feedback by accepting positive feedback, and to not reject criticism or constructive feedback without thoughtful consideration; to amend existing knowledge in light of new ideas and experiences; and to explore such new, alternative, or "unusual" ideas).

At the opposite end of the spectrum, closed-mindedness is a significant barrier to CT. By this stage, you have probably identified the inherent nature of bias in our thinking. The first step of CT is always going to be to evaluate this bias. However, one’s bias may be so strong that it leads them to become closed-minded and renders them unwilling to consider any other perspectives.

Another way in which someone might be closed-minded is through having properly researched and critically thought about a topic and then deciding that this perspective will never change, as if their knowledge will never need to adapt. However, critical thinkers know that knowledge can change and adapt. An example I’ve used in the past is quite relevant here—growing up, I was taught that there were nine planets in our solar system; however, based on further research, our knowledge of planets has been amended to now only consider eight of those as planets.

Being open-minded is a valuable disposition, but so is skepticism (i.e., the inclination to challenge ideas; to withhold judgment before engaging all the evidence or when the evidence and reasons are insufficient; to take a position and be able to change position when the evidence and reasons are sufficient; and to look at findings from various perspectives).

However, one can be both open-minded and skeptical. It is closed-mindedness that is the barrier to CT, so please note that closed-mindedness and skepticism are distinct dispositions.

Dwyer, C.P. (2017). Critical thinking: Conceptual perspectives and practical guidelines. UK: Cambridge University Press.

Dwyer, C.P., Hogan, M.J. & Stewart, I. (2014). An integrated critical thinking framework for the 21st century. Thinking Skills & Creativity, 12, 43-52.

Hamm, R. M. (1988). Clinical intuition and clinical analysis: expertise and the cognitive continuum. In J. Dowie & A. Elstein (Eds.), Professional judgment: A reader in clinical decision making, 78–105. Cambridge: Cambridge University Press.

Kahneman, D. (2011). Thinking fast and slow. Penguin: Great Britain.

Lieberman, M. D. (2003). Reflexive and reflective judgment processes: A social cognitive neuroscience approach. Social Judgments: Implicit and Explicit Processes, 5, 44–67.

Norris, S. P. (Ed.). (1992). The generalizability of critical thinking: Multiple perspectives on an educational ideal. New York: Teachers College Press.

Siegel, H. (1999). What (good) are thinking dispositions? Educational Theory, 49, 2, 207–221.

Valenzuela, J., Nieto, A. M., & Saiz, C. (2011). Critical thinking motivational scale: A contribution to the study of relationship between critical thinking and motivation. Journal of Research in Educational Psychology, 9, 2, 823–848.

Christopher Dwyer Ph.D.

Christopher Dwyer, Ph.D., is a lecturer at the Technological University of the Shannon in Athlone, Ireland.

  • Find Counselling
  • Find a Support Group
  • Find Online Therapy
  • Richmond - Tweed
  • Newcastle - Maitland
  • Canberra - ACT
  • Sunshine Coast
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Self Tests NEW
  • Therapy Center
  • Diagnosis Dictionary
  • Types of Therapy

May 2024 magazine cover

At any moment, someone’s aggravating behavior or our own bad luck can set us off on an emotional spiral that threatens to derail our entire day. Here’s how we can face our triggers with less reactivity so that we can get on with our lives.

  • Emotional Intelligence
  • Gaslighting
  • Affective Forecasting
  • Neuroscience

Information

  • Author Services

Initiatives

You are accessing a machine-readable page. In order to be human-readable, please install an RSS reader.

All articles published by MDPI are made immediately available worldwide under an open access license. No special permission is required to reuse all or part of the article published by MDPI, including figures and tables. For articles published under an open access Creative Common CC BY license, any part of the article may be reused without permission provided that the original article is clearly cited. For more information, please refer to https://www.mdpi.com/openaccess .

Feature papers represent the most advanced research with significant potential for high impact in the field. A Feature Paper should be a substantial original Article that involves several techniques or approaches, provides an outlook for future research directions and describes possible research applications.

Feature papers are submitted upon individual invitation or recommendation by the scientific editors and must receive positive feedback from the reviewers.

Editor’s Choice articles are based on recommendations by the scientific editors of MDPI journals from around the world. Editors select a small number of articles recently published in the journal that they believe will be particularly interesting to readers, or important in the respective research area. The aim is to provide a snapshot of some of the most exciting work published in the various research areas of the journal.

Original Submission Date Received: .

  • Active Journals
  • Find a Journal
  • Proceedings Series
  • For Authors
  • For Reviewers
  • For Editors
  • For Librarians
  • For Publishers
  • For Societies
  • For Conference Organizers
  • Open Access Policy
  • Institutional Open Access Program
  • Special Issues Guidelines
  • Editorial Process
  • Research and Publication Ethics
  • Article Processing Charges
  • Testimonials
  • Preprints.org
  • SciProfiles
  • Encyclopedia

jintelligence-logo

Article Menu

  • Subscribe SciFeed
  • PubMed/Medline
  • Google Scholar
  • on Google Scholar
  • Table of Contents

Find support for a specific problem in the support section of our website.

Please let us know what you think of our products and services.

Visit our dedicated information section to learn more about MDPI.

JSmol Viewer

An evaluative review of barriers to critical thinking in educational and real-world settings.

wishful thinking as a barrier to critical thinking

1. Introduction

2. barriers to critical thinking, 2.1. inadequate skills and dispositions, 2.2. epistemological (mis)understanding, 2.3. intuitive judgment, 2.4. bias and emotion, 3. discussion, 3.1. interpretations, 3.2. further implications and future research, 4. conclusions, institutional review board statement, informed consent statement, data availability statement, acknowledgments, conflicts of interest.

1 ) are acknowledged as impediments to one’s ability to apply CT (e.g., a lack of relevant background knowledge, as well as broader cognitive abilities and resources ( ; ; )), these will not be discussed as focus is largely restricted to issues of cognitive processes that ‘naturally’ act as barriers in their functioning. Moreover, such inadequacies may more so be issues of individual differences than ongoing issues that everyone, regardless of ability, would face in CT (e.g., the impact of emotion and bias). Nevertheless, it is recommended that future research further investigates the influence of such inadequacies in cognitive resources on CT.
2 , ). However, this discrepancy in findings may result from the types of emotion studied—such as task-relevant emotion and task-irrelevant emotion. The distinction between the two is important to consider in terms of, for example, the distinction between one’s general mood and feelings specific unto the topic under consideration. Though mood may play a role in the manner in which CT is conducted (e.g., making judgments about a topic one is passionate about may elicit positive or negative emotions that affect the thinker’s mood in some way), notably, this discussion focuses on task-relevant emotion and associated biases that negatively impact the CT process. This is also an important distinction because an individual may generally think critically about ‘important’ topics, but may fail to do so when faced with a cognitive task that requires CT with which the individual has a strong, emotional perspective (e.g., in terms of passion, as described above).
  • Abrami, Philip C., Robert M. Bernard, Eugene Borokhovski, David I. Waddington, C. Anne Wade, and Tonje Persson. 2015. Strategies for teaching students to think critically: A meta-analysis. Review of Educational Research 85: 275–314. [ Google Scholar ] [ CrossRef ]
  • Abrami, Philip C., Robert M. Bernard, Evgueni Borokhovski, Anne Wade, Michael A. Surkes, Rana Tamim, and Dai Zhang. 2008. Instructional interventions affecting critical thinking skills and dispositions: A stage 1 meta-analysis. Review of Educational Research 78: 1102–34. [ Google Scholar ]
  • Afshar, Hassan Soodmand, and Masoud Rahimi. 2014. The relationship among critical thinking, emotional intelligence, and speaking abilities of Iranian EFL learners. Procedia-Social and Behavioral Sciences 136: 75–79. [ Google Scholar ] [ CrossRef ]
  • Ahern, Aoife, Caroline Dominguez, Ciaran McNally, John J. O’Sullivan, and Daniela Pedrosa. 2019. A literature review of critical thinking in engineering education. Studies in Higher Education 44: 816–28. [ Google Scholar ] [ CrossRef ]
  • Akbari-Lakeh, M., A. Naderi, and A. Arbabisarjou. 2018. Critical thinking and emotional intelligence skills and relationship with students’ academic achievement. Prensa Médica Argentina 104: 2. [ Google Scholar ]
  • Aliakbari, Mohammad, and Akram Sadeghdaghighi. 2013. Teachers’ perception of the barriers to critical thinking. Procedia-Social and Behavioral Sciences 70: 1–5. [ Google Scholar ] [ CrossRef ]
  • Anticevic, Alan, Grega Repovs, Philip R. Corlett, and Deanna M. Barch. 2011. Negative and nonemotional interference with visual working memory in schizophrenia. Biological Psychiatry 70: 1159–68. [ Google Scholar ] [ CrossRef ]
  • Baril, Charles P., Billie M. Cunningham, David R. Fordham, Robert L. Gardner, and Susan K. Wolcott. 1998. Critical thinking in the public accounting profession: Aptitudes and attitudes. Journal of Accounting Education 16: 381–406. [ Google Scholar ] [ CrossRef ]
  • Bar-On, Reuven. 2006. The Bar-On model of emotional-social intelligence (ESI). Psicothema 18: 13–25. [ Google Scholar ]
  • Baumeister, Roy. 2003. The psychology of irrationality: Why people make foolish, self-defeating choices. The Psychology of Economic Decisions 1: 3–16. [ Google Scholar ]
  • Bezanilla, María José, Donna Fernández-Nogueira, Manuel Poblete, and Hector Galindo-Domínguez. 2019. Methodologies for teaching-learning critical thinking in higher education: The teacher’s view. Thinking Skills and Creativity 33: 100584. [ Google Scholar ] [ CrossRef ]
  • Brabeck, Mary Margaret. 1981. The relationship between critical thinking skills and development of reflective judgment among adolescent and adult women. Paper presented at the 89th annual convention of the American Psychological Association, Los Angeles, CA, USA, August 24–26. [ Google Scholar ]
  • Butler, Heather A., Christopher P. Dwyer, Michael J. Hogan, Amanda Franco, Silvia F. Rivas, Carlos Saiz, and Leandro S. Almeida. 2012. The Halpern Critical Thinking Assessment and real-world outcomes: Cross-national applications. Thinking Skills and Creativity 7: 112–21. [ Google Scholar ] [ CrossRef ]
  • Byerly, T. Ryan. 2019. Teaching for intellectual virtue in logic and critical thinking classes: Why and how. Teaching Philosophy 42: 1. [ Google Scholar ] [ CrossRef ]
  • Cabantous, Laure, Jean-Pascal Gond, and Michael Johnson-Cramer. 2010. Decision theory as practice: Crafting rationality in organizations. Organization Studies 31: 1531–66. [ Google Scholar ] [ CrossRef ]
  • Cáceres, Martín, Miguel Nussbaum, and Jorge Ortiz. 2020. Integrating critical thinking into the classroom: A teacher’s perspective. Thinking Skills and Creativity 37: 100674. [ Google Scholar ]
  • Cader, Raffik, Steve Campbell, and Don Watson. 2005. Cognitive continuum theory in nursing decision-making. Journal of Advanced Nursing 49: 397–405. [ Google Scholar ] [ CrossRef ]
  • Casey, Helen. 2018. Transformative Learning: An Exploration of the BA in Community and Family Studies Graduates’ Experiences. Doctoral dissertation, National University of Ireland, Galway, Ireland. [ Google Scholar ]
  • Chuah, Lisa YM, Florin Dolcos, Annette K. Chen, Hui Zheng, Sarayu Parimal, and Michael WL Chee. 2010. Sleep deprivation and interference by emotional distracters. SLEEP 33: 1305–13. [ Google Scholar ] [ CrossRef ]
  • Clark, Andy. 2001. Mindware: An Introduction to the Philosophy of Cognitive Science . New York: Oxford University Press. [ Google Scholar ]
  • Commission on Fake News and the Teaching of Critical Literacy in Schools. 2018. Fake News and Critical Literacy: Final Report . London: National Literacy Trust. [ Google Scholar ]
  • Cook, John, and Stephan Lewandowsky. 2011. The Debunking Handbook . St. Lucia: University of Queensland. [ Google Scholar ]
  • Cornell, Paul, Monica Riordan, Mary Townsend-Gervis, and Robin Mobley. 2011. Barriers to critical thinking: Workflow interruptions and task switching among nurses. JONA: The Journal of Nursing Administration 41: 407–14. [ Google Scholar ] [ CrossRef ]
  • Croskerry, Pat, Geeta Singhal, and Sílvia Mamede. 2013. Cognitive debiasing 2: Impediments to and strategies for change. BMJ Quality and Safety 22: ii65–ii72. [ Google Scholar ] [ CrossRef ]
  • Darling-Hammond, Linda. 2008. How can we teach for meaningful learning? In Powerful Learning . Edited by L. Darling-Hammond. New York: Wiley, pp. 1–10. [ Google Scholar ]
  • Dawson, Theo L. 2008. Metacognition and learning in adulthood. In Prepared in Response to Tasking from ODNI/CHCO/IC Leadership Development Office . Northampton: Developmental Testing Service, LLC. [ Google Scholar ]
  • Denkova, Ekaterina, Gloria Wong, Sanda Dolcos, Keen Sung, Lihong Wang, Nicholas Coupland, and Florin Dolcos. 2010. The impact of anxiety-inducing distraction on cognitive performance: A combined brain imaging and personality investigation. PLoS ONE 5: e14150. [ Google Scholar ] [ CrossRef ]
  • Dhami, Mandeep K., and Mary E. Thomson. 2012. On the relevance of cognitive continuum theory and quasirationality for understanding management judgment and decision making. European Management Journal 30: 316–26. [ Google Scholar ] [ CrossRef ]
  • Dolcos, Florin, Alexandru D. Iordan, and Sanda Dolcos. 2011. Neural correlates of emotion–cognition interactions: A review of evidence from brain imaging investigations. Journal of Cognitive Psychology 23: 669–94. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Dolcos, Florin, and Gregory McCarthy. 2006. Brain systems mediating cognitive interference by emotional distraction. Journal of Neuroscience 26: 2072–79. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Dolcos, Florin, Ekaterina Denkova, and Sanda Dolcos. 2012. Neural correlates of emotional memories: A review of evidence from brain imaging studies. Psychologia 55: 80–111. [ Google Scholar ] [ CrossRef ]
  • Dumitru, Daniela. 2012. Critical thinking and integrated programs. The problem of transferability. Procedia-Social and Behavioral Sciences 33: 143–47. [ Google Scholar ] [ CrossRef ]
  • Dunwoody, Philip T., Eric Haarbauer, Robert P. Mahan, Christopher Marino, and Chu-Chun Tang. 2000. Cognitive adaptation and its consequences: A test of cognitive continuum theory. Journal of Behavioral Decision Making 13: 35–54. [ Google Scholar ] [ CrossRef ]
  • Dwyer, Christopher P. 2011. The Evaluation of Argument Mapping as a Learning Tool. Doctoral thesis, National University of Ireland, Galway, Ireland. [ Google Scholar ]
  • Dwyer, Christopher P. 2017. Critical Thinking: Conceptual Perspectives and Practical Guidelines . Cambridge: Cambridge University Press. [ Google Scholar ]
  • Dwyer, Christopher P. 2020. Teaching critical thinking. The SAGE Encyclopedia of Higher Education 4: 1510–12. [ Google Scholar ]
  • Dwyer, Christopher P., and Anne Walsh. 2019. A case study of the effects of critical thinking instruction through adult distance learning on critical thinking performance: Implications for critical thinking development. Educational Technology and Research 68: 17–35. [ Google Scholar ] [ CrossRef ]
  • Dwyer, Christopher P., and John D. Eigenauer. 2017. To Teach or not to Teach Critical Thinking: A Reply to Huber and Kuncel. Thinking Skills and Creativity 26: 92–95. [ Google Scholar ] [ CrossRef ]
  • Dwyer, Christopher P., Michael J. Hogan, and Ian Stewart. 2012. An evaluation of argument mapping as a method of enhancing critical thinking performance in e-learning environments. Metacognition and Learning 7: 219–44. [ Google Scholar ] [ CrossRef ]
  • Dwyer, Christopher P., Michael J. Hogan, and Ian Stewart. 2014. An integrated critical thinking framework for the 21st century. Thinking Skills and Creativity 12: 43–52. [ Google Scholar ] [ CrossRef ]
  • Dwyer, Christopher P., Michael J. Hogan, and Ian Stewart. 2015. The evaluation of argument mapping-infused critical thinking instruction as a method of enhancing reflective judgment performance. Thinking Skills & Creativity 16: 11–26. [ Google Scholar ]
  • Dwyer, Christopher. P., Michael J. Hogan, Owen M. Harney, and Caroline Kavanagh. 2016. Facilitating a Student-Educator Conceptual Model of Dispositions towards Critical Thinking through Interactive Management. Educational Technology & Research 65: 47–73. [ Google Scholar ]
  • Eigenauer, John D. 2017. Don’t reinvent the critical thinking wheel: What scholarly literature says about critical thinking instruction. NISOD Innovation Abstracts 39: 2. [ Google Scholar ]
  • Elder, Linda. 1997. Critical thinking: The key to emotional intelligence. Journal of Developmental Education 21: 40. [ Google Scholar ] [ CrossRef ]
  • Ennis, Robert H. 1987. A taxonomoy of critical thinking dispositions and abilities. In Teaching Thinking Skills: Theory and Practice . Edited by J. B. Baron and R. J. Sternberg. New York: W.H. Freeman, pp. 9–26. [ Google Scholar ]
  • Ennis, Robert H. 1996. Critical Thinking . Upper Saddle River: Prentice-Hall. [ Google Scholar ]
  • Ennis, Robert H. 1998. Is critical thinking culturally biased? Teaching Philosophy 21: 15–33. [ Google Scholar ] [ CrossRef ]
  • Ennis, Robert. H. 2018. Critical thinking across the curriculum: A vision. Topoi 37: 165–84. [ Google Scholar ] [ CrossRef ]
  • Facione, Noreen C., and Peter A. Facione. 2001. Analyzing explanations for seemingly irrational choices: Linking argument analysis and cognitive science. International Journal of Applied Philosophy 15: 267–68. [ Google Scholar ]
  • Facione, Peter A. 1990. The Delphi Report: Committee on Pre-College Philosophy . Millbrae: California Academic Press. [ Google Scholar ]
  • Facione, Peter A., and Noreen C. Facione. 1992. CCTDI: A Disposition Inventory . Millbrae: California Academic Press. [ Google Scholar ]
  • Facione, Peter A., Noreen C. Facione, Stephen W. Blohm, and Carol Ann F. Giancarlo. 2002. The California Critical Thinking Skills Test: CCTST . San Jose: California Academic Press. [ Google Scholar ]
  • Feyerherm, Ann E., and Cheryl L. Rice. 2002. Emotional intelligence and team performance: The good, the bad and the ugly. International Journal of Organizational Analysis 10: 343–63. [ Google Scholar ] [ CrossRef ]
  • Flavell, John H. 1976. Metacognitive aspects of problem solving. The Nature of Intelligence , 231–36. [ Google Scholar ]
  • Gabennesch, Howard. 2006. Critical thinking… what is it good for? (In fact, what is it?). Skeptical Inquirer 30: 36–41. [ Google Scholar ]
  • Gadzella, Bernadette M. 1996. Teaching and Learning Critical Thinking Skills .
  • Gambrill, Eileen. 2006. Evidence-based practice and policy: Choices ahead. Research on Social Work Practice 16: 338–57. [ Google Scholar ]
  • Ghanizadeh, Afsaneh, and Fatemeh Moafian. 2011. Critical thinking and emotional intelligence: Investigating the relationship among EFL learners and the contribution of age and gender. Iranian Journal of Applied Linguistics 14: 23–48. [ Google Scholar ]
  • Gilovich, Thomas, Dale Griffin, and Daniel Kahneman, eds. 2002. Heuristics and Biases: The Psychology of Intuitive Judgment . Cambridge: Cambridge University Press. [ Google Scholar ]
  • Glaser, Edward. M. 1941. An Experiment in the Development of Critical Thinking . New York: Teachers College of Columbia University, Bureau of Publications. [ Google Scholar ]
  • Goleman, Daniel. 1995. Emotional Intelligence . New York: Bantam. [ Google Scholar ]
  • Halpern, Diane F. 2014. Thought & Knowledge: An Introduction to Critical Thinking , 5th ed. London: Psychology Press. [ Google Scholar ]
  • Hamm, Robert M. 1988. Clinical intuition and clinical analysis: Expertise and the cognitive continuum. In Professional Judgment: A Reader in Clinical Decision Making . Edited by J. Dowie and A. S. Elstein. Cambridge: Cambridge University Press, pp. 78–105. [ Google Scholar ]
  • Hammond, Kenneth R. 1981. Principles of Organization in Intuitive and Analytical Cognition . Report No. 231. Boulder: Center for Research on Judgment and Policy, University of Colorado. [ Google Scholar ]
  • Hammond, Kenneth R. 1996. Upon reflection. Thinking and Reasoning 2: 239–48. [ Google Scholar ] [ CrossRef ]
  • Hammond, Kenneth R. 2000. Judgments Under Stress . New York: Oxford University Press on Demand. [ Google Scholar ]
  • Haynes, Ada, Elizabeth Lisic, Michele Goltz, Barry Stein, and Kevin Harris. 2016. Moving beyond assessment to improving students’ critical thinking skills: A model for implementing change. Journal of the Scholarship of Teaching and Learning 16: 44–61. [ Google Scholar ] [ CrossRef ]
  • Hitchcock, David. 2004. The effectiveness of computer-assisted instruction in critical thinking. Informal Logic 24: 183–218. [ Google Scholar ] [ CrossRef ]
  • Huffman, Karen, Mark W. Vernoy, and Barbara F. William. 1991. Studying Psychology in Action: A Study Guide to Accompany Psychology in Action . Hoboken: Wiley. [ Google Scholar ]
  • Iordan, Alexandru D., Sanda Dolcos, and Florin Dolcos. 2013. Neural signatures of the response to emotional distraction: A review of evidence from brain imaging investigations. Frontiers in Human Neuroscience 7: 200. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Johnson, Marcia K., Carol L. Raye, Karen J. Mitchell, Erich J. Greene, William A. Cunningham, and Charles A. Sanislow. 2005. Using fMRI to investigate a component process of reflection: Prefrontal correlates of refreshing a just-activated representation. Cognitive, Affective, & Behavioral Neuroscience 5: 339–61. [ Google Scholar ] [ CrossRef ]
  • Jukes, I., and T. McCain. 2002. Minds in Play: Computer Game Design as a Context of Children’s Learning . Hillsdale: Erlbaum. [ Google Scholar ]
  • Kahneman, Daniel. 2011. Thinking Fast and Slow . Great Britain: Penguin. [ Google Scholar ]
  • Kahneman, Daniel, and Shane Frederick. 2002. Representativeness revisited: Attribute substitution in 240 intuitive judgment. In Heuristics and Biases: The Psychology of Intuitive Judgment . Edited by T. Gilovich, D. Griffin and D. Kahneman. New York: Cambridge University Press, pp. 49–81. [ Google Scholar ]
  • Kahneman, Daniel, Paul Slovic, and Amos Tversky, eds. 1982. Judgment under Uncertainty: Heuristics and Biases . Cambridge: Cambridge University Press. [ Google Scholar ]
  • Kaya, Hülya, Emine Şenyuva, and Gönül Bodur. 2017. Developing critical thinking disposition and emotional intelligence of nursing students: A longitudinal research. Nurse Education Today 48: 72–77. [ Google Scholar ] [ CrossRef ]
  • King, Kathleen P. 2009. The Handbook of the Evolving Research of Transformative Learning Based on the Learning Activities Survey. In Adult Education Special Topics: Theory, Research, and Practice in Lifelong Learning . Charlotte: Information Age Publishing. [ Google Scholar ]
  • King, Patricia M., and Karen S. Kitchener. 2004. Reflective judgment: Theory and research on the development of epistemic assumptions through adulthood. Educational Psychologist 39: 5–18. [ Google Scholar ] [ CrossRef ]
  • King, Patricia M., Phillip K. Wood, and Robert A. Mines. 1990. Critical thinking among college and graduate students. The Review of Higher Education 13: 167–86. [ Google Scholar ] [ CrossRef ]
  • King, Patricia. M., and Karen Kitchener. 1994. Developing Reflective Judgment: Understanding and Promoting Intellectual Growth and Critical Thinking in Adolescents and Adults . San Francisco: Jossey Bass. [ Google Scholar ]
  • Kruger, Justin, and David Dunning. 1999. Unskilled and unaware of it: How difficulties in recognizing one’s own incompetence lead to inflated self-Assessments. Journal of Personality and Social Psychology 77: 1121–34. [ Google Scholar ] [ CrossRef ]
  • Ku, Kelly Y. L. 2009. Assessing students’ critical thinking performance: Urging for measurements using multi-response format. Thinking Skills and Creativity 4: 70–76. [ Google Scholar ] [ CrossRef ]
  • Ku, Kelly Y. L., and Irene T. Ho. 2010a. Dispositional factors predicting Chinese students’ critical thinking performance. Personality and Individual Differences 48: 54–58. [ Google Scholar ] [ CrossRef ]
  • Ku, Kelly Y. L., and Irene T. Ho. 2010b. Metacognitive strategies that enhance critical thinking. Metacognition and Learning 5: 251–67. [ Google Scholar ] [ CrossRef ]
  • Kuhn, Deanna. 1999. A developmental model of critical thinking. Educational Researcher 28: 16–25. [ Google Scholar ] [ CrossRef ]
  • Kuhn, Deanna. 2000. Metacognitive development. Current Directions in Psychological Science 9: 178–81. [ Google Scholar ] [ CrossRef ]
  • Leventhal, Howard. 1984. A perceptual-motor theory of emotion. Advances in Experimental Social Psychology 17: 117–82. [ Google Scholar ]
  • Lloyd, Margaret, and Nan Bahr. 2010. Thinking critically about critical thinking in higher education. International Journal for the Scholarship of Teaching and Learning 4: 1–16. [ Google Scholar ]
  • Loftus, Elizabeth. F. 2017. Eavesdropping on memory. Annual Review of Psychology 68: 1–18. [ Google Scholar ] [ CrossRef ]
  • Ma, Lihong, and Haifeng Luo. 2021. Chinese pre-service teachers’ cognitions about cultivating critical thinking in teaching English as a foreign language. Asia Pacific Journal of Education 41: 543–57. [ Google Scholar ] [ CrossRef ]
  • Ma, Lihong, and Ning Liu. 2022. Teacher belief about integrating critical thinking in English teaching in China. Journal of Education for Teaching 49: 137–52. [ Google Scholar ] [ CrossRef ]
  • Mahmood, Khalid. 2016. Do people overestimate their information literacy skills? A systematic review of empirical evidence on the Dunning-Kruger effect. Communications in Information Literacy 10: 199–213. [ Google Scholar ] [ CrossRef ]
  • Mangena, Agnes, and Mary M. Chabeli. 2005. Strategies to overcome obstacles in the facilitation of critical thinking in nursing education. Nurse Education Today 25: 291–98. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • McGuinness, Carol. 2013. Teaching thinking: Learning how to think. Presented at the Psychological Society of Ireland and British Psychological Association’sPublic Lecture Series, Galway, Ireland, March 6. [ Google Scholar ]
  • Mezirow, Jack. 1978. Perspective Transformation. Adult Education 28: 100–10. [ Google Scholar ] [ CrossRef ]
  • Mezirow, Jack. 1990. How Critical Reflection Triggers Transformative Learning. In Fostering Critical Reflection in Adulthood . Edited by J. Mezirow. San Francisco: Jossey Bass, pp. 1–20. [ Google Scholar ]
  • Most, Steven B., Marvin M. Chun, David M. Widders, and David H. Zald. 2005. Attentional rubbernecking: Cognitive control and personality in emotioninduced blindness. Psychonomic Bulletin and Review 12: 654–61. [ Google Scholar ] [ CrossRef ]
  • Niu, Lian, Linda S. Behar-Horenstein, and Cyndi W. Garvan. 2013. Do instructional interventions influence college students’ critical thinking skills? A meta-analysis. Educational Research Review 9: 114–28. [ Google Scholar ] [ CrossRef ]
  • Norris, Stephen P. 1994. The meaning of critical thinking test performance: The effects of abilities and dispositions on scores. In Critical Thinking: Current Research, Theory, and Practice . Dordrecht: Kluwer, pp. 315–29. [ Google Scholar ]
  • Nyhan, Brendan, Jason Reifler, Sean Richey, and Gary L. Freed. 2014. Effective messages in vaccine promotion: A randomized trial. Pediatrics 133: E835–E842. [ Google Scholar ] [ CrossRef ]
  • Ortiz, Claudia Maria Alvarez. 2007. Does Philosophy Improve Critical Thinking Skills? Master’s thesis, University of Melbourne, Melbourne, VIC, Australia. [ Google Scholar ]
  • Paul, Richard W. 1993. Critical Thinking: What Every Person Needs to Survive in a Rapidly Changing World . Santa Barbara: Foundation for Critical Thinking. [ Google Scholar ]
  • Paul, Richard, and Linda Elder. 2008. Critical . Thinking. Dillon Beach: The Foundation for Critical Thinking. [ Google Scholar ]
  • Perkins, David N., Eileen Jay, and Shari Tishman. 1993. Beyond abilities: A dispositional theory of thinking. Merrill Palmer Quarterly 39: 1. [ Google Scholar ]
  • Perkins, David, and Ron Ritchhart. 2004. When is good thinking? In Motivation, Emotion, and Cognition . London: Routledge, pp. 365–98. [ Google Scholar ]
  • Popper, Karl R. 1959. The Logic of Scientific Discovery . London: Routledge. First published 1934. [ Google Scholar ]
  • Popper, Karl R. 1999. All Life Is Problem Solving . London: Psychology Press. [ Google Scholar ]
  • Quinn, Sarah, Michael Hogan, Christopher Dwyer, Patrick Finn, and Emer Fogarty. 2020. Development and Validation of the Student-Educator Negotiated Critical Thinking Dispositions Scale (SENCTDS). Thinking Skills and Creativity 38: 100710. [ Google Scholar ] [ CrossRef ]
  • Rear, David. 2019. One size fits all? The limitations of standardised assessment in critical thinking. Assessment & Evaluation in Higher Education 44: 664–75. [ Google Scholar ]
  • Reed, Jennifer H., and Jeffrey D. Kromrey. 2001. Teaching critical thinking in a community college history course: Empirical evidence from infusing Paul’s model. College Student Journal 35: 201–15. [ Google Scholar ]
  • Rimiene, Vaiva. 2002. Assessing and developing students’ critical thinking. Psychology Learning & Teaching 2: 17–22. [ Google Scholar ]
  • Rowe, Matthew P., B. Marcus Gillespie, Kevin R. Harris, Steven D. Koether, Li-Jen Y. Shannon, and Lori A. Rose. 2015. Redesigning a general education science course to promote critical thinking. CBE—Life Sciences Education 14: ar30. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Saleh, Salamah Embark. 2019. Critical thinking as a 21st century skill: Conceptions, implementation and challenges in the EFL classroom. European Journal of Foreign Language Teaching 4: 1. [ Google Scholar ] [ CrossRef ]
  • Salovey, Peter, and John D. Mayer. 1990. Emotional intelligence. Imagination, Cognition and Personality 9: 185–211. [ Google Scholar ] [ CrossRef ]
  • Schutte, Nicola S., John M. Malouff, Lena E. Hall, Donald J. Haggerty, Joan T. Cooper, Charles J. Golden, and Liane Dornheim. 1998. Development and validation of a measure of emotional intelligence. Personality and Individual Differences 25: 167–77. [ Google Scholar ] [ CrossRef ]
  • Shackman, Alexander J., Issidoros Sarinopoulos, Jeffrey S. Maxwell, Diego A. Pizzagalli, Aureliu Lavric, and Richard J. Davidson. 2006. Anxiety selectively disrupts visuospatial working memory. Emotion 6: 40–61. [ Google Scholar ] [ CrossRef ]
  • Siegel, Harvey. 1999. What (good) are thinking dispositions? Educational Theory 49: 207–21. [ Google Scholar ] [ CrossRef ]
  • Simon, Herbert A. 1957. Models of Man . New York: Wiley. [ Google Scholar ]
  • Slovic, Paul, Baruch Fischhoff, and Sarah Lichtenstein. 1977. Behavioral decision theory. Annual Review of Psychology 28: 1–39. [ Google Scholar ] [ CrossRef ]
  • Slovic, Paul, Melissa Finucane, Ellen Peters, and Donald G. MacGregor. 2002. Rational actors or rational fools: Implications of the affect heuristic for behavioral economics. The Journal of SocioEconomics 31: 329–42. [ Google Scholar ] [ CrossRef ]
  • Solon, Tom. 2007. Generic critical thinking infusion and course content learning in Introductory Psychology. Journal of Instructional Psychology 34: 95–109. [ Google Scholar ]
  • Stanovich, Keith E. 2018. Miserliness in human cognition: The interaction of detection, override and mindware. Thinking & Reasoning 24: 423–44. [ Google Scholar ]
  • Stanovich, Keith E., and Paula J. Stanovich. 2010. A framework for critical thinking, rational thinking, and intelligence. In Innovations in Educational Psychology: Perspectives on Learning, Teaching, and Human Development . Edited by D. D. Preiss and R. J. Sternberg. Berlin/Heidelberg: Springer Publishing Company, pp. 195–237. [ Google Scholar ]
  • Stanovich, Keith E., and Richard F. West. 2000. Individual differences in reasoning: Implications for the rationality debate? Behavioral and Brain Sciences 23: 645–65. [ Google Scholar ] [ CrossRef ]
  • Stanovich, Keith E., Richard F. West, and Maggie E. Toplak. 2016. The Rationality Quotient: Toward a Test of Rational Thinking . Cambridge: MIT Press. [ Google Scholar ]
  • Stedman, Nicole LP, and Anthony C. Andenoro. 2007. Identification of relationships between emotional intelligence skill and critical thinking disposition in undergraduate leadership students. Journal of Leadership Education 6: 190–208. [ Google Scholar ] [ CrossRef ]
  • Strack, Fritz, Leonard L. Martin, and Norbert Schwarz. 1988. Priming and communication: Social determinants of information use in judgments of life satisfaction. European Journal of Social Psychology 18: 429–42. [ Google Scholar ] [ CrossRef ]
  • Swami, Viren, and Adrian Furnham. 2014. Political paranoia and conspiracy theories. In Power, Politics, and Paranoia: Why People Are Suspicious of Their Leaders . Edited by J. W. van Prooijen and P. A. M. van Lange. Cambridge: Cambridge University Press, pp. 218–36. [ Google Scholar ]
  • Sweller, John. 2010. Cognitive load theory: Recent theoretical advances. In Cognitive Load Theory . Edited by J. L. Plass, R. Moreno and R. Brünken. New York: Cambridge University Press, pp. 29–47. [ Google Scholar ]
  • Tavris, Carol, and Elliot Aronson. 2007. Mistakes Were Made (But Not by Me) . Orlando: Harcourt. [ Google Scholar ]
  • Teichert, Tobias, Vincent P. Ferrera, and Jack Grinband. 2014. Humans optimize decision-making by delaying decision onset. PLoS ONE 9: e89638. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Tversky, Amos, and Daniel Kahneman. 1974. Judgment under uncertainty: Heuristics and biases. Science 185: 1124–31. [ Google Scholar ] [ CrossRef ] [ PubMed ]
  • Valenzuela, Jorge, Ana Nieto, and Carlos Saiz. 2011. Critical Thinking Motivational Scale: A 253 contribution to the study of relationship between critical thinking and motivation. Journal of Research in Educational Psychology 9: 823–48. [ Google Scholar ] [ CrossRef ]
  • Varian, Hal, and Peter Lyman. 2003. How Much Information? Berkeley: School of Information Management and Systems, UC Berkeley. [ Google Scholar ]
  • Vohs, Kathleen D., Roy F. Baumeister, Brandon J. Schmeichel, Jean M. Twenge, Noelle M. Nelson, and Dianne M. Tice. 2014. Making choices impairs subsequent self-control: A limited-resource account of decision making, self-regulation, and active initiative. Personality Processes and Individual Differences 94: 883–98. [ Google Scholar ] [ CrossRef ]
  • Yao, Xiaonan, Shuge Yuan, Wenjing Yang, Qunlin Chen, Dongtao Wei, Yuling Hou, Lijie Zhang, Jiang Qiu, and Dong Yang. 2018. Emotional intelligence moderates the relationship between regional gray matter volume in the bilateral temporal pole and critical thinking disposition. Brain Imaging and Behavior 12: 488–98. [ Google Scholar ] [ CrossRef ]
The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

Dwyer, C.P. An Evaluative Review of Barriers to Critical Thinking in Educational and Real-World Settings. J. Intell. 2023 , 11 , 105. https://doi.org/10.3390/jintelligence11060105

Dwyer CP. An Evaluative Review of Barriers to Critical Thinking in Educational and Real-World Settings. Journal of Intelligence . 2023; 11(6):105. https://doi.org/10.3390/jintelligence11060105

Dwyer, Christopher P. 2023. "An Evaluative Review of Barriers to Critical Thinking in Educational and Real-World Settings" Journal of Intelligence 11, no. 6: 105. https://doi.org/10.3390/jintelligence11060105

Article Metrics

Article access statistics, further information, mdpi initiatives, follow mdpi.

MDPI

Subscribe to receive issue release notifications and newsletters from MDPI journals

Wishful Thinking: Why It Happens, How to Prevent It, and How to Return to Reality

Wishful thinking is expensive. It causes delays, rework, and even outright failure, on every scale from the project to the enterprise. In this program, we explore tools we can use to detect wishful thinking while we're doing it. We then describe several techniques for preventing it, noticing it, and finally, for repairing the consequences of wishful thinking when it happens.

When things go wrong, and we look back at how we got there, we must sometimes admit to wishful thinking. In this program, we explore why wishful thinking is so common and suggest techniques for limiting its frequency and effects.

Wishful Thinking

A dandelion gone to seed. Some people like to make wishes and then blow on the seeds to set them free.

For example, a phenomenon known as the IKEA effect is the tendency for people to overvalue objects that they partially assembled themselves, such as furniture from IKEA, independent of the quality of the end result. This phenomenon, in organizations, might explain the "not invented here" syndrome that is, in many organizations, responsible for so much waste of resources and loss of market position.

In this program we provide practices and procedures for decision makers or groups of decision makers to help them detect and prevent wishful thinking. The work is based on tools for improving interpersonal communication due to Virginia Satir, and on recent advances in our understanding of the various psychological phenomena known as cognitive biases.

This program gives attendees the tools and concepts they need to detect and prevent wishful thinking, and once detected, to intervene constructively to limit its effects. It deals with issues such as:

  • How does wishful thinking affect our awareness of situations?
  • How does it influence our ability to remain attentive to tasks?
  • How does it affect the likelihood of seeing patterns that aren't actually present?
  • How does it contribute to the sunk cost effect, thereby interfering with attempts to cancel failed efforts?
  • What is the sunk time effect and how does it affect management?
  • How does wishful thinking prime us to make bad decisions?
  • How does wishful thinking affect negotiation?
  • Can it distort our assessment of the state of mind of superiors, subordinates, colleagues, or rivals? How?

This program is available as a keynote , seminar , or breakout . For the shorter formats, coverage of the outline below is selective.

Learning objectives

This program helps people who make decisions or who want or need to influence others as they make decisions. As it turns out, that's just about everyone in the knowledge-oriented workplace. Participants learn:

  • The concept and importance of cognitive biases
  • The Satir Interaction Model of communication, and how to use it as a framework for detecting wishful thinking
  • How cognitive biases contribute to the incidence of wishful thinking
  • What conditions expose groups to the risk of wishful thinking
  • Indicators of those cognitive biases that most affect individual and group decision making through wishful thinking
  • A basic checklist to use to evaluate the likelihood that wishful thinking has affected group decisions through cognitive biases

Participants learn to appreciate the causes of wishful thinking, both personal and organizational. Most important, they learn strategies and tactics for limiting its effects, or, having discovered that wishful thinking is in action, how to intervene to enhance decision quality.

Program structure and content

We learn through presentation, discussion, exercises, simulations, and post-program activities. We can tailor a program for you that addresses your specific challenges, or we can deliver a tried-and-true format that has worked well for other clients. Participants usually favor a mix of presentation, discussion, and focused exercises.

Whether you're a veteran decision maker, or a relative newcomer to high-stakes decision making as a workplace practice, this program is a real eye-opener.

Learning model

When we learn most new skills, we intend to apply them in situations with low emotional content. But knowledge about how people work together is most needed in highly charged situations. That's why we use a learning model that goes beyond presentation and discussion — it includes in the mix simulation, role-play, metaphorical problems, and group processing. This gives participants the resources they need to make new, more constructive choices even in tense situations. And it's a lot more fun for everybody.

Target audience

Decision makers at all levels, including managers of global operations, sponsors of global projects, managers, business analysts, team leads, project managers, and team members.

Program duration

Available formats range from 50 minutes to one full day. The longer formats allow for more coverage or more material, more experiential content and deeper understanding of issues specific to audience experience.

Follow Rick

Send email or subscribe to one of my newsletters

Send an email message to a friend

rbren IyeJIiAfnGdKlUXr ner@Chac sxirZwZlENmHUNHi oCanyon.com Send a message to Rick

   A Tip A Day feed

   Point Lookout weekly feed

Technical Debt for Policymakers Blog

  • "Rick is a dynamic presenter who thinks on his feet to keep the material relevant to the group." — Tina L. Lawson, Technical Project Manager, BankOne (now J.P. Morgan Chase)
  • "Rick truly has his finger on the pulse of teams and their communication." — Mark Middleton, Team Lead, SERS

Comprehensive collection of all e-books and e-booklets

Study.com

In order to continue enjoying our site, we ask that you confirm your identity as a human. Thank you very much for your cooperation.

  • Accounting & Finance
  • Communication
  • Critical Thinking
  • Marketing & Strategy
  • Starting a Business
  • Team Management
  • Corporate Philosophy
  • Diversity, Equity, & Inclusion
  • Kokorozashi
  • Sustainable Business
  • AI Ventures
  • Machine Learning
  • Alumni Voices
  • Yoshito Hori Blog
  • Unlimited Insights
  • Career Skills

How to Identify and Remove Barriers to Critical Thinking

An illustration of an office worker jumping over a brick wall representing barriers to critical thinking.

Critical Thinking: Structured Reasoning

Even a few simple techniques for logical decision making and persuasion can vastly improve your skills as a leader. Explore how critical thinking can help you evaluate complex business problems, reduce bias, and devise effective solutions.

Critical Thinking: Problem-Solving

Problem-solving is a central business skill, and yet it's the one many people struggle with most. This course will show you how to apply critical thinking techniques to common business examples, avoid misunderstandings, and get at the root of any problem.

Contrary to popular belief, being intelligent or logical does not automatically make you a critical thinker.

People with high IQs are still prone to biases, complacency, overconfidence, and stereotyping that affect the quality of their thoughts and performance at work. But people who scored high in critical thinking —a reflection of sound analytical, problem-solving, and decision-making abilities—report having fewer negative experiences in and out of the office.

Top 5 Barriers to Critical Thinking

To learn how to think critically, you’ll need to identify and understand what prevents people from doing so in the first place. Catching yourself (and others) engaging in these critical thinking no-no’s can help prevent costly mistakes and improve your quality of life.

Here are five of the most common barriers to critical thinking.

Egocentric Thinking

Egoism, or viewing everything in relation to yourself, is a natural human tendency and a common barrier to critical thinking. It often leads to an inability to question one’s own beliefs, sympathize with others, or consider different perspectives.

Egocentricity is an inherent character flaw. Understand that, and you’ll gain the open-minded point of view required to assess situations outside your own lens of understanding.

Groupthink and Social Conditioning

Everyone wants to feel like they belong. It’s a basic survival instinct and psychological mechanism that ensures the survival of our species. Historically, humans banded together to survive in the wild against predators and each other. That desire to “fit in” persists today as groupthink, or the tendency to agree with the majority and suppress independent thoughts and actions.

Groupthink is a serious threat to diversity in that it supports social conditioning, or the idea that we should all adhere to a particular society or culture’s most “acceptable” behavior.

Overcoming groupthink and cultural conditioning requires the courage to break free from the crowd. It’s the only way to question popular thought, culturally embedded values, and belief systems in a detached and objective manner.

Next Article

5 of the Best Books on Critical Thinking and Problem-Solving

 width=

Drone Mentality and Cognitive Fatigue

Turning on “autopilot” and going through the motions can lead to a lack of spatial awareness. This is known as drone mentality, and it’s not only detrimental to you, but those around you, as well.

Studies show that monotony and boredom are bad for mental health . Cognitive fatigue caused by long-term mental activity without appropriate stimulation, like an unchanging daily routine full of repetitive tasks, negatively impairs cognitive functioning and critical thinking .

Although you may be tempted to flip on autopilot when things get monotonous, as a critical thinker you need to challenge yourself to make new connections and find fresh ideas. Adopt different schools of thought. Keep both your learning and teaching methods exciting and innovative, and that will foster an environment of critical thinking.

The Logic Tree: The Ultimate Critical Thinking Framework

 width=

Personal Biases and Preferences

Everyone internalizes certain beliefs, opinions, and attitudes that manifest as personal biases. You may feel that you’re open minded, but these subconscious judgements are more common than most people realize. They can distort your thinking patterns and sway your decision making in the following ways:

  • Confirmation bias: favoring information that reinforces your existing viewpoints and beliefs
  • Anchoring bias: being overly influenced by the first piece of information you come across
  • False consensus effect: believing that most people share your perspective
  • Normalcy bias: assuming that things will stay the same despite significant changes to the status quo

The critical thinking process requires being aware of personal biases that affect your ability to rationally analyze a situation and make sound decisions.

Allostatic Overload

Research shows that persistent stress causes a phenomenon known as allostatic overload . It’s serious business, affecting your attention span, memory, mood, and even physical health.

When under pressure, your brain is forced to channel energy into the section responsible for processing necessary information at the expense of taking a rest. That’s why people experience memory lapses in fight-or-flight situations. Prolonged stress also reduces activity in the prefrontal cortex, the part of the brain that handles executive tasks.

Avoiding cognitive impairments under pressure begins by remaining as calm and objective as possible. If you’re feeling overwhelmed, take a deep breath and slow your thoughts. Assume the role of a third-party observer. Analyze and evaluate what can be controlled instead of what can’t.

Train Your Mind Using the 9 Intellectual Standards

The bad news is that barriers to critical thinking can really sneak up on you and be difficult to overcome. But the good news is that anyone can learn to think critically with practice.

Unlike raw intelligence, which is largely determined by genetics , critical thinking can be mastered using nine teachable standards of thought:

  • Clarity: Is the information or task at hand easy to understand and free from obscurities?
  • Precision: Is it specific and detailed?
  • Accuracy: Is it correct, free from errors and distortions?
  • Relevance: Is it directly related to the matter at hand?
  • Depth: Does it consider all other variables, contexts, and situations?
  • Breadth: Is it comprehensive, and does it encompass other perspectives?
  • Logical: Does it contradict itself?
  • Significance: Is it important in the first place?
  • Fairness: Is it free from bias, deception, and self-interest?

When evaluating any task, situation, or piece of information, consider these intellectual standards to hone your critical thinking skills in a structured, practiced way. Keep it up, and eventually critical thinking will become second nature.

Related Articles

The trap of tiara syndrome: how to advocate for yourself.

 width=

360 Marketing: Where Traditional and Digital Meet

 width=

The Unsung Pillar of Japanese Work Culture: Hou-Ren-Sou

 width=

Get monthly Insights

Sign up for our newsletter! Privacy Policy

GLOBIS Insights

  • Submission Guidelines
  • Our Contributors

Accountability

  • Privacy Policy

GLOBIS Group

  • GLOBIS Corporation
  • GLOBIS University
  • GLOBIS Capital Partners
  • GLOBIS Asia Pacific
  • GLOBIS Asia Campus
  • GLOBIS China
  • GLOBIS Europe
  • GLOBIS Thailand
  • G1 Institute
  • Ibaraki Robots Sports Entertainment
  • KIBOW Foundation

© GLOBIS All Rights Reserved

  • Our native egocentrism : “to view everything in the world in relationship to oneself , to be self-centered” (Webster’s New World Dictionary); to view the world in self-validating or selfish terms.
  • Our native sociocentrism : to view everything in relationship to one’s group ; to be group-centered; to attach ourselves to others and together create beliefs, rules, and taboos to which those in the group are expected to adhere, and against which the behavior of those outside the group are judged; to view the world in group-validating or groupish terms.

Video Series

wishful thinking as a barrier to critical thinking

  • Analyze a Self-Centered Person You Know Well
  • Identify Your Irrational Purposes
  • Identify Some of Your Irrational Beliefs
  • Focus on a Disagreement in Which you Were Not Fairminded
  • Identify Prejudices in Your Beliefs
  • Identify When Someone Was Trying to Manipulate You
  • Focus on a Time when you Did Not Get What You Selfishly Wanted
  • To What Extent Are You Rational?
  • To What Extent Are You Egocentrically Dominating?
  • To What Extent Are You Egocentrically Submissive?
  • To What Extent Are You Egocentrically Dominating versus Submissive?
  • Analyze Your Group Memberships
  • Identify Sociocentric Thinking in the News
  • Identify Beliefs Acquired Through Group Membership
  • Identify Ways Children are Indoctrinated Into Group Ideologies
  • Identify Dysfunctional Behavior Within a Group
  • Identify the Impact of Group Influence
  • Identify the Impact of Groupishness
  • Identify Dangerous Group Validation
  • Examining Ideologies Imposed on Groups by Group Members
  • Distinguish Between Reasonable and Unreasonable Ideas Within a Group
  • Analyze a Group That Promotes Positive Change or Advancement of Critical Thinking Societies
  • Identify Nationalism Within Our Countries
  • Identify Group Control and Group Conformity
  • Analyze Group Power Within Cultures
  • Analyzing the Actions of Dissenters
  • Analyze Speciescentrism Within Our Cultures
  • Identify Sociocentric Thought Within Professions
  • Identify Politicians Hiding Behind Words
  • Analyze Sociocentric Uses of Words

Human Mind

Lets Connect

wishful thinking as a barrier to critical thinking

Critical Thinking vs. Wishful Thinking

wishful thinking as a barrier to critical thinking

Dr. Hurd: I absolutely loved what you said in your article about “confidence, but no competence.”

I’d like to add, though, that the reason parents don’t teach their kids about the tools to gain that competence is simply because … they don’t really understand them, either.

I see many previous generations having “done” something … because they believed they had to. They couldn’t explain why (and believe me, at a much younger age, I asked older folks many times…and I got nothing but a run-around that would make President Obama blush; no joke), they just did. It was supposedly just, “What you were supposed to do, because you were.”

If the previous generations can’t explain to the proceeding generations why what’s being done is necessary to be done, then what we all see is those future generations living by the ideology that the previous generations believed, but didn’t want to admit.

Perfect example: Why did the Baby Boomer generation turn on to the Hippie Movement and revolt against the “man” when their parents seemed so incredibly patriotic? Maybe because deep down their parents weren’t very patriotic, they just claimed they were–to their kids and to themselves, and it wasn’t questioned.

Again, fantastic piece; I agree with you all the way.

Dr. Hurd replies:

In my newly released book, BAD THERAPY GOOD THERAPY, I made the distinction between “do as I say” dogmatism and “do as you feel” subjectivism. You’re talking about the first, and you’re so right that

this leads to one of two things: Either mindless rebellion (as in the 1960s student movements), or subservience, with one generation to the next not thinking about how wrong many unchallenged assumptions are.

The antidote to all of this is critical thinking. Good teachers can — at least theoretically — be hired to teach the skills of critical thinking to children, but parents must above all foster and provide the leadership for it. I don’t care whether a particular parent has a Ph.D. in chemistry or an M.D., or barely has a high school degree. Everyone has the capacity for critical thinking within his or her context of knowledge. Critical thinking means objective analysis, logic and reason. It means NOT blindly accepting vaguely held, unidentified assumptions, and it means NOT refusing to accept something as true just because everyone else seems to accept it as true.

Critical thinkers are self-aware about the content of their own minds, and always willing to identify and challenge their underlying or “hidden” assumptions.

Critical thinking is independent, by definition. It’s not subjective and emotion-driven, but it’s not “by-the-book” either. If you conclude something is true that happens to be conventional and a majority happens to agree, then fine. You still concluded it with your own independent, thinking and reasoning mind. You understand the reasons and arguments in favor of the conclusion that others might also accept. By that same method, you can determine when a majority are off course, or guilty of sloppy or erroneous thinking.

Parents have to be the leader on this issue. The way to be the leader is to live your life as a critical thinker, something you should be doing even without children. The other way to be the leader is to coach and guide your children through whatever critical thinking they’re capable of. You take the time to find out what they’re capable of by having intelligent discussions with them about anything at all of concern to them. It could be events in a book, on a television show, in the neighborhood, at the school, or within the family. Every conversation about anything is a potential opportunity to demonstrate critical thinking. Critical thinking is not lecturing. It’s logical, factual, and sometimes even common sensical analysis. “Here’s my answer to the question, and here are my reasons. What do you think?”

A good therapist, as I write in my book, thinks along with the therapy client. This is different from thinking FOR the client. The same applies to your relationship with your child. If your child, at a given stage of mental development, is honestly unable to grasp the essentials of a certain subject, then of course you do this for him. You decide, for example, where you’re going to live and whether you can afford a particular purchase. But always be open to the possibility that your child is capable of at least a certain amount of thinking on any subject, including a subject about which he still won’t be making any final decisions. When possible, let him make his own decisions even when — gasp! — he’s plainly wrong. (Most American parents cannot stand to let their children be wrong. This is why so many children grow up into these helpless and entitled adults.) When it’s physically safe, let your child make mistakes and learn from them. Critically think with him along the way. Reason with him, and think it all out along with him. Don’t talk down to your child; think along with him, as much as he’s able.

I understand what you’re saying in your comments. Not all parents do this. It’s truly sad. When I look at what passes for intellectual, moral, social and political “leadership” in this country, I recognize that these are the choices of people — of grown adults — who truly are lost and clueless when it comes to critical thinking. There would be no President Obama nor a President Bush, nor a President Romney or President Gingrich, in a culture where people engaged in even a little bit of critical thinking about political matters. In politics, most people are engaging in wishful thinking, not critical thinking, which is why most of us keep hiring idiots and liars to lead us. The same applies in the realm of what passes for moral and spiritual leadership, whether in the “do as I say” dogmatic context of traditional church, or the “do as you feel” subjectivism of the so-called self-help movement, including but not limited to all things Oprah.

The real battle in society, and within any individual soul, is the battle between wishful thinking and critical thinking. Critical thinking does not always lead to the truth, but it provides the means for correction while getting to the truth. And it helps you face and identify the truth, once there. Wishful thinking gives up on the concept of objective truth altogether, and replaces it with the nonsense that permeates our entire culture, from individual households and minds, to the President, to most of our celebrities and academic leaders, and everything in between.

America may go down as a civilization that got so much done, but that ultimately lacked a regard for the rational mind — and paid a terrible price for it. It’s the same for an individual, regardless of the civilization he happens to live in. If he thinks critically, he’ll flourish to the greatest degree possible in that society. If he doesn’t think critically, the greatest civilization in the world, while doing him some material good, will not do a thing for his mind, his intellect, emotions or soul.

Subscribe to Dr. Hurd News!

Please enter your Email Address.

Advertisement

Issue Cover

  • Previous Article
  • Next Article

EXPERIMENT 1: WISHFUL THINKING IN ToM (3-PoV) AND ONLINE BEHAVIOR (1-PoV)

Experiment 2: learning from others with an otom, general discussion, author contributions, acknowledgments, so good it has to be true: wishful thinking in theory of mind.

Competing Interests: The authors have no significant competing financial, professional, or personal interests that might have influenced the execution or presentation of the work described in this manuscript.

  • Cite Icon Cite
  • Open the PDF for in another window
  • Permissions
  • Article contents
  • Figures & tables
  • Supplementary Data
  • Peer Review
  • Search Site

Daniel Hawthorne-Madell , Noah D. Goodman; So Good It Has to Be True: Wishful Thinking in Theory of Mind. Open Mind 2017; 1 (2): 101–110. doi: https://doi.org/10.1162/OPMI_a_00011

Download citation file:

  • Ris (Zotero)
  • Reference Manager

In standard decision theory, rational agents are objective, keeping their beliefs independent from their desires. Such agents are the basis for current computational models of Theory of Mind (ToM), but the accuracy of these models are unknown. Do people really think that others do not let their desires color their beliefs? In two experiments we test whether people think that others engage in wishful thinking. We find that participants do think others believe that desirable events are more likely to happen, and that undesirable ones are less likely to happen. However, these beliefs are not well calibrated as people do not let their desires influence their beliefs in the task. Whether accurate or not, thinking that others wishfully think has consequences for reasoning about them. We find one such consequence—people learn more from an informant who thinks an event will happen despite wishing it was otherwise. People’s ToM therefore appears to be more nuanced than the current rational accounts in that it allows other’s desires to directly affect their subjective probability of an event.

Whether thinking “I can change him/her” about a rocky relationship or the more benign “those clouds will blow over” when at a picnic, people’s desires seem to color their beliefs. However, such an explanation presupposes a direct link between his desires and beliefs, a link that is currently absent in normative behavioral models and current Theory of Mind (ToM) models.

Does a causal link between desires and beliefs actually exist? 1 The evidence is mixed. There are a number of compelling studies that find “wishful thinking,” or a “desirability bias” in both carefully controlled laboratory studies (Mayraz, 2011 ) and real-world settings, such as the behavior of sport fans (Babad, 1987 ; Babad & Katz, 1991 ), expert investors (Olsen, 1997 ), and voters (Redlawsk, 2002 ). However, other researchers have failed to observe the effect—for example, Bar-Hillel and Budescu’s “The Elusive Wishful Thinking Effect” ( 1995 ) has provided alternative accounts of previous experiments (Hahn & Harris, 2014 ), and has argued that there is insufficient evidence for a systematic wishful thinking bias (Hahn & Harris, 2014 ; Krizan & Windschitl, 2007 ).

Whether or not there actually is a direct effect of desires on beliefs, people might think that there is and use this fact when reasoning about other people. That is to say, people’s ToM might incorporate the wishful thinking link seen in Figure 1b . The direct influence of desires on beliefs is a departure from classic belief–desire “folk” psychology in which beliefs and desires are independent and jointly cause action ( Figure 1a ). Previous models of ToM formalize belief–desire psychology into probabilistic models of action and belief formation. They show that inferring others’ beliefs (Baker, Saxe, & Tenenbaum, 2011 ), preferences (Jern, Lucas, & Kemp, 2011 ), and desires (Baker, Saxe, & Tenenbaum, 2009 ) can be understood as Bayesian reasoning over these generative models. A fundamental assumption of these models is that beliefs are formed on the basis of evidence, and a priori independent of desire. We will call models that make this assumption rational theories of mind (rToM). We can contrast this rationally motivated theory with one that incorporates the rose-colored lenses of a desire–belief link, an optimistic ToM (oToM). 2 We use their qualitative predictions to motivate two experiments into the presence (and calibration) of wishful thinking in ToM and its impact on social reasoning.

Competing models of Theory of Mind (ToM). Causal models of (a) rational ToM based upon classic belief-desire psychology and (b) optimistic ToM that includes a direct “wishful thinking” link between desires and beliefs.

Competing models of Theory of Mind (ToM). Causal models of (a) rational ToM based upon classic belief-desire psychology and (b) optimistic ToM that includes a direct “wishful thinking” link between desires and beliefs.

In Experiment 1 we explore wishful thinking in both ToM and behavior. In the third-person point-of-view (3-PoV) condition, we test whether people use an rToM or an oToM when reasoning about how others play a simple game—will manipulating an agent’s desire for an outcome affect people’s judgments about the agent’s belief in that outcome? In the first person point of view (1-PoV) condition we test whether people actually exhibit wishful thinking when playing the game themselves. We carefully match the (3-PoV) and (1-PoV) conditions and run them concurrently to have a clear test of whether people’s ToM assumptions lead them to make appropriate inferences about people’s behavior in the game. 3 Regardless of its appropriateness, people’s ToM should have consequences for both how they reason about others’ actions and how they learn from them. If people do attribute wishful thinking to others, it would have a dramatic impact on their interpretation of others’ behavior. In Experiment 2 we therefore test for a social learning pattern that only reasoners using an oToM would exhibit, highlighting the impact ToM assumptions have on social reasoning.

3-PoV Condition

To test for the presence of wishful thinking in people’s mental models of others we introduced Josh, a person playing a game with a transparent causal structure. The causal structure of the game was conveyed via the physical intuitions of the Galton board pictured in Figure 2b (in which a simulated ball bounces off pegs to land in one of two bins). The outcome of the game is binary (there are two bins) with different values associated with each outcome (money won or lost). We call the value of an outcome (i.e., the amount that Josh stands to win or lose) the utility of that outcome, U ( outcome ). Participants were asked what they think about Josh’s belief in the likelihood of the outcome p j ( outcome ). By manipulating outcome values we are able to test for wishful thinking. If people incorporate wishful thinking into their ToM, we should find that increasing an outcome’s utility results in higher estimates of Josh’s belief in the outcome’s occurrence, p j ( outcome ).

Stimuli used in Experiment 1. (a) The wheel used to determine the payout for the next outcome and (b) the Galton board used to decide the outcome. The blue arrow at the top indicates where the marble will be dropped. The numbers indicate the four drop positions used in the experiment.

Stimuli used in Experiment 1. (a) The wheel used to determine the payout for the next outcome and (b) the Galton board used to decide the outcome. The blue arrow at the top indicates where the marble will be dropped. The numbers indicate the four drop positions used in the experiment.

We first measured p j ( outcome | evidence ) without manipulating the desirability of the outcome in the “baseline” block of trials. Then in the “utility” block of trials we assigned values to outcomes, manipulating Josh’s U ( outcome ). 4 In the utility block of trials we used a spinning wheel ( Figure 2a ) to determine what Josh stood to win or lose based on the outcome of the marble drop. By comparing these two blocks of trials we test for the presence of wishful thinking in people’s ToM.

1-PoV Condition

To test whether people’s desires directly influence their beliefs in the Galton board game, we simply had the participant directly play the game (replacing Josh) and asked them about their belief in the likelihood of the outcome [their “self” belief p s ( outcome )].

Participants

Eighty participants (24 female, μ age = 32.93, σ age = 9.68) were randomly assigned to either the 3-PoV or the 1-PoV condition such that there were 40 in each.

Design and Procedure

Participants were first introduced to Josh, who was playing a marble-drop game with a Galton board (as seen in Figure 2b ). Josh was personified as a stick figure and appeared on every screen. We then presented the causal structure (i.e., physics) of the game by dropping a marble from the center of the board two times, with one landing in the orange bin ( Figure 2b left bin) and one landing in the purple bin ( Figure 2b right bin). After observing the two marble drops, participants began the baseline block of trials. In the four baseline trials, the marble’s drop position varied and participants were asked, “What do you think Josh thinks is the chance that the marble lands in the bin with the purple/orange box?” Participants’ responses were recorded on a continuous slider with endpoints labeled “Certainly Will” and “Certainly Won’t.” Color placement was randomized on each trial, and the color of the box in question varied between participants. The marble drop position was indicated with a blue arrow at the top of the Galton board, and there were four drop positions used ( marble x ; top of Figure 2b ) that varied in how likely they were to deliver the marble into the bin in question. In the baseline and subsequent trials, participants did not observe the marble drop and outcome; they only observed the position the marble would be dropped from.

The procedure mirrored the 3-PoV condition with the participant taking the place of Josh. All questions were therefore reframed to ask the participant’s beliefs about the outcome. The participants were given a $1 bonus initially and instructed that one trial at random would be selected to augment their current bonus, that is, they could gain or lose $1.

In a rational theory of mind, beliefs and desires are a priori independent. Manipulating Josh’s desires therefore shouldn’t have an effect on his beliefs, and we would predict that the utility trials look like the baseline trials. However, as seen in Figure 3a , the utility trials varied systematically from the baseline trials and, therefore, the predictions of an rToM. To quantify this deviation we fit a logistic mixed-effects model to participants’ p j ( outcome ) responses. The model used marble x and the categorically coded value of the outcome (negative, baseline, and positive) as fixed effects and included the random effect of marble x and intercept for each participant. The resulting model indicated that if an outcome was associated with a utility for Josh, participants thought that it would impact his beliefs about the probability of that outcome.

Experiment 1 data. The effect of an agent’s desire for an outcome on the mean subjective pj(outcome) attributed to the agent (with 95% CIs). For each participant, the mean effect of the positive utility ($1) and the negative utility (−$1) was determined by taking the difference between the pj(outcome) in each utility trial and the corresponding baseline trial. The effect is shown for the (a) 3-PoV (point-of-view) and (b) 1-PoV condition [where ps(outcome) is displayed]. These data are compared with the posterior predictives of the (c) optimistic and (d) rational Theory of Mind (ToM) models (see Supplemental Materials [Hawthorne-Madell & Goodman, 2017]).

Experiment 1 data. The effect of an agent’s desire for an outcome on the mean subjective p j ( outcome ) attributed to the agent (with 95% CIs). For each participant, the mean effect of the positive utility ($1) and the negative utility (−$1) was determined by taking the difference between the p j ( outcome ) in each utility trial and the corresponding baseline trial. The effect is shown for the (a) 3-PoV (point-of-view) and (b) 1-PoV condition [where p s ( outcome ) is displayed]. These data are compared with the posterior predictives of the (c) optimistic and (d) rational Theory of Mind (ToM) models (see Supplemental Materials [Hawthorne-Madell & Goodman, 2017 ]).

Participants thought that Josh would believe that an outcome that lost him money was less likely than the corresponding baseline trial ( β = −0.70, z = −2.10, p = .036). 6 They also thought that Josh would believe an outcome that would net him money was more likely than the corresponding baseline trial ( β = 0.96, z = 2.87, p = .004). 7 Finally, marble x , the direct evidence, had a significant influence ( β = 10.37, z = 11.78, p < .001). There was no evidence that the effect of the outcome value was affected by marble x (the interactive model did not provide a superior fit [ χ 2 (2) = 0.68, p = .736].

Unlike in the 3-PoV condition, as seen in Figure 3b , there was no effect of utility on participants’ p s ( outcome ) responses compared with their baseline responses. Using the same logistic mixed-model employed in the 3-PoV condition, neither outcomes that would lose the participant money ( β = 0.09, z = 0.30, p = .760), nor outcomes that would win them money ( β = −0.09, z = 0.30, p = .760) influenced participants’ p s ( outcome ) responses. Similar to the 3-PoV condition, a strong effect of the marble’s position was observed ( β = 8.88, z = 11.95, p < .001).

Comparing Conditions

To formalize the discrepancy of the effect of utility across conditions, we analyzed them together with a logistic mixed-model. We used the same model described previously except we continuously coded the effect of utility and added an interaction between this utility and condition. The resulting model had a significant interaction between PoV (condition) and the effect of utility on participants’ p j / s ( outcome ) responses ( β = 0.43, z = 3.83, p < .001). This interactive model provided a better fit than the additive model [ χ 2 (1) = 15.11, p < .001].

The results from the 3-PoV condition indicate that people’s ToM includes a direct “wishful thinking” link. This is consistent with the qualitative predictions of the oToM model (see the Supplemental Materials [Hawthorne-Madell & Goodman, 2017 ]; Equation 2), unlike rToM models where beliefs and desires are a priori independent. 8 However, the 1-PoV condition did not find evidence that people are biased by their desires in the Galton board game. This disconnect suggests that people’s attribution of wishful thinking in this situation is miscalibrated . That is to say that Experiment 1 represents a situation where wishful thinking is present in ToM reasoning but absent in actual behavior—people think others will behave wishfully when, in fact, they do not.

This miscalibration is consistent with an over attribution of wishful thinking. However, the present study does not provide insights into why there is this miscalibration. Any number of incorrect assumptions could lead to the results. Perhaps people think that everyone wishfully thinks, but only they are clever enough to correct for it. Alternatively, they could think that $1 or $5 is much more desirable for others than it is for themselves. There are a number of actor–observer asymmetries and self-enhancement biases that could plausibly underpin the observed inconsistency (Jones & Nisbett, 1971 ; Kunda, 1999 ). Further study is necessary to determine the cause of the over attribution.

Regardless of whether people actually engage in wishful thinking, if people assume others do, then it should affect how they interpret others’ actions and learn from them. In Experiment 2 we therefore expand our sights to social learning situations where oToM (but, crucially, not rToM) predicts that desires affect a social source’s influence.

Do people consider a social source’s desires when learning from them? It would be important to do so if they think that his desires have a direct influence on his beliefs. Consider a learner using an oToM to reason about her uncle, a Chicago Cubs fan, who proudly proclaims that this is the year the Cubs will win it all. Though her uncle knows a lot about baseball, the oToM learner is unmoved from her (understandably) skeptical stance. However, if her aunt, a lifelong Chicago White Sox fan (hometown rival to the Cubs), agrees that the Cubs do look better than the Sox this year, then an oToM learner considers this a much stronger teaching signal. In fact, a learner with an oToM would consider her aunt’s testimony as more persuasive than an impartial source (see Figure 4b ). A learner reasoning with an rToM wouldn’t distinguish between these three social sources, 9 as seen in Figure 4c .

Experiment 2 data. Effect of a social sources’ desire on how others learn from them for (a) data with 95% CIs, which we compare to the posterior predictives of (b) an optimistic Theory of Mind (ToM) and (c) a rational ToM. Points represent the mean p(teamx) response after hearing equally knowledgeable sources place a bet on teamx that is either consistent, unrelated, or inconsistent with their desires.

Experiment 2 data. Effect of a social sources’ desire on how others learn from them for (a) data with 95% CIs, which we compare to the posterior predictives of (b) an optimistic Theory of Mind (ToM) and (c) a rational ToM. Points represent the mean p ( team x ) response after hearing equally knowledgeable sources place a bet on team x that is either consistent, unrelated, or inconsistent with their desires.

We investigated which ToM best describes learning from social sources in a controlled version of this biased opinion scenario. Participants were asked how likely a team ( x ) was to win an upcoming match, p ( team x ), in a fictional college soccer tournament after seeing a knowledgeable student bet on the team. The student was either a fan of one of the teams facing off, or indifferent to the outcome. Participants therefore saw three trials—the consistent trial where the student bet on the team he wanted to win, the inconsistent trial where he bet on the team wished would lose, and the impartial trial where he didn’t care which team won before he bet.

One hundred twenty participants were randomly assigned into the consistent, inconsistent, or impartial conditions.

Participants were first introduced to a (fictional) annual British collegiate soccer tournament and told that they would see bets on these matches from a student who “Unbeknownst to his friends makes a £100 bet online on which team he thinks will win this year’s game.” 10 The student would either be a fan of one of the teams (attending that college) or neither of the teams (attending a different college). The students were equally knowledgeable across conditions, being described as seeing the outcome of the last 10 matches these teams played against each other.

After the introduction, participants were given a test trial appropriate for their (randomly assigned) condition in which the student bet consistently with his school, bet against his school, or was impartial (not a fan of either school). After observing the student’s bet and allegiance participants were asked “What do you think is the chance that team x wins the match this year?”

As seen in Figure 4a , participants’ responses were sensitive to the student’s a priori desires, consistent with learners who reason with an oToM (but not an rToM). Participants who saw an impartial student bet on team x thought the team was more likely to win than when they saw a fan of team x place an identical bet ( d = 0.80, 95% CI [0.33 1.27], z = 3.35, p < .001 11 ). This is consistent with the learner thinking that the fan’s desire to see his team win made him think it was objectively more likely. Additionally, participants who saw a fan of the other team bet on team x were more influenced than the same bet from the impartial student ( d = 0.67, 95% CI [0.21 1.14], z = 2.87, p = .004). As predicted by the model of the oToM learner, someone who bets against their desires is more diagnostic of team x being dominant than the independent source. The oToM learner thinks that team x had to be clearly dominant to overcome the wishful thinking of a fan rooting against them.

Assuming that fans engage in wishful thinking allows oToM learners to make stronger inferences about the strength of the fans’ evidence in some cases. For an rToM learner, the fan would have to have seen team x win a majority of the 10 observed matches in order to bet on them, regardless of their predilections, resulting in the flat predictions seen in Figure 4c . Meanwhile, the oToM learner thinks that a fan of team x could bet on them even if the fan only observed them win a few times. 12 If, however, the fan bets against their team, the oToM learner assumes that the fan must have seen their team trounced in the 10 observed matches. Using these insights, an oToM learner using Bayesian inference to learn from the fan will exhibit the qualitative pattern seen in Figure 4b , which is consistent with participants’ behavior (as seen in Figure 4a ). The pattern of results is consistent with the predictions of a learner using an oToM, (but see the discussion of limitations and additional potential explanations in the Supplemental Materials [Hawthorne-Madell & Goodman, 2017 ]).

Current computational models of theory of mind are built upon the assumption that beliefs are a priori independent of desires. Whether social reasoners use such a rational ToM (rToM) is an empirical question. In two experiments we tested the independence of beliefs and desires in ToM and found that people behave as if they think that others are wishful thinkers whose beliefs are colored by their desires.

In the 3-PoV condition of Experiment 1, we found that people believe that others inflate the probability of desirable outcomes and underestimate the probability of undesirable ones, as they would if they have an optimistic ToM (oToM) with a direct link between desires and beliefs ( Figure 3 ). If people broadly attribute wishful thinking to others (as Experiment 1 suggests), it should be reflected in their social reasoning. For example, social learners using an oToM to make sense of an agent’s beliefs would be sensitive to that agent’s relevant desires. This is exactly what we found in Experiment 2 ( Figure 4 )—how much people learned from an agent’s beliefs depended on his desires. Agents whose beliefs ran against their desires were more influential than impartial agents, who, in turn, were more influential than agents with consistent beliefs and desires.

The observed presence of wishful thinking in ToM has no necessary relation to its existence in people’s “online” belief formation. Indeed, the 1-PoV conditions of Experiment 1 indicate that people’s model of others’ wishful thinking is not perfectly calibrated. They over attribute wishful thinking to others in situations where they would actually form their beliefs independently of their desires. Charting the situations where wishful thinking is over applied in this way may be a fruitful avenue for further research. At the extreme, we could imagine finding that everyone thinks one another wishfully thinks, but in fact everyone forms their beliefs independent of their desires! This radical thesis is surely too strong, 13 but oToM may well overestimate the strength of wishful thinking and over generalize it—amplifying a small online effect into a larger social cognition effect. Attention to whether a task engages (potentially amplified) oToM representations could provide insight into the considerable heterogeneity of the wishful thinking effect as it has been studied. Specifically, it could help explain why first-person wishful thinking is reliably found in some paradigms and not others.

The paradigms in which wishful thinking is reliably found involve participants reasoning about themselves or others , such as the 3-PoV condition of Experiment 1 where participants reasoned about Josh’s beliefs (for a review of many tasks that may engage social reasoning, see, e.g., Shepperd, Klein, Waters, & Weinstein, 2013 , and Weinstein, 1980 , but see Harris & Hahn, 2011 , and Hahn & Harris, 2014 , for an alternative explanation). Whereas asocial paradigms involving direct estimation of probabilities usually do not find the effect, like the 1-PoV condition of Experiment 1 where participants directly estimated the chance that the ball would fall into a particular bin (for other examples of wishful thinking paradigms that do not involve social reasoning, see Study 1 of Bar-Hillel & Budescu, 1995 , and for a more general review of asocial bias experiments, see the “bookbags” and “pokerchips” paradigms cited in Hahn & Harris, 2014 , but see Francis Irwin’s series of experiments for an example of asocial paradigms that do find a wishful thinking effect—starting with Irwin, 1953 ).

Where people’s predictions of others’ behaviors (1-PoV, Experiment 1) and their actual behavior (3-PoV, Experiment 1) diverge is also important to map because these disconnects inject a systematic bias into social reasoning. Taking the social learning of Experiment 2 as an example, oToM learners ignored the belief of the agent whose bet was consistent with his desires. However, if this agent actually formed his beliefs without bias, then the learner would be missing a valuable learning opportunity. Asserting that others let their desires cloud their beliefs allows people to “explain away” those beliefs without seriously considering the possible evidence on which they are based. Future work should explore the details of these effects. For example, does a learner attribute bias equally to those who share his desires and those who hold competing ones?

The experiments presented here suggest that people think that others are wishful thinkers; this has broad consequences for social reasoning ranging from our inferences about heated scientific debates to pundit-posturing. Our findings highlight the importance of further research into the true structure of theory of mind. Do people think that others exhibit loss aversion or overweight low probabilities? Is the connection between beliefs and desires bidirectional? Rigorous examination of questions like these may buttress new, empirically motivated computational models of ToM that capture the nuance of human social cognition—an idea so good it has to be true.

All authors developed the study concept and design. Testing, data collection, and analysis were performed by DHM under supervision of NDG. DHM drafted the manuscript and NDG provided critical revisions. All authors approved the final version of the manuscript for submission.

This work was supported by ONR Grants N000141310788 and N000141310341, and a James S. McDonnell Foundation Scholar Award. We would also like to thank Joshua Hawthorne-Madell, Gregory Scontras, and Andreas Stuhlmüller for their careful reading and thoughtful comments on the manuscript.

While the causal link between desires and beliefs may, in fact, be bidirectional, we will focus on the evidence for the a priori effect of desires on beliefs.

We formally describe Bayesian models of both rToM and oToM in the Supplemental Materials (Hawthorne-Madell & Goodman, 2017 ).

Experiment 1 is a slightly modified replication of the two conditions previously run as separate experiments (see Supplemental Materials [Hawthorne-Madell & Goodman, 2017 ]).

Crucially, Josh’s U ( outcome ) should not be chosen by him, for example, “I bet $5 that it lands in the right bin,” as such an action would render U ( outcome ) and p ( outcome ) conditionally dependent and both rToM and oToM would predict influence of desire on belief judgments. To test pure wishful thinking, Josh’s U ( outcome ) has to be assigned to him by a process independent of p ( outcome )—in our case, a spinner.

All p values reported for Experiment 1 are based on the asymptotic Wald test.

There was no evidence of loss aversion in the relative magnitude of the wishful thinking effect for positive and negative utilities. In fact, the magnitude of the wishful thinking effect was slightly stronger for positive utilities.

Interestingly, there was consistency in the magnitude of this effect when Josh stood to gain $1 (as in the present experiment) or $5 in Experiment 1b (see the Supplemental Materials [Hawthorne-Madell & Goodman, 2017 ]). The extent to which people attributed wishful thinking to Josh was therefore not sensitive to the magnitude of Josh’s potential payout for this range (where payout is our operationalization of his desire).

Assuming that the three sources are equally knowledgeable and their statements have no causal influence on the game, for example, if the uncle is an umpire, his desires may matter through more objective routes.

See the Supplemental Materials [Hawthorne-Madell & Goodman, 2017 ] for complete experimental materials.

Calculated with Fisher-Pitman permutation test.

In fact, if the oToM learner thinks that the fan is a completely wishful thinker, then his bet is no longer diagnostic of his evidence (he could have seen anything!).

As seen in well-controlled examples of desires influencing online belief formation (e.g., Mayraz, 2011 ).

Competing Interests

Supplementary data, email alerts, related articles, affiliations.

  • Online ISSN 2470-2986

A product of The MIT Press

Mit press direct.

  • About MIT Press Direct

Information

  • Accessibility
  • For Authors
  • For Customers
  • For Librarians
  • Direct to Open
  • Open Access
  • Media Inquiries
  • Rights and Permissions
  • For Advertisers
  • About the MIT Press
  • The MIT Press Reader
  • MIT Press Blog
  • Seasonal Catalogs
  • MIT Press Home
  • Give to the MIT Press
  • Direct Service Desk
  • Terms of Use
  • Privacy Statement
  • Crossref Member
  • COUNTER Member  
  • The MIT Press colophon is registered in the U.S. Patent and Trademark Office

This Feature Is Available To Subscribers Only

Sign In or Create an Account

EduFusion.org

Where Creative Ideas in Distance Ed Come Together

Fear, Anger, and Denial—How do critical thinkers deal with barriers during a crisis?

Critical Thinking in Education

Have you ever made a bad choice in life? Most of us have at some point and if we go back and analyze what went wrong, we often find that there was some barrier to our critical thinking keeping us from making the best choice among the alternatives that we have. To understand and spot the barriers is important because we can often develop strategies for dealing with these roadblocks to good decision making.

Today I will focus on just a few with examples that highlight how we are currently dealing with the COVID 19 outbreak. The barriers that have surfaced are fear, anger, denial, egocentrism, and Sociocentrism. Each one of these barriers can have a significant impact on our decision making and with so many important decisions now that must be made, we want to be certain we are making the best possible choice.

Fear is perhaps one of those barriers that are partly ingrained through our evolution. In essence, fear encourages us to act—either rationally or irrationally.  Take for example how people are dealing with the outbreak of the coronavirus. There are many examples of how people are reacting out of fear rather than using critical thinking to work through and find a solution. Whether we are talking about the hoarding of something like toilet tissue or something more dangerous like attacking someone not social distancing, how we react can be traced back to this barrier. Roalfe (1929) posits that “ the real force or menace behind all fear and anxiety is desire” (p. 35). Our desire for safety and security essentially can overwhelm our ability to use our critical thinking to make good choices.

Anger is another of the barriers to thinking. When people are angry they will often ignore important information that might help resolve the root cause of the anger.  Berkowitz and Jones (2004) explain that “anger is often intermixed with the fright so that the fearful persons are all too apt to be angry as well, particularly when they think they cannot escape from the danger” (p. 152). This is particularly pertinent as people react to the COVID 19 crisis.  Many people simply cannot escape an outbreak and they will react with both anger and fear. One not mentioned though is denial.

Denial or repression is perhaps more dangerous than our initial impulse of anger because we can rationalize ourselves into believing something that is not factual rather than dealing with the issue. Whether it manifests in refusal to take protective measures or leads to risky behavior, the outcome can be detrimental not just to the person, but to others as well. Even when confronted with the facts, denial can keep a person from making the decisions needed for personal protection. We saw this play out again and again as people carried on as if nothing was really happening, even as the hospitals were flooded with sick and the numbers grew daily.

As critical thinkers, we can use the same methods we use for daily problem solving to help us make the best possible solution, even when dealing with a life-altering outbreak like the current COVID 19 pandemic we are currently going through. The first thing that must happen is the realization that there is a problem. As mentioned above, this can be very difficult when there are so many different voices trying to influence our understanding and how we are wired psychologically. Without the realization of the problem though, there can be no solutions. But how do we decide in such a fractured approach to the possible solutions? El-Hai & Machado (2020) state that

when the problem is not consensually stated to a significant extent, the search for solutions becomes open­ended, with different stakeholders championing alternative solutions and competing with one another to frame ‘the problem’ in a way that directly connects their preferred solution and their preferred problem definition, and, besides, fulfills their own interests and pleases their own values. (p.6)

What this suggests is that we must advocate for ourselves and seek out the best possible information. Whether we are reviewing information from the CDC or our state health department, the voices of the professionals can help us gain the insight we need. Understanding is key because it can help us to explore the alternatives we have available as we move forward towards a solution.

Exploring the alternatives is also important because in this stage we can weigh the pros and cons of how the possible decisions will affect us. This stage can be as complex as any and as reflected in the quote above, how we view the possibilities is impacted by how the issue is framed.

Then we will make the decision and implement the plan. In this stage, there will likely be some uncertainty and how we deal with the uncertainty can impact how willing we are to accept the risk in the decision. A young, healthy person is not likely to see the same risk as an older person with a pre-existing condition. As noted though in the available research, even the young can be impacted by the virus and young people can be potential spreaders of the disease when asymptomatic.

The final step in decision making a review of how well the plan is working. As I write today, the country has started opening back up and the number of cases has started to rise. As critical thinkers, we track our progress and work to make adjustments when needed. The question remains though, will people use social distancing and face coverings as a way slow the disease as the economy restarts, or will some slip into a state of denial?

Berkowitz, L., & Harmon-Jones, E. (2004). More thoughts about anger determinants. Emotion, 4 (2), 151-155. doi:http://dx.doi.org/10.1037/1528-3542.4.2.151

El-Hani, C., & Machado, V. (2020). COVID-19: The need of an integrated and critical view. Ethnobiology and Conservation, 9 Retrieved from https://search.proquest.com/docview/2404320755?accountid=35812

Roalfe, W. R. (1929). The psychology of fear. The Journal of Abnormal and Social Psychology, 24 (1), 32-40. doi:http://dx.doi.org/10.1037/h0071654

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

Notify me of follow-up comments by email.

Notify me of new posts by email.

11 Common Barriers To Critical Thinking – A Simple Guide

Critical thinking is the capacity to think in a clear and rational way . It’s a perspective related to what one should do and what one believes.

But what makes critical thinking a harder task to do. There are some barriers that come in the way of critical thinking.

Critical thinking is just not about collecting information. If you have a good IQ and know a lot of things, you can totally nail it.

A person can do critical thinking only if he can conclude results from his knowledge.

11 Most Common Barriers To Critical Thinking

5 Barriers To Critical Thinking

In this blog post, we’ll talk about the most common barriers to critical thinking and how you can overcome them.

1. Not Being Able To Tell The Difference Between A Fact And An Opinion

The first barrier to critical thinking is confusing facts with opinions. Facts are indisputable and indubitable , whereas opinions are not.

Here are some examples of facts you can easily check:

2. The Person Is Too Self-Obsessed To See Anything Else:

It is the most difficult barrier that makes a person see nothing but themselves. These people consider themselves as an important asset for the world.

This barrier won’t let you acknowledge other people. 

Critical thinking demands to analyze different aspects to test their validity. Also, finding good aspects of these perspectives is a portion of critical thinking.

But being self-obsessed is the most difficult barrier to overcome.

3. A Trend Of Brainstorming Together – A Barrier To Critical Thinking:

The nature of critical thinking stands on famous objectives, beliefs, and ideas. When people think collectively, it hardens for everyone to think in their own space.

Critical thinking requires that people have to think differently while in a group.

To break this barrier, everyone in the group must maintain their individuality .

4. Barriers To Critical Thinking – Emotions Are Heavier Than The Logic:

People are becoming more sensitive to the opposite views as time passes. So when people have to face the challenge of disagreement , logic flies out of the window.

And then irrelevant reactions take the place of logic that defies reason and disturbs management.

It’s a barrier to deciding based on emotions and emotion-based decision-making is bad for organizations.

5. The Competition Is Real Hard:

The greater interest of both sides is in winning the argument than in reaching the truth.

6. Barriers To Critical Thinking – Overly Relying On Experiences: 

When this barrier comes to the table of discussion, there are many people who get defensive.

Every person is different. Even the geographical regions are different. So you need to consider experience as an individual’s experience.

7. Accepting Statements Of Superhumans:

This barrier occurs when we test statements based on who said it rather than merit. In such cases, people accept statements to be true.

Contrary to that, people would reject a statement if it comes from a person they don’t like.

8. Intellect Is Greater Than Excellence:

For a very long period of time, IQ i.e. intelligence quotient was a measure for intelligence.

But the passing time told us that intelligence has a different number of dimensions.

9. Blindly Going Behind What A Myth Says:

Following myths is something that relates to accepting things based on stereotyping.

As we know that stereotypes and assumptions ignore individualistic thinking. These are the factors that hinder the person’s will to analyze the facts and figures.

It also makes people believe what they are doing is right. So they won’t be able to recognize and accept that they are making assumptions.

In such conditions, people can never identify that their judgments base on stereotypes.

10. Barriers To Critical Thinking – Grinding In The Same Cycle:

We don’t mean that routine is a bad thing. But it lessens one’s ability to think in an analytical way.

If a person has to do the same thing day after day, week after week, or even for his whole life.

11. Following The Power:

You may be accepting your boss’ views about a certain topic and you think the opposite to that.

But you are doing so because of the authority of the boss and the discipline of the organization.

Conclusion:

Critical thinking is so important because it exposes fallacies and bad reasoning.

It also plays an important role in cooperative reasoning and constructive tasks.

Do mention in a comment which barrier you think you are facing.

Leave a Comment Cancel reply

Quick links.

COMMENTS

  1. 7 Critical Thinking Barriers and How to Overcome Them

    6. Schedule Pressures. Time constraints often serve as a barrier to integrating learning opportunities that support critical thinking skills. Test scores and mandated teaching measures often result in teachers covering a great deal of content in a short amount of time.

  2. 12 Common Barriers To Critical Thinking (And How To Overcome Them)

    1. Using Emotions Instead of Logic. Failing to remove one's emotions from a critical thinking analysis is one of the hugest barriers to the process. People make these mistakes mainly in the relationship realm when choosing partners based on how they "make them feel" instead of the information collected.

  3. 5 Barriers to Critical Thinking

    Of course, these are not the only barriers to CT; rather, they are five that may have the most impact on how one applies CT. 1. Trusting Your Gut. Trust your gut is a piece of advice often thrown ...

  4. An Evaluative Review of Barriers to Critical Thinking in Educational

    1. Introduction. Critical thinking (CT) is a metacognitive process—consisting of a number of skills and dispositions—that, through purposeful, self-regulatory reflective judgment, increases the chances of producing a logical solution to a problem or a valid conclusion to an argument (Dwyer 2017, 2020; Dwyer et al. 2012, 2014, 2015, 2016; Dwyer and Walsh 2019; Quinn et al. 2020).

  5. PDF Wishful Thinking final

    using the concept of wishful thinking can have the problematic consequence of focusing attention on individual and epistemic problems in science, to the exclusion of social and ethical problems. 2. "Wishful Thinking" in the Science and Values Literature As noted above, in the 1990s, wishful thinking primarily served as a countercharge,

  6. 5 Barriers to Critical Thinking

    2. Lack of Knowledge. CT skills are key components of what CT is, and in order to conduct it, one must know how to use these skills. Not knowing the skills of CT—analysis, evaluation, and ...

  7. PDF The e ma ncipa T ed m in d

    focuses fundamentally on the problem of sociocentric thought as a barrier to the cultivation of critical societies. It links sociocentric thought to native egocentric ... whether to accept or reject the idea (itself an exercise in critical thinking). In many cases, I have offered "Test the Idea" activities to help you do this.

  8. 5 Barriers to Critical Thinking

    2. Lack of Knowledge. CT skills are key components of what CT is, and in order to conduct it, one must know how to use these skills. Not knowing the skills of CT—analysis, evaluation, and ...

  9. An Evaluative Review of Barriers to Critical Thinking in Educational

    Though a wide array of definitions and conceptualisations of critical thinking have been offered in the past, further elaboration on some concepts is required, particularly with respect to various factors that may impede an individual's application of critical thinking, such as in the case of reflective judgment. These barriers include varying levels of epistemological engagement or ...

  10. Wishful thinking

    Wishful thinking. Wishful thinking is the formation of beliefs based on what might be pleasing to imagine, rather than on evidence, rationality, or reality. It is a product of resolving conflicts between belief and desire. [1] Methodologies to examine wishful thinking are diverse.

  11. Wishful Thinking: Why It Happens, How to Prevent It,

    Wishful thinking is a means of reaching pleasing conclusions, maintaining preferred beliefs, or rejecting unfavorable beliefs, even when reality demands otherwise. We think wishfully by cherry-picking evidence, bending the rules of rational thought, or creating substitutes for reality. Wishful thinking is a source of risk in every human ...

  12. 'Destroying barriers to critical thinking' to surge the effect of self

    1. Introduction. The importance of critical thinking skills is widely accepted as one of the prominent sets of 21st-century skills for innovation and countering pervasive misinformation, developing the cognitive ability of self-regulation, raising responsible citizens, dealing with complex challenges in education, and finding well-reasoned solutions to tricky problems (Álvarez-Huerta et al ...

  13. Common Barriers to Critical Thinking

    Barriers to critical thinking can be as simple as using emotions rather than logic to make a decision to the complex individual biases that subconsciously dictate decisions. Below are some of the ...

  14. How to Identify and Remove Barriers to Critical Thinking

    Egoism, or viewing everything in relation to yourself, is a natural human tendency and a common barrier to critical thinking. It often leads to an inability to question one's own beliefs, sympathize with others, or consider different perspectives. Egocentricity is an inherent character flaw.

  15. Wall of Barriers

    Sociocentrism is a powerful force in every culture in the world. It is exemplified in widespread group selfishness (what we might term "groupishness"), conformity, and myopia. It threatens the well- being of humans, other species, and the planet. Due to technological advances, the capacity of human groups to cause great suffering among ...

  16. Critical Thinking vs. Wishful Thinking

    The real battle in society, and within any individual soul, is the battle between wishful thinking and critical thinking. Critical thinking does not always lead to the truth, but it provides the means for correction while getting to the truth. And it helps you face and identify the truth, once there. Wishful thinking gives up on the concept of ...

  17. So Good It Has to Be True: Wishful Thinking in Theory of Mind

    Competing models of Theory of Mind (ToM). Causal models of (a) rational ToM based upon classic belief-desire psychology and (b) optimistic ToM that includes a direct "wishful thinking" link between desires and beliefs. In Experiment 1 we explore wishful thinking in both ToM and behavior. In the third-person point-of-view (3-PoV) condition ...

  18. Fear, Anger, and Denial—How do critical thinkers deal with barriers

    Our desire for safety and security essentially can overwhelm our ability to use our critical thinking to make good choices. Anger is another of the barriers to thinking. When people are angry they will often ignore important information that might help resolve the root cause of the anger. Berkowitz and Jones (2004) explain that "anger is ...

  19. What Are Critical Thinking Skills and Why Are They Important?

    Critical thinking skills are used every day in a myriad of ways and can be applied to situations such as a CEO approaching a group project or a nurse deciding in which order to treat their patients. Examples of common critical thinking skills. Critical thinking skills differ from individual to individual and are utilized in various ways.

  20. 11 Common Barriers To Critical Thinking

    Watch on. In this blog post, we'll talk about the most common barriers to critical thinking and how you can overcome them. 1. Not Being Able To Tell The Difference Between A Fact And An Opinion. The first barrier to critical thinking is confusing facts with opinions. Facts are indisputable and indubitable, whereas opinions are not.

  21. Examples Of Egocentrism In Critical Thinking

    In this way relativistic thinking is a barrier in the path of critical thinking. 5. Wishful thinking: Wishful thinking is basically daydreaming. Wishing or imagining something would happen that isn 't exactly realistic. It is believing something not because you had good evidence for it but simply because you wished it were true.

  22. 5 Barriers to Critical Thinking

    Critical thinking is essential to using your overall experience, background, common sense and other attributes to become more aware of how your efforts for s...

  23. MIS3210 Practice Quiz Flashcards

    Which of the following was NOT mentioned as a barrier to critical thinking? A. refraining from making snap judgements B. egocentrism C. making unwarranted D. assumptions wishful thinking. A. The primary goal of a business analytics effort is: A.

  24. Barriers of Critical Thinking Flashcards

    Barriers to Critical Thinking. 15 terms. Galena_Koleva. Preview. Brain Based learning. 13 terms. golden_lamby. Preview. Nursing 2 Exam 1 - Legal Issues and Ethics. 40 terms. Savanna_Jeffers1045. ... Wishful thinking "It won't happen to me" Trying weird vitamins to cure cancer Thinking your not out on second base, but were a mile away from the base.