Menu Trigger

Why Schools Need to Change Yes, We Can Define, Teach, and Assess Critical Thinking Skills

how can critical thinking be measured

Jeff Heyck-Williams (He, His, Him) Director of the Two Rivers Learning Institute in Washington, DC

critical thinking

Today’s learners face an uncertain present and a rapidly changing future that demand far different skills and knowledge than were needed in the 20th century. We also know so much more about enabling deep, powerful learning than we ever did before. Our collective future depends on how well young people prepare for the challenges and opportunities of 21st-century life.

Critical thinking is a thing. We can define it; we can teach it; and we can assess it.

While the idea of teaching critical thinking has been bandied around in education circles since at least the time of John Dewey, it has taken greater prominence in the education debates with the advent of the term “21st century skills” and discussions of deeper learning. There is increasing agreement among education reformers that critical thinking is an essential ingredient for long-term success for all of our students.

However, there are still those in the education establishment and in the media who argue that critical thinking isn’t really a thing, or that these skills aren’t well defined and, even if they could be defined, they can’t be taught or assessed.

To those naysayers, I have to disagree. Critical thinking is a thing. We can define it; we can teach it; and we can assess it. In fact, as part of a multi-year Assessment for Learning Project , Two Rivers Public Charter School in Washington, D.C., has done just that.

Before I dive into what we have done, I want to acknowledge that some of the criticism has merit.

First, there are those that argue that critical thinking can only exist when students have a vast fund of knowledge. Meaning that a student cannot think critically if they don’t have something substantive about which to think. I agree. Students do need a robust foundation of core content knowledge to effectively think critically. Schools still have a responsibility for building students’ content knowledge.

However, I would argue that students don’t need to wait to think critically until after they have mastered some arbitrary amount of knowledge. They can start building critical thinking skills when they walk in the door. All students come to school with experience and knowledge which they can immediately think critically about. In fact, some of the thinking that they learn to do helps augment and solidify the discipline-specific academic knowledge that they are learning.

The second criticism is that critical thinking skills are always highly contextual. In this argument, the critics make the point that the types of thinking that students do in history is categorically different from the types of thinking students do in science or math. Thus, the idea of teaching broadly defined, content-neutral critical thinking skills is impossible. I agree that there are domain-specific thinking skills that students should learn in each discipline. However, I also believe that there are several generalizable skills that elementary school students can learn that have broad applicability to their academic and social lives. That is what we have done at Two Rivers.

Defining Critical Thinking Skills

We began this work by first defining what we mean by critical thinking. After a review of the literature and looking at the practice at other schools, we identified five constructs that encompass a set of broadly applicable skills: schema development and activation; effective reasoning; creativity and innovation; problem solving; and decision making.

critical thinking competency

We then created rubrics to provide a concrete vision of what each of these constructs look like in practice. Working with the Stanford Center for Assessment, Learning and Equity (SCALE) , we refined these rubrics to capture clear and discrete skills.

For example, we defined effective reasoning as the skill of creating an evidence-based claim: students need to construct a claim, identify relevant support, link their support to their claim, and identify possible questions or counter claims. Rubrics provide an explicit vision of the skill of effective reasoning for students and teachers. By breaking the rubrics down for different grade bands, we have been able not only to describe what reasoning is but also to delineate how the skills develop in students from preschool through 8th grade.

reasoning rubric

Before moving on, I want to freely acknowledge that in narrowly defining reasoning as the construction of evidence-based claims we have disregarded some elements of reasoning that students can and should learn. For example, the difference between constructing claims through deductive versus inductive means is not highlighted in our definition. However, by privileging a definition that has broad applicability across disciplines, we are able to gain traction in developing the roots of critical thinking. In this case, to formulate well-supported claims or arguments.

Teaching Critical Thinking Skills

The definitions of critical thinking constructs were only useful to us in as much as they translated into practical skills that teachers could teach and students could learn and use. Consequently, we have found that to teach a set of cognitive skills, we needed thinking routines that defined the regular application of these critical thinking and problem-solving skills across domains. Building on Harvard’s Project Zero Visible Thinking work, we have named routines aligned with each of our constructs.

For example, with the construct of effective reasoning, we aligned the Claim-Support-Question thinking routine to our rubric. Teachers then were able to teach students that whenever they were making an argument, the norm in the class was to use the routine in constructing their claim and support. The flexibility of the routine has allowed us to apply it from preschool through 8th grade and across disciplines from science to economics and from math to literacy.

argumentative writing

Kathryn Mancino, a 5th grade teacher at Two Rivers, has deliberately taught three of our thinking routines to students using the anchor charts above. Her charts name the components of each routine and has a place for students to record when they’ve used it and what they have figured out about the routine. By using this structure with a chart that can be added to throughout the year, students see the routines as broadly applicable across disciplines and are able to refine their application over time.

Assessing Critical Thinking Skills

By defining specific constructs of critical thinking and building thinking routines that support their implementation in classrooms, we have operated under the assumption that students are developing skills that they will be able to transfer to other settings. However, we recognized both the importance and the challenge of gathering reliable data to confirm this.

With this in mind, we have developed a series of short performance tasks around novel discipline-neutral contexts in which students can apply the constructs of thinking. Through these tasks, we have been able to provide an opportunity for students to demonstrate their ability to transfer the types of thinking beyond the original classroom setting. Once again, we have worked with SCALE to define tasks where students easily access the content but where the cognitive lift requires them to demonstrate their thinking abilities.

These assessments demonstrate that it is possible to capture meaningful data on students’ critical thinking abilities. They are not intended to be high stakes accountability measures. Instead, they are designed to give students, teachers, and school leaders discrete formative data on hard to measure skills.

While it is clearly difficult, and we have not solved all of the challenges to scaling assessments of critical thinking, we can define, teach, and assess these skills . In fact, knowing how important they are for the economy of the future and our democracy, it is essential that we do.

Jeff Heyck-Williams (He, His, Him)

Director of the two rivers learning institute.

Jeff Heyck-Williams is the director of the Two Rivers Learning Institute and a founder of Two Rivers Public Charter School. He has led work around creating school-wide cultures of mathematics, developing assessments of critical thinking and problem-solving, and supporting project-based learning.

Read More About Why Schools Need to Change

NGLC's Bravely 2024-2025

Four teams selected to reimagine student success with NGLC's Bravely

June 11, 2024

Liberty Leads Partnership Program

A Pedagogy of Belonging: Student Voices 30 Years Later

David Penberg

June 3, 2024

middle school student presentation

Equitable and Sustainable Social-Emotional Learning: Embracing Flexibility for Diverse Learners

Clementina Jose

May 30, 2024

how can critical thinking be measured

Bookmark this page

  • A Model for the National Assessment of Higher Order Thinking
  • International Critical Thinking Essay Test
  • Online Critical Thinking Basic Concepts Test
  • Online Critical Thinking Basic Concepts Sample Test

Consequential Validity: Using Assessment to Drive Instruction

Translate this page from English...

*Machine translated pages not guaranteed for accuracy. Click Here for our professional translations.

how can critical thinking be measured

Critical Thinking Testing and Assessment









The purpose of assessment in instruction is improvement. The purpose of assessing instruction for critical thinking is improving the teaching of discipline-based thinking (historical, biological, sociological, mathematical, etc.) It is to improve students’ abilities to think their way through content using disciplined skill in reasoning. The more particular we can be about what we want students to learn about critical thinking, the better we can devise instruction with that particular end in view.

how can critical thinking be measured

The Foundation for Critical Thinking offers assessment instruments which share in the same general goal: to enable educators to gather evidence relevant to determining the extent to which instruction is teaching students to think critically (in the process of learning content). To this end, the Fellows of the Foundation recommend:

that academic institutions and units establish an oversight committee for critical thinking, and

that this oversight committee utilizes a combination of assessment instruments (the more the better) to generate incentives for faculty, by providing them with as much evidence as feasible of the actual state of instruction for critical thinking.

The following instruments are available to generate evidence relevant to critical thinking teaching and learning:

Course Evaluation Form : Provides evidence of whether, and to what extent, students perceive faculty as fostering critical thinking in instruction (course by course). Machine-scoreable.

Online Critical Thinking Basic Concepts Test : Provides evidence of whether, and to what extent, students understand the fundamental concepts embedded in critical thinking (and hence tests student readiness to think critically). Machine-scoreable.

Critical Thinking Reading and Writing Test : Provides evidence of whether, and to what extent, students can read closely and write substantively (and hence tests students' abilities to read and write critically). Short-answer.

International Critical Thinking Essay Test : Provides evidence of whether, and to what extent, students are able to analyze and assess excerpts from textbooks or professional writing. Short-answer.

Commission Study Protocol for Interviewing Faculty Regarding Critical Thinking : Provides evidence of whether, and to what extent, critical thinking is being taught at a college or university. Can be adapted for high school. Based on the California Commission Study . Short-answer.

Protocol for Interviewing Faculty Regarding Critical Thinking : Provides evidence of whether, and to what extent, critical thinking is being taught at a college or university. Can be adapted for high school. Short-answer.

Protocol for Interviewing Students Regarding Critical Thinking : Provides evidence of whether, and to what extent, students are learning to think critically at a college or university. Can be adapted for high school). Short-answer. 

Criteria for Critical Thinking Assignments : Can be used by faculty in designing classroom assignments, or by administrators in assessing the extent to which faculty are fostering critical thinking.

Rubrics for Assessing Student Reasoning Abilities : A useful tool in assessing the extent to which students are reasoning well through course content.  

All of the above assessment instruments can be used as part of pre- and post-assessment strategies to gauge development over various time periods.

Consequential Validity

All of the above assessment instruments, when used appropriately and graded accurately, should lead to a high degree of consequential validity. In other words, the use of the instruments should cause teachers to teach in such a way as to foster critical thinking in their various subjects. In this light, for students to perform well on the various instruments, teachers will need to design instruction so that students can perform well on them. Students cannot become skilled in critical thinking without learning (first) the concepts and principles that underlie critical thinking and (second) applying them in a variety of forms of thinking: historical thinking, sociological thinking, biological thinking, etc. Students cannot become skilled in analyzing and assessing reasoning without practicing it. However, when they have routine practice in paraphrasing, summariz­ing, analyzing, and assessing, they will develop skills of mind requisite to the art of thinking well within any subject or discipline, not to mention thinking well within the various domains of human life.

For full copies of this and many other critical thinking articles, books, videos, and more, join us at the Center for Critical Thinking Community Online - the world's leading online community dedicated to critical thinking!   Also featuring interactive learning activities, study groups, and even a social media component, this learning platform will change your conception of intellectual development.

Already have an account?

Logo

Measuring Critical Thinking: Can It Be Done?

11 may measuring critical thinking: can it be done, why should we measure critical thinking, critical thinking is an objective of education.

An important reason for measuring critical thinking is that it is a key part of education, whether it is explicitly mentioned within the curriculum or not. The UK’s Framework for Higher Education Qualifications (FHEQ) includes descriptors that are specific to critical thinking, including:

“Critically evaluate arguments, assumptions, abstract concepts and data (that may be incomplete), to make judgements, and to frame appropriate questions to achieve a solution – or identify a range of solutions – to a problem.”

This illustrates the importance of critical thinking for academic success in higher education.

In the USA, critical thinking is actively embedded within the curriculum at a secondary school level in the ‘Common Core’ learning objectives. Students are required to learn to make strong arguments and to be able to structure them in a way that leads smoothly to support a conclusion. Including critical thinking in the secondary curriculum is important, since experts have found that critical thinking should be taught prior to higher education in order to be effective. This makes it important to include within learning objectives and assessment criteria, as a measure of academic achievement.

how can critical thinking be measured

Employers Look for Critical Thinking Skills

Critical thinking is also essential for professional success. As a result, it is no surprise that a number of prominent employers test critical thinking skills as part of their recruitment process. The Watson Glaser Test, discussed in our article about measuring critical thinking , has been used by the Bank of England and the Government Legal Service among other employers. Given that critical thinking has an increasing role in employability, it makes sense that critical thinking should be taught and tested among students in education.

However, critical thinking is an abstract and complex term, which makes it more difficult to measure and assess than other essential skills like reading and writing. The recognition that critical thinking is an important skill has led to the development of a number of assessment methods, although all of these methods have considerable limitations.

How is Critical Thinking Measured?

Traditional measurement of critical thinking in education.

Traditionally, a student’s ability to think critically has been measured through essay-based assessment within humanities, arts and social sciences subjects. As a result, critical thinking ability is only an aspect of the assessment: students are simultaneously marked on their subject knowledge, reading and writing skills. Therefore, providing sufficient feedback about each aspect of the assessment is incredibly time intensive for teachers. With limited feedback from teachers, students may experience a cycle of writing essays and receiving negative feedback, without fully understanding why their argument is weak or how it can be improved.

how can critical thinking be measured

This can lead to confusion and result in limited improvement in later essays as a result. For students who struggle with reading and writing, this challenge is only exacerbated. Furthermore, this method fails to produce a separate, useful measurement of critical thinking that teachers can use to identify problems and drive improvement.

Critical Thinking as an A Level Subject

The momentum for assessing critical thinking skills reached a peak in the UK with the introduction of school subjects dedicated to critical thinking. A number of exam boards included a dedicated A Level subject in the early 2000s, but only Cambridge International’s “Thinking Skills” curriculum remains. For this A Level, students are examined using a familiar exam format, with a mixture of short answer questions and essay style questions. Two papers, “critical thinking” and “applied reasoning” assess critical thinking, while the others focus on problem solving, with a greater focus on numerical questions.

how can critical thinking be measured

Undoubtedly, students have more time to develop critical thinking skills through studying a dedicated course. However, there are problems with this approach. Despite the use of shorter questions, strong reading and writing abilities are still necessary and this heavily impacts the criteria for achieving high marks. There is also a high work load for the teacher. Additionally, choosing to teach or study this subject means choosing against more traditional or accessible subjects. But UK universities do not accept it as part of their entrance requirements as they want to see students demonstrate their critical thinking within the traditional subjects. For these reasons, critical thinking as a subject has failed to make it into the curriculum in most schools.

Critical Thinking Tests for Employment

Some critical thinking tests are designed as a way of evaluating job applicants or of testing employee skills, rather than to test structured learning of critical thinking. The Watson Glaser test is perhaps the most well-known critical thinking test of this kind and is used most commonly by law employers.

The Watson Glaser test uses short multiple choice questions, which reduces verbal load and makes marking easier. The results can be transformed easily into statistics for the quantitative measure of critical thinking among a sample group. However, this means the questions are simple, so the test does not measure proficiency in thinking critically about longer, more complex arguments. The questions are also based on scenarios that may not be familiar to exam takers. Other employment tests use a similar format, although many limit critical thinking to a small section of the test.

Summarising the Strengths and Weaknesses of Critical Thinking Tests

how can critical thinking be measured

The table above summarises the strengths and weaknesses of each critical thinking test in this article. To learn more about measuring critical thinking, click here to read our longer article.

Look out for our next blog post, where we discuss how to improve the measurement of critical thinking.

Images from Armin Rimoldi, cottonbro and Polina Tan Kilevitch on Pexels

Privacy Overview

SEP home page

  • Table of Contents
  • Random Entry
  • Chronological
  • Editorial Information
  • About the SEP
  • Editorial Board
  • How to Cite the SEP
  • Special Characters
  • Advanced Tools
  • Support the SEP
  • PDFs for SEP Friends
  • Make a Donation
  • SEPIA for Libraries
  • Entry Contents

Bibliography

Academic tools.

  • Friends PDF Preview
  • Author and Citation Info
  • Back to Top

Critical Thinking

Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. Conceptions differ with respect to the scope of such thinking, the type of goal, the criteria and norms for thinking carefully, and the thinking components on which they focus. Its adoption as an educational goal has been recommended on the basis of respect for students’ autonomy and preparing students for success in life and for democratic citizenship. “Critical thinkers” have the dispositions and abilities that lead them to think critically when appropriate. The abilities can be identified directly; the dispositions indirectly, by considering what factors contribute to or impede exercise of the abilities. Standardized tests have been developed to assess the degree to which a person possesses such dispositions and abilities. Educational intervention has been shown experimentally to improve them, particularly when it includes dialogue, anchored instruction, and mentoring. Controversies have arisen over the generalizability of critical thinking across domains, over alleged bias in critical thinking theories and instruction, and over the relationship of critical thinking to other types of thinking.

2.1 Dewey’s Three Main Examples

2.2 dewey’s other examples, 2.3 further examples, 2.4 non-examples, 3. the definition of critical thinking, 4. its value, 5. the process of thinking critically, 6. components of the process, 7. contributory dispositions and abilities, 8.1 initiating dispositions, 8.2 internal dispositions, 9. critical thinking abilities, 10. required knowledge, 11. educational methods, 12.1 the generalizability of critical thinking, 12.2 bias in critical thinking theory and pedagogy, 12.3 relationship of critical thinking to other types of thinking, other internet resources, related entries.

Use of the term ‘critical thinking’ to describe an educational goal goes back to the American philosopher John Dewey (1910), who more commonly called it ‘reflective thinking’. He defined it as

active, persistent and careful consideration of any belief or supposed form of knowledge in the light of the grounds that support it, and the further conclusions to which it tends. (Dewey 1910: 6; 1933: 9)

and identified a habit of such consideration with a scientific attitude of mind. His lengthy quotations of Francis Bacon, John Locke, and John Stuart Mill indicate that he was not the first person to propose development of a scientific attitude of mind as an educational goal.

In the 1930s, many of the schools that participated in the Eight-Year Study of the Progressive Education Association (Aikin 1942) adopted critical thinking as an educational goal, for whose achievement the study’s Evaluation Staff developed tests (Smith, Tyler, & Evaluation Staff 1942). Glaser (1941) showed experimentally that it was possible to improve the critical thinking of high school students. Bloom’s influential taxonomy of cognitive educational objectives (Bloom et al. 1956) incorporated critical thinking abilities. Ennis (1962) proposed 12 aspects of critical thinking as a basis for research on the teaching and evaluation of critical thinking ability.

Since 1980, an annual international conference in California on critical thinking and educational reform has attracted tens of thousands of educators from all levels of education and from many parts of the world. Also since 1980, the state university system in California has required all undergraduate students to take a critical thinking course. Since 1983, the Association for Informal Logic and Critical Thinking has sponsored sessions in conjunction with the divisional meetings of the American Philosophical Association (APA). In 1987, the APA’s Committee on Pre-College Philosophy commissioned a consensus statement on critical thinking for purposes of educational assessment and instruction (Facione 1990a). Researchers have developed standardized tests of critical thinking abilities and dispositions; for details, see the Supplement on Assessment . Educational jurisdictions around the world now include critical thinking in guidelines for curriculum and assessment.

For details on this history, see the Supplement on History .

2. Examples and Non-Examples

Before considering the definition of critical thinking, it will be helpful to have in mind some examples of critical thinking, as well as some examples of kinds of thinking that would apparently not count as critical thinking.

Dewey (1910: 68–71; 1933: 91–94) takes as paradigms of reflective thinking three class papers of students in which they describe their thinking. The examples range from the everyday to the scientific.

Transit : “The other day, when I was down town on 16th Street, a clock caught my eye. I saw that the hands pointed to 12:20. This suggested that I had an engagement at 124th Street, at one o’clock. I reasoned that as it had taken me an hour to come down on a surface car, I should probably be twenty minutes late if I returned the same way. I might save twenty minutes by a subway express. But was there a station near? If not, I might lose more than twenty minutes in looking for one. Then I thought of the elevated, and I saw there was such a line within two blocks. But where was the station? If it were several blocks above or below the street I was on, I should lose time instead of gaining it. My mind went back to the subway express as quicker than the elevated; furthermore, I remembered that it went nearer than the elevated to the part of 124th Street I wished to reach, so that time would be saved at the end of the journey. I concluded in favor of the subway, and reached my destination by one o’clock.” (Dewey 1910: 68–69; 1933: 91–92)

Ferryboat : “Projecting nearly horizontally from the upper deck of the ferryboat on which I daily cross the river is a long white pole, having a gilded ball at its tip. It suggested a flagpole when I first saw it; its color, shape, and gilded ball agreed with this idea, and these reasons seemed to justify me in this belief. But soon difficulties presented themselves. The pole was nearly horizontal, an unusual position for a flagpole; in the next place, there was no pulley, ring, or cord by which to attach a flag; finally, there were elsewhere on the boat two vertical staffs from which flags were occasionally flown. It seemed probable that the pole was not there for flag-flying.

“I then tried to imagine all possible purposes of the pole, and to consider for which of these it was best suited: (a) Possibly it was an ornament. But as all the ferryboats and even the tugboats carried poles, this hypothesis was rejected. (b) Possibly it was the terminal of a wireless telegraph. But the same considerations made this improbable. Besides, the more natural place for such a terminal would be the highest part of the boat, on top of the pilot house. (c) Its purpose might be to point out the direction in which the boat is moving.

“In support of this conclusion, I discovered that the pole was lower than the pilot house, so that the steersman could easily see it. Moreover, the tip was enough higher than the base, so that, from the pilot’s position, it must appear to project far out in front of the boat. Moreover, the pilot being near the front of the boat, he would need some such guide as to its direction. Tugboats would also need poles for such a purpose. This hypothesis was so much more probable than the others that I accepted it. I formed the conclusion that the pole was set up for the purpose of showing the pilot the direction in which the boat pointed, to enable him to steer correctly.” (Dewey 1910: 69–70; 1933: 92–93)

Bubbles : “In washing tumblers in hot soapsuds and placing them mouth downward on a plate, bubbles appeared on the outside of the mouth of the tumblers and then went inside. Why? The presence of bubbles suggests air, which I note must come from inside the tumbler. I see that the soapy water on the plate prevents escape of the air save as it may be caught in bubbles. But why should air leave the tumbler? There was no substance entering to force it out. It must have expanded. It expands by increase of heat, or by decrease of pressure, or both. Could the air have become heated after the tumbler was taken from the hot suds? Clearly not the air that was already entangled in the water. If heated air was the cause, cold air must have entered in transferring the tumblers from the suds to the plate. I test to see if this supposition is true by taking several more tumblers out. Some I shake so as to make sure of entrapping cold air in them. Some I take out holding mouth downward in order to prevent cold air from entering. Bubbles appear on the outside of every one of the former and on none of the latter. I must be right in my inference. Air from the outside must have been expanded by the heat of the tumbler, which explains the appearance of the bubbles on the outside. But why do they then go inside? Cold contracts. The tumbler cooled and also the air inside it. Tension was removed, and hence bubbles appeared inside. To be sure of this, I test by placing a cup of ice on the tumbler while the bubbles are still forming outside. They soon reverse” (Dewey 1910: 70–71; 1933: 93–94).

Dewey (1910, 1933) sprinkles his book with other examples of critical thinking. We will refer to the following.

Weather : A man on a walk notices that it has suddenly become cool, thinks that it is probably going to rain, looks up and sees a dark cloud obscuring the sun, and quickens his steps (1910: 6–10; 1933: 9–13).

Disorder : A man finds his rooms on his return to them in disorder with his belongings thrown about, thinks at first of burglary as an explanation, then thinks of mischievous children as being an alternative explanation, then looks to see whether valuables are missing, and discovers that they are (1910: 82–83; 1933: 166–168).

Typhoid : A physician diagnosing a patient whose conspicuous symptoms suggest typhoid avoids drawing a conclusion until more data are gathered by questioning the patient and by making tests (1910: 85–86; 1933: 170).

Blur : A moving blur catches our eye in the distance, we ask ourselves whether it is a cloud of whirling dust or a tree moving its branches or a man signaling to us, we think of other traits that should be found on each of those possibilities, and we look and see if those traits are found (1910: 102, 108; 1933: 121, 133).

Suction pump : In thinking about the suction pump, the scientist first notes that it will draw water only to a maximum height of 33 feet at sea level and to a lesser maximum height at higher elevations, selects for attention the differing atmospheric pressure at these elevations, sets up experiments in which the air is removed from a vessel containing water (when suction no longer works) and in which the weight of air at various levels is calculated, compares the results of reasoning about the height to which a given weight of air will allow a suction pump to raise water with the observed maximum height at different elevations, and finally assimilates the suction pump to such apparently different phenomena as the siphon and the rising of a balloon (1910: 150–153; 1933: 195–198).

Diamond : A passenger in a car driving in a diamond lane reserved for vehicles with at least one passenger notices that the diamond marks on the pavement are far apart in some places and close together in others. Why? The driver suggests that the reason may be that the diamond marks are not needed where there is a solid double line separating the diamond lane from the adjoining lane, but are needed when there is a dotted single line permitting crossing into the diamond lane. Further observation confirms that the diamonds are close together when a dotted line separates the diamond lane from its neighbour, but otherwise far apart.

Rash : A woman suddenly develops a very itchy red rash on her throat and upper chest. She recently noticed a mark on the back of her right hand, but was not sure whether the mark was a rash or a scrape. She lies down in bed and thinks about what might be causing the rash and what to do about it. About two weeks before, she began taking blood pressure medication that contained a sulfa drug, and the pharmacist had warned her, in view of a previous allergic reaction to a medication containing a sulfa drug, to be on the alert for an allergic reaction; however, she had been taking the medication for two weeks with no such effect. The day before, she began using a new cream on her neck and upper chest; against the new cream as the cause was mark on the back of her hand, which had not been exposed to the cream. She began taking probiotics about a month before. She also recently started new eye drops, but she supposed that manufacturers of eye drops would be careful not to include allergy-causing components in the medication. The rash might be a heat rash, since she recently was sweating profusely from her upper body. Since she is about to go away on a short vacation, where she would not have access to her usual physician, she decides to keep taking the probiotics and using the new eye drops but to discontinue the blood pressure medication and to switch back to the old cream for her neck and upper chest. She forms a plan to consult her regular physician on her return about the blood pressure medication.

Candidate : Although Dewey included no examples of thinking directed at appraising the arguments of others, such thinking has come to be considered a kind of critical thinking. We find an example of such thinking in the performance task on the Collegiate Learning Assessment (CLA+), which its sponsoring organization describes as

a performance-based assessment that provides a measure of an institution’s contribution to the development of critical-thinking and written communication skills of its students. (Council for Aid to Education 2017)

A sample task posted on its website requires the test-taker to write a report for public distribution evaluating a fictional candidate’s policy proposals and their supporting arguments, using supplied background documents, with a recommendation on whether to endorse the candidate.

Immediate acceptance of an idea that suggests itself as a solution to a problem (e.g., a possible explanation of an event or phenomenon, an action that seems likely to produce a desired result) is “uncritical thinking, the minimum of reflection” (Dewey 1910: 13). On-going suspension of judgment in the light of doubt about a possible solution is not critical thinking (Dewey 1910: 108). Critique driven by a dogmatically held political or religious ideology is not critical thinking; thus Paulo Freire (1968 [1970]) is using the term (e.g., at 1970: 71, 81, 100, 146) in a more politically freighted sense that includes not only reflection but also revolutionary action against oppression. Derivation of a conclusion from given data using an algorithm is not critical thinking.

What is critical thinking? There are many definitions. Ennis (2016) lists 14 philosophically oriented scholarly definitions and three dictionary definitions. Following Rawls (1971), who distinguished his conception of justice from a utilitarian conception but regarded them as rival conceptions of the same concept, Ennis maintains that the 17 definitions are different conceptions of the same concept. Rawls articulated the shared concept of justice as

a characteristic set of principles for assigning basic rights and duties and for determining… the proper distribution of the benefits and burdens of social cooperation. (Rawls 1971: 5)

Bailin et al. (1999b) claim that, if one considers what sorts of thinking an educator would take not to be critical thinking and what sorts to be critical thinking, one can conclude that educators typically understand critical thinking to have at least three features.

  • It is done for the purpose of making up one’s mind about what to believe or do.
  • The person engaging in the thinking is trying to fulfill standards of adequacy and accuracy appropriate to the thinking.
  • The thinking fulfills the relevant standards to some threshold level.

One could sum up the core concept that involves these three features by saying that critical thinking is careful goal-directed thinking. This core concept seems to apply to all the examples of critical thinking described in the previous section. As for the non-examples, their exclusion depends on construing careful thinking as excluding jumping immediately to conclusions, suspending judgment no matter how strong the evidence, reasoning from an unquestioned ideological or religious perspective, and routinely using an algorithm to answer a question.

If the core of critical thinking is careful goal-directed thinking, conceptions of it can vary according to its presumed scope, its presumed goal, one’s criteria and threshold for being careful, and the thinking component on which one focuses. As to its scope, some conceptions (e.g., Dewey 1910, 1933) restrict it to constructive thinking on the basis of one’s own observations and experiments, others (e.g., Ennis 1962; Fisher & Scriven 1997; Johnson 1992) to appraisal of the products of such thinking. Ennis (1991) and Bailin et al. (1999b) take it to cover both construction and appraisal. As to its goal, some conceptions restrict it to forming a judgment (Dewey 1910, 1933; Lipman 1987; Facione 1990a). Others allow for actions as well as beliefs as the end point of a process of critical thinking (Ennis 1991; Bailin et al. 1999b). As to the criteria and threshold for being careful, definitions vary in the term used to indicate that critical thinking satisfies certain norms: “intellectually disciplined” (Scriven & Paul 1987), “reasonable” (Ennis 1991), “skillful” (Lipman 1987), “skilled” (Fisher & Scriven 1997), “careful” (Bailin & Battersby 2009). Some definitions specify these norms, referring variously to “consideration of any belief or supposed form of knowledge in the light of the grounds that support it and the further conclusions to which it tends” (Dewey 1910, 1933); “the methods of logical inquiry and reasoning” (Glaser 1941); “conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication” (Scriven & Paul 1987); the requirement that “it is sensitive to context, relies on criteria, and is self-correcting” (Lipman 1987); “evidential, conceptual, methodological, criteriological, or contextual considerations” (Facione 1990a); and “plus-minus considerations of the product in terms of appropriate standards (or criteria)” (Johnson 1992). Stanovich and Stanovich (2010) propose to ground the concept of critical thinking in the concept of rationality, which they understand as combining epistemic rationality (fitting one’s beliefs to the world) and instrumental rationality (optimizing goal fulfillment); a critical thinker, in their view, is someone with “a propensity to override suboptimal responses from the autonomous mind” (2010: 227). These variant specifications of norms for critical thinking are not necessarily incompatible with one another, and in any case presuppose the core notion of thinking carefully. As to the thinking component singled out, some definitions focus on suspension of judgment during the thinking (Dewey 1910; McPeck 1981), others on inquiry while judgment is suspended (Bailin & Battersby 2009, 2021), others on the resulting judgment (Facione 1990a), and still others on responsiveness to reasons (Siegel 1988). Kuhn (2019) takes critical thinking to be more a dialogic practice of advancing and responding to arguments than an individual ability.

In educational contexts, a definition of critical thinking is a “programmatic definition” (Scheffler 1960: 19). It expresses a practical program for achieving an educational goal. For this purpose, a one-sentence formulaic definition is much less useful than articulation of a critical thinking process, with criteria and standards for the kinds of thinking that the process may involve. The real educational goal is recognition, adoption and implementation by students of those criteria and standards. That adoption and implementation in turn consists in acquiring the knowledge, abilities and dispositions of a critical thinker.

Conceptions of critical thinking generally do not include moral integrity as part of the concept. Dewey, for example, took critical thinking to be the ultimate intellectual goal of education, but distinguished it from the development of social cooperation among school children, which he took to be the central moral goal. Ennis (1996, 2011) added to his previous list of critical thinking dispositions a group of dispositions to care about the dignity and worth of every person, which he described as a “correlative” (1996) disposition without which critical thinking would be less valuable and perhaps harmful. An educational program that aimed at developing critical thinking but not the correlative disposition to care about the dignity and worth of every person, he asserted, “would be deficient and perhaps dangerous” (Ennis 1996: 172).

Dewey thought that education for reflective thinking would be of value to both the individual and society; recognition in educational practice of the kinship to the scientific attitude of children’s native curiosity, fertile imagination and love of experimental inquiry “would make for individual happiness and the reduction of social waste” (Dewey 1910: iii). Schools participating in the Eight-Year Study took development of the habit of reflective thinking and skill in solving problems as a means to leading young people to understand, appreciate and live the democratic way of life characteristic of the United States (Aikin 1942: 17–18, 81). Harvey Siegel (1988: 55–61) has offered four considerations in support of adopting critical thinking as an educational ideal. (1) Respect for persons requires that schools and teachers honour students’ demands for reasons and explanations, deal with students honestly, and recognize the need to confront students’ independent judgment; these requirements concern the manner in which teachers treat students. (2) Education has the task of preparing children to be successful adults, a task that requires development of their self-sufficiency. (3) Education should initiate children into the rational traditions in such fields as history, science and mathematics. (4) Education should prepare children to become democratic citizens, which requires reasoned procedures and critical talents and attitudes. To supplement these considerations, Siegel (1988: 62–90) responds to two objections: the ideology objection that adoption of any educational ideal requires a prior ideological commitment and the indoctrination objection that cultivation of critical thinking cannot escape being a form of indoctrination.

Despite the diversity of our 11 examples, one can recognize a common pattern. Dewey analyzed it as consisting of five phases:

  • suggestions , in which the mind leaps forward to a possible solution;
  • an intellectualization of the difficulty or perplexity into a problem to be solved, a question for which the answer must be sought;
  • the use of one suggestion after another as a leading idea, or hypothesis , to initiate and guide observation and other operations in collection of factual material;
  • the mental elaboration of the idea or supposition as an idea or supposition ( reasoning , in the sense on which reasoning is a part, not the whole, of inference); and
  • testing the hypothesis by overt or imaginative action. (Dewey 1933: 106–107; italics in original)

The process of reflective thinking consisting of these phases would be preceded by a perplexed, troubled or confused situation and followed by a cleared-up, unified, resolved situation (Dewey 1933: 106). The term ‘phases’ replaced the term ‘steps’ (Dewey 1910: 72), thus removing the earlier suggestion of an invariant sequence. Variants of the above analysis appeared in (Dewey 1916: 177) and (Dewey 1938: 101–119).

The variant formulations indicate the difficulty of giving a single logical analysis of such a varied process. The process of critical thinking may have a spiral pattern, with the problem being redefined in the light of obstacles to solving it as originally formulated. For example, the person in Transit might have concluded that getting to the appointment at the scheduled time was impossible and have reformulated the problem as that of rescheduling the appointment for a mutually convenient time. Further, defining a problem does not always follow after or lead immediately to an idea of a suggested solution. Nor should it do so, as Dewey himself recognized in describing the physician in Typhoid as avoiding any strong preference for this or that conclusion before getting further information (Dewey 1910: 85; 1933: 170). People with a hypothesis in mind, even one to which they have a very weak commitment, have a so-called “confirmation bias” (Nickerson 1998): they are likely to pay attention to evidence that confirms the hypothesis and to ignore evidence that counts against it or for some competing hypothesis. Detectives, intelligence agencies, and investigators of airplane accidents are well advised to gather relevant evidence systematically and to postpone even tentative adoption of an explanatory hypothesis until the collected evidence rules out with the appropriate degree of certainty all but one explanation. Dewey’s analysis of the critical thinking process can be faulted as well for requiring acceptance or rejection of a possible solution to a defined problem, with no allowance for deciding in the light of the available evidence to suspend judgment. Further, given the great variety of kinds of problems for which reflection is appropriate, there is likely to be variation in its component events. Perhaps the best way to conceptualize the critical thinking process is as a checklist whose component events can occur in a variety of orders, selectively, and more than once. These component events might include (1) noticing a difficulty, (2) defining the problem, (3) dividing the problem into manageable sub-problems, (4) formulating a variety of possible solutions to the problem or sub-problem, (5) determining what evidence is relevant to deciding among possible solutions to the problem or sub-problem, (6) devising a plan of systematic observation or experiment that will uncover the relevant evidence, (7) carrying out the plan of systematic observation or experimentation, (8) noting the results of the systematic observation or experiment, (9) gathering relevant testimony and information from others, (10) judging the credibility of testimony and information gathered from others, (11) drawing conclusions from gathered evidence and accepted testimony, and (12) accepting a solution that the evidence adequately supports (cf. Hitchcock 2017: 485).

Checklist conceptions of the process of critical thinking are open to the objection that they are too mechanical and procedural to fit the multi-dimensional and emotionally charged issues for which critical thinking is urgently needed (Paul 1984). For such issues, a more dialectical process is advocated, in which competing relevant world views are identified, their implications explored, and some sort of creative synthesis attempted.

If one considers the critical thinking process illustrated by the 11 examples, one can identify distinct kinds of mental acts and mental states that form part of it. To distinguish, label and briefly characterize these components is a useful preliminary to identifying abilities, skills, dispositions, attitudes, habits and the like that contribute causally to thinking critically. Identifying such abilities and habits is in turn a useful preliminary to setting educational goals. Setting the goals is in its turn a useful preliminary to designing strategies for helping learners to achieve the goals and to designing ways of measuring the extent to which learners have done so. Such measures provide both feedback to learners on their achievement and a basis for experimental research on the effectiveness of various strategies for educating people to think critically. Let us begin, then, by distinguishing the kinds of mental acts and mental events that can occur in a critical thinking process.

  • Observing : One notices something in one’s immediate environment (sudden cooling of temperature in Weather , bubbles forming outside a glass and then going inside in Bubbles , a moving blur in the distance in Blur , a rash in Rash ). Or one notes the results of an experiment or systematic observation (valuables missing in Disorder , no suction without air pressure in Suction pump )
  • Feeling : One feels puzzled or uncertain about something (how to get to an appointment on time in Transit , why the diamonds vary in spacing in Diamond ). One wants to resolve this perplexity. One feels satisfaction once one has worked out an answer (to take the subway express in Transit , diamonds closer when needed as a warning in Diamond ).
  • Wondering : One formulates a question to be addressed (why bubbles form outside a tumbler taken from hot water in Bubbles , how suction pumps work in Suction pump , what caused the rash in Rash ).
  • Imagining : One thinks of possible answers (bus or subway or elevated in Transit , flagpole or ornament or wireless communication aid or direction indicator in Ferryboat , allergic reaction or heat rash in Rash ).
  • Inferring : One works out what would be the case if a possible answer were assumed (valuables missing if there has been a burglary in Disorder , earlier start to the rash if it is an allergic reaction to a sulfa drug in Rash ). Or one draws a conclusion once sufficient relevant evidence is gathered (take the subway in Transit , burglary in Disorder , discontinue blood pressure medication and new cream in Rash ).
  • Knowledge : One uses stored knowledge of the subject-matter to generate possible answers or to infer what would be expected on the assumption of a particular answer (knowledge of a city’s public transit system in Transit , of the requirements for a flagpole in Ferryboat , of Boyle’s law in Bubbles , of allergic reactions in Rash ).
  • Experimenting : One designs and carries out an experiment or a systematic observation to find out whether the results deduced from a possible answer will occur (looking at the location of the flagpole in relation to the pilot’s position in Ferryboat , putting an ice cube on top of a tumbler taken from hot water in Bubbles , measuring the height to which a suction pump will draw water at different elevations in Suction pump , noticing the spacing of diamonds when movement to or from a diamond lane is allowed in Diamond ).
  • Consulting : One finds a source of information, gets the information from the source, and makes a judgment on whether to accept it. None of our 11 examples include searching for sources of information. In this respect they are unrepresentative, since most people nowadays have almost instant access to information relevant to answering any question, including many of those illustrated by the examples. However, Candidate includes the activities of extracting information from sources and evaluating its credibility.
  • Identifying and analyzing arguments : One notices an argument and works out its structure and content as a preliminary to evaluating its strength. This activity is central to Candidate . It is an important part of a critical thinking process in which one surveys arguments for various positions on an issue.
  • Judging : One makes a judgment on the basis of accumulated evidence and reasoning, such as the judgment in Ferryboat that the purpose of the pole is to provide direction to the pilot.
  • Deciding : One makes a decision on what to do or on what policy to adopt, as in the decision in Transit to take the subway.

By definition, a person who does something voluntarily is both willing and able to do that thing at that time. Both the willingness and the ability contribute causally to the person’s action, in the sense that the voluntary action would not occur if either (or both) of these were lacking. For example, suppose that one is standing with one’s arms at one’s sides and one voluntarily lifts one’s right arm to an extended horizontal position. One would not do so if one were unable to lift one’s arm, if for example one’s right side was paralyzed as the result of a stroke. Nor would one do so if one were unwilling to lift one’s arm, if for example one were participating in a street demonstration at which a white supremacist was urging the crowd to lift their right arm in a Nazi salute and one were unwilling to express support in this way for the racist Nazi ideology. The same analysis applies to a voluntary mental process of thinking critically. It requires both willingness and ability to think critically, including willingness and ability to perform each of the mental acts that compose the process and to coordinate those acts in a sequence that is directed at resolving the initiating perplexity.

Consider willingness first. We can identify causal contributors to willingness to think critically by considering factors that would cause a person who was able to think critically about an issue nevertheless not to do so (Hamby 2014). For each factor, the opposite condition thus contributes causally to willingness to think critically on a particular occasion. For example, people who habitually jump to conclusions without considering alternatives will not think critically about issues that arise, even if they have the required abilities. The contrary condition of willingness to suspend judgment is thus a causal contributor to thinking critically.

Now consider ability. In contrast to the ability to move one’s arm, which can be completely absent because a stroke has left the arm paralyzed, the ability to think critically is a developed ability, whose absence is not a complete absence of ability to think but absence of ability to think well. We can identify the ability to think well directly, in terms of the norms and standards for good thinking. In general, to be able do well the thinking activities that can be components of a critical thinking process, one needs to know the concepts and principles that characterize their good performance, to recognize in particular cases that the concepts and principles apply, and to apply them. The knowledge, recognition and application may be procedural rather than declarative. It may be domain-specific rather than widely applicable, and in either case may need subject-matter knowledge, sometimes of a deep kind.

Reflections of the sort illustrated by the previous two paragraphs have led scholars to identify the knowledge, abilities and dispositions of a “critical thinker”, i.e., someone who thinks critically whenever it is appropriate to do so. We turn now to these three types of causal contributors to thinking critically. We start with dispositions, since arguably these are the most powerful contributors to being a critical thinker, can be fostered at an early stage of a child’s development, and are susceptible to general improvement (Glaser 1941: 175)

8. Critical Thinking Dispositions

Educational researchers use the term ‘dispositions’ broadly for the habits of mind and attitudes that contribute causally to being a critical thinker. Some writers (e.g., Paul & Elder 2006; Hamby 2014; Bailin & Battersby 2016a) propose to use the term ‘virtues’ for this dimension of a critical thinker. The virtues in question, although they are virtues of character, concern the person’s ways of thinking rather than the person’s ways of behaving towards others. They are not moral virtues but intellectual virtues, of the sort articulated by Zagzebski (1996) and discussed by Turri, Alfano, and Greco (2017).

On a realistic conception, thinking dispositions or intellectual virtues are real properties of thinkers. They are general tendencies, propensities, or inclinations to think in particular ways in particular circumstances, and can be genuinely explanatory (Siegel 1999). Sceptics argue that there is no evidence for a specific mental basis for the habits of mind that contribute to thinking critically, and that it is pedagogically misleading to posit such a basis (Bailin et al. 1999a). Whatever their status, critical thinking dispositions need motivation for their initial formation in a child—motivation that may be external or internal. As children develop, the force of habit will gradually become important in sustaining the disposition (Nieto & Valenzuela 2012). Mere force of habit, however, is unlikely to sustain critical thinking dispositions. Critical thinkers must value and enjoy using their knowledge and abilities to think things through for themselves. They must be committed to, and lovers of, inquiry.

A person may have a critical thinking disposition with respect to only some kinds of issues. For example, one could be open-minded about scientific issues but not about religious issues. Similarly, one could be confident in one’s ability to reason about the theological implications of the existence of evil in the world but not in one’s ability to reason about the best design for a guided ballistic missile.

Facione (1990a: 25) divides “affective dispositions” of critical thinking into approaches to life and living in general and approaches to specific issues, questions or problems. Adapting this distinction, one can usefully divide critical thinking dispositions into initiating dispositions (those that contribute causally to starting to think critically about an issue) and internal dispositions (those that contribute causally to doing a good job of thinking critically once one has started). The two categories are not mutually exclusive. For example, open-mindedness, in the sense of willingness to consider alternative points of view to one’s own, is both an initiating and an internal disposition.

Using the strategy of considering factors that would block people with the ability to think critically from doing so, we can identify as initiating dispositions for thinking critically attentiveness, a habit of inquiry, self-confidence, courage, open-mindedness, willingness to suspend judgment, trust in reason, wanting evidence for one’s beliefs, and seeking the truth. We consider briefly what each of these dispositions amounts to, in each case citing sources that acknowledge them.

  • Attentiveness : One will not think critically if one fails to recognize an issue that needs to be thought through. For example, the pedestrian in Weather would not have looked up if he had not noticed that the air was suddenly cooler. To be a critical thinker, then, one needs to be habitually attentive to one’s surroundings, noticing not only what one senses but also sources of perplexity in messages received and in one’s own beliefs and attitudes (Facione 1990a: 25; Facione, Facione, & Giancarlo 2001).
  • Habit of inquiry : Inquiry is effortful, and one needs an internal push to engage in it. For example, the student in Bubbles could easily have stopped at idle wondering about the cause of the bubbles rather than reasoning to a hypothesis, then designing and executing an experiment to test it. Thus willingness to think critically needs mental energy and initiative. What can supply that energy? Love of inquiry, or perhaps just a habit of inquiry. Hamby (2015) has argued that willingness to inquire is the central critical thinking virtue, one that encompasses all the others. It is recognized as a critical thinking disposition by Dewey (1910: 29; 1933: 35), Glaser (1941: 5), Ennis (1987: 12; 1991: 8), Facione (1990a: 25), Bailin et al. (1999b: 294), Halpern (1998: 452), and Facione, Facione, & Giancarlo (2001).
  • Self-confidence : Lack of confidence in one’s abilities can block critical thinking. For example, if the woman in Rash lacked confidence in her ability to figure things out for herself, she might just have assumed that the rash on her chest was the allergic reaction to her medication against which the pharmacist had warned her. Thus willingness to think critically requires confidence in one’s ability to inquire (Facione 1990a: 25; Facione, Facione, & Giancarlo 2001).
  • Courage : Fear of thinking for oneself can stop one from doing it. Thus willingness to think critically requires intellectual courage (Paul & Elder 2006: 16).
  • Open-mindedness : A dogmatic attitude will impede thinking critically. For example, a person who adheres rigidly to a “pro-choice” position on the issue of the legal status of induced abortion is likely to be unwilling to consider seriously the issue of when in its development an unborn child acquires a moral right to life. Thus willingness to think critically requires open-mindedness, in the sense of a willingness to examine questions to which one already accepts an answer but which further evidence or reasoning might cause one to answer differently (Dewey 1933; Facione 1990a; Ennis 1991; Bailin et al. 1999b; Halpern 1998, Facione, Facione, & Giancarlo 2001). Paul (1981) emphasizes open-mindedness about alternative world-views, and recommends a dialectical approach to integrating such views as central to what he calls “strong sense” critical thinking. In three studies, Haran, Ritov, & Mellers (2013) found that actively open-minded thinking, including “the tendency to weigh new evidence against a favored belief, to spend sufficient time on a problem before giving up, and to consider carefully the opinions of others in forming one’s own”, led study participants to acquire information and thus to make accurate estimations.
  • Willingness to suspend judgment : Premature closure on an initial solution will block critical thinking. Thus willingness to think critically requires a willingness to suspend judgment while alternatives are explored (Facione 1990a; Ennis 1991; Halpern 1998).
  • Trust in reason : Since distrust in the processes of reasoned inquiry will dissuade one from engaging in it, trust in them is an initiating critical thinking disposition (Facione 1990a, 25; Bailin et al. 1999b: 294; Facione, Facione, & Giancarlo 2001; Paul & Elder 2006). In reaction to an allegedly exclusive emphasis on reason in critical thinking theory and pedagogy, Thayer-Bacon (2000) argues that intuition, imagination, and emotion have important roles to play in an adequate conception of critical thinking that she calls “constructive thinking”. From her point of view, critical thinking requires trust not only in reason but also in intuition, imagination, and emotion.
  • Seeking the truth : If one does not care about the truth but is content to stick with one’s initial bias on an issue, then one will not think critically about it. Seeking the truth is thus an initiating critical thinking disposition (Bailin et al. 1999b: 294; Facione, Facione, & Giancarlo 2001). A disposition to seek the truth is implicit in more specific critical thinking dispositions, such as trying to be well-informed, considering seriously points of view other than one’s own, looking for alternatives, suspending judgment when the evidence is insufficient, and adopting a position when the evidence supporting it is sufficient.

Some of the initiating dispositions, such as open-mindedness and willingness to suspend judgment, are also internal critical thinking dispositions, in the sense of mental habits or attitudes that contribute causally to doing a good job of critical thinking once one starts the process. But there are many other internal critical thinking dispositions. Some of them are parasitic on one’s conception of good thinking. For example, it is constitutive of good thinking about an issue to formulate the issue clearly and to maintain focus on it. For this purpose, one needs not only the corresponding ability but also the corresponding disposition. Ennis (1991: 8) describes it as the disposition “to determine and maintain focus on the conclusion or question”, Facione (1990a: 25) as “clarity in stating the question or concern”. Other internal dispositions are motivators to continue or adjust the critical thinking process, such as willingness to persist in a complex task and willingness to abandon nonproductive strategies in an attempt to self-correct (Halpern 1998: 452). For a list of identified internal critical thinking dispositions, see the Supplement on Internal Critical Thinking Dispositions .

Some theorists postulate skills, i.e., acquired abilities, as operative in critical thinking. It is not obvious, however, that a good mental act is the exercise of a generic acquired skill. Inferring an expected time of arrival, as in Transit , has some generic components but also uses non-generic subject-matter knowledge. Bailin et al. (1999a) argue against viewing critical thinking skills as generic and discrete, on the ground that skilled performance at a critical thinking task cannot be separated from knowledge of concepts and from domain-specific principles of good thinking. Talk of skills, they concede, is unproblematic if it means merely that a person with critical thinking skills is capable of intelligent performance.

Despite such scepticism, theorists of critical thinking have listed as general contributors to critical thinking what they variously call abilities (Glaser 1941; Ennis 1962, 1991), skills (Facione 1990a; Halpern 1998) or competencies (Fisher & Scriven 1997). Amalgamating these lists would produce a confusing and chaotic cornucopia of more than 50 possible educational objectives, with only partial overlap among them. It makes sense instead to try to understand the reasons for the multiplicity and diversity, and to make a selection according to one’s own reasons for singling out abilities to be developed in a critical thinking curriculum. Two reasons for diversity among lists of critical thinking abilities are the underlying conception of critical thinking and the envisaged educational level. Appraisal-only conceptions, for example, involve a different suite of abilities than constructive-only conceptions. Some lists, such as those in (Glaser 1941), are put forward as educational objectives for secondary school students, whereas others are proposed as objectives for college students (e.g., Facione 1990a).

The abilities described in the remaining paragraphs of this section emerge from reflection on the general abilities needed to do well the thinking activities identified in section 6 as components of the critical thinking process described in section 5 . The derivation of each collection of abilities is accompanied by citation of sources that list such abilities and of standardized tests that claim to test them.

Observational abilities : Careful and accurate observation sometimes requires specialist expertise and practice, as in the case of observing birds and observing accident scenes. However, there are general abilities of noticing what one’s senses are picking up from one’s environment and of being able to articulate clearly and accurately to oneself and others what one has observed. It helps in exercising them to be able to recognize and take into account factors that make one’s observation less trustworthy, such as prior framing of the situation, inadequate time, deficient senses, poor observation conditions, and the like. It helps as well to be skilled at taking steps to make one’s observation more trustworthy, such as moving closer to get a better look, measuring something three times and taking the average, and checking what one thinks one is observing with someone else who is in a good position to observe it. It also helps to be skilled at recognizing respects in which one’s report of one’s observation involves inference rather than direct observation, so that one can then consider whether the inference is justified. These abilities come into play as well when one thinks about whether and with what degree of confidence to accept an observation report, for example in the study of history or in a criminal investigation or in assessing news reports. Observational abilities show up in some lists of critical thinking abilities (Ennis 1962: 90; Facione 1990a: 16; Ennis 1991: 9). There are items testing a person’s ability to judge the credibility of observation reports in the Cornell Critical Thinking Tests, Levels X and Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005). Norris and King (1983, 1985, 1990a, 1990b) is a test of ability to appraise observation reports.

Emotional abilities : The emotions that drive a critical thinking process are perplexity or puzzlement, a wish to resolve it, and satisfaction at achieving the desired resolution. Children experience these emotions at an early age, without being trained to do so. Education that takes critical thinking as a goal needs only to channel these emotions and to make sure not to stifle them. Collaborative critical thinking benefits from ability to recognize one’s own and others’ emotional commitments and reactions.

Questioning abilities : A critical thinking process needs transformation of an inchoate sense of perplexity into a clear question. Formulating a question well requires not building in questionable assumptions, not prejudging the issue, and using language that in context is unambiguous and precise enough (Ennis 1962: 97; 1991: 9).

Imaginative abilities : Thinking directed at finding the correct causal explanation of a general phenomenon or particular event requires an ability to imagine possible explanations. Thinking about what policy or plan of action to adopt requires generation of options and consideration of possible consequences of each option. Domain knowledge is required for such creative activity, but a general ability to imagine alternatives is helpful and can be nurtured so as to become easier, quicker, more extensive, and deeper (Dewey 1910: 34–39; 1933: 40–47). Facione (1990a) and Halpern (1998) include the ability to imagine alternatives as a critical thinking ability.

Inferential abilities : The ability to draw conclusions from given information, and to recognize with what degree of certainty one’s own or others’ conclusions follow, is universally recognized as a general critical thinking ability. All 11 examples in section 2 of this article include inferences, some from hypotheses or options (as in Transit , Ferryboat and Disorder ), others from something observed (as in Weather and Rash ). None of these inferences is formally valid. Rather, they are licensed by general, sometimes qualified substantive rules of inference (Toulmin 1958) that rest on domain knowledge—that a bus trip takes about the same time in each direction, that the terminal of a wireless telegraph would be located on the highest possible place, that sudden cooling is often followed by rain, that an allergic reaction to a sulfa drug generally shows up soon after one starts taking it. It is a matter of controversy to what extent the specialized ability to deduce conclusions from premisses using formal rules of inference is needed for critical thinking. Dewey (1933) locates logical forms in setting out the products of reflection rather than in the process of reflection. Ennis (1981a), on the other hand, maintains that a liberally-educated person should have the following abilities: to translate natural-language statements into statements using the standard logical operators, to use appropriately the language of necessary and sufficient conditions, to deal with argument forms and arguments containing symbols, to determine whether in virtue of an argument’s form its conclusion follows necessarily from its premisses, to reason with logically complex propositions, and to apply the rules and procedures of deductive logic. Inferential abilities are recognized as critical thinking abilities by Glaser (1941: 6), Facione (1990a: 9), Ennis (1991: 9), Fisher & Scriven (1997: 99, 111), and Halpern (1998: 452). Items testing inferential abilities constitute two of the five subtests of the Watson Glaser Critical Thinking Appraisal (Watson & Glaser 1980a, 1980b, 1994), two of the four sections in the Cornell Critical Thinking Test Level X (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), three of the seven sections in the Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), 11 of the 34 items on Forms A and B of the California Critical Thinking Skills Test (Facione 1990b, 1992), and a high but variable proportion of the 25 selected-response questions in the Collegiate Learning Assessment (Council for Aid to Education 2017).

Experimenting abilities : Knowing how to design and execute an experiment is important not just in scientific research but also in everyday life, as in Rash . Dewey devoted a whole chapter of his How We Think (1910: 145–156; 1933: 190–202) to the superiority of experimentation over observation in advancing knowledge. Experimenting abilities come into play at one remove in appraising reports of scientific studies. Skill in designing and executing experiments includes the acknowledged abilities to appraise evidence (Glaser 1941: 6), to carry out experiments and to apply appropriate statistical inference techniques (Facione 1990a: 9), to judge inductions to an explanatory hypothesis (Ennis 1991: 9), and to recognize the need for an adequately large sample size (Halpern 1998). The Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005) includes four items (out of 52) on experimental design. The Collegiate Learning Assessment (Council for Aid to Education 2017) makes room for appraisal of study design in both its performance task and its selected-response questions.

Consulting abilities : Skill at consulting sources of information comes into play when one seeks information to help resolve a problem, as in Candidate . Ability to find and appraise information includes ability to gather and marshal pertinent information (Glaser 1941: 6), to judge whether a statement made by an alleged authority is acceptable (Ennis 1962: 84), to plan a search for desired information (Facione 1990a: 9), and to judge the credibility of a source (Ennis 1991: 9). Ability to judge the credibility of statements is tested by 24 items (out of 76) in the Cornell Critical Thinking Test Level X (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005) and by four items (out of 52) in the Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005). The College Learning Assessment’s performance task requires evaluation of whether information in documents is credible or unreliable (Council for Aid to Education 2017).

Argument analysis abilities : The ability to identify and analyze arguments contributes to the process of surveying arguments on an issue in order to form one’s own reasoned judgment, as in Candidate . The ability to detect and analyze arguments is recognized as a critical thinking skill by Facione (1990a: 7–8), Ennis (1991: 9) and Halpern (1998). Five items (out of 34) on the California Critical Thinking Skills Test (Facione 1990b, 1992) test skill at argument analysis. The College Learning Assessment (Council for Aid to Education 2017) incorporates argument analysis in its selected-response tests of critical reading and evaluation and of critiquing an argument.

Judging skills and deciding skills : Skill at judging and deciding is skill at recognizing what judgment or decision the available evidence and argument supports, and with what degree of confidence. It is thus a component of the inferential skills already discussed.

Lists and tests of critical thinking abilities often include two more abilities: identifying assumptions and constructing and evaluating definitions.

In addition to dispositions and abilities, critical thinking needs knowledge: of critical thinking concepts, of critical thinking principles, and of the subject-matter of the thinking.

We can derive a short list of concepts whose understanding contributes to critical thinking from the critical thinking abilities described in the preceding section. Observational abilities require an understanding of the difference between observation and inference. Questioning abilities require an understanding of the concepts of ambiguity and vagueness. Inferential abilities require an understanding of the difference between conclusive and defeasible inference (traditionally, between deduction and induction), as well as of the difference between necessary and sufficient conditions. Experimenting abilities require an understanding of the concepts of hypothesis, null hypothesis, assumption and prediction, as well as of the concept of statistical significance and of its difference from importance. They also require an understanding of the difference between an experiment and an observational study, and in particular of the difference between a randomized controlled trial, a prospective correlational study and a retrospective (case-control) study. Argument analysis abilities require an understanding of the concepts of argument, premiss, assumption, conclusion and counter-consideration. Additional critical thinking concepts are proposed by Bailin et al. (1999b: 293), Fisher & Scriven (1997: 105–106), Black (2012), and Blair (2021).

According to Glaser (1941: 25), ability to think critically requires knowledge of the methods of logical inquiry and reasoning. If we review the list of abilities in the preceding section, however, we can see that some of them can be acquired and exercised merely through practice, possibly guided in an educational setting, followed by feedback. Searching intelligently for a causal explanation of some phenomenon or event requires that one consider a full range of possible causal contributors, but it seems more important that one implements this principle in one’s practice than that one is able to articulate it. What is important is “operational knowledge” of the standards and principles of good thinking (Bailin et al. 1999b: 291–293). But the development of such critical thinking abilities as designing an experiment or constructing an operational definition can benefit from learning their underlying theory. Further, explicit knowledge of quirks of human thinking seems useful as a cautionary guide. Human memory is not just fallible about details, as people learn from their own experiences of misremembering, but is so malleable that a detailed, clear and vivid recollection of an event can be a total fabrication (Loftus 2017). People seek or interpret evidence in ways that are partial to their existing beliefs and expectations, often unconscious of their “confirmation bias” (Nickerson 1998). Not only are people subject to this and other cognitive biases (Kahneman 2011), of which they are typically unaware, but it may be counter-productive for one to make oneself aware of them and try consciously to counteract them or to counteract social biases such as racial or sexual stereotypes (Kenyon & Beaulac 2014). It is helpful to be aware of these facts and of the superior effectiveness of blocking the operation of biases—for example, by making an immediate record of one’s observations, refraining from forming a preliminary explanatory hypothesis, blind refereeing, double-blind randomized trials, and blind grading of students’ work. It is also helpful to be aware of the prevalence of “noise” (unwanted unsystematic variability of judgments), of how to detect noise (through a noise audit), and of how to reduce noise: make accuracy the goal, think statistically, break a process of arriving at a judgment into independent tasks, resist premature intuitions, in a group get independent judgments first, favour comparative judgments and scales (Kahneman, Sibony, & Sunstein 2021). It is helpful as well to be aware of the concept of “bounded rationality” in decision-making and of the related distinction between “satisficing” and optimizing (Simon 1956; Gigerenzer 2001).

Critical thinking about an issue requires substantive knowledge of the domain to which the issue belongs. Critical thinking abilities are not a magic elixir that can be applied to any issue whatever by somebody who has no knowledge of the facts relevant to exploring that issue. For example, the student in Bubbles needed to know that gases do not penetrate solid objects like a glass, that air expands when heated, that the volume of an enclosed gas varies directly with its temperature and inversely with its pressure, and that hot objects will spontaneously cool down to the ambient temperature of their surroundings unless kept hot by insulation or a source of heat. Critical thinkers thus need a rich fund of subject-matter knowledge relevant to the variety of situations they encounter. This fact is recognized in the inclusion among critical thinking dispositions of a concern to become and remain generally well informed.

Experimental educational interventions, with control groups, have shown that education can improve critical thinking skills and dispositions, as measured by standardized tests. For information about these tests, see the Supplement on Assessment .

What educational methods are most effective at developing the dispositions, abilities and knowledge of a critical thinker? In a comprehensive meta-analysis of experimental and quasi-experimental studies of strategies for teaching students to think critically, Abrami et al. (2015) found that dialogue, anchored instruction, and mentoring each increased the effectiveness of the educational intervention, and that they were most effective when combined. They also found that in these studies a combination of separate instruction in critical thinking with subject-matter instruction in which students are encouraged to think critically was more effective than either by itself. However, the difference was not statistically significant; that is, it might have arisen by chance.

Most of these studies lack the longitudinal follow-up required to determine whether the observed differential improvements in critical thinking abilities or dispositions continue over time, for example until high school or college graduation. For details on studies of methods of developing critical thinking skills and dispositions, see the Supplement on Educational Methods .

12. Controversies

Scholars have denied the generalizability of critical thinking abilities across subject domains, have alleged bias in critical thinking theory and pedagogy, and have investigated the relationship of critical thinking to other kinds of thinking.

McPeck (1981) attacked the thinking skills movement of the 1970s, including the critical thinking movement. He argued that there are no general thinking skills, since thinking is always thinking about some subject-matter. It is futile, he claimed, for schools and colleges to teach thinking as if it were a separate subject. Rather, teachers should lead their pupils to become autonomous thinkers by teaching school subjects in a way that brings out their cognitive structure and that encourages and rewards discussion and argument. As some of his critics (e.g., Paul 1985; Siegel 1985) pointed out, McPeck’s central argument needs elaboration, since it has obvious counter-examples in writing and speaking, for which (up to a certain level of complexity) there are teachable general abilities even though they are always about some subject-matter. To make his argument convincing, McPeck needs to explain how thinking differs from writing and speaking in a way that does not permit useful abstraction of its components from the subject-matters with which it deals. He has not done so. Nevertheless, his position that the dispositions and abilities of a critical thinker are best developed in the context of subject-matter instruction is shared by many theorists of critical thinking, including Dewey (1910, 1933), Glaser (1941), Passmore (1980), Weinstein (1990), Bailin et al. (1999b), and Willingham (2019).

McPeck’s challenge prompted reflection on the extent to which critical thinking is subject-specific. McPeck argued for a strong subject-specificity thesis, according to which it is a conceptual truth that all critical thinking abilities are specific to a subject. (He did not however extend his subject-specificity thesis to critical thinking dispositions. In particular, he took the disposition to suspend judgment in situations of cognitive dissonance to be a general disposition.) Conceptual subject-specificity is subject to obvious counter-examples, such as the general ability to recognize confusion of necessary and sufficient conditions. A more modest thesis, also endorsed by McPeck, is epistemological subject-specificity, according to which the norms of good thinking vary from one field to another. Epistemological subject-specificity clearly holds to a certain extent; for example, the principles in accordance with which one solves a differential equation are quite different from the principles in accordance with which one determines whether a painting is a genuine Picasso. But the thesis suffers, as Ennis (1989) points out, from vagueness of the concept of a field or subject and from the obvious existence of inter-field principles, however broadly the concept of a field is construed. For example, the principles of hypothetico-deductive reasoning hold for all the varied fields in which such reasoning occurs. A third kind of subject-specificity is empirical subject-specificity, according to which as a matter of empirically observable fact a person with the abilities and dispositions of a critical thinker in one area of investigation will not necessarily have them in another area of investigation.

The thesis of empirical subject-specificity raises the general problem of transfer. If critical thinking abilities and dispositions have to be developed independently in each school subject, how are they of any use in dealing with the problems of everyday life and the political and social issues of contemporary society, most of which do not fit into the framework of a traditional school subject? Proponents of empirical subject-specificity tend to argue that transfer is more likely to occur if there is critical thinking instruction in a variety of domains, with explicit attention to dispositions and abilities that cut across domains. But evidence for this claim is scanty. There is a need for well-designed empirical studies that investigate the conditions that make transfer more likely.

It is common ground in debates about the generality or subject-specificity of critical thinking dispositions and abilities that critical thinking about any topic requires background knowledge about the topic. For example, the most sophisticated understanding of the principles of hypothetico-deductive reasoning is of no help unless accompanied by some knowledge of what might be plausible explanations of some phenomenon under investigation.

Critics have objected to bias in the theory, pedagogy and practice of critical thinking. Commentators (e.g., Alston 1995; Ennis 1998) have noted that anyone who takes a position has a bias in the neutral sense of being inclined in one direction rather than others. The critics, however, are objecting to bias in the pejorative sense of an unjustified favoring of certain ways of knowing over others, frequently alleging that the unjustly favoured ways are those of a dominant sex or culture (Bailin 1995). These ways favour:

  • reinforcement of egocentric and sociocentric biases over dialectical engagement with opposing world-views (Paul 1981, 1984; Warren 1998)
  • distancing from the object of inquiry over closeness to it (Martin 1992; Thayer-Bacon 1992)
  • indifference to the situation of others over care for them (Martin 1992)
  • orientation to thought over orientation to action (Martin 1992)
  • being reasonable over caring to understand people’s ideas (Thayer-Bacon 1993)
  • being neutral and objective over being embodied and situated (Thayer-Bacon 1995a)
  • doubting over believing (Thayer-Bacon 1995b)
  • reason over emotion, imagination and intuition (Thayer-Bacon 2000)
  • solitary thinking over collaborative thinking (Thayer-Bacon 2000)
  • written and spoken assignments over other forms of expression (Alston 2001)
  • attention to written and spoken communications over attention to human problems (Alston 2001)
  • winning debates in the public sphere over making and understanding meaning (Alston 2001)

A common thread in this smorgasbord of accusations is dissatisfaction with focusing on the logical analysis and evaluation of reasoning and arguments. While these authors acknowledge that such analysis and evaluation is part of critical thinking and should be part of its conceptualization and pedagogy, they insist that it is only a part. Paul (1981), for example, bemoans the tendency of atomistic teaching of methods of analyzing and evaluating arguments to turn students into more able sophists, adept at finding fault with positions and arguments with which they disagree but even more entrenched in the egocentric and sociocentric biases with which they began. Martin (1992) and Thayer-Bacon (1992) cite with approval the self-reported intimacy with their subject-matter of leading researchers in biology and medicine, an intimacy that conflicts with the distancing allegedly recommended in standard conceptions and pedagogy of critical thinking. Thayer-Bacon (2000) contrasts the embodied and socially embedded learning of her elementary school students in a Montessori school, who used their imagination, intuition and emotions as well as their reason, with conceptions of critical thinking as

thinking that is used to critique arguments, offer justifications, and make judgments about what are the good reasons, or the right answers. (Thayer-Bacon 2000: 127–128)

Alston (2001) reports that her students in a women’s studies class were able to see the flaws in the Cinderella myth that pervades much romantic fiction but in their own romantic relationships still acted as if all failures were the woman’s fault and still accepted the notions of love at first sight and living happily ever after. Students, she writes, should

be able to connect their intellectual critique to a more affective, somatic, and ethical account of making risky choices that have sexist, racist, classist, familial, sexual, or other consequences for themselves and those both near and far… critical thinking that reads arguments, texts, or practices merely on the surface without connections to feeling/desiring/doing or action lacks an ethical depth that should infuse the difference between mere cognitive activity and something we want to call critical thinking. (Alston 2001: 34)

Some critics portray such biases as unfair to women. Thayer-Bacon (1992), for example, has charged modern critical thinking theory with being sexist, on the ground that it separates the self from the object and causes one to lose touch with one’s inner voice, and thus stigmatizes women, who (she asserts) link self to object and listen to their inner voice. Her charge does not imply that women as a group are on average less able than men to analyze and evaluate arguments. Facione (1990c) found no difference by sex in performance on his California Critical Thinking Skills Test. Kuhn (1991: 280–281) found no difference by sex in either the disposition or the competence to engage in argumentative thinking.

The critics propose a variety of remedies for the biases that they allege. In general, they do not propose to eliminate or downplay critical thinking as an educational goal. Rather, they propose to conceptualize critical thinking differently and to change its pedagogy accordingly. Their pedagogical proposals arise logically from their objections. They can be summarized as follows:

  • Focus on argument networks with dialectical exchanges reflecting contesting points of view rather than on atomic arguments, so as to develop “strong sense” critical thinking that transcends egocentric and sociocentric biases (Paul 1981, 1984).
  • Foster closeness to the subject-matter and feeling connected to others in order to inform a humane democracy (Martin 1992).
  • Develop “constructive thinking” as a social activity in a community of physically embodied and socially embedded inquirers with personal voices who value not only reason but also imagination, intuition and emotion (Thayer-Bacon 2000).
  • In developing critical thinking in school subjects, treat as important neither skills nor dispositions but opening worlds of meaning (Alston 2001).
  • Attend to the development of critical thinking dispositions as well as skills, and adopt the “critical pedagogy” practised and advocated by Freire (1968 [1970]) and hooks (1994) (Dalgleish, Girard, & Davies 2017).

A common thread in these proposals is treatment of critical thinking as a social, interactive, personally engaged activity like that of a quilting bee or a barn-raising (Thayer-Bacon 2000) rather than as an individual, solitary, distanced activity symbolized by Rodin’s The Thinker . One can get a vivid description of education with the former type of goal from the writings of bell hooks (1994, 2010). Critical thinking for her is open-minded dialectical exchange across opposing standpoints and from multiple perspectives, a conception similar to Paul’s “strong sense” critical thinking (Paul 1981). She abandons the structure of domination in the traditional classroom. In an introductory course on black women writers, for example, she assigns students to write an autobiographical paragraph about an early racial memory, then to read it aloud as the others listen, thus affirming the uniqueness and value of each voice and creating a communal awareness of the diversity of the group’s experiences (hooks 1994: 84). Her “engaged pedagogy” is thus similar to the “freedom under guidance” implemented in John Dewey’s Laboratory School of Chicago in the late 1890s and early 1900s. It incorporates the dialogue, anchored instruction, and mentoring that Abrami (2015) found to be most effective in improving critical thinking skills and dispositions.

What is the relationship of critical thinking to problem solving, decision-making, higher-order thinking, creative thinking, and other recognized types of thinking? One’s answer to this question obviously depends on how one defines the terms used in the question. If critical thinking is conceived broadly to cover any careful thinking about any topic for any purpose, then problem solving and decision making will be kinds of critical thinking, if they are done carefully. Historically, ‘critical thinking’ and ‘problem solving’ were two names for the same thing. If critical thinking is conceived more narrowly as consisting solely of appraisal of intellectual products, then it will be disjoint with problem solving and decision making, which are constructive.

Bloom’s taxonomy of educational objectives used the phrase “intellectual abilities and skills” for what had been labeled “critical thinking” by some, “reflective thinking” by Dewey and others, and “problem solving” by still others (Bloom et al. 1956: 38). Thus, the so-called “higher-order thinking skills” at the taxonomy’s top levels of analysis, synthesis and evaluation are just critical thinking skills, although they do not come with general criteria for their assessment (Ennis 1981b). The revised version of Bloom’s taxonomy (Anderson et al. 2001) likewise treats critical thinking as cutting across those types of cognitive process that involve more than remembering (Anderson et al. 2001: 269–270). For details, see the Supplement on History .

As to creative thinking, it overlaps with critical thinking (Bailin 1987, 1988). Thinking about the explanation of some phenomenon or event, as in Ferryboat , requires creative imagination in constructing plausible explanatory hypotheses. Likewise, thinking about a policy question, as in Candidate , requires creativity in coming up with options. Conversely, creativity in any field needs to be balanced by critical appraisal of the draft painting or novel or mathematical theory.

  • Abrami, Philip C., Robert M. Bernard, Eugene Borokhovski, David I. Waddington, C. Anne Wade, and Tonje Person, 2015, “Strategies for Teaching Students to Think Critically: A Meta-analysis”, Review of Educational Research , 85(2): 275–314. doi:10.3102/0034654314551063
  • Aikin, Wilford M., 1942, The Story of the Eight-year Study, with Conclusions and Recommendations , Volume I of Adventure in American Education , New York and London: Harper & Brothers. [ Aikin 1942 available online ]
  • Alston, Kal, 1995, “Begging the Question: Is Critical Thinking Biased?”, Educational Theory , 45(2): 225–233. doi:10.1111/j.1741-5446.1995.00225.x
  • –––, 2001, “Re/Thinking Critical Thinking: The Seductions of Everyday Life”, Studies in Philosophy and Education , 20(1): 27–40. doi:10.1023/A:1005247128053
  • American Educational Research Association, 2014, Standards for Educational and Psychological Testing / American Educational Research Association, American Psychological Association, National Council on Measurement in Education , Washington, DC: American Educational Research Association.
  • Anderson, Lorin W., David R. Krathwohl, Peter W. Airiasian, Kathleen A. Cruikshank, Richard E. Mayer, Paul R. Pintrich, James Raths, and Merlin C. Wittrock, 2001, A Taxonomy for Learning, Teaching and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives , New York: Longman, complete edition.
  • Bailin, Sharon, 1987, “Critical and Creative Thinking”, Informal Logic , 9(1): 23–30. [ Bailin 1987 available online ]
  • –––, 1988, Achieving Extraordinary Ends: An Essay on Creativity , Dordrecht: Kluwer. doi:10.1007/978-94-009-2780-3
  • –––, 1995, “Is Critical Thinking Biased? Clarifications and Implications”, Educational Theory , 45(2): 191–197. doi:10.1111/j.1741-5446.1995.00191.x
  • Bailin, Sharon and Mark Battersby, 2009, “Inquiry: A Dialectical Approach to Teaching Critical Thinking”, in Juho Ritola (ed.), Argument Cultures: Proceedings of OSSA 09 , CD-ROM (pp. 1–10), Windsor, ON: OSSA. [ Bailin & Battersby 2009 available online ]
  • –––, 2016a, “Fostering the Virtues of Inquiry”, Topoi , 35(2): 367–374. doi:10.1007/s11245-015-9307-6
  • –––, 2016b, Reason in the Balance: An Inquiry Approach to Critical Thinking , Indianapolis: Hackett, 2nd edition.
  • –––, 2021, “Inquiry: Teaching for Reasoned Judgment”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 31–46. doi: 10.1163/9789004444591_003
  • Bailin, Sharon, Roland Case, Jerrold R. Coombs, and Leroi B. Daniels, 1999a, “Common Misconceptions of Critical Thinking”, Journal of Curriculum Studies , 31(3): 269–283. doi:10.1080/002202799183124
  • –––, 1999b, “Conceptualizing Critical Thinking”, Journal of Curriculum Studies , 31(3): 285–302. doi:10.1080/002202799183133
  • Blair, J. Anthony, 2021, Studies in Critical Thinking , Windsor, ON: Windsor Studies in Argumentation, 2nd edition. [Available online at https://windsor.scholarsportal.info/omp/index.php/wsia/catalog/book/106]
  • Berman, Alan M., Seth J. Schwartz, William M. Kurtines, and Steven L. Berman, 2001, “The Process of Exploration in Identity Formation: The Role of Style and Competence”, Journal of Adolescence , 24(4): 513–528. doi:10.1006/jado.2001.0386
  • Black, Beth (ed.), 2012, An A to Z of Critical Thinking , London: Continuum International Publishing Group.
  • Bloom, Benjamin Samuel, Max D. Engelhart, Edward J. Furst, Walter H. Hill, and David R. Krathwohl, 1956, Taxonomy of Educational Objectives. Handbook I: Cognitive Domain , New York: David McKay.
  • Boardman, Frank, Nancy M. Cavender, and Howard Kahane, 2018, Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life , Boston: Cengage, 13th edition.
  • Browne, M. Neil and Stuart M. Keeley, 2018, Asking the Right Questions: A Guide to Critical Thinking , Hoboken, NJ: Pearson, 12th edition.
  • Center for Assessment & Improvement of Learning, 2017, Critical Thinking Assessment Test , Cookeville, TN: Tennessee Technological University.
  • Cleghorn, Paul. 2021. “Critical Thinking in the Elementary School: Practical Guidance for Building a Culture of Thinking”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessmen t, Leiden: Brill, pp. 150–167. doi: 10.1163/9789004444591_010
  • Cohen, Jacob, 1988, Statistical Power Analysis for the Behavioral Sciences , Hillsdale, NJ: Lawrence Erlbaum Associates, 2nd edition.
  • College Board, 1983, Academic Preparation for College. What Students Need to Know and Be Able to Do , New York: College Entrance Examination Board, ERIC document ED232517.
  • Commission on the Relation of School and College of the Progressive Education Association, 1943, Thirty Schools Tell Their Story , Volume V of Adventure in American Education , New York and London: Harper & Brothers.
  • Council for Aid to Education, 2017, CLA+ Student Guide . Available at http://cae.org/images/uploads/pdf/CLA_Student_Guide_Institution.pdf ; last accessed 2022 07 16.
  • Dalgleish, Adam, Patrick Girard, and Maree Davies, 2017, “Critical Thinking, Bias and Feminist Philosophy: Building a Better Framework through Collaboration”, Informal Logic , 37(4): 351–369. [ Dalgleish et al. available online ]
  • Dewey, John, 1910, How We Think , Boston: D.C. Heath. [ Dewey 1910 available online ]
  • –––, 1916, Democracy and Education: An Introduction to the Philosophy of Education , New York: Macmillan.
  • –––, 1933, How We Think: A Restatement of the Relation of Reflective Thinking to the Educative Process , Lexington, MA: D.C. Heath.
  • –––, 1936, “The Theory of the Chicago Experiment”, Appendix II of Mayhew & Edwards 1936: 463–477.
  • –––, 1938, Logic: The Theory of Inquiry , New York: Henry Holt and Company.
  • Dominguez, Caroline (coord.), 2018a, A European Collection of the Critical Thinking Skills and Dispositions Needed in Different Professional Fields for the 21st Century , Vila Real, Portugal: UTAD. Available at http://bit.ly/CRITHINKEDUO1 ; last accessed 2022 07 16.
  • ––– (coord.), 2018b, A European Review on Critical Thinking Educational Practices in Higher Education Institutions , Vila Real: UTAD. Available at http://bit.ly/CRITHINKEDUO2 ; last accessed 2022 07 16.
  • ––– (coord.), 2018c, The CRITHINKEDU European Course on Critical Thinking Education for University Teachers: From Conception to Delivery , Vila Real: UTAD. Available at http:/bit.ly/CRITHINKEDU03; last accessed 2022 07 16.
  • Dominguez Caroline and Rita Payan-Carreira (eds.), 2019, Promoting Critical Thinking in European Higher Education Institutions: Towards an Educational Protocol , Vila Real: UTAD. Available at http:/bit.ly/CRITHINKEDU04; last accessed 2022 07 16.
  • Ennis, Robert H., 1958, “An Appraisal of the Watson-Glaser Critical Thinking Appraisal”, The Journal of Educational Research , 52(4): 155–158. doi:10.1080/00220671.1958.10882558
  • –––, 1962, “A Concept of Critical Thinking: A Proposed Basis for Research on the Teaching and Evaluation of Critical Thinking Ability”, Harvard Educational Review , 32(1): 81–111.
  • –––, 1981a, “A Conception of Deductive Logical Competence”, Teaching Philosophy , 4(3/4): 337–385. doi:10.5840/teachphil198143/429
  • –––, 1981b, “Eight Fallacies in Bloom’s Taxonomy”, in C. J. B. Macmillan (ed.), Philosophy of Education 1980: Proceedings of the Thirty-seventh Annual Meeting of the Philosophy of Education Society , Bloomington, IL: Philosophy of Education Society, pp. 269–273.
  • –––, 1984, “Problems in Testing Informal Logic, Critical Thinking, Reasoning Ability”, Informal Logic , 6(1): 3–9. [ Ennis 1984 available online ]
  • –––, 1987, “A Taxonomy of Critical Thinking Dispositions and Abilities”, in Joan Boykoff Baron and Robert J. Sternberg (eds.), Teaching Thinking Skills: Theory and Practice , New York: W. H. Freeman, pp. 9–26.
  • –––, 1989, “Critical Thinking and Subject Specificity: Clarification and Needed Research”, Educational Researcher , 18(3): 4–10. doi:10.3102/0013189X018003004
  • –––, 1991, “Critical Thinking: A Streamlined Conception”, Teaching Philosophy , 14(1): 5–24. doi:10.5840/teachphil19911412
  • –––, 1996, “Critical Thinking Dispositions: Their Nature and Assessability”, Informal Logic , 18(2–3): 165–182. [ Ennis 1996 available online ]
  • –––, 1998, “Is Critical Thinking Culturally Biased?”, Teaching Philosophy , 21(1): 15–33. doi:10.5840/teachphil19982113
  • –––, 2011, “Critical Thinking: Reflection and Perspective Part I”, Inquiry: Critical Thinking across the Disciplines , 26(1): 4–18. doi:10.5840/inquiryctnews20112613
  • –––, 2013, “Critical Thinking across the Curriculum: The Wisdom CTAC Program”, Inquiry: Critical Thinking across the Disciplines , 28(2): 25–45. doi:10.5840/inquiryct20132828
  • –––, 2016, “Definition: A Three-Dimensional Analysis with Bearing on Key Concepts”, in Patrick Bondy and Laura Benacquista (eds.), Argumentation, Objectivity, and Bias: Proceedings of the 11th International Conference of the Ontario Society for the Study of Argumentation (OSSA), 18–21 May 2016 , Windsor, ON: OSSA, pp. 1–19. Available at http://scholar.uwindsor.ca/ossaarchive/OSSA11/papersandcommentaries/105 ; last accessed 2022 07 16.
  • –––, 2018, “Critical Thinking Across the Curriculum: A Vision”, Topoi , 37(1): 165–184. doi:10.1007/s11245-016-9401-4
  • Ennis, Robert H., and Jason Millman, 1971, Manual for Cornell Critical Thinking Test, Level X, and Cornell Critical Thinking Test, Level Z , Urbana, IL: Critical Thinking Project, University of Illinois.
  • Ennis, Robert H., Jason Millman, and Thomas Norbert Tomko, 1985, Cornell Critical Thinking Tests Level X & Level Z: Manual , Pacific Grove, CA: Midwest Publication, 3rd edition.
  • –––, 2005, Cornell Critical Thinking Tests Level X & Level Z: Manual , Seaside, CA: Critical Thinking Company, 5th edition.
  • Ennis, Robert H. and Eric Weir, 1985, The Ennis-Weir Critical Thinking Essay Test: Test, Manual, Criteria, Scoring Sheet: An Instrument for Teaching and Testing , Pacific Grove, CA: Midwest Publications.
  • Facione, Peter A., 1990a, Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction , Research Findings and Recommendations Prepared for the Committee on Pre-College Philosophy of the American Philosophical Association, ERIC Document ED315423.
  • –––, 1990b, California Critical Thinking Skills Test, CCTST – Form A , Millbrae, CA: The California Academic Press.
  • –––, 1990c, The California Critical Thinking Skills Test--College Level. Technical Report #3. Gender, Ethnicity, Major, CT Self-Esteem, and the CCTST , ERIC Document ED326584.
  • –––, 1992, California Critical Thinking Skills Test: CCTST – Form B, Millbrae, CA: The California Academic Press.
  • –––, 2000, “The Disposition Toward Critical Thinking: Its Character, Measurement, and Relationship to Critical Thinking Skill”, Informal Logic , 20(1): 61–84. [ Facione 2000 available online ]
  • Facione, Peter A. and Noreen C. Facione, 1992, CCTDI: A Disposition Inventory , Millbrae, CA: The California Academic Press.
  • Facione, Peter A., Noreen C. Facione, and Carol Ann F. Giancarlo, 2001, California Critical Thinking Disposition Inventory: CCTDI: Inventory Manual , Millbrae, CA: The California Academic Press.
  • Facione, Peter A., Carol A. Sánchez, and Noreen C. Facione, 1994, Are College Students Disposed to Think? , Millbrae, CA: The California Academic Press. ERIC Document ED368311.
  • Fisher, Alec, and Michael Scriven, 1997, Critical Thinking: Its Definition and Assessment , Norwich: Centre for Research in Critical Thinking, University of East Anglia.
  • Freire, Paulo, 1968 [1970], Pedagogia do Oprimido . Translated as Pedagogy of the Oppressed , Myra Bergman Ramos (trans.), New York: Continuum, 1970.
  • Gigerenzer, Gerd, 2001, “The Adaptive Toolbox”, in Gerd Gigerenzer and Reinhard Selten (eds.), Bounded Rationality: The Adaptive Toolbox , Cambridge, MA: MIT Press, pp. 37–50.
  • Glaser, Edward Maynard, 1941, An Experiment in the Development of Critical Thinking , New York: Bureau of Publications, Teachers College, Columbia University.
  • Groarke, Leo A. and Christopher W. Tindale, 2012, Good Reasoning Matters! A Constructive Approach to Critical Thinking , Don Mills, ON: Oxford University Press, 5th edition.
  • Halpern, Diane F., 1998, “Teaching Critical Thinking for Transfer Across Domains: Disposition, Skills, Structure Training, and Metacognitive Monitoring”, American Psychologist , 53(4): 449–455. doi:10.1037/0003-066X.53.4.449
  • –––, 2016, Manual: Halpern Critical Thinking Assessment , Mödling, Austria: Schuhfried. Available at https://pdfcoffee.com/hcta-test-manual-pdf-free.html; last accessed 2022 07 16.
  • Hamby, Benjamin, 2014, The Virtues of Critical Thinkers , Doctoral dissertation, Philosophy, McMaster University. [ Hamby 2014 available online ]
  • –––, 2015, “Willingness to Inquire: The Cardinal Critical Thinking Virtue”, in Martin Davies and Ronald Barnett (eds.), The Palgrave Handbook of Critical Thinking in Higher Education , New York: Palgrave Macmillan, pp. 77–87.
  • Haran, Uriel, Ilana Ritov, and Barbara A. Mellers, 2013, “The Role of Actively Open-minded Thinking in Information Acquisition, Accuracy, and Calibration”, Judgment and Decision Making , 8(3): 188–201.
  • Hatcher, Donald and Kevin Possin, 2021, “Commentary: Thinking Critically about Critical Thinking Assessment”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 298–322. doi: 10.1163/9789004444591_017
  • Haynes, Ada, Elizabeth Lisic, Kevin Harris, Katie Leming, Kyle Shanks, and Barry Stein, 2015, “Using the Critical Thinking Assessment Test (CAT) as a Model for Designing Within-Course Assessments: Changing How Faculty Assess Student Learning”, Inquiry: Critical Thinking Across the Disciplines , 30(3): 38–48. doi:10.5840/inquiryct201530316
  • Haynes, Ada and Barry Stein, 2021, “Observations from a Long-Term Effort to Assess and Improve Critical Thinking”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 231–254. doi: 10.1163/9789004444591_014
  • Hiner, Amanda L. 2021. “Equipping Students for Success in College and Beyond: Placing Critical Thinking Instruction at the Heart of a General Education Program”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 188–208. doi: 10.1163/9789004444591_012
  • Hitchcock, David, 2017, “Critical Thinking as an Educational Ideal”, in his On Reasoning and Argument: Essays in Informal Logic and on Critical Thinking , Dordrecht: Springer, pp. 477–497. doi:10.1007/978-3-319-53562-3_30
  • –––, 2021, “Seven Philosophical Implications of Critical Thinking: Themes, Variations, Implications”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 9–30. doi: 10.1163/9789004444591_002
  • hooks, bell, 1994, Teaching to Transgress: Education as the Practice of Freedom , New York and London: Routledge.
  • –––, 2010, Teaching Critical Thinking: Practical Wisdom , New York and London: Routledge.
  • Johnson, Ralph H., 1992, “The Problem of Defining Critical Thinking”, in Stephen P, Norris (ed.), The Generalizability of Critical Thinking , New York: Teachers College Press, pp. 38–53.
  • Kahane, Howard, 1971, Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life , Belmont, CA: Wadsworth.
  • Kahneman, Daniel, 2011, Thinking, Fast and Slow , New York: Farrar, Straus and Giroux.
  • Kahneman, Daniel, Olivier Sibony, & Cass R. Sunstein, 2021, Noise: A Flaw in Human Judgment , New York: Little, Brown Spark.
  • Kenyon, Tim, and Guillaume Beaulac, 2014, “Critical Thinking Education and Debasing”, Informal Logic , 34(4): 341–363. [ Kenyon & Beaulac 2014 available online ]
  • Krathwohl, David R., Benjamin S. Bloom, and Bertram B. Masia, 1964, Taxonomy of Educational Objectives, Handbook II: Affective Domain , New York: David McKay.
  • Kuhn, Deanna, 1991, The Skills of Argument , New York: Cambridge University Press. doi:10.1017/CBO9780511571350
  • –––, 2019, “Critical Thinking as Discourse”, Human Development, 62 (3): 146–164. doi:10.1159/000500171
  • Lipman, Matthew, 1987, “Critical Thinking–What Can It Be?”, Analytic Teaching , 8(1): 5–12. [ Lipman 1987 available online ]
  • –––, 2003, Thinking in Education , Cambridge: Cambridge University Press, 2nd edition.
  • Loftus, Elizabeth F., 2017, “Eavesdropping on Memory”, Annual Review of Psychology , 68: 1–18. doi:10.1146/annurev-psych-010416-044138
  • Makaiau, Amber Strong, 2021, “The Good Thinker’s Tool Kit: How to Engage Critical Thinking and Reasoning in Secondary Education”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 168–187. doi: 10.1163/9789004444591_011
  • Martin, Jane Roland, 1992, “Critical Thinking for a Humane World”, in Stephen P. Norris (ed.), The Generalizability of Critical Thinking , New York: Teachers College Press, pp. 163–180.
  • Mayhew, Katherine Camp, and Anna Camp Edwards, 1936, The Dewey School: The Laboratory School of the University of Chicago, 1896–1903 , New York: Appleton-Century. [ Mayhew & Edwards 1936 available online ]
  • McPeck, John E., 1981, Critical Thinking and Education , New York: St. Martin’s Press.
  • Moore, Brooke Noel and Richard Parker, 2020, Critical Thinking , New York: McGraw-Hill, 13th edition.
  • Nickerson, Raymond S., 1998, “Confirmation Bias: A Ubiquitous Phenomenon in Many Guises”, Review of General Psychology , 2(2): 175–220. doi:10.1037/1089-2680.2.2.175
  • Nieto, Ana Maria, and Jorge Valenzuela, 2012, “A Study of the Internal Structure of Critical Thinking Dispositions”, Inquiry: Critical Thinking across the Disciplines , 27(1): 31–38. doi:10.5840/inquiryct20122713
  • Norris, Stephen P., 1985, “Controlling for Background Beliefs When Developing Multiple-choice Critical Thinking Tests”, Educational Measurement: Issues and Practice , 7(3): 5–11. doi:10.1111/j.1745-3992.1988.tb00437.x
  • Norris, Stephen P. and Robert H. Ennis, 1989, Evaluating Critical Thinking (The Practitioners’ Guide to Teaching Thinking Series), Pacific Grove, CA: Midwest Publications.
  • Norris, Stephen P. and Ruth Elizabeth King, 1983, Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland.
  • –––, 1984, The Design of a Critical Thinking Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland. ERIC Document ED260083.
  • –––, 1985, Test on Appraising Observations: Manual , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland.
  • –––, 1990a, Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland, 2nd edition.
  • –––, 1990b, Test on Appraising Observations: Manual , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland, 2nd edition.
  • OCR [Oxford, Cambridge and RSA Examinations], 2011, AS/A Level GCE: Critical Thinking – H052, H452 , Cambridge: OCR. Past papers available at https://pastpapers.co/ocr/?dir=A-Level/Critical-Thinking-H052-H452; last accessed 2022 07 16.
  • Ontario Ministry of Education, 2013, The Ontario Curriculum Grades 9 to 12: Social Sciences and Humanities . Available at http://www.edu.gov.on.ca/eng/curriculum/secondary/ssciences9to122013.pdf ; last accessed 2022 07 16.
  • Passmore, John Arthur, 1980, The Philosophy of Teaching , London: Duckworth.
  • Paul, Richard W., 1981, “Teaching Critical Thinking in the ‘Strong’ Sense: A Focus on Self-Deception, World Views, and a Dialectical Mode of Analysis”, Informal Logic , 4(2): 2–7. [ Paul 1981 available online ]
  • –––, 1984, “Critical Thinking: Fundamental to Education for a Free Society”, Educational Leadership , 42(1): 4–14.
  • –––, 1985, “McPeck’s Mistakes”, Informal Logic , 7(1): 35–43. [ Paul 1985 available online ]
  • Paul, Richard W. and Linda Elder, 2006, The Miniature Guide to Critical Thinking: Concepts and Tools , Dillon Beach, CA: Foundation for Critical Thinking, 4th edition.
  • Payette, Patricia, and Edna Ross, 2016, “Making a Campus-Wide Commitment to Critical Thinking: Insights and Promising Practices Utilizing the Paul-Elder Approach at the University of Louisville”, Inquiry: Critical Thinking Across the Disciplines , 31(1): 98–110. doi:10.5840/inquiryct20163118
  • Possin, Kevin, 2008, “A Field Guide to Critical-Thinking Assessment”, Teaching Philosophy , 31(3): 201–228. doi:10.5840/teachphil200831324
  • –––, 2013a, “Some Problems with the Halpern Critical Thinking Assessment (HCTA) Test”, Inquiry: Critical Thinking across the Disciplines , 28(3): 4–12. doi:10.5840/inquiryct201328313
  • –––, 2013b, “A Serious Flaw in the Collegiate Learning Assessment (CLA) Test”, Informal Logic , 33(3): 390–405. [ Possin 2013b available online ]
  • –––, 2013c, “A Fatal Flaw in the Collegiate Learning Assessment Test”, Assessment Update , 25 (1): 8–12.
  • –––, 2014, “Critique of the Watson-Glaser Critical Thinking Appraisal Test: The More You Know, the Lower Your Score”, Informal Logic , 34(4): 393–416. [ Possin 2014 available online ]
  • –––, 2020, “CAT Scan: A Critical Review of the Critical-Thinking Assessment Test”, Informal Logic , 40 (3): 489–508. [Available online at https://informallogic.ca/index.php/informal_logic/article/view/6243]
  • Rawls, John, 1971, A Theory of Justice , Cambridge, MA: Harvard University Press.
  • Rear, David, 2019, “One Size Fits All? The Limitations of Standardised Assessment in Critical Thinking”, Assessment & Evaluation in Higher Education , 44(5): 664–675. doi: 10.1080/02602938.2018.1526255
  • Rousseau, Jean-Jacques, 1762, Émile , Amsterdam: Jean Néaulme.
  • Scheffler, Israel, 1960, The Language of Education , Springfield, IL: Charles C. Thomas.
  • Scriven, Michael, and Richard W. Paul, 1987, Defining Critical Thinking , Draft statement written for the National Council for Excellence in Critical Thinking Instruction. Available at http://www.criticalthinking.org/pages/defining-critical-thinking/766 ; last accessed 2022 07 16.
  • Sheffield, Clarence Burton Jr., 2018, “Promoting Critical Thinking in Higher Education: My Experiences as the Inaugural Eugene H. Fram Chair in Applied Critical Thinking at Rochester Institute of Technology”, Topoi , 37(1): 155–163. doi:10.1007/s11245-016-9392-1
  • Siegel, Harvey, 1985, “McPeck, Informal Logic and the Nature of Critical Thinking”, in David Nyberg (ed.), Philosophy of Education 1985: Proceedings of the Forty-First Annual Meeting of the Philosophy of Education Society , Normal, IL: Philosophy of Education Society, pp. 61–72.
  • –––, 1988, Educating Reason: Rationality, Critical Thinking, and Education , New York: Routledge.
  • –––, 1999, “What (Good) Are Thinking Dispositions?”, Educational Theory , 49(2): 207–221. doi:10.1111/j.1741-5446.1999.00207.x
  • Simon, Herbert A., 1956, “Rational Choice and the Structure of the Environment”, Psychological Review , 63(2): 129–138. doi: 10.1037/h0042769
  • Simpson, Elizabeth, 1966–67, “The Classification of Educational Objectives: Psychomotor Domain”, Illinois Teacher of Home Economics , 10(4): 110–144, ERIC document ED0103613. [ Simpson 1966–67 available online ]
  • Skolverket, 2018, Curriculum for the Compulsory School, Preschool Class and School-age Educare , Stockholm: Skolverket, revised 2018. Available at https://www.skolverket.se/download/18.31c292d516e7445866a218f/1576654682907/pdf3984.pdf; last accessed 2022 07 15.
  • Smith, B. Othanel, 1953, “The Improvement of Critical Thinking”, Progressive Education , 30(5): 129–134.
  • Smith, Eugene Randolph, Ralph Winfred Tyler, and the Evaluation Staff, 1942, Appraising and Recording Student Progress , Volume III of Adventure in American Education , New York and London: Harper & Brothers.
  • Splitter, Laurance J., 1987, “Educational Reform through Philosophy for Children”, Thinking: The Journal of Philosophy for Children , 7(2): 32–39. doi:10.5840/thinking1987729
  • Stanovich Keith E., and Paula J. Stanovich, 2010, “A Framework for Critical Thinking, Rational Thinking, and Intelligence”, in David D. Preiss and Robert J. Sternberg (eds), Innovations in Educational Psychology: Perspectives on Learning, Teaching and Human Development , New York: Springer Publishing, pp 195–237.
  • Stanovich Keith E., Richard F. West, and Maggie E. Toplak, 2011, “Intelligence and Rationality”, in Robert J. Sternberg and Scott Barry Kaufman (eds.), Cambridge Handbook of Intelligence , Cambridge: Cambridge University Press, 3rd edition, pp. 784–826. doi:10.1017/CBO9780511977244.040
  • Tankersley, Karen, 2005, Literacy Strategies for Grades 4–12: Reinforcing the Threads of Reading , Alexandria, VA: Association for Supervision and Curriculum Development.
  • Thayer-Bacon, Barbara J., 1992, “Is Modern Critical Thinking Theory Sexist?”, Inquiry: Critical Thinking Across the Disciplines , 10(1): 3–7. doi:10.5840/inquiryctnews199210123
  • –––, 1993, “Caring and Its Relationship to Critical Thinking”, Educational Theory , 43(3): 323–340. doi:10.1111/j.1741-5446.1993.00323.x
  • –––, 1995a, “Constructive Thinking: Personal Voice”, Journal of Thought , 30(1): 55–70.
  • –––, 1995b, “Doubting and Believing: Both are Important for Critical Thinking”, Inquiry: Critical Thinking across the Disciplines , 15(2): 59–66. doi:10.5840/inquiryctnews199515226
  • –––, 2000, Transforming Critical Thinking: Thinking Constructively , New York: Teachers College Press.
  • Toulmin, Stephen Edelston, 1958, The Uses of Argument , Cambridge: Cambridge University Press.
  • Turri, John, Mark Alfano, and John Greco, 2017, “Virtue Epistemology”, in Edward N. Zalta (ed.), The Stanford Encyclopedia of Philosophy (Winter 2017 Edition). URL = < https://plato.stanford.edu/archives/win2017/entries/epistemology-virtue/ >
  • Vincent-Lancrin, Stéphan, Carlos González-Sancho, Mathias Bouckaert, Federico de Luca, Meritxell Fernández-Barrerra, Gwénaël Jacotin, Joaquin Urgel, and Quentin Vidal, 2019, Fostering Students’ Creativity and Critical Thinking: What It Means in School. Educational Research and Innovation , Paris: OECD Publishing.
  • Warren, Karen J. 1988. “Critical Thinking and Feminism”, Informal Logic , 10(1): 31–44. [ Warren 1988 available online ]
  • Watson, Goodwin, and Edward M. Glaser, 1980a, Watson-Glaser Critical Thinking Appraisal, Form A , San Antonio, TX: Psychological Corporation.
  • –––, 1980b, Watson-Glaser Critical Thinking Appraisal: Forms A and B; Manual , San Antonio, TX: Psychological Corporation,
  • –––, 1994, Watson-Glaser Critical Thinking Appraisal, Form B , San Antonio, TX: Psychological Corporation.
  • Weinstein, Mark, 1990, “Towards a Research Agenda for Informal Logic and Critical Thinking”, Informal Logic , 12(3): 121–143. [ Weinstein 1990 available online ]
  • –––, 2013, Logic, Truth and Inquiry , London: College Publications.
  • Willingham, Daniel T., 2019, “How to Teach Critical Thinking”, Education: Future Frontiers , 1: 1–17. [Available online at https://prod65.education.nsw.gov.au/content/dam/main-education/teaching-and-learning/education-for-a-changing-world/media/documents/How-to-teach-critical-thinking-Willingham.pdf.]
  • Zagzebski, Linda Trinkaus, 1996, Virtues of the Mind: An Inquiry into the Nature of Virtue and the Ethical Foundations of Knowledge , Cambridge: Cambridge University Press. doi:10.1017/CBO9781139174763
How to cite this entry . Preview the PDF version of this entry at the Friends of the SEP Society . Look up topics and thinkers related to this entry at the Internet Philosophy Ontology Project (InPhO). Enhanced bibliography for this entry at PhilPapers , with links to its database.
  • Association for Informal Logic and Critical Thinking (AILACT)
  • Critical Thinking Across the European Higher Education Curricula (CRITHINKEDU)
  • Critical Thinking Definition, Instruction, and Assessment: A Rigorous Approach
  • Critical Thinking Research (RAIL)
  • Foundation for Critical Thinking
  • Insight Assessment
  • Partnership for 21st Century Learning (P21)
  • The Critical Thinking Consortium
  • The Nature of Critical Thinking: An Outline of Critical Thinking Dispositions and Abilities , by Robert H. Ennis

abilities | bias, implicit | children, philosophy for | civic education | decision-making capacity | Dewey, John | dispositions | education, philosophy of | epistemology: virtue | logic: informal

Copyright © 2022 by David Hitchcock < hitchckd @ mcmaster . ca >

  • Accessibility

Support SEP

Mirror sites.

View this site from another server:

  • Info about mirror sites

The Stanford Encyclopedia of Philosophy is copyright © 2024 by The Metaphysics Research Lab , Department of Philosophy, Stanford University

Library of Congress Catalog Data: ISSN 1095-5054

Promoting and Assessing Critical Thinking

Critical thinking is a high priority outcome of higher education – critical thinking skills are crucial for independent thinking and problem solving in both our students’ professional and personal lives. But, what does it mean to be a critical thinker and how do we promote and assess it in our students? Critical thinking can be defined as being able to examine an issue by breaking it down, and evaluating it in a conscious manner, while providing arguments/evidence to support the evaluation. Below are some suggestions for promoting and assessing critical thinking in our students.

Thinking through inquiry

Asking questions and using the answers to understand the world around us is what drives critical thinking. In inquiry-based instruction, the teacher asks students leading questions to draw from them information, inferences, and predictions about a topic. Below are some example generic question stems that can serve as prompts to aid in generating critical thinking questions. Consider providing prompts such as these to students to facilitate their ability to also ask these questions of themselves and others. If we want students to generate good questions on their own, we need to teach them how to do so by providing them with the structure and guidance of example questions, whether in written form, or by our use of questions in the classroom.

Generic question stems

  • What are the strengths and weaknesses of …?
  • What is the difference between … and …?
  • Explain why/how …?
  • What would happen if …?
  • What is the nature of …?
  • Why is … happening?
  • What is a new example of …?
  • How could … be used to …?
  • What are the implications of …?
  • What is … analogous to?
  • What do we already know about …?
  • How does … affect …?
  • How does … tie in with what we have learned before?
  • What does … mean?
  • Why is … important?
  • How are … and … similar/different?
  • How does … apply to everyday life?
  • What is a counterarguement for …?
  • What is the best …and why?
  • What is a solution to the problem of …?
  • Compare … and … with regard to …?
  • What do you think causes …? Why?
  • Do you agree or disagree with this statement? What evidence is there to support your answer?
  • What is another way to look at …?

Critical thinking through writing

Another essential ingredient in critical thinking instruction is the use of writing. Writing converts students from passive to active learners and requires them to identify issues and formulate hypotheses and arguments. The act of writing requires students to focus and clarify their thoughts before putting them down on paper, hence taking them through the critical thinking process. Writing requires that students make important critical choices and ask themselves (Gocsik, 2002):

  • What information is most important?
  • What might be left out?
  • What is it that I think about this subject?
  • How did I arrive at what I think?
  • What are my assumptions? Are they valid?
  • How can I work with facts, observations, and so on, in order to convince others of what I think?
  • What do I not yet understand?

Consider providing the above questions to students so that they can evaluate their own writing as well. Some suggestions for critical thinking writing activities include:

  • Give students raw data and ask them to write an argument or analysis based on the data.
  • Have students explore and write about unfamiliar points of view or “what if” situations.
  • Think of a controversy in your field, and have the students write a dialogue between characters with different points of view.
  • Select important articles in your field and ask the students to write summaries or abstracts of them. Alternately, you could ask students to write an abstract of your lecture.
  • Develop a scenario that place students in realistic situations relevant to your discipline, where they must reach a decision to resolve a conflict.

See the Centre for Teaching Excellence (CTE) teaching tip “ Low-Stakes Writing Assignments ” for critical thinking writing assignments.

Critical thinking through group collaboration

Opportunities for group collaboration could include discussions, case studies, task-related group work, peer review, or debates. Group collaboration is effective for promoting critical thought because:

  • An effective team has the potential to produce better results than any individual,
  • Students are exposed to different perspectives while clarifying their own ideas,
  • Collaborating on a project or studying with a group for an exam generally stimulates interest and increases the understanding and knowledge of the topic.

See the CTE teaching tip “ Group Work in the Classroom: Types of Small Groups ” for suggestions for forming small groups in your classroom.

Assessing critical thinking skills

You can also use the students’ responses from the activities that promote critical thinking to assess whether they are, indeed, reaching your critical thinking goals. It is important to establish clear criteria for evaluating critical thinking. Even though many of us may be able to identify critical thinking when we see it, explicitly stated criteria help both students and teachers know the goal toward which they are working. An effective criterion measures which skills are present, to what extent, and which skills require further development. The following are characteristics of work that may demonstrate effective critical thinking:

  • Accurately and thoroughly interprets evidence, statements, graphics, questions, literary elements, etc.
  • Asks relevant questions.
  • Analyses and evaluates key information, and alternative points of view clearly and precisely.
  • Fair-mindedly examines beliefs, assumptions, and opinions and weighs them against facts.
  • Draws insightful, reasonable conclusions.
  • Justifies inferences and opinions.
  • Thoughtfully addresses and evaluates major alternative points of view.
  • Thoroughly explains assumptions and reasons.

It is also important to note that assessment is a tool that can be used throughout a course, not just at the end. It is more useful to assess students throughout a course, so you can see if criteria require further clarification and students can test out their understanding of your criteria and receive feedback. Also consider distributing your criteria with your assignments so that students receive guidance about your expectations. This will help them to reflect on their own work and improve the quality of their thinking and writing.

See the CTE teaching tip sheets “ Rubrics ” and “ Responding to Writing Assignments: Managing the Paper Load ” for more information on rubrics.

If you would like support applying these tips to your own teaching, CTE staff members are here to help.  View the  CTE Support  page to find the most relevant staff member to contact. 

  • Gocsik, K. (2002). Teaching Critical Thinking Skills. UTS Newsletter, 11(2):1-4
  • Facione, P.A. and Facione, N.C. (1994). Holistic Critical Thinking Scoring Rubric. Millbrae, CA: California Academic Press. www.calpress.com/rubric.html (retrieved September 2003)
  • King, A. (1995). Inquiring minds really do want to know: using questioning to teach critical thinking. Teaching of Psychology, 22(1): 13-17
  • Wade, C. and Tavris, C. (1987). Psychology (1st ed.) New York: Harper. IN: Wade, C. (1995). Using Writing to Develop and Assess Critical Thinking. Teaching of Psychology, 22(1): 24-28.

teaching tips

Catalog search

Teaching tip categories.

  • Assessment and feedback
  • Blended Learning and Educational Technologies
  • Career Development
  • Course Design
  • Course Implementation
  • Inclusive Teaching and Learning
  • Learning activities
  • Support for Student Learning
  • Support for TAs

Cart

  • SUGGESTED TOPICS
  • The Magazine
  • Newsletters
  • Managing Yourself
  • Managing Teams
  • Work-life Balance
  • The Big Idea
  • Data & Visuals
  • Reading Lists
  • Case Selections
  • HBR Learning
  • Topic Feeds
  • Account Settings
  • Email Preferences

A Short Guide to Building Your Team’s Critical Thinking Skills

  • Matt Plummer

how can critical thinking be measured

Critical thinking isn’t an innate skill. It can be learned.

Most employers lack an effective way to objectively assess critical thinking skills and most managers don’t know how to provide specific instruction to team members in need of becoming better thinkers. Instead, most managers employ a sink-or-swim approach, ultimately creating work-arounds to keep those who can’t figure out how to “swim” from making important decisions. But it doesn’t have to be this way. To demystify what critical thinking is and how it is developed, the author’s team turned to three research-backed models: The Halpern Critical Thinking Assessment, Pearson’s RED Critical Thinking Model, and Bloom’s Taxonomy. Using these models, they developed the Critical Thinking Roadmap, a framework that breaks critical thinking down into four measurable phases: the ability to execute, synthesize, recommend, and generate.

With critical thinking ranking among the most in-demand skills for job candidates , you would think that educational institutions would prepare candidates well to be exceptional thinkers, and employers would be adept at developing such skills in existing employees. Unfortunately, both are largely untrue.

how can critical thinking be measured

  • Matt Plummer (@mtplummer) is the founder of Zarvana, which offers online programs and coaching services to help working professionals become more productive by developing time-saving habits. Before starting Zarvana, Matt spent six years at Bain & Company spin-out, The Bridgespan Group, a strategy and management consulting firm for nonprofits, foundations, and philanthropists.  

Partner Center

Thinking Critically and Analytically about Critical-Analytic Thinking: an Introduction

  • Published: 09 October 2014
  • Volume 26 , pages 469–476, ( 2014 )

Cite this article

how can critical thinking be measured

  • Patricia A. Alexander 1  

3101 Accesses

30 Citations

1 Altmetric

Explore all metrics

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

Alexander, P. A. and the Disciplined Reading and Learning Research Laboratory (2012). Reading into the future: competence for the 21st century. Educational Psychologist, 47 (4), 1–22. doi: 10.1080/00461520.2012.722511 .

Alexander, P. A., Dinsmore, D. L., Fox, E., Grossnickle, E. M., Loughlin, S. M., Maggioni, L., Parkinson, M. M., & Winters, F. I. (2011). Higher-order thinking and knowledge: domain-general and domain-specific trends and future directions. In G. Schraw & D. Robinson (Eds.), Assessment of higher order thinking skills (pp. 47–88). Charlotte: Information Age Publishers.

Google Scholar  

Arum, R., & Roksa, J. (2011). Academically adrift: limited learning on college campuses . Chicago: University of Chicago Press.

Bassok, M., Dunbar, K. N., & Holyoak, K. J. (2012). Neural substrate of analogical reasoning and metaphor comprehension: introduction to the special section. Journal of experimental psychology: learning, memory, and cognition, 38 (2), 261–263.

Bloom, B. S. (Ed.) (1956) Taxonomy of educational objectives, the classification of educational goals—Handbook I: cognitive domain. New York: McKay.

Bolger, D. J., Mackey, A. P., Wang, M., & Grigorenko, E. L. (2014). The role and sources of individual differences in critical-analytic thinking: a capsule overview. Educational Psychology Review. doi: 10.1007/s10648-014-9279-x .

Bolger, D. J., Balass, M., Landen, E., & Perfetti, C. A. (2008). Contextual variation and definitions in learning the meanings of words. Discourse processes, 45 , 122–159.

Article   Google Scholar  

Brown, N. J. S., Afflerbach, P., and Croninger, R. G. (2014). Assessment of critical-analytic thinking. Educational Psychology Review. doi: 10.1007/s10648-014-9280-4 .

Byrnes, J. P., & Dunbar, K. N. (2014). The nature and development of critical-analytic thinking. Educational Psychology Review. doi: 10.1007/s10648-014-9284-0 .

Case, R. (2005). Moving critical thinking to the main stage. Education Canada, 45 (2), 45–49.

Dewey, J. D. (1933). How we think, a restatement of the relation of reflective thinking to the educative process . Boston: D. C. Heath.

Dunbar, K. N., & Klahr, D. (2012). Scientific thinking and reasoning. In K. J. Holyoak & R. Morrison (Eds.), Oxford handbook of thinking and reasoning (pp. 701–718). New York: Oxford Press.

Farah, M. J. (2010). Mind, brain and education in socioeconomic context. In M. Ferarri & L. Vuletic (Eds.), The developmental relations of mind, brain and education (pp. 243–256). New York: Springer.

Chapter   Google Scholar  

Halpern, D. F. (2001). Assessing the effectiveness of critical thinking instruction. The Journal of General Education, 50 (4), 270–286.

Holyoak, K. J. (2012). Analogy and relational reasoning. In K. J. Holyoak & R. G. Morrison (Eds.), The Oxford handbook of thinking and reasoning (pp. 234–256). New York: Oxford University Press.

Ku, K. Y. (2009). Assessing students’ critical thinking performance: urging for measurements using multi-response format. Thinking skills and creativity, 4 , 70–76.

Mackey, A. P., Hill, S. S., Stone, S. I., & Bunge, S. A. (2011). Differential effects of reasoning and speed training in children. Developmental Science, 14 , 582–590. doi: 10.1111/j.1467-7687.2010.01005.x .

Miele, D. B., & Wigfield, A. (2014). Quantitative and qualitative relations between motivation and critical-analytic thinking. Educational Psychology Review. doi: 10.1007/s10648-014-9282-2 .

Molden, D. C., & Miele, D. B. (2008). The origins and influences of promotion-focused and prevention-focused achievement motivations. In M. Maehr, S. Karabenick, & T. Urdan (Eds.), Advances in motivation and achievement: social psychological perspectives on motivation and achievement (pp. 81–118). Bingley, UK: Emerald.

Murphy, P. K., Wilkinson, I. A. G., Soter, A. O., Hennessey, M. N., & Alexander, J. F. (2009). Examining the effects of classroom discussion on students’ comprehension of text: a meta-analysis. Journal of Educational Psychology, 101 , 740–764.

Murphy, P. K., Rowe, M. L., Ramani, G., and Silverman, R. (2014). Promoting critical-analytic thinking in children and adolescents at home and in school. Educational Psychology Review. doi: 10.1007/s10648-014-9281-3 .

National Governors Association Center for Best Practices, Council of Chief State School Officers. (2010). Common core standards (National Governors Association Center for Best Practices, Council of Chief State School Officers, Washington DC). Retrieved from http://www.corestandards.org/the-standards

NGSS Lead States. (2013). Next Generation Science Standards: for states, by states. Retrieved from: http://www.nextgenscience.org/next-generation-science-standards .

Partnership for Assessment of Readiness for College and Careers. (2014). Grades 6–11 condensed scoring rubric for prose constructed response items. Retrieved from www.parcconline.org/samples/english-language-artsliteracy/grades-6-11-generic-rubrics .

Pintrich, P. R., Marx, R. W., & Boyle, R. A. (1993). Beyond cold conceptual change: the role of motivational beliefs and classroom contextual factors in the process of conceptual change. Review of Educational Research, 63 , 167–199.

Ramani, G. B., Brownell, C. A., & Campbell, S. B. (2010). Positive and negative peer interaction in 3- and 4-year-olds in relation to regulation and dysregulation. The Journal of genetic psychology, 171 (3), 218–250.

Rowe, M. L. (2008). Child-directed speech: relation to socioeconomic status, knowledge of child development, and child vocabulary skill. Journal of Child Language, 35 , 185–205.

Schraw, G., & Robinson, D. H. (2012). Assessment of higher order thinking skills . Charlotte, NC: Information Age Publishers.

Smarter Balanced Assessment Consortium (2013). Quarterly report: year 3 quarter 2 (January–March). Retrieved from http://www.smarterbalanced.org/wordpress/wp-content/uploads/2013/10/Quarterly-Report-March-2013.pdf

Wang, M. (2011). Learning a second language. In R. E. Mayer & P. A. Alexander (Eds.), Handbook of research on learning and instruction (pp. 127–147). New York: Routledge.

Wentzel, K. R. (2009a). Peer relationships and motivation at school. In K Rubin, W. Bukowski, & B. Laursen (Eds.), Handbook on peer relationships (pp. 531–547) . New York: Guilford.

Wentzel, K. R. (2009b). Students’ relationships with teachers as motivational contexts. In K. Wentzel & A. Wigfield (Eds.), Handbook of motivation at school (pp. 301–322). Mahwah, NJ: Lawrence Erlbaum Associates.

Wentzel, K. R., & Wigfield, A. (2009). Handbook of motivation at school. New York: Taylor Francis

Westheimer, J. (2008, May). No child left thinking: democracy at-risk in American schools. Independent School Magazine. Accessed at http://www.nais.org/publications/ismagazinearticle.cfm?ItemNumber=150654

Whewell, W. (1840). The philosophy of inductive sciences. London: Parker.

Wigfield, A., & Guthrie, J. T. (2010). The impact of concept-oriented reading instruction on students’ reading motivation, reading engagement, and reading comprehension. In J. Meece & J. S. Eccles (Eds.), Handbook on schools, schooling, and human development . Mahwah: Lawrence Erlbaum Associates.

Wilson, E. O. (1998). Consilience: the unity of knowledge . New York: Knopf.

Download references

Acknowledgments

The invitation conference that was the catalyst for this special issue was supported in part by a grant from the Interdisciplinary Research Conference Program of the American Education Research Association, and funding from the College of Education, University of Maryland, and the Department of Human Development and Quantitative Methodology.

Author information

Authors and affiliations.

Department of Human Development and Quantitative Methodology, University of Maryland, College Park, MD, 20742-1131, USA

Patricia A. Alexander

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Patricia A. Alexander .

Rights and permissions

Reprints and permissions

About this article

Alexander, P.A. Thinking Critically and Analytically about Critical-Analytic Thinking: an Introduction. Educ Psychol Rev 26 , 469–476 (2014). https://doi.org/10.1007/s10648-014-9283-1

Download citation

Published : 09 October 2014

Issue Date : December 2014

DOI : https://doi.org/10.1007/s10648-014-9283-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Epistemic Belief
  • Common Core State Standard
  • Generation Science Standard
  • Varied Discipline
  • Educational Psychology Review
  • Find a journal
  • Publish with us
  • Track your research

how can critical thinking be measured

Frequently Asked Questions

Who uses ia assessments.

Educational institutions utilize IA critical thinking assessments for several key purposes:

  • Selective Admissions: Identify students with strong critical thinking abilities, ensuring high-performing cohorts.
  • Student Advising and Retention Programs: Guide students based on their critical thinking strengths and areas for improvement, aiding in retention.
  • Documentation of Student Learning Outcomes: Measure and showcase the effectiveness of educational programs in cultivating critical thinking.
  • Student Development and Enrichment Programs: Design initiatives that cater to students' needs.
  • Program Evaluation and Accreditation: Maintain standards and achieve accreditation by demonstrating a commitment to critical thinking education.
  • Predicting Successful Professional Licensure: Assess the likelihood of students achieving professional licenses based on their critical thinking prowess.
  • Reporting to Performance-based Funding Agencies: Substantiate the quality of education and students' critical thinking advancement, ensuring sustained funding.
  • Educational Research Studies: Contribute to studies by reputable organizations like NSF, DOE, RWJ, and others, pushing forward the frontiers of understanding in critical thinking education.

Employers utilize IA critical thinking assessments in various strategic ways:

  • Global Applicant Screening : Ensure candidates, regardless of their location, possess the desired critical thinking skills and mindset.
  • Hiring and On-boarding : Place emphasis on self-development in critical thinking from the initial stages, laying the groundwork for future performance.
  • Employee Development : Initiate training programs and evaluation measures centered on enhancing problem-solving and risk management capabilities through critical thinking.
  • Leadership Identification : Recognize and promoting employees to unit leadership and management roles based on their critical thinking proficiencies.

Military and government organizations harness the power of IA Critical Thinking Assessments in multiple crucial areas:

  • Hiring Practices : Ensure that personnel have a documented strength in critical thinking to handle complex and high-stakes tasks.
  • Training Effectiveness : Evaluate the success of training initiatives in imparting and strengthening critical thinking capabilities.
  • Emergency Response Evaluation : Gauge the risk management and problem-solving skills and mindset of those responsible for emergency situations, ensuring timely and effective responses.
  • Service Improvement : Deploy programs aimed at enhancing the efficacy and reliability of essential government services through informed, critical thinking.
  • Innovation and Strategy : Identify individuals with exceptional problem-solving abilities, empowering them to create strategic options for challenges.
  • Military Decision-Making : Bolster the decision-making prowess of military personnel, facilitating improved mission planning, execution, and post-action evaluation.

*Insight Assessment is an approved GSA Vendor 

  • Selective Admissions : Select students based on their inherent critical thinking abilities, ensuring an enriched learning environment.
  • Program Evaluation : Continuously assess and refine educational programs with a focus on nurturing critical thinking.
  • Accreditation Review : Support and substantiate the school's commitment to promoting critical thinking during accreditation processes.
  • Documenting Learning Goals : Track and showcase the achievement of set critical thinking milestones within the learning journey.
  • Advising and Enrichment : Tailor programs to bolster the reasoning skills and thinking mindset of students, catering to their individual needs.
  • Celebrating Thinking Prowess : Recognize and celebrate students who display exceptional strength in reasoning skills and mindset, motivating and setting standards for peers.

Hospitals and medical centers utilize critical thinking training and assessments to ensure the highest standards in patient care and staff competence:

  • Hiring Practices : Prioritize critical thinking strengths when hiring professionals across all levels to ensure effective decision-making in patient care.
  • Benchmarking : Compare potential hires against a national pool of applicants, ensuring only the best critical thinkers are selected.
  • On-boarding Guidance : Ensure that newly-hired clinical professionals align with the center's critical thinking standards from day one.
  • Staff Development : Offer self-improvement initiatives to existing staff and clinical managers, focusing on elevating their critical thinking abilities.
  • Evaluating Practice Initiatives : Assess the impact and effectiveness of new medical practices and protocols through the lens of critical thinking.

Designing Your Project

IA assessments are specifically designed for organizations looking to evaluate the critical thinking skills and mindset of their candidates, students, or employees. Our assessments cater to various sectors including educational institutions, corporate entities, training programs, and more. If understanding and advancing the critical thinking capabilities of your members is vital to your objectives, then our assessments would be an excellent fit. Remember, our aim is ‘Advancing Thinking Worldwide,’ and we’re here to help organizations like yours achieve that goal.

For assistance selecting the best critical thinking skills or mindset assessment for your project, please contact our customer support representatives. 

The assessment instruments we provide build on a multidisciplinary international expert consensus research project that defined "critical thinking" and identified the thinking skills inherent in the human cognitive process of purposeful reflective judgmen t.  Upon that foundation we add the subsequent more than 30 years of international endorsement of these now well-defined concepts.  Validity and reliability coefficients meet the highest standards for all instruments (see additional comments below). Each assessment has been psychometrically evaluated in collaboration with researchers, educators, trainers, and working professionals, to assure cultural sensitivity and language competence for its intended test-taker group. The Executive Summary of that international expert consensus project, " The Delphi Report ," is available free in the Resources section of our website.

We’ve piloted items/scales across diverse populations, including trainees, students, employees, military personnel, and more. An extensive body of both collaborative and independent research underpins our tools. We pride ourselves on the cultural and linguistic competence of our instruments, thanks to our work with global scholars. Ultimately, the validity and reliability coefficients of all our assessments meet the industry’s highest standards, confirming our unwavering commitment to advancing critical thinking. 

Our assessment instruments measuring reasoning skills and thinking mindset are developed and studied in representative samples appropriate to each tailored instrument. For example, assessments developed for each population group (undergraduate students, graduate students, employees and trainees, military officers and enlisted personnel, government employees, children at every K-12 grade level, health professionals across the spectrum of health care disciplines, law students, MBA students, technical and community college students, and the general population) are examined in validation studies to assure their performance in the intended population and to uncover any un-intended non-inclusive language or cultural bias. The resulting item pool and measurement scales used in our assessments are updated to maintain their validity, reliability, and cultural relevance.

Multiple choice items are used to measure reasoning skills. These are not memory items or content knowledge recall items. Instead, our questions provide the content to which the test-taker must apply their thinking skills. Consistent with best practices in the measurement of internal motivations, intentions, and beliefs, carefully worded Likert-style items measure thinking mindset attributes.  The scale metrics of the reasoning skills assessments capture the cognitive skills needed for strength in critical thinking as defined in APA Delphi Report as intrinsic to critical thinking skill.  The mindset metrics are framed using this study’s description of the ideal critical thinker informed by discussions with industry leaders and expert teachers and trainers of reasoning and decision-making. The resulting instruments are built using scales that have been validated separately and as a group in the composite assessment and will have been tested against social desirability bias. 

For information relating to content validity; construct validity; convergent construct validity; discriminant/divergent validity; criterion/predictive validity; the relationship of critical thinking to age, sex, grade point average, or educational opportunity; controls for social desirability response bias; internal consistency; and test-retest reliability see the user manual for the assessment instrument you are considering. 

Calibration to Project Needs: Ensure the assessment is the appropriate level for those being evaluated and measures the skills and/or mindset that the project is looking to analyze. Ensure language comprehension for accurate assessment outcomes. This is why our assessments cater to various ages, educational levels, roles, and languages. Recognize the significance of cultural relevance in real-world question scenarios. At Insight Assessment, we work globally to ensure our assessments’ cultural sensitivity.

IA takes pride in being a global leader in critical thinking assessment. Our online tools have been meticulously crafted to offer a multi-language experience, ensuring both language flexibility and cultural sensitivity. Our instruments have been designed to be as universally applicable as possible, minimizing cultural biases and ensuring the relevancy of questions across diverse backgrounds. This means whether you’re a university in Asia assessing American candidates or a European company evaluating local talents, our tools are adaptable and relevant. Presently, our assessments are trusted by institutions in over sixty nations and are available in over twenty languages. Our commitment is to provide assessments that transcend cultural and linguistic boundaries, ensuring validity and reliability worldwide.

Insight Assessment partners with Measured Reasons to offer professional training and workshops for organizations. For individuals looking to hone their critical thinking skills and mindset, we recommend visiting Insight Basecamp.

Absolutely. Clients can incorporate up to 10 custom demographic questions for test-takers to complete upon their initial login. These questions facilitate more detailed analysis of assessment results based on categories like department, years of service, education level, geographic location, and more. Possible responses can be open-ended or provide a drop-down menu of predefined options. Reach out to our assessment specialists for assistance in framing these questions and tailoring them to your project’s needs.

Numeracy, often referred to as quantitative reasoning, is the ability to reason and solve problems involving numbers. It’s vital in various domains:

  • STEM Careers : Central to professions in science, technology, engineering, and mathematics.
  • Business & Military : Essential for decision-making, risk management, and strategic planning.
  • Health Care : Crucial for patient management, dosage determination, treatment evaluation, and ensuring patient safety.
  • Development & Risk Management : Important for developers and managers dealing with probabilities, estimates, and likelihoods.

Measuring numeracy ensures that professionals in these fields possess the foundational skills necessary for effective decision-making and problem-solving.

Testing & Administration

Our test-taker interface is designed for universal accessibility. It can be accessed on computers, tablets, and smartphones using any standard browser, online, worldwide, 24/7/365 without the need for additional software. If you have specific compatibility concerns or organizational needs, please contact your assessment representative for assistance.

  • Interface Accessibility : Our test-taker interface is WCAG 2.0 AA level, ensuring it meets recognized accessibility standards.
  • Extended Time : We can provide logins for users requiring additional time to complete an assessment.
  • Cognitive Processes : Since our tests measure cognitive problem analysis and problem-solving, there are no accommodations that can alter this core function of the test without affecting its purpose.
  • Responsibility & Accommodations : While we aim to make our assessments as accessible as possible, specific accommodation decisions remain with the purchaser. If you encounter a limitation or need guidance on potential accommodations, please contact us. Common accommodations include providing additional time or reading test items aloud to test-takers without interpreting or explaining items.
  • Test Security : All accommodations must uphold the terms and conditions relating to test security. This includes no unauthorized duplication, copying, digital capture of test items, or administration via non-IA software.

Yes, IA provides assessments in various languages to cater to our global clientele. The available languages for each mindset or skills assessment can be found on its respective product page. We’re committed to ensuring that language isn’t a barrier to critical thinking assessment.

Yes. We support anonymous testing for clients and projects that do not want the end user’s name or other personal identifying information to appear on reports. Contact our assessment support team to discuss your specific project needs.

Our thinking mindset assessments utilize a Likert format (agree, disagree) and typically measure six to ten mindset attributes relevant to the assessed population. The number of items varies based on the population, but all assessments are designed to be completed within 25 minutes.

Our thinking skills assessments often include short scenarios, images, or graphics with 4 or 5 response options. The number of questions varies by the population, with scores reported for individual skills and an overall critical thinking score. All assessments in this category are designed for completion within a 45-60 minute period. Check our Products page for more details.

For details about the time allotted for any specific assessment, see that assessment’s product page.

While there’s no specific “test prep” required for a critical thinking assessment, you can enhance your performance by practicing some general strategies:

  • Familiarize Yourself: Get to know the format and types of questions used in critical thinking assessments using the free assessments available in our testing system or the quizzes and surveys on Insight Basecamp .
  • Critical Thinking Exercises:  Engage in exercises or activities that involve problem-solving, logical reasoning, and decision-making.
  • Read Widely:  Explore diverse topics and viewpoints to broaden your knowledge and perspective.
  • Practice Under Time Constraints:  Work on answering questions within the allotted time to improve your time management skills.
  • Stay Calm:  Relax on the assessment day, stay focused, and trust your critical thinking skills.

Remember that the goal of a critical thinking assessment is to measure your natural ability to think critically, so there’s no need for extensive preparation. Just be yourself and approach the assessment with a clear mind.

For those looking to strengthen their critical thinking skills and thinking mindset, try our Critical Thinking e-courses and exercises available on  Insight Basecamp .

Click the blue LOGIN button at the top right of our website.  Use the credentials supplied by your test administrator to login.  Select your assessment by clicking on it, choose a language, then you can Start.  Free Sample assessments are also available to help you become familiar with the testing interface.  For more detailed instructions, download this PDF .

Results & Reports

Assessment Report packages typically include the following components:

  • Individual score reports for each person assessed.
  • Group-level spreadsheet and presentation-ready graphics that can be downloaded as needed.
  • Customizable demographic profile data to match your privacy and data analysis requirements. Plus the option to use specific group variables for group reports on subsets of your test takers.

Clients can decide whether test-takers will have online access to their own score results. We recommend discussing the advantages and disadvantages of permitting or withholding score reports with one of our knowledgeable assessment support agents to ensure the best fit for your project.

Yes. Available percentile comparison groups for bench-marking reasoning skills assessments are listed on the product page for that assessment. Percentiles rankings are reported with the score results data delivered to the client.

Pricing & Placing an Order

New clients: Get started by reaching out to us at [email protected] or by calling (650) 697-5628. Our assessment specialists are here to discuss your project, help identify the best testing tool for you, and provide a no-obligation price quote.

Established clients: Contact your account representative directly by email, or phone (650) 697-5628 for assistance with your new project or to request a price quote. Alternatively, use our Request a Price Quote form.

We negotiate discounts for multi-year and high-volume client organizations. We offer discounts to governmental agencies, academic non-profit institutions, competitive externally funded peer-reviewed research studies, doctoral dissertation research, and 501(c)3 organizations. Contact our representatives for information about discount pricing and eligibility.

Absolutely! We believe in the value and effectiveness of our assessments and are happy to offer demos to prospective clients. To schedule a demo and explore our testing interface, please contact our customer service agents. This will give you a firsthand experience of the quality and usability of our platform before making a purchase decision.

Payments can be made via credit card, check, ACH bank transfer, or wire transfer.

Christopher Dwyer Ph.D.

Critical Thinking About Measuring Critical Thinking

A list of critical thinking measures..

Posted May 18, 2018

  • What Is Cognition?
  • Find counselling near me

In my last post , I discussed the nature of engaging the critical thinking (CT) process and made mention of individuals who draw a conclusion and wind up being correct. But, just because they’re right, it doesn’t mean they used CT to get there. I exemplified this through an observation made in recent years regarding extant measures of CT, many of which assess CT via multiple-choice questions. In the case of CT MCQs, you can guess the "right" answer 20-25% of the time, without any need for CT. So, the question is, are these CT measures really measuring CT?

As my previous articles explain, CT is a metacognitive process consisting of a number of sub-skills and dispositions, that, when applied through purposeful, self-regulatory, reflective judgment, increase the chances of producing a logical solution to a problem or a valid conclusion to an argument (Dwyer, 2017; Dwyer, Hogan & Stewart, 2014). Most definitions, though worded differently, tend to agree with this perspective – it consists of certain dispositions, specific skills and a reflective sensibility that governs application of these skills. That’s how it’s defined; however, it’s not necessarily how it’s been operationally defined.

Operationally defining something refers to defining the terms of the process or measure required to determine the nature and properties of a phenomenon. Simply, it is defining the concept with respect to how it can be done, assessed or measured. If the manner in which you measure something does not match, or assess the parameters set out in the way in which you define it, then you have not been successful in operationally defining it.

Though most theoretical definitions of CT are similar, the manner in which they vary often impedes the construction of an integrated theoretical account of how best to measure CT skills. As a result, researchers and educators must consider the wide array of CT measures available, in order to identify the best and the most appropriate measures, based on the CT conceptualisation used for training. There are various extant CT measures – the most popular amongst them include the Watson-Glaser Critical Thinking Assessment (WGCTA; Watson & Glaser, 1980), the Cornell Critical Thinking Test (CCTT; Ennis, Millman & Tomko, 1985), the California Critical Thinking Skills Test (CCTST; Facione, 1990a), the Ennis-Weir Critical Thinking Essay Test (EWCTET; Ennis & Weir, 1985) and the Halpern Critical Thinking Assessment (Halpern, 2010).

It has been noted by some commentators that these different measures of CT ability may not be directly comparable (Abrami et al., 2008). For example, the WGCTA consists of 80 MCQs that measure the ability to draw inferences; recognise assumptions; evaluate arguments; and use logical interpretation and deductive reasoning (Watson & Glaser, 1980). The CCTT consists of 52 MCQs which measure skills of critical thinking associated with induction; deduction; observation and credibility; definition and assumption identification; and meaning and fallacies. Finally, the CCTST consists of 34 multiple-choice questions (MCQs) and measures CT according to the core skills of analysis, evaluation and inference, as well as inductive and deductive reasoning.

As addressed above, the MCQ-format of these three assessments is less than ideal – problematic even, because it allows test-takers to simply guess when they do not know the correct answer, instead of demonstrating their ability to critically analyse and evaluate problems and infer solutions to those problems (Ku, 2009). Furthermore, as argued by Halpern (2003), the MCQ format makes the assessment a test of verbal and quantitative knowledge rather than CT (i.e. because one selects from a list of possible answers rather than determining one’s own criteria for developing an answer). The measurement of CT through MCQs is also problematic given the potential incompatibility between the conceptualisation of CT that shapes test construction and its assessment using MCQs. That is, MCQ tests assess cognitive capacities associated with identifying single right-or-wrong answers and as a result, this approach to testing is unable to provide a direct measure of test-takers’ use of metacognitive processes such as CT, reflective judgment, and disposition towards CT.

Instead of using MCQ items, a better measure of CT might ask open-ended questions, which would allow test-takers to demonstrate whether or not they spontaneously use a specific CT skill. One commonly used CT assessment, mentioned above, that employs an open-ended format is the Ennis-Weir Critical Thinking Essay Test (EWCTET; Ennis & Weir, 1985). The EWCTET is an essay-based assessment of the test-taker’s ability to analyse, evaluate, and respond to arguments and debates in real-world situations (Ennis & Weir, 1985; see Ku, 2009 for a discussion). The authors of the EWCTET provide what they call a “rough, somewhat overlapping list of areas of critical thinking competence”, measured by their test (Ennis & Weir, 1985, p. 1). However, this test, too, has been criticised – for its domain-specific nature (Taube, 1997), the subjectivity of its scoring protocol and its bias in favour of those proficient in writing (Adams, Whitlow, Stover & Johnson, 1996).

Another, more recent CT assessment that utilises an open-ended format is the Halpern Critical Thinking Assessment (HCTA; Halpern, 2010). The HCTA consists of 25 open-ended questions based on believable, everyday situations, followed by 25 specific questions that probe for the reasoning behind each answer. The multi-part nature of the questions makes it possible to assess the ability to use specific CT skills when the prompt is provided (Ku, 2009). The HCTA’s scoring protocol also provides comprehensible, unambiguous instructions for how to evaluate responses by breaking them down into clear, measurable components. Questions on the HCTA represent five categories of CT application: hypothesis testing (e.g. understanding the limits of correlational reasoning and how to know when causal claims cannot be made), verbal reasoning (e.g. recognising the use of pervasive or misleading language), argumentation (e.g. recognising the structure of arguments, how to examine the credibility of a source and how to judge one’s own arguments), judging likelihood and uncertainty (e.g. applying relevant principles of probability, how to avoid overconfidence in certain situations) and problem-solving (e.g. identifying the problem goal, generating and selecting solutions among alternatives).

Up until the development of the HCTA, I would have recommended the CCTST for measuring CT, despite its limitations. What’s nice about the CCTST is that it assesses the three core skills of CT: analysis, evaluation, and inference, which other scales do not (explicitly). So, if you were interested in assessing students’ sub-skill ability, this would be helpful. However, as we know, though CT skill performance is a sequence, it is also a collation of these skills – meaning that for any given problem or topic, each skill is necessary. By administrating an analysis problem, an evaluation problem and an inference problem, in which the student scores top marks for all three, it doesn’t guarantee that the student will apply these three to a broader problem that requires all three. That is, these questions don’t measure CT skill ability per se, rather analysis skill, evaluation skill and inference skill in isolation. Simply, scores may predict CT skill performance, but they don’t measure it.

how can critical thinking be measured

What may be a better indicator of CT performance is assessment of CT application . As addressed above, there are five general applications of CT: hypothesis testing, verbal reasoning, argumentation, problem-solving and judging likelihood and uncertainty – all of which require a collation of analysis, evaluation, and inference. Though the sub-skills of analysis, evaluation, and inference are not directly measured in this case, their collation is measured through five distinct applications; and, as I see it, provides a 'truer' assessment of CT. In addition to assessing CT via an open-ended, short-answer format, the HCTA measures CT according to the five applications of CT; thus, I recommend its use for measuring CT.

However, that’s not to say that the HCTA is perfect. Though it consists of 25 open-ended questions, followed by 25 specific questions that probe for the reasoning behind each answer, when I first used it to assess a sample of students, I found that in setting up my data file, there were actually 165 opportunities for scoring across the test. Past research recommends that the assessment takes roughly between 45 and 60 minutes to complete. However, many of my participants reported it requiring closer to two hours (sometimes longer). It’s a long assessment – thorough, but long. Fortunately, adapted, shortened versions are now available, and it’s an adapted version that I currently administrate to assess CT. Another limitation is that, despite the rationale above, it would be nice to have some indication of how participants get on with the sub-skills of analysis, evaluation, and inference, as I do think there’s a potential predictive element in the relationship among the individual skills and the applications. With that, I suppose it is feasible to administer both the HCTA and CCTST to assess such hypotheses.

Though it’s obviously important to consider how assessments actually measure CT and the nature in which each is limited, the broader, macro-problem still requires thought. Just as conceptualisations of CT vary, so too does the reliability and validity of the different CT measures, which has led Abrami and colleagues (2008, p. 1104) to ask: “How will we know if one intervention is more beneficial than another if we are uncertain about the validity and reliability of the outcome measures?” Abrami and colleagues add that, even when researchers explicitly declare that they are assessing CT, there still remains the major challenge of ensuring that measured outcomes are related, in some meaningful way, to the conceptualisation and operational definition of CT that informed the teaching practice in cases of interventional research. Often, the relationship between the concepts of CT that are taught and those that are assessed is unclear, and a large majority of studies in this area include no theory to help elucidate these relationships.

In conclusion, solving the problem of consistency across CT conceptualisation, training, and measure is no easy task. I think recent advancements in CT scale development (e.g. the development of the HCTA and its adapted versions) have eased the problem, given that they now bridge the gap between current theory and practical assessment. However, such advances need to be made clearer to interested populations. As always, I’m very interested in hearing from any readers who may have any insight or suggestions!

Abrami, P. C., Bernard, R. M., Borokhovski, E., Wade, A., Surkes, M. A., Tamim, R., & Zhang, D. (2008). Instructional interventions affecting critical thinking skills and dispositions: A stage 1 meta-analysis. Review of Educational Research, 78(4), 1102–1134.

Adams, M.H., Whitlow, J.F., Stover, L.M., & Johnson, K.W. (1996). Critical thinking as an educational outcome: An evaluation of current tools of measurement. Nurse Educator, 21, 23–32.

Dwyer, C.P. (2017). Critical thinking: Conceptual perspectives and practical guidelines. Cambridge, UK: Cambridge University Press.

Dwyer, C.P., Hogan, M.J. & Stewart, I. (2014). An integrated critical thinking framework for the 21st century. Thinking Skills & Creativity, 12, 43-52.

Ennis, R.H., Millman, J., & Tomko, T.N. (1985). Cornell critical thinking tests. CA: Critical Thinking Co.

Ennis, R.H., & Weir, E. (1985). The Ennis-Weir critical thinking essay test. Pacific Grove, CA: Midwest Publications.

Facione, P. A. (1990a). The California critical thinking skills test (CCTST): Forms A and B;The CCTST test manual. Millbrae, CA: California Academic Press.

Facione, P.A. (1990b). The Delphi report: Committee on pre-college philosophy. Millbrae, CA: California Academic Press.

Halpern, D. F. (2003b). The “how” and “why” of critical thinking assessment. In D. Fasko (Ed.), Critical thinking and reasoning: Current research, theory and practice. Cresskill, NJ: Hampton Press.

Halpern, D.F. (2010). The Halpern critical thinking assessment: Manual. Vienna: Schuhfried.

Ku, K.Y.L. (2009). Assessing students’ critical thinking performance: Urging for measurements using multi-response format. Thinking Skills and Creativity, 4, 1, 70- 76.

Taube, K.T. (1997). Critical thinking ability and disposition as factors of performance on a written critical thinking test. Journal of General Education, 46, 129-164.

Watson, G., & Glaser, E.M. (1980). Watson-Glaser critical thinking appraisal. New York: Psychological Corporation.

Christopher Dwyer Ph.D.

Christopher Dwyer, Ph.D., is a lecturer at the Technological University of the Shannon in Athlone, Ireland.

  • Find a Therapist
  • Find a Treatment Centre
  • Find a Support Group
  • Find Online Therapy
  • Calgary, AB
  • Edmonton, AB
  • Hamilton, ON
  • Montréal, QC
  • Toronto, ON
  • Vancouver, BC
  • Winnipeg, MB
  • Mississauga, ON
  • Oakville, ON
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Self Tests NEW
  • Therapy Center
  • Diagnosis Dictionary
  • Types of Therapy

May 2024 magazine cover

At any moment, someone’s aggravating behavior or our own bad luck can set us off on an emotional spiral that could derail our entire day. Here’s how we can face triggers with less reactivity and get on with our lives.

  • Emotional Intelligence
  • Gaslighting
  • Affective Forecasting
  • Neuroscience

Get the Reddit app

Critical Thinking 911

HOW CAN CRITICAL THINKING BE MEASURED?

Critical thinking is a crucial skill that plays a significant role in solving complex problems, making informed decisions, and evaluating arguments. It involves the ability to analyze information, identify biases and assumptions, and develop logical arguments. In today's world, where information is readily available and often conflicting, the need for critical thinking has become even more critical. Therefore, it is essential to be able to measure critical thinking accurately. In this article, we will explore different methods of measuring critical thinking, their reliability, and validity.

Critical Thinking Assessment Test (CAT)

The Critical Thinking Assessment Test (CAT) is one of the most commonly used methods of measuring critical thinking. It is a standardized test that includes multiple-choice questions and is designed to assess a person's ability to think critically. The test measures various aspects of critical thinking, including analysis, inference, evaluation, and deductive reasoning.

The CAT test is widely used in educational settings and has been shown to be reliable and valid. However, some critics argue that the CAT test does not measure real-world critical thinking skills accurately. They claim that the test is too focused on formal reasoning and does not take into account the context in which critical thinking occurs. Additionally, some critics argue that the test is culturally biased and may disadvantage certain groups of people.

Halpern Critical Thinking Assessment

The Halpern Critical Thinking Assessment is another commonly used method of measuring critical thinking. It is a test that measures a person's critical thinking skills across different domains, including verbal reasoning, argument analysis, and decision-making. The test includes multiple-choice questions and open-ended questions, and it is designed to assess a person's ability to think critically in real-world situations.

The Halpern Critical Thinking Assessment has been shown to be reliable and valid in various studies. However, some critics argue that the test is too focused on the cognitive aspects of critical thinking and does not take into account the affective and behavioral aspects of critical thinking.

Watson-Glaser Critical Thinking Appraisal

The Watson-Glaser Critical Thinking Appraisal is a test that measures critical thinking skills in various domains, including inference, deduction, interpretation, evaluation, and analysis. The test includes multiple-choice questions and is designed to assess a person's ability to think critically and solve problems in real-world situations.

The Watson-Glaser Critical Thinking Appraisal has been shown to be reliable and valid in various studies. However, some critics argue that the test is too focused on verbal reasoning and may disadvantage people who are more visually oriented.

Cornell Critical Thinking Test

The Cornell Critical Thinking Test is a test that measures critical thinking skills in various domains, including induction, deduction, credibility, and identification of assumptions. The test includes multiple-choice questions and is designed to assess a person's ability to think critically and solve problems in real-world situations.

The Cornell Critical Thinking Test has been shown to be reliable and valid in various studies. However, some critics argue that the test is too focused on analytical reasoning and does not take into account the affective and behavioral aspects of critical thinking.

California Critical Thinking Skills Test (CCTST)

The California Critical Thinking Skills Test (CCTST) is a test that measures critical thinking skills in various domains, including analysis, inference, evaluation, and deduction. The test includes multiple-choice questions and is designed to assess a person's ability to think critically and solve problems in real-world situations.

The CCTST has been shown to be reliable and valid in various studies. However, some critics argue that the test is too focused on formal reasoning and does not take into account the context in which critical thinking occurs.

International Critical Thinking Test

The International Critical Thinking Test is a test that measures critical thinking skills in various domains, including recognition of assumptions, deduction, induction, and interpretation. The test includes multiple-choice questions and is designed to assess a person's ability to think critically and solve problems in real-world situations.

The International Critical Thinking Test has been shown to be reliable and valid in various studies. However, some critics argue that the test is too focused on analytical reasoning and does not take into account the affective and behavioral aspects of critical thinking.

Measuring critical thinking is essential for identifying areas of strength and weakness in a person's critical thinking skills. Different methods of measuring critical thinking have been developed, and each has its strengths and weaknesses. However, it is important to note that no single test can accurately measure all aspects of critical thinking. Therefore, a combination of different tests and assessment methods may be the most effective way to assess a person's critical thinking skills accurately.

Have no time to work on your critical thinking? Well, we do. We will write an critical thinking sample crafted to your needs. In-time submission and academic quality guaranteed. - EditaPaper.com

More From Forbes

5 steps leaders can take to make better decisions.

  • Share to Facebook
  • Share to Twitter
  • Share to Linkedin

5 people discussing strategy as a team and making decisions on how to move forward.

If you’re plagued by indecisiveness and often doubt every decision you must make as a leader, you’re certainly not alone.

Leaders often fear making tough decisions because they are acutely aware of the potential consequences and the impact on their team and organization. This reluctance can stem from a fear of failure, the possibility of making the wrong choice, and anxiety over how others will perceive their decisions. Additionally, the weight of the responsibility and the desire to maintain harmony can make leaders hesitant to make decisions that may lead to conflict or dissatisfaction among team members.

In a global survey of 14,250 people in January 2023, Oracle and economist Seth Stephens-Davidowitz found that 85% of business leaders have suffered from decision distress – regretting, feeling guilty about, or questioning a decision they made in the past year. Not only that, but 70% of those leaders would prefer “to have a robot make their decisions.”

However, according to the Cloverpop 2023 Decision IQ Benchmark Survey , the quality of a company’s decision-making is its most valuable asset. Do leaders really want to leave their most valuable asset up to robots?

Getting to a decision point isn’t always easy. Often, these situations involve significant change, intense pressure, or even balancing opposing opinions – all of which can make figuring out what decision to make much harder.

So, what steps can you take to improve your decision-making?

1. Know You Must Make The Decision

Best high-yield savings accounts of 2024, best 5% interest savings accounts of 2024.

For many leaders, the biggest obstacle to decision-making is ignoring that a decision must be made and that the responsibility to make it rests with you. Ignoring a problem or delaying a decision, hoping it will resolve itself, is a common but flawed approach. Problems rarely disappear independently; in fact, they often escalate if not addressed promptly.

So, the first critical step is acknowledging the existence of an issue and understanding that it requires your action.

· Develop your awareness and observation skills to recognize moments when timely decisions are necessary

· Cultivate a mindset that embraces responsibility and accountability for your decision-making

· Prioritize timely actions to prevent minor issues from growing into significant challenges

2. Embrace Data-Driven Decision Making

To make effective, solid decisions, leaders shouldn’t rely solely on their “gut” as the foundation for making an important decision. Decisions made after reviewing solid data are more likely to lead to successful outcomes.

As a leader, you must combine your experience with the information presented to you and then analyze it to find the best solution.

· Collect and analyze relevant data

· Use analytical tools and software (for example: Alterx , Google Data Studio , Microsoft Power BI ) to interpret data patterns

· Combine quantitative data with qualitative insights for a more comprehensive view

Gathering information is essential–it helps ensure more objective decisions and reduces bias.

3. Promote A Collaborative Environment

Leaders often make the best decisions when they gather information from diverse perspectives. By seeking input from various stakeholders, leaders can uncover blind spots and consider a broader range of solutions.

· Cultivate an inclusive culture, so team members feel safe sharing their opinions

· Encourage open dialogue and active listening during discussions

· Ask exploratory questions to gather as much information as possible

· Leverage the collective intelligence of your team to explore various solutions

Collaboration enhances decision quality and cultivates a sense of ownership and commitment among team members.

4. Develop Your Critical Thinking Skills

Critical thinking is essential as it equips you with the skill set to analyze situations thoroughly and objectively. It helps you identify critical issues, evaluate alternatives, and make decisions based on sound reasoning.

· Be curious and ask probing questions to challenge your assumptions and examine alternatives

· Engage in regular reflective practices, such as journaling or debriefing after significant decisions

· Attend workshops or training sessions focused on enhancing critical thinking abilities

· Practice genuinely listening to team members to gather valuable insights

5. Be Decisive, But Flexible

Effective leaders know when to make decisions quickly and when to adjust their strategies based on new information. This essential flexibility ensures adaptability in dynamic environments. Balancing these two qualities helps leaders navigate uncertainties and drive their teams forward.

· Set clear priorities so you can identify which decisions can and must be made swiftly and which can afford more time for consideration

· Develop contingency plans so you can quickly pivot if circumstances change

· Review the results from the decisions made and be open to adjusting your approach based on what you learn

Decisiveness is vital in leadership because it keeps projects moving forward and demonstrates confidence to your team. Ultimately, the responsibility of decision-making rests with the leader. Since it is a pivotal asset that significantly influences an organization’s performance, leaders must develop strong decision-making skills and confidence in their ability to make the right decisions.

Dr. Samantha Madhosingh

  • Editorial Standards
  • Reprints & Permissions

Join The Conversation

One Community. Many Voices. Create a free account to share your thoughts. 

Forbes Community Guidelines

Our community is about connecting people through open and thoughtful conversations. We want our readers to share their views and exchange ideas and facts in a safe space.

In order to do so, please follow the posting rules in our site's  Terms of Service.   We've summarized some of those key rules below. Simply put, keep it civil.

Your post will be rejected if we notice that it seems to contain:

  • False or intentionally out-of-context or misleading information
  • Insults, profanity, incoherent, obscene or inflammatory language or threats of any kind
  • Attacks on the identity of other commenters or the article's author
  • Content that otherwise violates our site's  terms.

User accounts will be blocked if we notice or believe that users are engaged in:

  • Continuous attempts to re-post comments that have been previously moderated/rejected
  • Racist, sexist, homophobic or other discriminatory comments
  • Attempts or tactics that put the site security at risk
  • Actions that otherwise violate our site's  terms.

So, how can you be a power user?

  • Stay on topic and share your insights
  • Feel free to be clear and thoughtful to get your point across
  • ‘Like’ or ‘Dislike’ to show your point of view.
  • Protect your community.
  • Use the report tool to alert us when someone breaks the rules.

Thanks for reading our community guidelines. Please read the full list of posting rules found in our site's  Terms of Service.

how can critical thinking be measured

Big-Picture Scenarios Guide Law Associates’ Critical Thinking

Patricia Libby

A deficit of critical thinking among law firm associates is now a recurrent theme in legal practice. This deficiency impairs an attorney’s individual effectiveness—and it also impacts law firms’ overall efficiency. As firms more frequently integrate sophisticated tools such as generative AI, they must take extra care to ensure associates using these programs assess their output critically.

How should law firms tackle a critical thinking crisis? It may be a challenge, but using strategies such as comprehensive guidance, experiential training, and data-driven feedback can go a long way.

Big Picture

A key step in fostering critical thinking is to give associates a comprehensive lay of the land in a given case. This requires helping them first understand the basic structure of a deal or litigation matter before immersing them in the business realities and context that shape legal outcomes.

Associates can get lost in the weeds of their day-to-day work assignments. Set them up for success by giving them a strong foundational understanding of the practice.

So much of what associates miss is due to a failure to ask “why?” Ensuring associates understand where their assignments fit in the overall deal or lawsuit will sharpen their ability to know when to pursue lines of inquiry.

Experiential Learning

The traditional “lunch and learn” training format, while convenient and well-intended, often fails to engage associates in meaningful learning. Training should be interactive and demanding, requiring active participation rather than passive absorption.

The best way to achieve this is to have associates complete mock assignments and receive feedback.

For transactional associates, this could mean reviewing and marking up agreements in anticipation of negotiation, or orchestrating a closing with timelines and deal documents. For litigators, it could involve writing sections of a motion to dismiss or drafting a meet-and-confer email to opposing counsel about a discovery dispute.

Firms should create a dynamic learning environment where associates can learn by doing—and make mistakes in a safe environment where they may question, discuss, and interact with both the material and their peers.

This method reinforces legal concepts, improves retention, and encourages critical thinking by immersing associates in real-world scenarios that require problem-solving and decision-making skills. Active participation can also build confidence in newer attorneys, which empowers them to handle their everyday tasks more effectively.

As much as possible, training should use actual deal documents, complex procedural histories, and messy fact patterns that typify actual client work.

Rather than rely on simplified or simulated scenarios, training should require new associates to grapple with the square peg, round hole problems that lawyers face every day. Mock assignments should require associates to synthesize different, imperfect, and often conflicting pieces of information—and then actually complete the assignment.

While reviewing mock deal documents and issue-spotting is better than having associates listen to a lecture, it’s still too passive and doesn’t allow associates the practice needed to develop their critical thinking muscles. Having associates work with these real deal documents to complete mock assignments is crucial.

Data-Driven Feedback

In addition to having associates go through mock assignments, firms should review those assignments and provide meaningful feedback. They should base their assessment structure on a consistent rubric and watch out for any gaps in crucial critical thinking skills.

While the rubric isn’t for the purpose of giving grades, it will help firms gather key data on associate performance and fairly inform evaluator feedback. Firms can use this rubric both to give individual associate feedback and to inform observations on trends across associate classes.

A sample rubric for these assessments could determine whether the associate:

● Followed specific instructions

● Applied the correct concept and adapted it for the given situation

● Identified all legal and practical issues

● Devised a unique business solution

● Used best practices (such as best drafting techniques and approach)

Feedback from partners and other supervisors should be constructive, focusing on encouraging questions, deeper analysis, and reflection.

Firms can also have associates self-assess their performance on the assignments. Comparing the associate self-assessments with instructors’ feedback will help create awareness for associates on how others are viewing their work.

Interactive group review sessions can be especially effective for fostering critical thinking. Partners or reviewers can discuss the assignment in small groups, weave in tailored feedback, war stories, and best practices, while encouraging associates to ask “why” and discuss gray areas.

These review sessions should nurture inquiry and critical thinking by encouraging associates to question assumptions, explore alternative solutions, and discuss business context.

Developing critical thinking skills in law firm associates isn’t just about enhancing individual capabilities—it’s also about keeping up with the demands of modern legal practice. By adopting these strategies, firms can position themselves as leaders and mentors to a new generation of lawyers.

This article does not necessarily reflect the opinion of Bloomberg Industry Group, Inc., the publisher of Bloomberg Law and Bloomberg Tax, or its owners.

Author Information

Patricia Libby is executive legal editor at AltaClaro, an experiential learning platform that bridges the training gap between law school and legal practice for large firm associates.

Write for Us: Author Guidelines

To contact the editors responsible for this story: Melanie Cohen at [email protected] ; Jessie Kokrda Kamens at [email protected]

Learn more about Bloomberg Law or Log In to keep reading:

Learn about bloomberg law.

AI-powered legal analytics, workflow tools and premium legal & business news.

Already a subscriber?

Log in to keep reading or access research tools.

Home

Today’s Hours

Featured services.

how can critical thinking be measured

  • Estelle & Melvin Gelman Library
  • Eckles Library
  • Virginia Science and Technology Campus Library
  • Himmelfarb Health Sciences Library
  • Jacob Burns Law Library
  • Arthur D. Jenkins Library

how can critical thinking be measured

  • Undergraduates
  • Graduate Students
  • Faculty & Instructors
  • Off-Campus Students

The Gelman Library building is closed May 20 - August 11, 2024, for HVAC repair and upgrades . During this time, library services and resources are available. Learn how to access materials, equipment, and study spaces during the closure.

Returning library materials : The Gelman book drop is open 24/7, and you may use it to return your materials at any time. When facing the Gelman entrance, the book drop is located to the far right of the revolving doors. A standing sign with an arrow points to the book drop.

How can genAI be used to promote critical thinking skills?

Learn how genAI can be used to promote critical thinking skills.

Presenter: Alexa Alice Joubin, Ph.D. | Professor of English, East Asian Lang & Lit, and Theatre & Dance

Recorded and produced: Spring 2024

AI tool(s) demonstrated: Adobe Firefly

Home

Educational Membership icon

  • New! Member Benefit New! Member Benefit
  • Featured Analytics Hub
  • Resources Resources
  • Member Directory
  • Networking Communities
  • Advertise, Exhibit, Sponsor
  • Find or Post Jobs

Connect Icon

  • Learn and Engage Learn and Engage

how can critical thinking be measured

  • Compare AACSB-Accredited Schools
  • Explore Programs

Bullseye mission icon

  • Advocacy Advocacy
  • Featured AACSB Announces 2024 Class of Influential Leaders
  • Diversity, Equity, Inclusion, and Belonging
  • Influential Leaders
  • Innovations That Inspire
  • Connect With Us Connect With Us
  • Accredited School Search
  • Accreditation
  • Learning and Events
  • Advertise, Sponsor, Exhibit
  • Tips and Advice
  • Is Business School Right for Me?

The Role of Humans in an AI World

Video Icon

  • Now more than ever, b-school curriculum needs to complement technical skills with human creativity and other unique characteristics, to enhance the capabilities of both.
  • Business educators need to teach students how to think critically about employing AI to identify problems and interpret solutions, rather than allowing AI tools, such as ChatGPT, to determine what’s relevant.
  • A new value framework—based on critical thinking and purposeful analysis—can help guide business schools, and accordingly students, to more meaningful solutions.

Ana Freire : [0:15] In my view, AI will replace everything. There are some human characteristics that need to be in place always, such as creativity, or the natural intelligence, the common sense, and many other features inherent from human behavior.

[0:32] We just need to find the way in how to basically combine artificial intelligence and the most wonderful characteristics of human beings in order to multiply the effects that both parts can generate together.

[0:51] Higher education institutions have the responsibility to not just teach the technological content, which is behind artificial intelligence, but also soft skills like communication or critical thinking in order to give the future leaders the opportunity to decide or to augment their own capabilities in order to multiply the effects of artificial intelligence when it's needed.

[1:17] Because maybe not in all environments and in all tasks artificial intelligence will be needed in our work.

Higher education institutions have the responsibility to not just teach the technological content, but also soft skills like communication or critical thinking.

David De Cremer : [1:26] Creativity is seen as something uniquely human. If you look at what creativity is about, it's really finding new solutions to problems that are relevant and meaningful to us.

[1:37] If you use that definition and we apply it to generative AI like ChatGPT, we see that there are a number of skills that our educators should train. First of all, ChatGPT provides solutions, it generates solutions, but who phrases the question? A human.

[1:54] It's about identifying a problem and then specifying in a question. It generates something, but who interprets what is generated? A human as well, because it needs to be seen as meaningful and relevant to a human. These are two important skills.

[2:10] Identifying the question relates to use generative AI in a way that you push our students to think critically. What are the big business questions? What are the big questions in life that business can help? Those are the problems that we define.

[2:24] That's related to what we call prompt engineering as well. If you know the right question, do you also know how to prompt ChatGPT to come up with an answer that's relevant?

Prompt engineering is a good skill to have, but it's not the job of the future.

[2:35] Prompt engineering is not the job of the future, I must say. Some people think it still is the case, but you have to remember ChatGPT is probabilistic. It's not deterministic, which means sometimes even with the same prompt, it may generate a different answer.

[2:52] Prompt engineering is a good skill to have, but it's not the job of the future. The job of the future in my view, is much more looking at what has been generated when you ask the right kind of question. That's based on your own purpose, what kind of value you want to create.

[3:06] That's why critical thinking about what is it that I'm doing, what is the value of my business, is important because you interpret from that framework. What's going to be the job of the future is really a content analyst.

[3:17] As humans, we participate in the real world. AI doesn't, so we assess the relevance and the meaning of it, and we can do so because we are active participants.

[3:28] Being a content analyst, knowing this is an outcome that ChatGPT generated, how can I transfer that into knowledge that I can use to come up with a solution for a problem? Those are skills that as educators, we really need to foster and ChatGPT is a very helpful tool to help in that process.

  • artificial intelligence
  • critical thinking
  • future of work
  • higher education
  • soft skills

Article Icon

Democrats are talking about replacing Joe Biden. That wouldn't be so easy.

President Joe Biden's performance in the first debate Thursday has sparked a new round of criticism from Democrats , as well as public and private musing about whether he should remain at the top of the ticket.

In the modern era, a national party has never tried to adversarially replace its nominee, in part, because knows it would most likely fail. The issue came before both parties in 2016, but neither took action.

Party rules make it almost impossible to replace nominees without their consent, let alone smoothly replace them with someone else. And doing so would amount to party insiders’ overturning the results of primaries when Democratic voters overwhelmingly to nominate Biden. He won almost 99% of all delegates.

And at the moment, there’s no known, serious effort to push him off the top of the ticket.

Still, the Democratic National Committee's charter does make some provisions in case the party’s nominee is incapacitated or opts to step aside, and an anti-Biden coup at the convention is theoretically possible, if highly unlikely. So how would it work?

What happens if Biden drops out before the convention?

The only plausible scenario for Democrats to get a new nominee would be for Biden to decide to withdraw, which he has sworn off repeatedly during other bumpy stretches of his campaign.  

He could do so while serving out the rest of his term in the White House, as Lyndon Johnson did in 1968. 

If Biden were to drop out before he is scheduled to be formally nominated in August, it would create a free-for-all among Democrats, because there’s no mechanism for him or anyone else to anoint a chosen successor.

It takes a majority of the roughly 4,000 pledged delegates to win the party’s nomination. Biden’s won 3,900 of them. Under recent reforms, the party’s more than 700 superdelegates — Democratic lawmakers and dignitaries — are allowed to vote only if no one wins a majority of pledged delegates on the first ballot, so their votes could be crucial in a contested convention. 

Because Biden's opponents all won effectively no delegates throughout the Democratic nominating process, there'd be a virtual clean slate heading into the convention, and the decision would most likely come down to the convention delegates who were initially pledged to Biden.

Biden would have some influence over his pledged delegates, but ultimately, they can vote as they please, so candidates would most likely campaign aggressively to win over each individual delegate.

However, there's a potentially important wrinkle: Democrats plan to formally nominate Biden virtually ahead of the late-August convention to sidestep any potential concerns about ballo t access in Ohio, where a technical quirk has complicated things

Democrats decided to plan a virtual nomination for Biden after Ohio Republicans balked at passing pro forma legislation that would allow Biden to be on the ballot, even though the convention falls after a state deadline. But while Republicans passed a law to shift the deadline, Democrats decided to move forward with a virtual nomination nonetheless.

Could Democrats replace Biden against his will?

There’s no evidence the party would entertain a change without Biden’s consent. But even if it did, there’s no mechanism for it to replace a candidate before the convention, and certainly no way for it to anoint a chosen successor.

If large swaths of the Democratic Party lost faith in Biden, delegates to the national convention could theoretically defect en masse. Of course, they were chosen to be delegates because of their loyalty to Biden and have pledged to support him at the convention.

But, unlike many Republican delegates, Democratic delegates aren’t technically bound to their candidate. DNC rules allow delegates to “in all good conscience reflect the sentiments of those who elected them,” providing some wiggle room.

The party’s charter does include provisions to replace the nominee in the event of a vacancy. The measure is intended to be used in case of death, resignation or incapacitation, not to replace someone who has no desire to step down.

That was the measure that Donna Brazile, then the interim DNC chair, considered invoking after Hillary Clinton collapsed two months before the 2016 election, she wrote in her memoir .

In her memoir, released a year later, Brazile wrote that she was worried “not just about Hillary’s health but about her anemic campaign ... so lacking in the spirit of fight.” 

“Perhaps changing the candidate was a chance to win this thing, to change the playing field in a way that would send Donald Trump scrambling and unable to catch up,” she wrote, adding that aides to other would-be candidates contacted her, including then-Vice President Biden’s chief of staff.

But after less than 24 hours of consideration, Brazile realized the idea was untenable without Clinton’s cooperation and likely to only divide her party further. “I could not make good on my threat to replace her," she wrote.

Current DNC Chair Jaime Harrison is a longtime Biden ally who serves, essentially, at the pleasure of the president. And the national party has certainly given no indication it’s anything but fully behind his re-election.  

What happens if Biden withdraws after the convention?

To fill a vacancy on the national ticket, the chair can call a “special meeting” of the full DNC, which includes about 500 members. On paper, at least, all it takes is a majority vote of those present to pick new presidential and vice presidential nominees. But that process would most likely be anything but smooth and be filled with behind-the-scenes jockeying and public pressure campaigns. 

If a vacancy were to occur close to the November election, however, it could raise constitutional, legal and practical concerns. Among other issues, ballots have to be printed well in advance of the election, and it might not be possible to change them in time.

Would Kamala Harris replace Biden?

If Biden were to relinquish the presidency, Vice President Kamala Harris would automatically become president — but not the Democratic Party’s nominee. Nor would she necessarily be the nominee if Biden withdrew from his re-election bid while he remained in the White House.

She might be politically favored, but party rules give the vice president no major mechanical benefit over other candidates. 

Biden’s delegates wouldn’t automatically transfer to Harris, and the convention holds separate votes on nominees for president and vice president. So she would still need to win a majority of delegates at the convention. 

If the top of the ticket were vacated after the convention, she would still need to win a majority of votes at the special meeting of the DNC.

That is all, at least, under current party rules. But a vacancy at the top of the ticket is the kind of dramatic moment that might lead party leaders to revisit them in the name of easing the transition. Harris has some close allies in key places at the DNC, including a co-chair of the party’s Rules and Bylaws Committee. But nothing would be likely to happen without a fight.

how can critical thinking be measured

Ben Kamisar is a national political reporter for NBC News.

how can critical thinking be measured

Alex Seitz-Wald is a senior politics reporter for NBC News.

Advertisement

Supported by

Who Won the Debate? Biden Stumbles Left Trump on Top

A halting debate performance by President Biden left Democratic strategists reeling, raising questions about his fitness to stay in the race.

  • Share full article

Former President Donald J. Trump and President Biden on a debate stage, each standing at lecterns with microphones. The CNN logo is adorned on the lecterns and on screens behind them.

By Alan Rappeport

Reporting from Washington

In the first presidential debate of the year between the leading Democratic and Republican candidates, President Biden and former President Donald J. Trump clashed on inflation, taxes, Ukraine and the future of democracy.

A halting performance from Mr. Biden and a relatively steady and measured delivery by Mr. Trump left Democrats deeply concerned about Mr. Biden’s prospects. Personal attacks overshadowed discussions of policy during the debate, with the candidates sparring over who had a better golf game, their respective cognitive abilities and their legal problems.

On cable news and social media, strategists from both parties wondered if Mr. Biden could continue in the race against Mr. Trump. Few Democrats could muster an upbeat assessment of the president’s performance.

Here is a sampling of the reaction.

“It was a really disappointing debate performance from Joe Biden. I don’t think there’s any other way to slice it. His biggest issue was to prove to the American people that he had the energy, the stamina — and he didn’t do that,” Kate Bedingfield, Mr. Biden’s former White House communications director, said on CNN.

“Biden is even whiffing on his easy pitches — abortion and Jan. 6. I mean, my God,” said Matt Gorman, a Republican strategist and former senior adviser to the presidential campaign for Senator Tim Scott of South Carolina.

“Look, I debated Joe 7 times in 2020. He’s a different guy in 2024,” Andrew Yang, a Democratic presidential candidate in 2020, said on the social media platform X, adding the hashtag #swapJoeout.

We are having trouble retrieving the article content.

Please enable JavaScript in your browser settings.

Thank you for your patience while we verify access. If you are in Reader mode please exit and  log into  your Times account, or  subscribe  for all of The Times.

Thank you for your patience while we verify access.

Already a subscriber?  Log in .

Want all of The Times?  Subscribe .

COMMENTS

  1. Critical Thinking > Assessment (Stanford Encyclopedia of Philosophy)

    The Critical Thinking Assessment Test (CAT) is unique among them in being designed for use by college faculty to help them improve their development of students' critical thinking skills (Haynes et al. 2015; Haynes & Stein 2021). Also, for some years the United Kingdom body OCR (Oxford Cambridge and RSA Examinations) awarded AS and A Level ...

  2. Critical Thinking About Measuring Critical Thinking

    A list of critical thinking measures. Operationally defining something refers to defining the terms of the process or measure required to determine the nature and properties of a phenomenon.

  3. Teaching, Measuring & Assessing Critical Thinking Skills

    Yes, We Can Define, Teach, and Assess Critical Thinking Skills. Critical thinking is a thing. We can define it; we can teach it; and we can assess it. While the idea of teaching critical thinking has been bandied around in education circles since at least the time of John Dewey, it has taken greater prominence in the education debates with the ...

  4. Critical Thinking Testing and Assessment

    The purpose of assessing instruction for critical thinking is improving the teaching of discipline-based thinking (historical, biological, sociological, mathematical, etc.) It is to improve students' abilities to think their way through content using disciplined skill in reasoning. The more particular we can be about what we want students to ...

  5. Measuring Critical Thinking: Can It Be Done?

    The results can be transformed easily into statistics for the quantitative measure of critical thinking among a sample group. However, this means the questions are simple, so the test does not measure proficiency in thinking critically about longer, more complex arguments. The questions are also based on scenarios that may not be familiar to ...

  6. Assessing Critical Thinking in Higher Education: Current State and

    Critical thinking is one of the most frequently discussed higher order skills, believed to play a central role in logical thinking, decision making, and problem solving (Butler, 2012; Halpern, 2003).It is also a highly contentious skill in that researchers debate about its definition; its amenability to assessment; its degree of generality or specificity; and the evidence of its practical ...

  7. PDF Measuring student success skills

    on-critical-thinking/. How can critical thinking be measured and assessed? The primary assessment tools for critical thinking are standardized tests and high-quality performance-based assessments. Standardized tests face criticism due to weaknesses related to, for example, construct underrepresentation, which occurs when a measurement

  8. PDF The Importance of Critical Thinking and How to Measure It

    under the name Watson-Glaser Critical Thinking Appraisal in the U.S. This was followed in the '80s by forms A and B containing a reduced number of items, and updated content.

  9. Critical Thinking: A Model of Intelligence for Solving Real-World

    Yes, intelligence (i.e., critical thinking) can be enhanced and used for solving a real-world problem such as COVID-19, which we use as an example of contemporary problems that need a new approach. Keywords: critical thinking, ... when he stated that," IQ tests measure only a small set of the thinking abilities that people need. ...

  10. Critical Thinking

    Critical Thinking. Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. Conceptions differ with respect to the scope of such thinking, the type of goal, the criteria and norms ...

  11. Measuring critical thinking skills.

    Critical thinking remains one of the most important skills identified as an outcome of a college degree. This chapter reviews critical thinking measures that are specific to psychology as well as broad-based general measures. Utilizing the techniques available from the scholarship of teaching and learning (SoTL) literature, scholars can measure and document the effectiveness of planned ...

  12. Promoting and Assessing Critical Thinking

    Critical thinking can be defined as being able to examine an issue by breaking it down, and evaluating it in a conscious manner, while providing arguments/evidence to support the evaluation. Below are some suggestions for promoting and assessing critical thinking in our students. Asking questions and using the answers to understand the world ...

  13. Critical Thinking About Measuring Critical Thinking

    A list of critical thinking measures. As my previous articles explain, CT is a metacognitive process consisting of a number of sub-skills and dispositions, that, when applied through purposeful ...

  14. (PDF) Measuring Psychological Critical Thinking: An Update

    Lawson (1999) designed the Psychological Critical Thinking Exam (PCTE) to measure students' ability to "think critically, or evaluate claims, in a way that explicitly incorporates basic ...

  15. A Short Guide to Building Your Team's Critical Thinking Skills

    Summary. Most employers lack an effective way to objectively assess critical thinking skills and most managers don't know how to provide specific instruction to team members in need of becoming ...

  16. Thinking Critically and Analytically about Critical-Analytic Thinking

    Those who have contributed to this conference and this issue feel strongly that these scholarly treatises can prove invaluable for those already invested in or concerned with this foundational subject, as well as those seeking to better understand what critical-analytic thinking is, why it matters, and how it can be measured or shaped.

  17. The Importance of Critical Thinking and How to Measure It

    This whitepaper examines the importance of critical thinking and how it can be measured. Wyn Davies, Global Product Strategist, Pearson TalentLens and Matt Stevens, Head of Pearson TalentLens, UK. Critical thinking ability is now more important than ever and its requirement is not limited to workplace contexts - it also has distinct societal ...

  18. What Are Critical Thinking Skills and Why Are They Important?

    It makes you a well-rounded individual, one who has looked at all of their options and possible solutions before making a choice. According to the University of the People in California, having critical thinking skills is important because they are [ 1 ]: Universal. Crucial for the economy. Essential for improving language and presentation skills.

  19. (PDF) Critical thinking. Can it be measured?

    Abstract and Figures. It has been argued that critical thinking is a vital skill in science education. However, there is a lack of clarity in describing what is meant by critical thinking and no ...

  20. Concept mapping: A strategy for teaching and evaluation in nursing

    How can critical thinking be measured? Not surprisingly, the measurement of critical thinking in nursing education has been challenging (Adams, 1999, Angel et al., 2000, Daly, 1998, Maynard, 1996, Oermann, 1997, O'Sullivan et al., 1997, Perciful and Nester, 1996, Staib, 2003). Without a clearly accepted definition of critical thinking and ...

  21. FAQ

    Stay Calm: Relax on the assessment day, stay focused, and trust your critical thinking skills. Remember that the goal of a critical thinking assessment is to measure your natural ability to think critically, so there's no need for extensive preparation. Just be yourself and approach the assessment with a clear mind.

  22. Critical Thinking About Measuring Critical Thinking

    The authors of the EWCET provide what they call a "rough, somewhat overlapping list of areas of critical thinking competence", measured by their test (Ennis & Weir, 1985, p. 1).

  23. The Importance of Critical Thinking in Nursing

    Critical thinking skills in nursing refer to a nurse's ability to question, analyze, interpret, and apply various pieces of information based on facts and evidence rather than subjective information or emotions. Critical thinking leads to decisions that are both objective and impartial. This aspect of clinical practice allows nurses to assess ...

  24. HOW CAN CRITICAL THINKING BE MEASURED? : r/CriticalThinking911

    Measuring critical thinking is essential for identifying areas of strength and weakness in a person's critical thinking skills. Different methods of measuring critical thinking have been developed, and each has its strengths and weaknesses. However, it is important to note that no single test can accurately measure all aspects of critical thinking.

  25. 5 Steps Leaders Can Take To Make Better Decisions

    Critical thinking is essential as it equips you with the skill set to analyze situations thoroughly and objectively. It helps you identify critical issues, evaluate alternatives, and make ...

  26. Big-Picture Scenarios Guide Law Associates' Critical Thinking

    How should law firms tackle a critical thinking crisis? It may be a challenge, but using strategies such as comprehensive guidance, experiential training, and data-driven feedback can go a long way. Big Picture. A key step in fostering critical thinking is to give associates a comprehensive lay of the land in a given case.

  27. How can genAI be used to promote critical thinking skills?

    Learn how genAI can be used to promote critical thinking skills. Presenter: Alexa Alice Joubin, Ph.D. | Professor of English, East Asian Lang & Lit, and Theatre & Dance. Recorded and produced: Spring 2024. AI tool(s) demonstrated: Adobe Firefly

  28. The Role of Humans in an AI World

    A new value framework—based on critical thinking and purposeful analysis—can help guide business schools, and accordingly students, to more meaningful solutions. Transcript. Ana Freire: [0:15] In my view, AI will replace everything. There are some human characteristics that need to be in place always, such as creativity, or the natural ...

  29. Democrats are talking about replacing Joe Biden. That wouldn't be so easy

    President Joe Biden's performance in the first debate Thursday has sparked a new round of criticism from Democrats, as well as public and private musing about whether he should remain at the top ...

  30. Who Won the Debate? Biden Stumbles Left Trump on Top

    A halting performance from Mr. Biden and a relatively steady and measured delivery by Mr. Trump left Democrats deeply concerned about Mr. Biden's prospects. Personal attacks overshadowed ...