U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

The PMC website is updating on October 15, 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • CBE Life Sci Educ
  • v.17(1); Spring 2018

Understanding the Complex Relationship between Critical Thinking and Science Reasoning among Undergraduate Thesis Writers

Jason e. dowd.

† Department of Biology, Duke University, Durham, NC 27708

Robert J. Thompson, Jr.

‡ Department of Psychology and Neuroscience, Duke University, Durham, NC 27708

Leslie A. Schiff

§ Department of Microbiology and Immunology, University of Minnesota, Minneapolis, MN 55455

Julie A. Reynolds

Associated data.

This study empirically examines the relationship between students’ critical-thinking skills and scientific reasoning as reflected in undergraduate thesis writing in biology. Writing offers a unique window into studying this relationship, and the findings raise potential implications for instruction.

Developing critical-thinking and scientific reasoning skills are core learning objectives of science education, but little empirical evidence exists regarding the interrelationships between these constructs. Writing effectively fosters students’ development of these constructs, and it offers a unique window into studying how they relate. In this study of undergraduate thesis writing in biology at two universities, we examine how scientific reasoning exhibited in writing (assessed using the Biology Thesis Assessment Protocol) relates to general and specific critical-thinking skills (assessed using the California Critical Thinking Skills Test), and we consider implications for instruction. We find that scientific reasoning in writing is strongly related to inference , while other aspects of science reasoning that emerge in writing (epistemological considerations, writing conventions, etc.) are not significantly related to critical-thinking skills. Science reasoning in writing is not merely a proxy for critical thinking. In linking features of students’ writing to their critical-thinking skills, this study 1) provides a bridge to prior work suggesting that engagement in science writing enhances critical thinking and 2) serves as a foundational step for subsequently determining whether instruction focused explicitly on developing critical-thinking skills (particularly inference ) can actually improve students’ scientific reasoning in their writing.

INTRODUCTION

Critical-thinking and scientific reasoning skills are core learning objectives of science education for all students, regardless of whether or not they intend to pursue a career in science or engineering. Consistent with the view of learning as construction of understanding and meaning ( National Research Council, 2000 ), the pedagogical practice of writing has been found to be effective not only in fostering the development of students’ conceptual and procedural knowledge ( Gerdeman et al. , 2007 ) and communication skills ( Clase et al. , 2010 ), but also scientific reasoning ( Reynolds et al. , 2012 ) and critical-thinking skills ( Quitadamo and Kurtz, 2007 ).

Critical thinking and scientific reasoning are similar but different constructs that include various types of higher-order cognitive processes, metacognitive strategies, and dispositions involved in making meaning of information. Critical thinking is generally understood as the broader construct ( Holyoak and Morrison, 2005 ), comprising an array of cognitive processes and dispostions that are drawn upon differentially in everyday life and across domains of inquiry such as the natural sciences, social sciences, and humanities. Scientific reasoning, then, may be interpreted as the subset of critical-thinking skills (cognitive and metacognitive processes and dispositions) that 1) are involved in making meaning of information in scientific domains and 2) support the epistemological commitment to scientific methodology and paradigm(s).

Although there has been an enduring focus in higher education on promoting critical thinking and reasoning as general or “transferable” skills, research evidence provides increasing support for the view that reasoning and critical thinking are also situational or domain specific ( Beyer et al. , 2013 ). Some researchers, such as Lawson (2010) , present frameworks in which science reasoning is characterized explicitly in terms of critical-thinking skills. There are, however, limited coherent frameworks and empirical evidence regarding either the general or domain-specific interrelationships of scientific reasoning, as it is most broadly defined, and critical-thinking skills.

The Vision and Change in Undergraduate Biology Education Initiative provides a framework for thinking about these constructs and their interrelationship in the context of the core competencies and disciplinary practice they describe ( American Association for the Advancement of Science, 2011 ). These learning objectives aim for undergraduates to “understand the process of science, the interdisciplinary nature of the new biology and how science is closely integrated within society; be competent in communication and collaboration; have quantitative competency and a basic ability to interpret data; and have some experience with modeling, simulation and computational and systems level approaches as well as with using large databases” ( Woodin et al. , 2010 , pp. 71–72). This framework makes clear that science reasoning and critical-thinking skills play key roles in major learning outcomes; for example, “understanding the process of science” requires students to engage in (and be metacognitive about) scientific reasoning, and having the “ability to interpret data” requires critical-thinking skills. To help students better achieve these core competencies, we must better understand the interrelationships of their composite parts. Thus, the next step is to determine which specific critical-thinking skills are drawn upon when students engage in science reasoning in general and with regard to the particular scientific domain being studied. Such a determination could be applied to improve science education for both majors and nonmajors through pedagogical approaches that foster critical-thinking skills that are most relevant to science reasoning.

Writing affords one of the most effective means for making thinking visible ( Reynolds et al. , 2012 ) and learning how to “think like” and “write like” disciplinary experts ( Meizlish et al. , 2013 ). As a result, student writing affords the opportunities to both foster and examine the interrelationship of scientific reasoning and critical-thinking skills within and across disciplinary contexts. The purpose of this study was to better understand the relationship between students’ critical-thinking skills and scientific reasoning skills as reflected in the genre of undergraduate thesis writing in biology departments at two research universities, the University of Minnesota and Duke University.

In the following subsections, we discuss in greater detail the constructs of scientific reasoning and critical thinking, as well as the assessment of scientific reasoning in students’ thesis writing. In subsequent sections, we discuss our study design, findings, and the implications for enhancing educational practices.

Critical Thinking

The advances in cognitive science in the 21st century have increased our understanding of the mental processes involved in thinking and reasoning, as well as memory, learning, and problem solving. Critical thinking is understood to include both a cognitive dimension and a disposition dimension (e.g., reflective thinking) and is defined as “purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or contextual considera­tions upon which that judgment is based” ( Facione, 1990, p. 3 ). Although various other definitions of critical thinking have been proposed, researchers have generally coalesced on this consensus: expert view ( Blattner and Frazier, 2002 ; Condon and Kelly-Riley, 2004 ; Bissell and Lemons, 2006 ; Quitadamo and Kurtz, 2007 ) and the corresponding measures of critical-­thinking skills ( August, 2016 ; Stephenson and Sadler-McKnight, 2016 ).

Both the cognitive skills and dispositional components of critical thinking have been recognized as important to science education ( Quitadamo and Kurtz, 2007 ). Empirical research demonstrates that specific pedagogical practices in science courses are effective in fostering students’ critical-thinking skills. Quitadamo and Kurtz (2007) found that students who engaged in a laboratory writing component in the context of a general education biology course significantly improved their overall critical-thinking skills (and their analytical and inference skills, in particular), whereas students engaged in a traditional quiz-based laboratory did not improve their critical-thinking skills. In related work, Quitadamo et al. (2008) found that a community-based inquiry experience, involving inquiry, writing, research, and analysis, was associated with improved critical thinking in a biology course for nonmajors, compared with traditionally taught sections. In both studies, students who exhibited stronger presemester critical-thinking skills exhibited stronger gains, suggesting that “students who have not been explicitly taught how to think critically may not reach the same potential as peers who have been taught these skills” ( Quitadamo and Kurtz, 2007 , p. 151).

Recently, Stephenson and Sadler-McKnight (2016) found that first-year general chemistry students who engaged in a science writing heuristic laboratory, which is an inquiry-based, writing-to-learn approach to instruction ( Hand and Keys, 1999 ), had significantly greater gains in total critical-thinking scores than students who received traditional laboratory instruction. Each of the four components—inquiry, writing, collaboration, and reflection—have been linked to critical thinking ( Stephenson and Sadler-McKnight, 2016 ). Like the other studies, this work highlights the value of targeting critical-thinking skills and the effectiveness of an inquiry-based, writing-to-learn approach to enhance critical thinking. Across studies, authors advocate adopting critical thinking as the course framework ( Pukkila, 2004 ) and developing explicit examples of how critical thinking relates to the scientific method ( Miri et al. , 2007 ).

In these examples, the important connection between writing and critical thinking is highlighted by the fact that each intervention involves the incorporation of writing into science, technology, engineering, and mathematics education (either alone or in combination with other pedagogical practices). However, critical-thinking skills are not always the primary learning outcome; in some contexts, scientific reasoning is the primary outcome that is assessed.

Scientific Reasoning

Scientific reasoning is a complex process that is broadly defined as “the skills involved in inquiry, experimentation, evidence evaluation, and inference that are done in the service of conceptual change or scientific understanding” ( Zimmerman, 2007 , p. 172). Scientific reasoning is understood to include both conceptual knowledge and the cognitive processes involved with generation of hypotheses (i.e., inductive processes involved in the generation of hypotheses and the deductive processes used in the testing of hypotheses), experimentation strategies, and evidence evaluation strategies. These dimensions are interrelated, in that “experimentation and inference strategies are selected based on prior conceptual knowledge of the domain” ( Zimmerman, 2000 , p. 139). Furthermore, conceptual and procedural knowledge and cognitive process dimensions can be general and domain specific (or discipline specific).

With regard to conceptual knowledge, attention has been focused on the acquisition of core methodological concepts fundamental to scientists’ causal reasoning and metacognitive distancing (or decontextualized thinking), which is the ability to reason independently of prior knowledge or beliefs ( Greenhoot et al. , 2004 ). The latter involves what Kuhn and Dean (2004) refer to as the coordination of theory and evidence, which requires that one question existing theories (i.e., prior knowledge and beliefs), seek contradictory evidence, eliminate alternative explanations, and revise one’s prior beliefs in the face of contradictory evidence. Kuhn and colleagues (2008) further elaborate that scientific thinking requires “a mature understanding of the epistemological foundations of science, recognizing scientific knowledge as constructed by humans rather than simply discovered in the world,” and “the ability to engage in skilled argumentation in the scientific domain, with an appreciation of argumentation as entailing the coordination of theory and evidence” ( Kuhn et al. , 2008 , p. 435). “This approach to scientific reasoning not only highlights the skills of generating and evaluating evidence-based inferences, but also encompasses epistemological appreciation of the functions of evidence and theory” ( Ding et al. , 2016 , p. 616). Evaluating evidence-based inferences involves epistemic cognition, which Moshman (2015) defines as the subset of metacognition that is concerned with justification, truth, and associated forms of reasoning. Epistemic cognition is both general and domain specific (or discipline specific; Moshman, 2015 ).

There is empirical support for the contributions of both prior knowledge and an understanding of the epistemological foundations of science to scientific reasoning. In a study of undergraduate science students, advanced scientific reasoning was most often accompanied by accurate prior knowledge as well as sophisticated epistemological commitments; additionally, for students who had comparable levels of prior knowledge, skillful reasoning was associated with a strong epistemological commitment to the consistency of theory with evidence ( Zeineddin and Abd-El-Khalick, 2010 ). These findings highlight the importance of the need for instructional activities that intentionally help learners develop sophisticated epistemological commitments focused on the nature of knowledge and the role of evidence in supporting knowledge claims ( Zeineddin and Abd-El-Khalick, 2010 ).

Scientific Reasoning in Students’ Thesis Writing

Pedagogical approaches that incorporate writing have also focused on enhancing scientific reasoning. Many rubrics have been developed to assess aspects of scientific reasoning in written artifacts. For example, Timmerman and colleagues (2011) , in the course of describing their own rubric for assessing scientific reasoning, highlight several examples of scientific reasoning assessment criteria ( Haaga, 1993 ; Tariq et al. , 1998 ; Topping et al. , 2000 ; Kelly and Takao, 2002 ; Halonen et al. , 2003 ; Willison and O’Regan, 2007 ).

At both the University of Minnesota and Duke University, we have focused on the genre of the undergraduate honors thesis as the rhetorical context in which to study and improve students’ scientific reasoning and writing. We view the process of writing an undergraduate honors thesis as a form of professional development in the sciences (i.e., a way of engaging students in the practices of a community of discourse). We have found that structured courses designed to scaffold the thesis-­writing process and promote metacognition can improve writing and reasoning skills in biology, chemistry, and economics ( Reynolds and Thompson, 2011 ; Dowd et al. , 2015a , b ). In the context of this prior work, we have defined scientific reasoning in writing as the emergent, underlying construct measured across distinct aspects of students’ written discussion of independent research in their undergraduate theses.

The Biology Thesis Assessment Protocol (BioTAP) was developed at Duke University as a tool for systematically guiding students and faculty through a “draft–feedback–revision” writing process, modeled after professional scientific peer-review processes ( Reynolds et al. , 2009 ). BioTAP includes activities and worksheets that allow students to engage in critical peer review and provides detailed descriptions, presented as rubrics, of the questions (i.e., dimensions, shown in Table 1 ) upon which such review should focus. Nine rubric dimensions focus on communication to the broader scientific community, and four rubric dimensions focus on the accuracy and appropriateness of the research. These rubric dimensions provide criteria by which the thesis is assessed, and therefore allow BioTAP to be used as an assessment tool as well as a teaching resource ( Reynolds et al. , 2009 ). Full details are available at www.science-writing.org/biotap.html .

Theses assessment protocol dimensions

In previous work, we have used BioTAP to quantitatively assess students’ undergraduate honors theses and explore the relationship between thesis-writing courses (or specific interventions within the courses) and the strength of students’ science reasoning in writing across different science disciplines: biology ( Reynolds and Thompson, 2011 ); chemistry ( Dowd et al. , 2015b ); and economics ( Dowd et al. , 2015a ). We have focused exclusively on the nine dimensions related to reasoning and writing (questions 1–9), as the other four dimensions (questions 10–13) require topic-specific expertise and are intended to be used by the student’s thesis supervisor.

Beyond considering individual dimensions, we have investigated whether meaningful constructs underlie students’ thesis scores. We conducted exploratory factor analysis of students’ theses in biology, economics, and chemistry and found one dominant underlying factor in each discipline; we termed the factor “scientific reasoning in writing” ( Dowd et al. , 2015a , b , 2016 ). That is, each of the nine dimensions could be understood as reflecting, in different ways and to different degrees, the construct of scientific reasoning in writing. The findings indicated evidence of both general and discipline-specific components to scientific reasoning in writing that relate to epistemic beliefs and paradigms, in keeping with broader ideas about science reasoning discussed earlier. Specifically, scientific reasoning in writing is more strongly associated with formulating a compelling argument for the significance of the research in the context of current literature in biology, making meaning regarding the implications of the findings in chemistry, and providing an organizational framework for interpreting the thesis in economics. We suggested that instruction, whether occurring in writing studios or in writing courses to facilitate thesis preparation, should attend to both components.

Research Question and Study Design

The genre of thesis writing combines the pedagogies of writing and inquiry found to foster scientific reasoning ( Reynolds et al. , 2012 ) and critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-­McKnight, 2016 ). However, there is no empirical evidence regarding the general or domain-specific interrelationships of scientific reasoning and critical-thinking skills, particularly in the rhetorical context of the undergraduate thesis. The BioTAP studies discussed earlier indicate that the rubric-based assessment produces evidence of scientific reasoning in the undergraduate thesis, but it was not designed to foster or measure critical thinking. The current study was undertaken to address the research question: How are students’ critical-thinking skills related to scientific reasoning as reflected in the genre of undergraduate thesis writing in biology? Determining these interrelationships could guide efforts to enhance students’ scientific reasoning and writing skills through focusing instruction on specific critical-thinking skills as well as disciplinary conventions.

To address this research question, we focused on undergraduate thesis writers in biology courses at two institutions, Duke University and the University of Minnesota, and examined the extent to which students’ scientific reasoning in writing, assessed in the undergraduate thesis using BioTAP, corresponds to students’ critical-thinking skills, assessed using the California Critical Thinking Skills Test (CCTST; August, 2016 ).

Study Sample

The study sample was composed of students enrolled in courses designed to scaffold the thesis-writing process in the Department of Biology at Duke University and the College of Biological Sciences at the University of Minnesota. Both courses complement students’ individual work with research advisors. The course is required for thesis writers at the University of Minnesota and optional for writers at Duke University. Not all students are required to complete a thesis, though it is required for students to graduate with honors; at the University of Minnesota, such students are enrolled in an honors program within the college. In total, 28 students were enrolled in the course at Duke University and 44 students were enrolled in the course at the University of Minnesota. Of those students, two students did not consent to participate in the study; additionally, five students did not validly complete the CCTST (i.e., attempted fewer than 60% of items or completed the test in less than 15 minutes). Thus, our overall rate of valid participation is 90%, with 27 students from Duke University and 38 students from the University of Minnesota. We found no statistically significant differences in thesis assessment between students with valid CCTST scores and invalid CCTST scores. Therefore, we focus on the 65 students who consented to participate and for whom we have complete and valid data in most of this study. Additionally, in asking students for their consent to participate, we allowed them to choose whether to provide or decline access to academic and demographic background data. Of the 65 students who consented to participate, 52 students granted access to such data. Therefore, for additional analyses involving academic and background data, we focus on the 52 students who consented. We note that the 13 students who participated but declined to share additional data performed slightly lower on the CCTST than the 52 others (perhaps suggesting that they differ by other measures, but we cannot determine this with certainty). Among the 52 students, 60% identified as female and 10% identified as being from underrepresented ethnicities.

In both courses, students completed the CCTST online, either in class or on their own, late in the Spring 2016 semester. This is the same assessment that was used in prior studies of critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-McKnight, 2016 ). It is “an objective measure of the core reasoning skills needed for reflective decision making concerning what to believe or what to do” ( Insight Assessment, 2016a ). In the test, students are asked to read and consider information as they answer multiple-choice questions. The questions are intended to be appropriate for all users, so there is no expectation of prior disciplinary knowledge in biology (or any other subject). Although actual test items are protected, sample items are available on the Insight Assessment website ( Insight Assessment, 2016b ). We have included one sample item in the Supplemental Material.

The CCTST is based on a consensus definition of critical thinking, measures cognitive and metacognitive skills associated with critical thinking, and has been evaluated for validity and reliability at the college level ( August, 2016 ; Stephenson and Sadler-McKnight, 2016 ). In addition to providing overall critical-thinking score, the CCTST assesses seven dimensions of critical thinking: analysis, interpretation, inference, evaluation, explanation, induction, and deduction. Scores on each dimension are calculated based on students’ performance on items related to that dimension. Analysis focuses on identifying assumptions, reasons, and claims and examining how they interact to form arguments. Interpretation, related to analysis, focuses on determining the precise meaning and significance of information. Inference focuses on drawing conclusions from reasons and evidence. Evaluation focuses on assessing the credibility of sources of information and claims they make. Explanation, related to evaluation, focuses on describing the evidence, assumptions, or rationale for beliefs and conclusions. Induction focuses on drawing inferences about what is probably true based on evidence. Deduction focuses on drawing conclusions about what must be true when the context completely determines the outcome. These are not independent dimensions; the fact that they are related supports their collective interpretation as critical thinking. Together, the CCTST dimensions provide a basis for evaluating students’ overall strength in using reasoning to form reflective judgments about what to believe or what to do ( August, 2016 ). Each of the seven dimensions and the overall CCTST score are measured on a scale of 0–100, where higher scores indicate superior performance. Scores correspond to superior (86–100), strong (79–85), moderate (70–78), weak (63–69), or not manifested (62 and below) skills.

Scientific Reasoning in Writing

At the end of the semester, students’ final, submitted undergraduate theses were assessed using BioTAP, which consists of nine rubric dimensions that focus on communication to the broader scientific community and four additional dimensions that focus on the exhibition of topic-specific expertise ( Reynolds et al. , 2009 ). These dimensions, framed as questions, are displayed in Table 1 .

Student theses were assessed on questions 1–9 of BioTAP using the same procedures described in previous studies ( Reynolds and Thompson, 2011 ; Dowd et al. , 2015a , b ). In this study, six raters were trained in the valid, reliable use of BioTAP rubrics. Each dimension was rated on a five-point scale: 1 indicates the dimension is missing, incomplete, or below acceptable standards; 3 indicates that the dimension is adequate but not exhibiting mastery; and 5 indicates that the dimension is excellent and exhibits mastery (intermediate ratings of 2 and 4 are appropriate when different parts of the thesis make a single category challenging). After training, two raters independently assessed each thesis and then discussed their independent ratings with one another to form a consensus rating. The consensus score is not an average score, but rather an agreed-upon, discussion-based score. On a five-point scale, raters independently assessed dimensions to be within 1 point of each other 82.4% of the time before discussion and formed consensus ratings 100% of the time after discussion.

In this study, we consider both categorical (mastery/nonmastery, where a score of 5 corresponds to mastery) and numerical treatments of individual BioTAP scores to better relate the manifestation of critical thinking in BioTAP assessment to all of the prior studies. For comprehensive/cumulative measures of BioTAP, we focus on the partial sum of questions 1–5, as these questions relate to higher-order scientific reasoning (whereas questions 6–9 relate to mid- and lower-order writing mechanics [ Reynolds et al. , 2009 ]), and the factor scores (i.e., numerical representations of the extent to which each student exhibits the underlying factor), which are calculated from the factor loadings published by Dowd et al. (2016) . We do not focus on questions 6–9 individually in statistical analyses, because we do not expect critical-thinking skills to relate to mid- and lower-order writing skills.

The final, submitted thesis reflects the student’s writing, the student’s scientific reasoning, the quality of feedback provided to the student by peers and mentors, and the student’s ability to incorporate that feedback into his or her work. Therefore, our assessment is not the same as an assessment of unpolished, unrevised samples of students’ written work. While one might imagine that such an unpolished sample may be more strongly correlated with critical-thinking skills measured by the CCTST, we argue that the complete, submitted thesis, assessed using BioTAP, is ultimately a more appropriate reflection of how students exhibit science reasoning in the scientific community.

Statistical Analyses

We took several steps to analyze the collected data. First, to provide context for subsequent interpretations, we generated descriptive statistics for the CCTST scores of the participants based on the norms for undergraduate CCTST test takers. To determine the strength of relationships among CCTST dimensions (including overall score) and the BioTAP dimensions, partial-sum score (questions 1–5), and factor score, we calculated Pearson’s correlations for each pair of measures. To examine whether falling on one side of the nonmastery/mastery threshold (as opposed to a linear scale of performance) was related to critical thinking, we grouped BioTAP dimensions into categories (mastery/nonmastery) and conducted Student’s t tests to compare the means scores of the two groups on each of the seven dimensions and overall score of the CCTST. Finally, for the strongest relationship that emerged, we included additional academic and background variables as covariates in multiple linear-regression analysis to explore questions about how much observed relationships between critical-thinking skills and science reasoning in writing might be explained by variation in these other factors.

Although BioTAP scores represent discreet, ordinal bins, the five-point scale is intended to capture an underlying continuous construct (from inadequate to exhibiting mastery). It has been argued that five categories is an appropriate cutoff for treating ordinal variables as pseudo-continuous ( Rhemtulla et al. , 2012 )—and therefore using continuous-variable statistical methods (e.g., Pearson’s correlations)—as long as the underlying assumption that ordinal scores are linearly distributed is valid. Although we have no way to statistically test this assumption, we interpret adequate scores to be approximately halfway between inadequate and mastery scores, resulting in a linear scale. In part because this assumption is subject to disagreement, we also consider and interpret a categorical (mastery/nonmastery) treatment of BioTAP variables.

We corrected for multiple comparisons using the Holm-Bonferroni method ( Holm, 1979 ). At the most general level, where we consider the single, comprehensive measures for BioTAP (partial-sum and factor score) and the CCTST (overall score), there is no need to correct for multiple comparisons, because the multiple, individual dimensions are collapsed into single dimensions. When we considered individual CCTST dimensions in relation to comprehensive measures for BioTAP, we accounted for seven comparisons; similarly, when we considered individual dimensions of BioTAP in relation to overall CCTST score, we accounted for five comparisons. When all seven CCTST and five BioTAP dimensions were examined individually and without prior knowledge, we accounted for 35 comparisons; such a rigorous threshold is likely to reject weak and moderate relationships, but it is appropriate if there are no specific pre-existing hypotheses. All p values are presented in tables for complete transparency, and we carefully consider the implications of our interpretation of these data in the Discussion section.

CCTST scores for students in this sample ranged from the 39th to 99th percentile of the general population of undergraduate CCTST test takers (mean percentile = 84.3, median = 85th percentile; Table 2 ); these percentiles reflect overall scores that range from moderate to superior. Scores on individual dimensions and overall scores were sufficiently normal and far enough from the ceiling of the scale to justify subsequent statistical analyses.

Descriptive statistics of CCTST dimensions a

MinimumMeanMedianMaximum
Analysis7088.690100
Interpretation7489.787100
Inference7887.989100
Evaluation6383.684100
Explanation6184.487100
Induction7487.48797
Deduction7186.48797
Overall73868597

a Scores correspond to superior (86–100), strong (79–85), moderate (70–78), weak (63–69), or not manifested (62 and lower) skills.

The Pearson’s correlations between students’ cumulative scores on BioTAP (the factor score based on loadings published by Dowd et al. , 2016 , and the partial sum of scores on questions 1–5) and students’ overall scores on the CCTST are presented in Table 3 . We found that the partial-sum measure of BioTAP was significantly related to the overall measure of critical thinking ( r = 0.27, p = 0.03), while the BioTAP factor score was marginally related to overall CCTST ( r = 0.24, p = 0.05). When we looked at relationships between comprehensive BioTAP measures and scores for individual dimensions of the CCTST ( Table 3 ), we found significant positive correlations between the both BioTAP partial-sum and factor scores and CCTST inference ( r = 0.45, p < 0.001, and r = 0.41, p < 0.001, respectively). Although some other relationships have p values below 0.05 (e.g., the correlations between BioTAP partial-sum scores and CCTST induction and interpretation scores), they are not significant when we correct for multiple comparisons.

Correlations between dimensions of CCTST and dimensions of BioTAP a

a In each cell, the top number is the correlation, and the bottom, italicized number is the associated p value. Correlations that are statistically significant after correcting for multiple comparisons are shown in bold.

b This is the partial sum of BioTAP scores on questions 1–5.

c This is the factor score calculated from factor loadings published by Dowd et al. (2016) .

When we expanded comparisons to include all 35 potential correlations among individual BioTAP and CCTST dimensions—and, accordingly, corrected for 35 comparisons—we did not find any additional statistically significant relationships. The Pearson’s correlations between students’ scores on each dimension of BioTAP and students’ scores on each dimension of the CCTST range from −0.11 to 0.35 ( Table 3 ); although the relationship between discussion of implications (BioTAP question 5) and inference appears to be relatively large ( r = 0.35), it is not significant ( p = 0.005; the Holm-Bonferroni cutoff is 0.00143). We found no statistically significant relationships between BioTAP questions 6–9 and CCTST dimensions (unpublished data), regardless of whether we correct for multiple comparisons.

The results of Student’s t tests comparing scores on each dimension of the CCTST of students who exhibit mastery with those of students who do not exhibit mastery on each dimension of BioTAP are presented in Table 4 . Focusing first on the overall CCTST scores, we found that the difference between those who exhibit mastery and those who do not in discussing implications of results (BioTAP question 5) is statistically significant ( t = 2.73, p = 0.008, d = 0.71). When we expanded t tests to include all 35 comparisons—and, like above, corrected for 35 comparisons—we found a significant difference in inference scores between students who exhibit mastery on question 5 and students who do not ( t = 3.41, p = 0.0012, d = 0.88), as well as a marginally significant difference in these students’ induction scores ( t = 3.26, p = 0.0018, d = 0.84; the Holm-Bonferroni cutoff is p = 0.00147). Cohen’s d effect sizes, which reveal the strength of the differences for statistically significant relationships, range from 0.71 to 0.88.

The t statistics and effect sizes of differences in ­dimensions of CCTST across dimensions of BioTAP a

a In each cell, the top number is the t statistic for each comparison, and the middle, italicized number is the associated p value. The bottom number is the effect size. Correlations that are statistically significant after correcting for multiple comparisons are shown in bold.

Finally, we more closely examined the strongest relationship that we observed, which was between the CCTST dimension of inference and the BioTAP partial-sum composite score (shown in Table 3 ), using multiple regression analysis ( Table 5 ). Focusing on the 52 students for whom we have background information, we looked at the simple relationship between BioTAP and inference (model 1), a robust background model including multiple covariates that one might expect to explain some part of the variation in BioTAP (model 2), and a combined model including all variables (model 3). As model 3 shows, the covariates explain very little variation in BioTAP scores, and the relationship between inference and BioTAP persists even in the presence of all of the covariates.

Partial sum (questions 1–5) of BioTAP scores ( n = 52)

VariableModel 1Model 2Model 3
CCTST inference0.536***0.491**
Grade point average0.1760.092
Independent study courses−0.0870.001
Writing-intensive courses0.1310.021
Institution0.3290.115
Male0.0850.041
Underrepresented group−0.114−0.060
Adjusted 0.273−0. 0220.195

** p < 0.01.

*** p < 0.001.

The aim of this study was to examine the extent to which the various components of scientific reasoning—manifested in writing in the genre of undergraduate thesis and assessed using BioTAP—draw on general and specific critical-thinking skills (assessed using CCTST) and to consider the implications for educational practices. Although science reasoning involves critical-thinking skills, it also relates to conceptual knowledge and the epistemological foundations of science disciplines ( Kuhn et al. , 2008 ). Moreover, science reasoning in writing , captured in students’ undergraduate theses, reflects habits, conventions, and the incorporation of feedback that may alter evidence of individuals’ critical-thinking skills. Our findings, however, provide empirical evidence that cumulative measures of science reasoning in writing are nonetheless related to students’ overall critical-thinking skills ( Table 3 ). The particularly significant roles of inference skills ( Table 3 ) and the discussion of implications of results (BioTAP question 5; Table 4 ) provide a basis for more specific ideas about how these constructs relate to one another and what educational interventions may have the most success in fostering these skills.

Our results build on previous findings. The genre of thesis writing combines pedagogies of writing and inquiry found to foster scientific reasoning ( Reynolds et al. , 2012 ) and critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-McKnight, 2016 ). Quitadamo and Kurtz (2007) reported that students who engaged in a laboratory writing component in a general education biology course significantly improved their inference and analysis skills, and Quitadamo and colleagues (2008) found that participation in a community-based inquiry biology course (that included a writing component) was associated with significant gains in students’ inference and evaluation skills. The shared focus on inference is noteworthy, because these prior studies actually differ from the current study; the former considered critical-­thinking skills as the primary learning outcome of writing-­focused interventions, whereas the latter focused on emergent links between two learning outcomes (science reasoning in writing and critical thinking). In other words, inference skills are impacted by writing as well as manifested in writing.

Inference focuses on drawing conclusions from argument and evidence. According to the consensus definition of critical thinking, the specific skill of inference includes several processes: querying evidence, conjecturing alternatives, and drawing conclusions. All of these activities are central to the independent research at the core of writing an undergraduate thesis. Indeed, a critical part of what we call “science reasoning in writing” might be characterized as a measure of students’ ability to infer and make meaning of information and findings. Because the cumulative BioTAP measures distill underlying similarities and, to an extent, suppress unique aspects of individual dimensions, we argue that it is appropriate to relate inference to scientific reasoning in writing . Even when we control for other potentially relevant background characteristics, the relationship is strong ( Table 5 ).

In taking the complementary view and focusing on BioTAP, when we compared students who exhibit mastery with those who do not, we found that the specific dimension of “discussing the implications of results” (question 5) differentiates students’ performance on several critical-thinking skills. To achieve mastery on this dimension, students must make connections between their results and other published studies and discuss the future directions of the research; in short, they must demonstrate an understanding of the bigger picture. The specific relationship between question 5 and inference is the strongest observed among all individual comparisons. Altogether, perhaps more than any other BioTAP dimension, this aspect of students’ writing provides a clear view of the role of students’ critical-thinking skills (particularly inference and, marginally, induction) in science reasoning.

While inference and discussion of implications emerge as particularly strongly related dimensions in this work, we note that the strongest contribution to “science reasoning in writing in biology,” as determined through exploratory factor analysis, is “argument for the significance of research” (BioTAP question 2, not question 5; Dowd et al. , 2016 ). Question 2 is not clearly related to critical-thinking skills. These findings are not contradictory, but rather suggest that the epistemological and disciplinary-specific aspects of science reasoning that emerge in writing through BioTAP are not completely aligned with aspects related to critical thinking. In other words, science reasoning in writing is not simply a proxy for those critical-thinking skills that play a role in science reasoning.

In a similar vein, the content-related, epistemological aspects of science reasoning, as well as the conventions associated with writing the undergraduate thesis (including feedback from peers and revision), may explain the lack of significant relationships between some science reasoning dimensions and some critical-thinking skills that might otherwise seem counterintuitive (e.g., BioTAP question 2, which relates to making an argument, and the critical-thinking skill of argument). It is possible that an individual’s critical-thinking skills may explain some variation in a particular BioTAP dimension, but other aspects of science reasoning and practice exert much stronger influence. Although these relationships do not emerge in our analyses, the lack of significant correlation does not mean that there is definitively no correlation. Correcting for multiple comparisons suppresses type 1 error at the expense of exacerbating type 2 error, which, combined with the limited sample size, constrains statistical power and makes weak relationships more difficult to detect. Ultimately, though, the relationships that do emerge highlight places where individuals’ distinct critical-thinking skills emerge most coherently in thesis assessment, which is why we are particularly interested in unpacking those relationships.

We recognize that, because only honors students submit theses at these institutions, this study sample is composed of a selective subset of the larger population of biology majors. Although this is an inherent limitation of focusing on thesis writing, links between our findings and results of other studies (with different populations) suggest that observed relationships may occur more broadly. The goal of improved science reasoning and critical thinking is shared among all biology majors, particularly those engaged in capstone research experiences. So while the implications of this work most directly apply to honors thesis writers, we provisionally suggest that all students could benefit from further study of them.

There are several important implications of this study for science education practices. Students’ inference skills relate to the understanding and effective application of scientific content. The fact that we find no statistically significant relationships between BioTAP questions 6–9 and CCTST dimensions suggests that such mid- to lower-order elements of BioTAP ( Reynolds et al. , 2009 ), which tend to be more structural in nature, do not focus on aspects of the finished thesis that draw strongly on critical thinking. In keeping with prior analyses ( Reynolds and Thompson, 2011 ; Dowd et al. , 2016 ), these findings further reinforce the notion that disciplinary instructors, who are most capable of teaching and assessing scientific reasoning and perhaps least interested in the more mechanical aspects of writing, may nonetheless be best suited to effectively model and assess students’ writing.

The goal of the thesis writing course at both Duke University and the University of Minnesota is not merely to improve thesis scores but to move students’ writing into the category of mastery across BioTAP dimensions. Recognizing that students with differing critical-thinking skills (particularly inference) are more or less likely to achieve mastery in the undergraduate thesis (particularly in discussing implications [question 5]) is important for developing and testing targeted pedagogical interventions to improve learning outcomes for all students.

The competencies characterized by the Vision and Change in Undergraduate Biology Education Initiative provide a general framework for recognizing that science reasoning and critical-thinking skills play key roles in major learning outcomes of science education. Our findings highlight places where science reasoning–related competencies (like “understanding the process of science”) connect to critical-thinking skills and places where critical thinking–related competencies might be manifested in scientific products (such as the ability to discuss implications in scientific writing). We encourage broader efforts to build empirical connections between competencies and pedagogical practices to further improve science education.

One specific implication of this work for science education is to focus on providing opportunities for students to develop their critical-thinking skills (particularly inference). Of course, as this correlational study is not designed to test causality, we do not claim that enhancing students’ inference skills will improve science reasoning in writing. However, as prior work shows that science writing activities influence students’ inference skills ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ), there is reason to test such a hypothesis. Nevertheless, the focus must extend beyond inference as an isolated skill; rather, it is important to relate inference to the foundations of the scientific method ( Miri et al. , 2007 ) in terms of the epistemological appreciation of the functions and coordination of evidence ( Kuhn and Dean, 2004 ; Zeineddin and Abd-El-Khalick, 2010 ; Ding et al. , 2016 ) and disciplinary paradigms of truth and justification ( Moshman, 2015 ).

Although this study is limited to the domain of biology at two institutions with a relatively small number of students, the findings represent a foundational step in the direction of achieving success with more integrated learning outcomes. Hopefully, it will spur greater interest in empirically grounding discussions of the constructs of scientific reasoning and critical-thinking skills.

This study contributes to the efforts to improve science education, for both majors and nonmajors, through an empirically driven analysis of the relationships between scientific reasoning reflected in the genre of thesis writing and critical-thinking skills. This work is rooted in the usefulness of BioTAP as a method 1) to facilitate communication and learning and 2) to assess disciplinary-specific and general dimensions of science reasoning. The findings support the important role of the critical-thinking skill of inference in scientific reasoning in writing, while also highlighting ways in which other aspects of science reasoning (epistemological considerations, writing conventions, etc.) are not significantly related to critical thinking. Future research into the impact of interventions focused on specific critical-thinking skills (i.e., inference) for improved science reasoning in writing will build on this work and its implications for science education.

Supplementary Material

Acknowledgments.

We acknowledge the contributions of Kelaine Haas and Alexander Motten to the implementation and collection of data. We also thank Mine Çetinkaya-­Rundel for her insights regarding our statistical analyses. This research was funded by National Science Foundation award DUE-1525602.

  • American Association for the Advancement of Science. (2011). Vision and change in undergraduate biology education: A call to action . Washington, DC: Retrieved September 26, 2017, from https://visionandchange.org/files/2013/11/aaas-VISchange-web1113.pdf . [ Google Scholar ]
  • August D. (2016). California Critical Thinking Skills Test user manual and resource guide . San Jose: Insight Assessment/California Academic Press. [ Google Scholar ]
  • Beyer C. H., Taylor E., Gillmore G. M. (2013). Inside the undergraduate teaching experience: The University of Washington’s growth in faculty teaching study . Albany, NY: SUNY Press. [ Google Scholar ]
  • Bissell A. N., Lemons P. P. (2006). A new method for assessing critical thinking in the classroom . BioScience , ( 1 ), 66–72. https://doi.org/10.1641/0006-3568(2006)056[0066:ANMFAC]2.0.CO;2 . [ Google Scholar ]
  • Blattner N. H., Frazier C. L. (2002). Developing a performance-based assessment of students’ critical thinking skills . Assessing Writing , ( 1 ), 47–64. [ Google Scholar ]
  • Clase K. L., Gundlach E., Pelaez N. J. (2010). Calibrated peer review for computer-assisted learning of biological research competencies . Biochemistry and Molecular Biology Education , ( 5 ), 290–295. [ PubMed ] [ Google Scholar ]
  • Condon W., Kelly-Riley D. (2004). Assessing and teaching what we value: The relationship between college-level writing and critical thinking abilities . Assessing Writing , ( 1 ), 56–75. https://doi.org/10.1016/j.asw.2004.01.003 . [ Google Scholar ]
  • Ding L., Wei X., Liu X. (2016). Variations in university students’ scientific reasoning skills across majors, years, and types of institutions . Research in Science Education , ( 5 ), 613–632. https://doi.org/10.1007/s11165-015-9473-y . [ Google Scholar ]
  • Dowd J. E., Connolly M. P., Thompson R. J., Jr., Reynolds J. A. (2015a). Improved reasoning in undergraduate writing through structured workshops . Journal of Economic Education , ( 1 ), 14–27. https://doi.org/10.1080/00220485.2014.978924 . [ Google Scholar ]
  • Dowd J. E., Roy C. P., Thompson R. J., Jr., Reynolds J. A. (2015b). “On course” for supporting expanded participation and improving scientific reasoning in undergraduate thesis writing . Journal of Chemical Education , ( 1 ), 39–45. https://doi.org/10.1021/ed500298r . [ Google Scholar ]
  • Dowd J. E., Thompson R. J., Jr., Reynolds J. A. (2016). Quantitative genre analysis of undergraduate theses: Uncovering different ways of writing and thinking in science disciplines . WAC Journal , , 36–51. [ Google Scholar ]
  • Facione P. A. (1990). Critical thinking: a statement of expert consensus for purposes of educational assessment and instruction. Research findings and recommendations . Newark, DE: American Philosophical Association; Retrieved September 26, 2017, from https://philpapers.org/archive/FACCTA.pdf . [ Google Scholar ]
  • Gerdeman R. D., Russell A. A., Worden K. J., Gerdeman R. D., Russell A. A., Worden K. J. (2007). Web-based student writing and reviewing in a large biology lecture course . Journal of College Science Teaching , ( 5 ), 46–52. [ Google Scholar ]
  • Greenhoot A. F., Semb G., Colombo J., Schreiber T. (2004). Prior beliefs and methodological concepts in scientific reasoning . Applied Cognitive Psychology , ( 2 ), 203–221. https://doi.org/10.1002/acp.959 . [ Google Scholar ]
  • Haaga D. A. F. (1993). Peer review of term papers in graduate psychology courses . Teaching of Psychology , ( 1 ), 28–32. https://doi.org/10.1207/s15328023top2001_5 . [ Google Scholar ]
  • Halonen J. S., Bosack T., Clay S., McCarthy M., Dunn D. S., Hill G. W., Whitlock K. (2003). A rubric for learning, teaching, and assessing scientific inquiry in psychology . Teaching of Psychology , ( 3 ), 196–208. https://doi.org/10.1207/S15328023TOP3003_01 . [ Google Scholar ]
  • Hand B., Keys C. W. (1999). Inquiry investigation . Science Teacher , ( 4 ), 27–29. [ Google Scholar ]
  • Holm S. (1979). A simple sequentially rejective multiple test procedure . Scandinavian Journal of Statistics , ( 2 ), 65–70. [ Google Scholar ]
  • Holyoak K. J., Morrison R. G. (2005). The Cambridge handbook of thinking and reasoning . New York: Cambridge University Press. [ Google Scholar ]
  • Insight Assessment. (2016a). California Critical Thinking Skills Test (CCTST) Retrieved September 26, 2017, from www.insightassessment.com/Products/Products-Summary/Critical-Thinking-Skills-Tests/California-Critical-Thinking-Skills-Test-CCTST .
  • Insight Assessment. (2016b). Sample thinking skills questions. Retrieved September 26, 2017, from www.insightassessment.com/Resources/Teaching-Training-and-Learning-Tools/node_1487 .
  • Kelly G. J., Takao A. (2002). Epistemic levels in argument: An analysis of university oceanography students’ use of evidence in writing . Science Education , ( 3 ), 314–342. https://doi.org/10.1002/sce.10024 . [ Google Scholar ]
  • Kuhn D., Dean D., Jr. (2004). Connecting scientific reasoning and causal inference . Journal of Cognition and Development , ( 2 ), 261–288. https://doi.org/10.1207/s15327647jcd0502_5 . [ Google Scholar ]
  • Kuhn D., Iordanou K., Pease M., Wirkala C. (2008). Beyond control of variables: What needs to develop to achieve skilled scientific thinking? . Cognitive Development , ( 4 ), 435–451. https://doi.org/10.1016/j.cogdev.2008.09.006 . [ Google Scholar ]
  • Lawson A. E. (2010). Basic inferences of scientific reasoning, argumentation, and discovery . Science Education , ( 2 ), 336–364. https://doi.org/­10.1002/sce.20357 . [ Google Scholar ]
  • Meizlish D., LaVaque-Manty D., Silver N., Kaplan M. (2013). Think like/write like: Metacognitive strategies to foster students’ development as disciplinary thinkers and writers . In Thompson R. J. (Ed.), Changing the conversation about higher education (pp. 53–73). Lanham, MD: Rowman & Littlefield. [ Google Scholar ]
  • Miri B., David B.-C., Uri Z. (2007). Purposely teaching for the promotion of higher-order thinking skills: A case of critical thinking . Research in Science Education , ( 4 ), 353–369. https://doi.org/10.1007/s11165-006-9029-2 . [ Google Scholar ]
  • Moshman D. (2015). Epistemic cognition and development: The psychology of justification and truth . New York: Psychology Press. [ Google Scholar ]
  • National Research Council. (2000). How people learn: Brain, mind, experience, and school . Expanded ed. Washington, DC: National Academies Press. [ Google Scholar ]
  • Pukkila P. J. (2004). Introducing student inquiry in large introductory genetics classes . Genetics , ( 1 ), 11–18. https://doi.org/10.1534/genetics.166.1.11 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Quitadamo I. J., Faiola C. L., Johnson J. E., Kurtz M. J. (2008). Community-based inquiry improves critical thinking in general education biology . CBE—Life Sciences Education , ( 3 ), 327–337. https://doi.org/10.1187/cbe.07-11-0097 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Quitadamo I. J., Kurtz M. J. (2007). Learning to improve: Using writing to increase critical thinking performance in general education biology . CBE—Life Sciences Education , ( 2 ), 140–154. https://doi.org/10.1187/cbe.06-11-0203 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Reynolds J. A., Smith R., Moskovitz C., Sayle A. (2009). BioTAP: A systematic approach to teaching scientific writing and evaluating undergraduate theses . BioScience , ( 10 ), 896–903. https://doi.org/10.1525/bio.2009.59.10.11 . [ Google Scholar ]
  • Reynolds J. A., Thaiss C., Katkin W., Thompson R. J. (2012). Writing-to-learn in undergraduate science education: A community-based, conceptually driven approach . CBE—Life Sciences Education , ( 1 ), 17–25. https://doi.org/10.1187/cbe.11-08-0064 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Reynolds J. A., Thompson R. J. (2011). Want to improve undergraduate thesis writing? Engage students and their faculty readers in scientific peer review . CBE—Life Sciences Education , ( 2 ), 209–215. https://doi.org/­10.1187/cbe.10-10-0127 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Rhemtulla M., Brosseau-Liard P. E., Savalei V. (2012). When can categorical variables be treated as continuous? A comparison of robust continuous and categorical SEM estimation methods under suboptimal conditions . Psychological Methods , ( 3 ), 354–373. https://doi.org/­10.1037/a0029315 . [ PubMed ] [ Google Scholar ]
  • Stephenson N. S., Sadler-McKnight N. P. (2016). Developing critical thinking skills using the science writing heuristic in the chemistry laboratory . Chemistry Education Research and Practice , ( 1 ), 72–79. https://doi.org/­10.1039/C5RP00102A . [ Google Scholar ]
  • Tariq V. N., Stefani L. A. J., Butcher A. C., Heylings D. J. A. (1998). Developing a new approach to the assessment of project work . Assessment and Evaluation in Higher Education , ( 3 ), 221–240. https://doi.org/­10.1080/0260293980230301 . [ Google Scholar ]
  • Timmerman B. E. C., Strickland D. C., Johnson R. L., Payne J. R. (2011). Development of a “universal” rubric for assessing undergraduates’ scientific reasoning skills using scientific writing . Assessment and Evaluation in Higher Education , ( 5 ), 509–547. https://doi.org/10.1080/­02602930903540991 . [ Google Scholar ]
  • Topping K. J., Smith E. F., Swanson I., Elliot A. (2000). Formative peer assessment of academic writing between postgraduate students . Assessment and Evaluation in Higher Education , ( 2 ), 149–169. https://doi.org/10.1080/713611428 . [ Google Scholar ]
  • Willison J., O’Regan K. (2007). Commonly known, commonly not known, totally unknown: A framework for students becoming researchers . Higher Education Research and Development , ( 4 ), 393–409. https://doi.org/10.1080/07294360701658609 . [ Google Scholar ]
  • Woodin T., Carter V. C., Fletcher L. (2010). Vision and Change in Biology Undergraduate Education: A Call for Action—Initial responses . CBE—Life Sciences Education , ( 2 ), 71–73. https://doi.org/10.1187/cbe.10-03-0044 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Zeineddin A., Abd-El-Khalick F. (2010). Scientific reasoning and epistemological commitments: Coordination of theory and evidence among college science students . Journal of Research in Science Teaching , ( 9 ), 1064–1093. https://doi.org/10.1002/tea.20368 . [ Google Scholar ]
  • Zimmerman C. (2000). The development of scientific reasoning skills . Developmental Review , ( 1 ), 99–149. https://doi.org/10.1006/drev.1999.0497 . [ Google Scholar ]
  • Zimmerman C. (2007). The development of scientific thinking skills in elementary and middle school . Developmental Review , ( 2 ), 172–223. https://doi.org/10.1016/j.dr.2006.12.001 . [ Google Scholar ]

GCFGlobal Logo

  • Get started with computers
  • Learn Microsoft Office
  • Apply for a job
  • Improve my work skills
  • Design nice-looking docs
  • Getting Started
  • Smartphones & Tablets
  • Typing Tutorial
  • Online Learning
  • Basic Internet Skills
  • Online Safety
  • Social Media
  • Zoom Basics
  • Google Docs
  • Google Sheets
  • Career Planning
  • Resume Writing
  • Cover Letters
  • Job Search and Networking
  • Business Communication
  • Entrepreneurship 101
  • Careers without College
  • Job Hunt for Today
  • 3D Printing
  • Freelancing 101
  • Personal Finance
  • Sharing Economy
  • Decision-Making
  • Graphic Design
  • Photography
  • Image Editing
  • Learning WordPress
  • Language Learning
  • Critical Thinking
  • For Educators
  • Translations
  • Staff Picks
  • English expand_more expand_less

Critical Thinking and Decision-Making  - What is Critical Thinking?

Critical thinking and decision-making  -, what is critical thinking, critical thinking and decision-making what is critical thinking.

GCFLearnFree Logo

Critical Thinking and Decision-Making: What is Critical Thinking?

Lesson 1: what is critical thinking, what is critical thinking.

Critical thinking is a term that gets thrown around a lot. You've probably heard it used often throughout the years whether it was in school, at work, or in everyday conversation. But when you stop to think about it, what exactly is critical thinking and how do you do it ?

Watch the video below to learn more about critical thinking.

Simply put, critical thinking is the act of deliberately analyzing information so that you can make better judgements and decisions . It involves using things like logic, reasoning, and creativity, to draw conclusions and generally understand things better.

illustration of the terms logic, reasoning, and creativity

This may sound like a pretty broad definition, and that's because critical thinking is a broad skill that can be applied to so many different situations. You can use it to prepare for a job interview, manage your time better, make decisions about purchasing things, and so much more.

The process

illustration of "thoughts" inside a human brain, with several being connected and "analyzed"

As humans, we are constantly thinking . It's something we can't turn off. But not all of it is critical thinking. No one thinks critically 100% of the time... that would be pretty exhausting! Instead, it's an intentional process , something that we consciously use when we're presented with difficult problems or important decisions.

Improving your critical thinking

illustration of the questions "What do I currently know?" and "How do I know this?"

In order to become a better critical thinker, it's important to ask questions when you're presented with a problem or decision, before jumping to any conclusions. You can start with simple ones like What do I currently know? and How do I know this? These can help to give you a better idea of what you're working with and, in some cases, simplify more complex issues.  

Real-world applications

illustration of a hand holding a smartphone displaying an article that reads, "Study: Cats are better than dogs"

Let's take a look at how we can use critical thinking to evaluate online information . Say a friend of yours posts a news article on social media and you're drawn to its headline. If you were to use your everyday automatic thinking, you might accept it as fact and move on. But if you were thinking critically, you would first analyze the available information and ask some questions :

  • What's the source of this article?
  • Is the headline potentially misleading?
  • What are my friend's general beliefs?
  • Do their beliefs inform why they might have shared this?

illustration of "Super Cat Blog" and "According to survery of cat owners" being highlighted from an article on a smartphone

After analyzing all of this information, you can draw a conclusion about whether or not you think the article is trustworthy.

Critical thinking has a wide range of real-world applications . It can help you to make better decisions, become more hireable, and generally better understand the world around you.

illustration of a lightbulb, a briefcase, and the world

/en/problem-solving-and-decision-making/why-is-it-so-hard-to-make-decisions/content/

Critical Thinking vs Problem Solving: Navigating Cognitive Approaches

relationship between critical thinking and problem solving skills

JTN Article

relationship between critical thinking and problem solving skills

Critical thinking and problem solving are closely related skills that often go hand in hand. Critical thinking is a prerequisite for effective problem-solving. While they are distinct concepts, they are interdependent and complement each other in various ways. Here's a breakdown of the relationship between critical thinking and problem solving, and strategies to strengthen both skill sets.

Critical Thinking vs. Problem Solving

Critical thinking involves the ability to analyze, evaluate, and assess information, ideas, or arguments in a logical and systematic manner. It includes skills such as reasoning, analyzing evidence, identifying biases, and making informed judgments. Problem solving is the process of finding solutions to specific challenges or issues. It typically involves defining a problem, generating potential solutions, evaluating those solutions, and implementing the best one.

Critical thinking is often considered the foundation of effective problem solving. To solve a problem effectively, you first need to critically assess and understand the problem itself. Critical thinking helps you define the problem, identify its root causes, and gather relevant information.

Both critical thinking and problem solving contribute to informed decision-making. Critical thinking helps individuals evaluate the pros and cons of different solutions, while problem-solving skills help in selecting the most suitable solution. Critical thinking and problem-solving skills promote continuous improvement within an organization in response to changing needs and conditions. Individuals who engage in critical thinking continuously refine their problem-solving abilities, leading to more effective solutions over time.

In summary, critical thinking and problem solving are interconnected skills that support each other. Critical thinking provides the analytical and evaluative tools needed to approach problems effectively, while problem solving puts critical thinking into action by applying these skills to real-world challenges. Together, they enable individuals and teams to make well-informed decisions and find innovative solutions to complex issues.

Understanding Critical Thinking

Critical thinking is a cognitive skill and a mental process that involves the objective, deliberate, and systematic evaluation of information, ideas, situations, or problems in order to form well-reasoned judgments, make informed decisions, and solve complex issues. It is a fundamental human capability that goes beyond mere acceptance of information at face value and instead encourages individuals to approach information critically, examining its validity, relevance, and potential biases.

At its core, critical thinking involves these key components: ‍

  • ‍ Analysis: Critical thinkers carefully examine information or situations by breaking them down into their constituent parts. They dissect complex ideas or problems into manageable components, making it easier to understand and address. ‍
  • Evaluation: Critical thinking requires individuals to assess the quality and credibility of information or arguments. It involves considering the source, evidence, and reasoning behind a statement or claim, and determining whether it is well-founded.
  • ‍ Inference: Critical thinkers draw logical and reasonable conclusions based on available evidence and information. They avoid making assumptions or jumping to unwarranted conclusions.
  • ‍ Problem-Solving : Critical thinking is a valuable problem-solving tool. It involves identifying problems, exploring potential solutions, and evaluating those solutions to determine the most effective course of action. ‍
  • Decision-Making: Informed decision-making is a crucial aspect of critical thinking. It helps individuals choose the most appropriate course of action among several options, taking into account the potential consequences and ethical considerations. ‍
  • Reflection: Critical thinkers engage in self-reflection, questioning their own beliefs and assumptions. This self-awareness allows for personal growth and intellectual development. ‍

‍ The Power of Critical Thinking

Critical thinking skills are essential for analyzing complex problems within your organization. When faced with a problem, individuals and teams must critically examine the various components, potential causes, and consequences of the issue. Critical thinking helps break down complex problems into manageable parts. Critical thinking skills also play a crucial role in identifying problems accurately. Without the ability to critically assess situations, individuals may misinterpret problems or focus on symptoms rather than root causes. ‍

Informed Decision-Making ‍

Critical thinking enables leaders and employees to make informed decisions. It involves evaluating information, considering alternatives, and weighing the pros and cons before choosing the best course of action. In a business context, this can lead to better strategic decisions, efficient resource allocation, and effective problem-solving. ‍

Problem Solving ‍

Businesses face a wide range of complex challenges. Critical thinking equips individuals with the skills to analyze problems, identify root causes, and develop innovative solutions. This is crucial for addressing issues promptly and effectively, whether they involve market competition, operational inefficiencies, or customer satisfaction.

‍ Innovation

Critical thinking fosters creativity and innovation. It encourages employees to think outside the box, challenge conventional wisdom, and explore unconventional solutions. This is essential for staying competitive and developing new products, services, or processes. ‍

Conflict Resolution

Conflicts are inevitable in any organization. Critical thinking skills enable individuals to approach conflicts objectively, understand the underlying issues, and propose constructive solutions. This promotes a healthier work environment and fosters collaboration. ‍

Customer Satisfaction

Understanding and meeting customer needs and expectations are crucial in business. Critical thinking helps organizations analyze customer feedback, identify areas for improvement, and innovate to provide better products or services. ‍

Ethical Decision-Making

In an age where ethics and corporate responsibility are paramount, critical thinking plays a role in ethical decision-making. It helps individuals and organizations assess the ethical implications of their actions and make choices that align with their values and societal expectations. ‍

Competitive Advantage

Businesses that encourage and develop critical thinking skills in their employees can gain a competitive advantage. A workforce that can analyze data, adapt to changes, and innovate is more likely to thrive in a rapidly evolving market. ‍

Enhancing Critical Thinking Abilities

Enhancing critical thinking abilities is a valuable skill that can improve decision-making, problem solving, and overall cognitive function. Here are some strategies to help you develop and enhance your critical thinking abilities:

Ask Questions: Encourage curiosity by asking open-ended questions about the information or situation at hand. Questions like "Why?" and "How?" can prompt deeper thinking and analysis.

Gather Information: Seek out diverse sources of information and perspectives. Be open to exploring various viewpoints, even if they challenge your existing beliefs or assumptions.

Evaluate Sources: Assess the credibility and reliability of information sources. Consider the author's qualifications, the publication's reputation, and potential biases.

Analyze Arguments: Break down arguments into their components. Identify premises, conclusions, and any logical fallacies. Evaluate the strength of the evidence and the soundness of the reasoning.

Practice Reflective Thinking: Regularly take time to reflect on your thoughts, experiences, and decisions. Consider what you've learned and how you can apply it to future situations.

Consider Alternative Perspectives: Put yourself in someone else's shoes and try to understand their viewpoint, even if you disagree. This helps you develop empathy and a more comprehensive understanding of issues.

Socratic Questioning: Employ Socratic questioning techniques, which involve asking a series of probing questions to explore ideas and uncover deeper insights.

Problem-Solving Exercises: Regularly tackle problems or puzzles that require analytical thinking. This can be anything from brain teasers to real-world challenges.

Debate and Discussion: Engage in debates and discussions with others, particularly those with different viewpoints. Constructive debates can sharpen your critical thinking skills as you defend your position and respond to counter-arguments.

Continual Learning: Embrace a growth mindset and a commitment to lifelong learning. New information and experiences can challenge and expand your thinking.

Seek Feedback: Encourage others to provide constructive feedback on your thoughts and ideas. Constructive criticism can help you refine your thinking.

Use Critical Thinking Tools: Familiarize yourself with critical thinking tools like the SWOT analysis, the 5 Whys technique, and decision matrices. These tools can help structure your thinking and decision-making process.

Take Courses: Consider enrolling in courses or workshops focused on critical thinking and problem solving. Many educational institutions and online platforms offer such courses.

Collaborate: Collaborate with others on projects or problem-solving tasks. Different perspectives and skills can enhance your critical thinking abilities.

Remember that developing critical thinking is an ongoing process, and improvement takes time. Be patient with yourself and continuously practice these strategies to enhance your critical thinking abilities over time. ‍

Problem Solving: A Key Skillset

Developing critical thinking skills will help you and your team become better problem-solvers. A strong problem-solving skillset is of paramount importance in a professional or business context for several compelling reasons. In the complex and ever-evolving landscape of modern work environments, individuals and organizations alike face a myriad of challenges that demand effective problem-solving abilities.

First and foremost, problem-solving skills empower individuals to tackle obstacles and setbacks with confidence and efficiency. In a professional context, this means overcoming workplace challenges, meeting project deadlines, and addressing unexpected issues head-on. Whether it's resolving technical glitches, navigating interpersonal conflicts, or devising strategies to meet changing market demands, problem-solving is the linchpin that ensures operations run smoothly.

Moreover, in business, problem-solving is intricately linked to innovation and growth. Companies that foster a culture of problem-solving encourage their employees to think creatively and proactively identify opportunities for improvement. These organizations not only respond effectively to market disruptions but also stay ahead of the competition by consistently delivering innovative products, services, and solutions. Problem-solving skillsets, therefore, serve as a catalyst for driving innovation and maintaining a competitive edge in today's fast-paced business world.

Additionally, problem-solving skills facilitate effective decision-making. Professionals who can critically analyze information, weigh alternatives, and assess potential risks are better equipped to make sound decisions that align with their organizations' strategic objectives. From financial choices to resource allocation and market entry strategies, well-honed problem-solving skills are instrumental in choosing the most suitable and advantageous courses of action.

In conclusion, the importance of a problem-solving skillset in a professional or business context cannot be overstated. It empowers individuals to navigate challenges, fosters innovation, supports effective decision-making, and ultimately contributes to the success and growth of both individuals and organizations. In today's dynamic and competitive work environments, honing problem-solving abilities is an investment with dividends that extend far beyond immediate problem resolution. ‍

Unleashing Problem-Solving Abilities ‍

Developing and honing problem-solving abilities at work is a valuable skill that can enhance your effectiveness and contribute to your career growth. Here are some practical strategies to help you cultivate and improve your problem-solving skills in the workplace:

Recognize the Importance: Acknowledge the significance of problem-solving skills in your job and career. Understand that the ability to solve problems efficiently is a valuable asset that can set you apart.

Understand the Problem: Take the time to fully understand the problem at hand. Define the problem clearly and identify its root causes. This step is crucial for finding effective solutions.

Gather Information: Collect relevant data and information related to the problem. This may involve research, data analysis, or consulting with colleagues who have expertise in the area.

Break It Down: Divide complex problems into smaller, more manageable components. This can make the problem-solving process less daunting and help you focus on solving one aspect at a time.

Brainstorm Solutions: Encourage brainstorming sessions with colleagues or team members. Diverse perspectives can lead to innovative solutions. Be open to ideas and avoid judgment during this phase.

Evaluate Solutions: Assess each potential solution objectively. Consider the pros and cons, feasibility, and potential risks associated with each option. Critical thinking is essential at this stage.

Select the Best Solution: Based on your evaluation, choose the solution that seems most effective and suitable for the situation. Consider both short-term and long-term implications.

Create an Action Plan: Develop a clear and actionable plan to implement the chosen solution. Define roles and responsibilities, set deadlines, and allocate resources as needed.

Implement and Monitor: Put the plan into action, and closely monitor its progress. Be prepared to make adjustments if necessary as you encounter new information or challenges.

Learn from Failure: Understand that not all solutions will be successful. When problems persist or new ones arise, view them as opportunities for growth and learning. Analyze what went wrong and use that knowledge to improve.

Seek Feedback: Don't hesitate to seek feedback from colleagues, mentors, or supervisors. Constructive criticism can provide valuable insights and help you refine your problem-solving skills.

Continuous Learning: Stay updated on industry trends, best practices, and emerging technologies. Expanding your knowledge base can provide you with new tools and perspectives for problem-solving.

Practice Patience: Complex problems may not have immediate solutions. Exercise patience and persistence, and don't get discouraged if a solution doesn't come quickly.

Embrace Challenges: Seek out challenging projects or assignments that require problem-solving skills. The more you practice, the more confident and skilled you'll become.

Mentorship: If possible, find a mentor or coach who excels in problem-solving. Learning from someone with experience can accelerate your growth.

Use Problem-Solving Tools: Familiarize yourself with problem-solving methodologies and tools like the 5 Whys technique, root cause analysis, SWOT analysis, and decision matrices. These frameworks can guide your problem-solving process.

Develop Soft Skills: Effective problem-solving often involves strong communication, teamwork, and interpersonal skills. Work on improving these soft skills to collaborate effectively with others in solving problems. ‍

Problem Solving in Action

Here are some real-life examples of problem-solving in action in various organizational or business contexts:

  • ‍ Customer Complaint Resolution: A customer service team at an e-commerce company receives numerous complaints about delayed deliveries. The team investigates the root causes, which may include inefficient order processing or problems with third-party couriers. They implement process improvements, streamline communication with courier services, and provide proactive delivery updates to customers to address the issue and enhance customer satisfaction. ‍
  • Cost Reduction Initiative: A manufacturing company realizes that its production costs are escalating, affecting its profit margins. To solve this problem, the company engages in a cost reduction initiative. They scrutinize every aspect of their operations, identify inefficiencies, negotiate better deals with suppliers, optimize production processes, and implement energy-saving measures, ultimately reducing production costs without compromising quality. ‍
  • Employee Engagement Improvement: An organization observes declining employee engagement levels, leading to increased turnover. HR and management collaborate to identify the underlying issues, which may include inadequate training, lack of career growth opportunities, or poor work-life balance. They develop and implement employee engagement strategies, such as training programs, mentorship initiatives, and flexible work arrangements, to boost morale and retain talent. ‍
  • Product Quality Enhancement: A technology company receives customer complaints about a particular product's reliability and performance. The engineering team conducts root cause analysis, identifies design flaws, and works on product improvements. They also set up a system for gathering feedback from customers to continuously refine the product's quality.

These real-life examples demonstrate that problem-solving is an essential skill in organizations and businesses across various sectors. Effective problem-solving often involves collaboration among teams, data analysis, creative thinking, and the implementation of well-thought-out solutions to overcome challenges and achieve strategic objectives. ‍

Unraveling the Critical Thinking Puzzle

In a professional or organizational context, the relationship between critical thinking and problem-solving is symbiotic and indispensable. Critical thinking serves as the foundational framework that underpins effective problem-solving, while problem-solving is the practical application of critical thinking skills to address real-world challenges. Together, they form a dynamic duo that drives success and innovation.

Critical thinking equips individuals and teams with the capacity to analyze information, evaluate alternatives, and make informed decisions. It encourages open-mindedness, skepticism, and the ability to see beyond the surface, fostering a culture of intellectual rigor. In the context of problem-solving, critical thinking helps define the problem accurately, assess potential solutions objectively, and identify the most appropriate course of action.

Problem-solving, on the other hand, puts critical thinking into action. It involves taking the insights gained through critical analysis and applying them to real-world scenarios. Effective problem-solving hinges on the ability to break down complex challenges, generate creative solutions, and adapt strategies as circumstances evolve—all skills deeply rooted in critical thinking.

In sum, critical thinking without effective problem-solving remains theoretical, and problem-solving without critical thinking lacks depth and efficacy. In the professional and organizational realm, these two capabilities complement and strengthen each other, fostering innovation, informed decision-making, and the ability to navigate the complexities of today's dynamic business environments. Together, they empower individuals and organizations to thrive, adapt, and excel in a world where challenges and opportunities abound.

Critical Thinking vs Problem Solving: Navigating Cognitive Approaches

Kaitlyn is a member of the training team at JTN Group in New York. She's a master facilitator with experience leading workshops & training programs for SMBs through to Enterprise organizations. Learn about JTN Group here.

Get in Touch With Us

United States

4th Floor 667 Madison Avenue New York, NY 10065 United States +1 877 465 7740 [email protected]

United Kingdom

30th Floor 122 Leadenhall Street London EC3V 4AB United Kingdom +44 20 7099 5535 [email protected]

relationship between critical thinking and problem solving skills

JTN works globally with ambitious B2B organizations. Let’s stay in touch.

JTN is a registered trademark used under license.

  • Campus Life
  • ...a student.
  • ...a veteran.
  • ...an alum.
  • ...a parent.
  • ...faculty or staff.
  • Class Schedule
  • Crisis Resources
  • People Finder
  • Change Password

UTC RAVE Alert

Critical thinking and problem-solving, jump to: , what is critical thinking, characteristics of critical thinking, why teach critical thinking.

  • Teaching Strategies to Help Promote Critical Thinking Skills

References and Resources

When examining the vast literature on critical thinking, various definitions of critical thinking emerge. Here are some samples:

  • "Critical thinking is the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action" (Scriven, 1996).
  • "Most formal definitions characterize critical thinking as the intentional application of rational, higher order thinking skills, such as analysis, synthesis, problem recognition and problem solving, inference, and evaluation" (Angelo, 1995, p. 6).
  • "Critical thinking is thinking that assesses itself" (Center for Critical Thinking, 1996b).
  • "Critical thinking is the ability to think about one's thinking in such a way as 1. To recognize its strengths and weaknesses and, as a result, 2. To recast the thinking in improved form" (Center for Critical Thinking, 1996c).

Perhaps the simplest definition is offered by Beyer (1995) : "Critical thinking... means making reasoned judgments" (p. 8). Basically, Beyer sees critical thinking as using criteria to judge the quality of something, from cooking to a conclusion of a research paper. In essence, critical thinking is a disciplined manner of thought that a person uses to assess the validity of something (statements, news stories, arguments, research, etc.).

Back        

Wade (1995) identifies eight characteristics of critical thinking. Critical thinking involves asking questions, defining a problem, examining evidence, analyzing assumptions and biases, avoiding emotional reasoning, avoiding oversimplification, considering other interpretations, and tolerating ambiguity. Dealing with ambiguity is also seen by Strohm & Baukus (1995) as an essential part of critical thinking, "Ambiguity and doubt serve a critical-thinking function and are a necessary and even a productive part of the process" (p. 56).

Another characteristic of critical thinking identified by many sources is metacognition. Metacognition is thinking about one's own thinking. More specifically, "metacognition is being aware of one's thinking as one performs specific tasks and then using this awareness to control what one is doing" (Jones & Ratcliff, 1993, p. 10 ).

In the book, Critical Thinking, Beyer elaborately explains what he sees as essential aspects of critical thinking. These are:

  • Dispositions: Critical thinkers are skeptical, open-minded, value fair-mindedness, respect evidence and reasoning, respect clarity and precision, look at different points of view, and will change positions when reason leads them to do so.
  • Criteria: To think critically, must apply criteria. Need to have conditions that must be met for something to be judged as believable. Although the argument can be made that each subject area has different criteria, some standards apply to all subjects. "... an assertion must... be based on relevant, accurate facts; based on credible sources; precise; unbiased; free from logical fallacies; logically consistent; and strongly reasoned" (p. 12).
  • Argument: Is a statement or proposition with supporting evidence. Critical thinking involves identifying, evaluating, and constructing arguments.
  • Reasoning: The ability to infer a conclusion from one or multiple premises. To do so requires examining logical relationships among statements or data.
  • Point of View: The way one views the world, which shapes one's construction of meaning. In a search for understanding, critical thinkers view phenomena from many different points of view.
  • Procedures for Applying Criteria: Other types of thinking use a general procedure. Critical thinking makes use of many procedures. These procedures include asking questions, making judgments, and identifying assumptions.

Oliver & Utermohlen (1995) see students as too often being passive receptors of information. Through technology, the amount of information available today is massive. This information explosion is likely to continue in the future. Students need a guide to weed through the information and not just passively accept it. Students need to "develop and effectively apply critical thinking skills to their academic studies, to the complex problems that they will face, and to the critical choices they will be forced to make as a result of the information explosion and other rapid technological changes" (Oliver & Utermohlen, p. 1 ).

As mentioned in the section, Characteristics of Critical Thinking , critical thinking involves questioning. It is important to teach students how to ask good questions, to think critically, in order to continue the advancement of the very fields we are teaching. "Every field stays alive only to the extent that fresh questions are generated and taken seriously" (Center for Critical Thinking, 1996a ).

Beyer sees the teaching of critical thinking as important to the very state of our nation. He argues that to live successfully in a democracy, people must be able to think critically in order to make sound decisions about personal and civic affairs. If students learn to think critically, then they can use good thinking as the guide by which they live their lives.

Teaching Strategies to Help Promote Critical Thinking

The 1995, Volume 22, issue 1, of the journal, Teaching of Psychology , is devoted to the teaching critical thinking. Most of the strategies included in this section come from the various articles that compose this issue.

  • CATS (Classroom Assessment Techniques): Angelo stresses the use of ongoing classroom assessment as a way to monitor and facilitate students' critical thinking. An example of a CAT is to ask students to write a "Minute Paper" responding to questions such as "What was the most important thing you learned in today's class? What question related to this session remains uppermost in your mind?" The teacher selects some of the papers and prepares responses for the next class meeting.
  • Cooperative Learning Strategies: Cooper (1995) argues that putting students in group learning situations is the best way to foster critical thinking. "In properly structured cooperative learning environments, students perform more of the active, critical thinking with continuous support and feedback from other students and the teacher" (p. 8).
  • Case Study /Discussion Method: McDade (1995) describes this method as the teacher presenting a case (or story) to the class without a conclusion. Using prepared questions, the teacher then leads students through a discussion, allowing students to construct a conclusion for the case.
  • Using Questions: King (1995) identifies ways of using questions in the classroom:
  • Reciprocal Peer Questioning: Following lecture, the teacher displays a list of question stems (such as, "What are the strengths and weaknesses of...). Students must write questions about the lecture material. In small groups, the students ask each other the questions. Then, the whole class discusses some of the questions from each small group.
  • Reader's Questions: Require students to write questions on assigned reading and turn them in at the beginning of class. Select a few of the questions as the impetus for class discussion.
  • Conference Style Learning: The teacher does not "teach" the class in the sense of lecturing. The teacher is a facilitator of a conference. Students must thoroughly read all required material before class. Assigned readings should be in the zone of proximal development. That is, readings should be able to be understood by students, but also challenging. The class consists of the students asking questions of each other and discussing these questions. The teacher does not remain passive, but rather, helps "direct and mold discussions by posing strategic questions and helping students build on each others' ideas" (Underwood & Wald, 1995, p. 18 ).
  • Use Writing Assignments: Wade sees the use of writing as fundamental to developing critical thinking skills. "With written assignments, an instructor can encourage the development of dialectic reasoning by requiring students to argue both [or more] sides of an issue" (p. 24).
  • Written dialogues: Give students written dialogues to analyze. In small groups, students must identify the different viewpoints of each participant in the dialogue. Must look for biases, presence or exclusion of important evidence, alternative interpretations, misstatement of facts, and errors in reasoning. Each group must decide which view is the most reasonable. After coming to a conclusion, each group acts out their dialogue and explains their analysis of it.
  • Spontaneous Group Dialogue: One group of students are assigned roles to play in a discussion (such as leader, information giver, opinion seeker, and disagreer). Four observer groups are formed with the functions of determining what roles are being played by whom, identifying biases and errors in thinking, evaluating reasoning skills, and examining ethical implications of the content.
  • Ambiguity: Strohm & Baukus advocate producing much ambiguity in the classroom. Don't give students clear cut material. Give them conflicting information that they must think their way through.
  • Angelo, T. A. (1995). Beginning the dialogue: Thoughts on promoting critical thinking: Classroom assessment for critical thinking. Teaching of Psychology, 22(1), 6-7.
  • Beyer, B. K. (1995). Critical thinking. Bloomington, IN: Phi Delta Kappa Educational Foundation.
  • Center for Critical Thinking (1996a). The role of questions in thinking, teaching, and learning. [On-line]. Available HTTP: http://www.criticalthinking.org/University/univlibrary/library.nclk
  • Center for Critical Thinking (1996b). Structures for student self-assessment. [On-line]. Available HTTP: http://www.criticalthinking.org/University/univclass/trc.nclk
  • Center for Critical Thinking (1996c). Three definitions of critical thinking [On-line]. Available HTTP: http://www.criticalthinking.org/University/univlibrary/library.nclk
  • Cooper, J. L. (1995). Cooperative learning and critical thinking. Teaching of Psychology, 22(1), 7-8.
  • Jones, E. A. & Ratcliff, G. (1993). Critical thinking skills for college students. National Center on Postsecondary Teaching, Learning, and Assessment, University Park, PA. (Eric Document Reproduction Services No. ED 358 772)
  • King, A. (1995). Designing the instructional process to enhance critical thinking across the curriculum: Inquiring minds really do want to know: Using questioning to teach critical thinking. Teaching of Psychology, 22 (1) , 13-17.
  • McDade, S. A. (1995). Case study pedagogy to advance critical thinking. Teaching Psychology, 22(1), 9-10.
  • Oliver, H. & Utermohlen, R. (1995). An innovative teaching strategy: Using critical thinking to give students a guide to the future.(Eric Document Reproduction Services No. 389 702)
  • Robertson, J. F. & Rane-Szostak, D. (1996). Using dialogues to develop critical thinking skills: A practical approach. Journal of Adolescent & Adult Literacy, 39(7), 552-556.
  • Scriven, M. & Paul, R. (1996). Defining critical thinking: A draft statement for the National Council for Excellence in Critical Thinking. [On-line]. Available HTTP: http://www.criticalthinking.org/University/univlibrary/library.nclk
  • Strohm, S. M., & Baukus, R. A. (1995). Strategies for fostering critical thinking skills. Journalism and Mass Communication Educator, 50 (1), 55-62.
  • Underwood, M. K., & Wald, R. L. (1995). Conference-style learning: A method for fostering critical thinking with heart. Teaching Psychology, 22(1), 17-21.
  • Wade, C. (1995). Using writing to develop and assess critical thinking. Teaching of Psychology, 22(1), 24-28.

Other Reading

  • Bean, J. C. (1996). Engaging ideas: The professor's guide to integrating writing, critical thinking, & active learning in the classroom. Jossey-Bass.
  • Bernstein, D. A. (1995). A negotiation model for teaching critical thinking. Teaching of Psychology, 22(1), 22-24.
  • Carlson, E. R. (1995). Evaluating the credibility of sources. A missing link in the teaching of critical thinking. Teaching of Psychology, 22(1), 39-41.
  • Facione, P. A., Sanchez, C. A., Facione, N. C., & Gainen, J. (1995). The disposition toward critical thinking. The Journal of General Education, 44(1), 1-25.
  • Halpern, D. F., & Nummedal, S. G. (1995). Closing thoughts about helping students improve how they think. Teaching of Psychology, 22(1), 82-83.
  • Isbell, D. (1995). Teaching writing and research as inseparable: A faculty-librarian teaching team. Reference Services Review, 23(4), 51-62.
  • Jones, J. M. & Safrit, R. D. (1994). Developing critical thinking skills in adult learners through innovative distance learning. Paper presented at the International Conference on the practice of adult education and social development. Jinan, China. (Eric Document Reproduction Services No. ED 373 159)
  • Sanchez, M. A. (1995). Using critical-thinking principles as a guide to college-level instruction. Teaching of Psychology, 22(1), 72-74.
  • Spicer, K. L. & Hanks, W. E. (1995). Multiple measures of critical thinking skills and predisposition in assessment of critical thinking. Paper presented at the annual meeting of the Speech Communication Association, San Antonio, TX. (Eric Document Reproduction Services No. ED 391 185)
  • Terenzini, P. T., Springer, L., Pascarella, E. T., & Nora, A. (1995). Influences affecting the development of students' critical thinking skills. Research in Higher Education, 36(1), 23-39.

On the Internet

  • Carr, K. S. (1990). How can we teach critical thinking. Eric Digest. [On-line]. Available HTTP: http://ericps.ed.uiuc.edu/eece/pubs/digests/1990/carr90.html
  • The Center for Critical Thinking (1996). Home Page. Available HTTP: http://www.criticalthinking.org/University/
  • Ennis, Bob (No date). Critical thinking. [On-line], April 4, 1997. Available HTTP: http://www.cof.orst.edu/cof/teach/for442/ct.htm
  • Montclair State University (1995). Curriculum resource center. Critical thinking resources: An annotated bibliography. [On-line]. Available HTTP: http://www.montclair.edu/Pages/CRC/Bibliographies/CriticalThinking.html
  • No author, No date. Critical Thinking is ... [On-line], April 4, 1997. Available HTTP: http://library.usask.ca/ustudy/critical/
  • Sheridan, Marcia (No date). Internet education topics hotlink page. [On-line], April 4, 1997. Available HTTP: http://sun1.iusb.edu/~msherida/topics/critical.html

Walker Center for Teaching and Learning

  • 433 Library
  • Dept 4354
  • 615 McCallie Ave
  •   423-425-4188

The Peak Performance Center

The Peak Performance Center

The pursuit of performance excellence, critical thinking.

Critical Thinking header

Critical thinking refers to the process of actively analyzing, assessing, synthesizing, evaluating and reflecting on information gathered from observation, experience, or communication. It is thinking in a clear, logical, reasoned, and reflective manner to solve problems or make decisions. Basically, critical thinking is taking a hard look at something to understand what it really means.

Critical Thinkers

Critical thinkers do not simply accept all ideas, theories, and conclusions as facts. They have a mindset of questioning ideas and conclusions. They make reasoned judgments that are logical and well thought out by assessing the evidence that supports a specific theory or conclusion.

When presented with a new piece of new information, critical thinkers may ask questions such as;

“What information supports that?”

“How was this information obtained?”

“Who obtained the information?”

“How do we know the information is valid?”

“Why is it that way?”

“What makes it do that?”

“How do we know that?”

“Are there other possibilities?”

Critical Thinking

Combination of Analytical and Creative Thinking

Many people perceive critical thinking just as analytical thinking. However, critical thinking incorporates both analytical thinking and creative thinking. Critical thinking does involve breaking down information into parts and analyzing the parts in a logical, step-by-step manner. However, it also involves challenging consensus to formulate new creative ideas and generate innovative solutions. It is critical thinking that helps to evaluate and improve your creative ideas.

Critical Thinking Skills

Elements of Critical Thinking

Critical thinking involves:

  • Gathering relevant information
  • Evaluating information
  • Asking questions
  • Assessing bias or unsubstantiated assumptions
  • Making inferences from the information and filling in gaps
  • Using abstract ideas to interpret information
  • Formulating ideas
  • Weighing opinions
  • Reaching well-reasoned conclusions
  • Considering alternative possibilities
  • Testing conclusions
  • Verifying if evidence/argument support the conclusions

Developing Critical Thinking Skills

Critical thinking is considered a higher order thinking skills, such as analysis, synthesis, deduction, inference, reason, and evaluation. In order to demonstrate critical thinking, you would need to develop skills in;

Interpreting : understanding the significance or meaning of information

Analyzing : breaking information down into its parts

Connecting : making connections between related items or pieces of information.

Integrating : connecting and combining information to better understand the relationship between the information.

Evaluating : judging the value, credibility, or strength of something

Reasoning : creating an argument through logical steps

Deducing : forming a logical opinion about something based on the information or evidence that is available

Inferring : figuring something out through reasoning based on assumptions and ideas

Generating : producing new information, ideas, products, or ways of viewing things.

Blooms Taxonomy

Bloom’s Taxonomy Revised

Mind Mapping

Chunking Information

Brainstorming

relationship between critical thinking and problem solving skills

Copyright © 2024 | WordPress Theme by MH Themes

web analytics

Critical thinking definition

relationship between critical thinking and problem solving skills

Critical thinking, as described by Oxford Languages, is the objective analysis and evaluation of an issue in order to form a judgement.

Active and skillful approach, evaluation, assessment, synthesis, and/or evaluation of information obtained from, or made by, observation, knowledge, reflection, acumen or conversation, as a guide to belief and action, requires the critical thinking process, which is why it's often used in education and academics.

Some even may view it as a backbone of modern thought.

However, it's a skill, and skills must be trained and encouraged to be used at its full potential.

People turn up to various approaches in improving their critical thinking, like:

  • Developing technical and problem-solving skills
  • Engaging in more active listening
  • Actively questioning their assumptions and beliefs
  • Seeking out more diversity of thought
  • Opening up their curiosity in an intellectual way etc.

Is critical thinking useful in writing?

Critical thinking can help in planning your paper and making it more concise, but it's not obvious at first. We carefully pinpointed some the questions you should ask yourself when boosting critical thinking in writing:

  • What information should be included?
  • Which information resources should the author look to?
  • What degree of technical knowledge should the report assume its audience has?
  • What is the most effective way to show information?
  • How should the report be organized?
  • How should it be designed?
  • What tone and level of language difficulty should the document have?

Usage of critical thinking comes down not only to the outline of your paper, it also begs the question: How can we use critical thinking solving problems in our writing's topic?

Let's say, you have a Powerpoint on how critical thinking can reduce poverty in the United States. You'll primarily have to define critical thinking for the viewers, as well as use a lot of critical thinking questions and synonyms to get them to be familiar with your methods and start the thinking process behind it.

Are there any services that can help me use more critical thinking?

We understand that it's difficult to learn how to use critical thinking more effectively in just one article, but our service is here to help.

We are a team specializing in writing essays and other assignments for college students and all other types of customers who need a helping hand in its making. We cover a great range of topics, offer perfect quality work, always deliver on time and aim to leave our customers completely satisfied with what they ordered.

The ordering process is fully online, and it goes as follows:

  • Select the topic and the deadline of your essay.
  • Provide us with any details, requirements, statements that should be emphasized or particular parts of the essay writing process you struggle with.
  • Leave the email address, where your completed order will be sent to.
  • Select your prefered payment type, sit back and relax!

With lots of experience on the market, professionally degreed essay writers , online 24/7 customer support and incredibly low prices, you won't find a service offering a better deal than ours.

Relationship between problem solving and critical thinking skills 

Relationship between problem solving and critical thinking skills 

Table 1 . Relationship between problem solving and critical thinking...

Similar publications

Figure 1

  • I Putu Pasek Suryawan
  • Putu Kerti Nitiasih
  • Putu Nanci Riastini

I Gusti Putu Sudiarta

  • J Phys Conf
  • Suprapti Rejeki

Riyadi

  • Hardi Suyitno

Iqbal Kharisudin

  • Recruit researchers
  • Join for free
  • Login Email Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google Welcome back! Please log in. Email · Hint Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google No account? Sign up

Algorithmic thinking, cooperativity, creativity, critical thinking, and problem solving: exploring the relationship between computational thinking skills and academic performance

  • Published: 11 August 2017
  • Volume 4 , pages 355–369, ( 2017 )

Cite this article

relationship between critical thinking and problem solving skills

  • Tenzin Doleck 1 ,
  • Paul Bazelais 1 ,
  • David John Lemay 1 ,
  • Anoop Saxena 1 &
  • Ram B. Basnet 2  

6905 Accesses

91 Citations

2 Altmetric

Explore all metrics

The continued call for twenty-first century skills renders computational thinking a topical subject of study, as it is increasingly recognized as a fundamental competency for the contemporary world. Yet its relationship to academic performance is poorly understood. In this paper, we explore the association between computational thinking and academic performance. We test a structural model—employing a partial least squares approach—to assess the relationship between computational thinking skills and academic performance. Surprisingly, we find no association between computational thinking skills and academic performance (except for a link between cooperativity and academic performance). These results are discussed respecting curricular mandated instruction in higher-order thinking skills and the importance of curricular alignment between instructional objectives and evaluation approaches for successfully teaching and learning twenty-first-century skills.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save.

  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

Similar content being viewed by others

relationship between critical thinking and problem solving skills

Computational Thinking Assessment – Towards More Vivid Interpretations

relationship between critical thinking and problem solving skills

Assessing Computational Thinking: The Relation of Different Assessment Instruments and Learning Tools

relationship between critical thinking and problem solving skills

A valid and reliable tool for examining computational thinking skills

Explore related subjects.

  • Artificial Intelligence
  • Digital Education and Educational Technology

Anderson, L. (2002). Curricular alignment: A re-examination. Theory into Practice, 41 (4), 255–264.

Article   Google Scholar  

Ater-Kranov, A., Bryant, R., Orr, G., Wallace, S., & Zhang, M. (2010). Developing a community definition and teaching modules for computational thinking: Accomplishments and challenges. In Proceedings of the 2010 ACM conference on information technology education (pp. 143–148). ACM.

Atmatzidou, S., & Demetriadis, S. (2016). Advancing students’ computational thinking skills through educational robotics: A study on age and gender relevant differences. Robotics And Autonomous Systems, 75, 661–670. doi: 10.1016/j.robot.2015.10.008 .

Barr, V., & Stephenson, C. (2011). Bringing computational thinking to K-12. ACM Inroads, 2 (1), 48. doi: 10.1145/1929887.1929905 .

Bateman, D., Taylor, S., Janik, E., & Logan, A. (2008). Curriculum coherence and student success. Champlain College CEGEP. Retrieved from: http://www.cdc.qc.ca/parea/786950_bateman_curriculums_champlain_st_lambert_PAREA_2007.pdf .

Bazelais, P., Lemay, D. J., & Doleck, T. (2016). How does grit impact college students’ academic achievement in science? European Journal of Science and Mathematics Education, 4 (1), 33–43.

Google Scholar  

Biggs, J. (1996). Enhancing teaching through constructive alignment. Higher Education, 32, 1–18.

Biggs, J. (1999). Teaching for quality learning at university . Society for Research into Higher Education/Open University Press.

Brennan, K., & Resnick, M. (2012). New frameworks for studying and assessing the development of computational thinking . Vancouver: Paper presented at the American Educational Research Association.

Bundy, A. (2007). Computational thinking is pervasive. Journal of Scientific and Practical Computing, 1 (2), 67–69.

Cooper, S., Pérez, L., & Rainey, D. (2010). K-12 computational learning. Communications of the ACM, 53 (11), 27. doi: 10.1145/1839676.1839686 .

Denning, P. (2009). The profession of IT beyond computational thinking. Communications of the ACM, 52 (6), 28. doi: 10.1145/1516046.1516054 .

Deschryver, M. D., & Yadav, A. (2015). Creative and computational thinking in the context of new literacies: Working with teachers to scaffold complex technology-mediated approaches to teaching and learning. Journal of Technology and Teacher Education, 23 (3), 411–431.

Farris, A. V., & Sengupta, P. (2014). Perspectival computational thinking for learning physics: A case study of collaborative agent-based modeling. In Proceedings of the 12th international conference of the learning sciences (ICLS 2014) (pp. 1102–1107).

Fornell, C., & Larcker, D. (1981). Evaluating structural equation models with unobservable variables and measurement error. Journal of Marketing Research, 18 (1), 39–50.

Foundation for Critical Thinking. (2015). Elements and standards learning tool. Retrieved from: http://www.criticalthinking.org/pages/analyzing-and-assessing-thinking-/783 .

Futschek, G. (2006). Algorithmic thinking: The key for understanding computer science. In International conference on informatics in secondary schools - evolution and perspectives (pp. 159–168). Berlin: Springer.

Google for Education. (n.d.). CT overview. Retrieved from https://edu.google.com/resources/programs/exploring-computational-thinking/#!ct-overview .

Gretter, S., & Yadav, A. (2016). Computational thinking and media & information literacy: An integrated approach to teaching twenty-first century skills. Techtrends, 60 (5), 510–516. doi: 10.1007/s11528-016-0098-4 .

Grover, S., & Pea, R. (2013). Computational thinking in K-12: A review of the state of the field. Educational Researcher, 42 (1), 38–43. doi: 10.3102/0013189x12463051 .

Guzdial, M. (2008). Education paving the way for computational thinking. Communications of the ACM, 51 (8), 25. doi: 10.1145/1378704.1378713 .

Hair, J., Ringle, C., & Sarstedt, M. (2011). PLS-SEM: Indeed a silver bullet. The Journal of Marketing Theory and Practice, 19 (2), 139–152. doi: 10.2753/mtp1069-6679190202 .

Hu, C. (2011). Computational thinking: What it might mean and what we might do about it. In Proceedings of the 16th annual joint conference on innovation and technology in computer science education (pp. 223–227). ACM.

Katai, Z. (2014). The challenge of promoting algorithmic thinking of both sciences- and humanities-oriented learners. Journal of Computer Assisted Learning, 31 (4), 287–299. doi: 10.1111/jcal.12070 .

Kiss, G., & Arki, Z. (2017). The influence of game-based programming education on the algorithmic thinking. Procedia - Social and Behavioral Sciences, 237 (21), 613–617.

Kock, N. (2015a). WarpPLS. Retrieved from http://www.warppls.com .

Kock, N. (2015b). WarpPLS 5.0 user manual . ScripWarp Systems. Retrieved from http://cits.tamiu.edu/WarpPLS/UserManual_v_5_0.pdf .

Korkmaz, Ö., Çakir, R., & Özden, M. (2017). A validity and reliability study of the computational thinking scales (CTS). Computers in Human Behavior . doi: 10.1016/j.chb.2017.01.005 .

Kules, B. (2016). Computational thinking is critical thinking: Connecting to university discourse, goals, and learning outcomes. In Proceedings of the association for information science and technology. Silver Springs, MD: American Society for Information Science.

Lee, I., Martin, F., & Apone, K. (2014). Integrating computational thinking across the K-8 curriculum. ACM Inroads, 5 (4), 64–71. doi: 10.1145/2684721.2684736 .

Liu, J., & Wang, L. (2010). Computational thinking in discrete mathematics. In IEEE 2nd international workshop on education technology and computer science (pp. 413–416).

Lockwood, J., & Mooney, A. (2017). Computational thinking in education: Where does it fit? A Systematic Literary Review . (under review).

Lu, J., & Fletcher, G. (2009). Thinking about computational thinking. ACM SIGCSE Bulletin, 41 (1), 260–264. doi: 10.1145/1539024.1508959 .

Lye, S., & Koh, J. (2014). Review on teaching and learning of computational thinking through programming: What is next for K-12? Computers in Human Behavior, 41, 51–61. doi: 10.1016/j.chb.2014.09.012 .

Mezirow, J. (2000). Learning as transformation: Critical perspectives on a theory in progress . San Francisco: Jossey-Bass.

Mishra, P., Yadav, A., & Deep-Play Research Group. (2013). Rethinking technology & creativity in the 21st century. TechTrends, 57 (3), 10–14.

Mueller, J., Beckett, D., Hennessey, E., & Shodiev, H. (2017). Assessing computational thinking across the curriculum. In Emerging research, practice, and policy on computational thinking (pp. 251–267). Springer International Publishing.

National Council for Curriculum and Assessment. (2013). Draft specification for junior cycle short course. Retrieved from http://www.juniorcycle.ie/NCCA_JuniorCycle/media/NCCA/Documents/Consultation/Short%20Courses/SC_P_and_C.pdf .

National Research Council. (2011). Report of a workshop of pedagogical aspects of computational thinking. Retrieved from http://www.nap.edu/catalog.php?record_id=13170 .

Papert, S. (1996). An exploration in the space of mathematics educations. International Journal of Computers for Mathematical Learning, 1 (1), 95–123. doi: 10.1007/bf00191473 .

Polya, G. (1981). Mathematical discovery: On understanding, learning and teaching problem solving . New York: Wiley.

Resnick, M., Silverman, B., Kafai, Y., Maloney, J., Monroy-Hernández, A., Rusk, N., et al. (2009). Scratch. Communications of the ACM, 52 (11), 60. doi: 10.1145/1592761.1592779 .

Román-González, M., Pérez-González, J., & Jiménez-Fernández, C. (2017). Which cognitive abilities underlie computational thinking? Criterion validity of the computational thinking test. Computers in Human Behavior, 72, 678–691. doi: 10.1016/j.chb.2016.08.047 .

Sawyer, K. (2012). Explaining creativity: The science of human innovation (2nd ed.). New York: Oxford Univ. Press.

Sengupta, P., Kinnebrew, J., Basu, S., Biswas, G., & Clark, D. (2013). Integrating computational thinking with K-12 science education using agent-based computation: A theoretical framework. Education And Information Technologies, 18 (2), 351–380. doi: 10.1007/s10639-012-9240-x .

Snalune, P. (2015). The benefits of computational thinking. ITNOW, 57 (4), 58–59. doi: 10.1093/itnow/bwv111 .

Standl, B. (2016). A case study on cooperative problem solving processes in small 9th grade student groups. IEEE global engineering education conference (EDUCON), Abu Dhabi (pp. 961–967).

Voogt, J., Fisser, P., Good, J., Mishra, P., & Yadav, A. (2015). Computational thinking in compulsory education: Towards an agenda for research and practice. Education and Information Technologies, 20 (4), 715–728. doi: 10.1007/s10639-015-9412-6 .

Voskoglou, M. G., & Buckley, S. (2012). Problem solving and computers in a learning environment. Egyptian Computer Science Journal, 36 (4), 28–46.

Warneken, F., Steinwender, J., Hamann, K., & Tomasello, M. (2014). Young children’s planning in a collaborative problem-solving task. Cognitive Development, 31, 48–58. doi: 10.1016/j.cogdev.2014.02.003 .

Weintrop, D., Beheshti, E., Horn, M., Orton, K., Jona, K., Trouille, L., et al. (2015). Defining computational thinking for mathematics and science classrooms. Journal of Science Education and Technology, 25 (1), 127–147. doi: 10.1007/s10956-015-9581-5 .

Wenger, E. (1998). Communities of practice: Learning, meaning, and identity . Cambridge: Cambridge University Press.

Book   Google Scholar  

Williams, R. L. (2005). Targeting critical thinking within teacher education: The potential impact on Society. The Teacher Educator, 40 (3), 163–187.

Wing, J. (2006). Computational thinking. Communications of the ACM, 49 (3), 33. doi: 10.1145/1118178.1118215 .

Wing, J. (2008). Computational thinking and thinking about computing. Philosophical Transactions Of The Royal Society A: Mathematical, Physical And Engineering Sciences, 366 (1881), 3717–3725. doi: 10.1098/rsta.2008.0118 .

Wing, J. (2011). Research notebook: Computational thinking—What and why? The Link Newsletter, 6 , 1–32. Retrieved from http://link.cs.cmu.edu/files/11-399_The_Link_Newsletter-3.pdf .

Wing, J. (2014). Computational thinking benefits society. Social issues in computing . Retrieved from http://socialissues.cs.toronto.edu/2014/01/computational-thinking/ .

Wold, H. (1982). Soft modeling: The basic design and some extensions. In K. Joreskog & H. Wold (Eds.), Systems under indirect observation (pp. 1–54). Amsterdam: North-Holland.

Yadav, A., Hong, H., & Stephenson, C. (2016). Computational thinking for all: Pedagogical approaches to embedding 21st century problem solving in K-12 classrooms. Techtrends, 60 (6), 565–568. doi: 10.1007/s11528-016-0087-7 .

Yadav, A., Stephenson, C., & Hong, H. (2017). Computational thinking for teacher education. Communications of the ACM, 60 (4), 55–62. doi: 10.1145/2994591 .

Download references

Author information

Authors and affiliations.

McGill University, 3700 McTavish St., Montreal, QC, H3A 1Y2, Canada

Tenzin Doleck, Paul Bazelais, David John Lemay & Anoop Saxena

Colorado Mesa University, Grand Junction, CO, USA

Ram B. Basnet

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Tenzin Doleck .

Rights and permissions

Reprints and permissions

About this article

Doleck, T., Bazelais, P., Lemay, D.J. et al. Algorithmic thinking, cooperativity, creativity, critical thinking, and problem solving: exploring the relationship between computational thinking skills and academic performance. J. Comput. Educ. 4 , 355–369 (2017). https://doi.org/10.1007/s40692-017-0090-9

Download citation

Received : 27 May 2017

Revised : 10 July 2017

Accepted : 07 August 2017

Published : 11 August 2017

Issue Date : December 2017

DOI : https://doi.org/10.1007/s40692-017-0090-9

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Computational thinking
  • Computational thinking skills
  • Academic performance
  • CEGEP students
  • Curricular alignment
  • Find a journal
  • Publish with us
  • Track your research

IMAGES

  1. What is Critical Thinking and Problem Solving?

    relationship between critical thinking and problem solving skills

  2. What is Critical Thinking and Problem Solving?

    relationship between critical thinking and problem solving skills

  3. Critical Thinking and Problem Solving Skills by Heba Mansour on Prezi

    relationship between critical thinking and problem solving skills

  4. Critical Thinking Skills

    relationship between critical thinking and problem solving skills

  5. What Is Critical Thinking And Creative Problem Solving

    relationship between critical thinking and problem solving skills

  6. problem solving and critical thinking are two main forms of

    relationship between critical thinking and problem solving skills

VIDEO

  1. Core Critical thinking Skills

  2. The Relationship between Critical Thinking Skills and Academic Writing

  3. Top Critical Thinking Skills

  4. Creative Thinking VS Critical Thinking

  5. Critical Thinking and Problem-Solving Skills for High School and College Students

  6. Teaching Critical Thinking by @TeacherToolkit

COMMENTS

  1. Critical Thinking vs. Problem-Solving: What's the Difference?

    Critical thinking vs. problem-solving Critical thinking and problem-solving can both help you resolve challenges, but the two practices have distinct purposes and strategies. Here are some differences between the two skills: Critical thinking This is a mode of thinking, compared to problem-solving, which is a set of solution-oriented strategies.

  2. Critical Thinking versus Problem Solving

    The first step to enhancing your critical thinking and problem solving skills is to think about them, become aware of them, then you can actively practice to improve them. Critical thinking and problem-solving are two important "soft" or essential skills hiring managers are looking for. According to a Linkedin survey, 57% of business ...

  3. Understanding the Complex Relationship between Critical Thinking and

    The advances in cognitive science in the 21st century have increased our understanding of the mental processes involved in thinking and reasoning, as well as memory, learning, and problem solving. Critical thinking is understood to include both a cognitive dimension and a disposition dimension (e.g., reflective thinking) and is defined as ...

  4. Critical Thinking and Decision-Making

    Definition. Simply put, critical thinking is the act of deliberately analyzing information so that you can make better judgements and decisions. It involves using things like logic, reasoning, and creativity, to draw conclusions and generally understand things better. This may sound like a pretty broad definition, and that's because critical ...

  5. Bridging critical thinking and transformative learning: The role of

    In recent decades, approaches to critical thinking have generally taken a practical turn, pivoting away from more abstract accounts - such as emphasizing the logical relations that hold between statements (Ennis, 1964) - and moving toward an emphasis on belief and action.According to the definition that Robert Ennis (2018) has been advocating for the last few decades, critical thinking is ...

  6. Critical Thinking and Problem-Solving Skills

    Critical thinking helps people solve problems systematically using facts and data. It involves going through a process of gathering information, analyzing, evaluating, and synthesizing the information to help solve problems in a timely manner. Logic plays a predominant role in critical thinking, because the goal of critical thinking is to solve ...

  7. Critical Thinking vs Problem Solving: Navigating Cognitive Approaches

    In a professional or organizational context, the relationship between critical thinking and problem-solving is symbiotic and indispensable. Critical thinking serves as the foundational framework that underpins effective problem-solving, while problem-solving is the practical application of critical thinking skills to address real-world challenges.

  8. Critical Thinking and Problem-Solving

    Critical thinking involves asking questions, defining a problem, examining evidence, analyzing assumptions and biases, avoiding emotional reasoning, avoiding oversimplification, considering other interpretations, and tolerating ambiguity. Dealing with ambiguity is also seen by Strohm & Baukus (1995) as an essential part of critical thinking ...

  9. Critical Thinking

    Critical thinking refers to the process of actively analyzing, assessing, synthesizing, evaluating and reflecting on information gathered from observation, experience, or communication. It is thinking in a clear, logical, reasoned, and reflective manner to solve problems or make decisions. Basically, critical thinking is taking a hard look at ...

  10. Critical Thinking and Problem Solving Skills: How these Skills are

    Pearson correlation test results (r) between the critical thinking skills and problem solving ability of students did not show a strong relationship even contradictory (-.05). based on the results ...

  11. PDF The Relationship between Critical Thinking Disposition and Problem

    Objective : The aim of the study was to investigate the relationship between critical thinking disposition and problem solving skills in nurses. Material and method: The research was a descriptive and correlation type. The research was carried out at a private hospital in the city of İstanbul in Turkey between April and July 2018.

  12. PDF Investigating The Relationship Between Critical Thinking Skills and

    investigating the relationship between critical thinking and problem solving skills that includes overcoming the problems of daily life in different educational grades and professional groups (Koray et al., 2007; Cantürk, Günhan & Başer, 2009; Beşer & Kıssal, 2009; Choi, Lindquist & Song, 2014). A similar relationship is also considered to ...

  13. PDF Examining the Relationship between Pre-Service Teachers' Critical

    2011) and relationship between problem solving skills and self-efficacy (Altunçekiç, Yaman, Koray, 2005; Aylar & Aksin, 2011; Kesicioğlu & Güven, 2014; Yenice, 2012). These studies show that critical thinking disposition and problem-solving skills are related to self-efficacy of teachers. However, studies on the predictive power of

  14. Using Critical Thinking in Essays and other Assignments

    Critical thinking, as described by Oxford Languages, is the objective analysis and evaluation of an issue in order to form a judgement. Active and skillful approach, evaluation, assessment, synthesis, and/or evaluation of information obtained from, or made by, observation, knowledge, reflection, acumen or conversation, as a guide to belief and action, requires the critical thinking process ...

  15. Collaborative Learning and Critical Thinking

    Collaborative learning is a relationship among learners that fosters positive interdependence, individual accountability, and interpersonal skills. "Critical thinking" involves asking appropriate questions, gathering and creatively sorting through relevant information, relating new information to existing knowledge, reexamining beliefs ...

  16. Relationship between problem solving and critical thinking skills

    Siswanto. Critical thinking skills are important in solving geometric problems. The intelligence aspect also influences problem-solving. Therefore, this study aimed to describe students ...

  17. Determine the Relationship Between the Disposition of Critical Thinking

    In their study "An Investigation of University Students’ Critical Thinking Disposition and Perceived Problem Solving Skills", Tumkaya, Albek and Aldag (2009) aimed to determine whether there was a significant relationship between disposition of critical thinking and problem solving skills, and whether university students' disposition of ...

  18. Algorithmic thinking, cooperativity, creativity, critical thinking, and

    The continued call for twenty-first century skills renders computational thinking a topical subject of study, as it is increasingly recognized as a fundamental competency for the contemporary world. Yet its relationship to academic performance is poorly understood. In this paper, we explore the association between computational thinking and academic performance. We test a structural model ...