Evidence Summary
Critical Thinking Exercises in the Classroom are
Helpful but not Sufficient for Improving Critical Thinking Test Scores
A Review of:
Wallace, E. D., & Jefferson, R. N. (2013). Developing Critical
Thinking Skills For Information Seeking Success. New Review of Academic
Librarianship, 19(3): 246-255. http://dx.doi.org/10.1080/13614533.2013.802702
Reviewed by:
Heather Coates
Digital Scholarship & Data Management Librarian
Indiana University-Purdue University Indianapolis
(IUPUI) University Library
Indianapolis, Indiana, United States of America
Email: hcoates@iupui.edu
Received: 7 Mar. 2014 Accepted: 30 Apr.
2014
2014 Coates.
This is an Open Access article distributed under the terms of the Creative
Commons‐Attribution‐Noncommercial‐Share Alike License 2.5 Canada (http://creativecommons.org/licenses/by‐nc‐sa/2.5/ca/),
which permits unrestricted use, distribution, and reproduction in any medium,
provided the original work is properly attributed, not used for commercial
purposes, and, if transformed, the resulting work is redistributed under the
same or similar license to this one.
Abstract
Objective – To determine whether a series of workbook exercises contributed to
improved critical thinking test scores.
Design – Post-test design with a quasi-experimental control group.
Setting – Military college in the United States of America.
Subjects – 76 undergraduates enrolled in a required freshman orientation
seminar.
Methods – Approximately one third of the enrolled participants (n=26) were
provided with a copy of the book Critical
Thinking: Building the Basics. A subset of exercises was completed
independently over three to four class sessions during the first three weeks of
the semester. The control group (n=50) did not receive any critical skills
thinking instruction. The iCritical Thinking Skills Test, an online exam
provided by Educational Testing Service (ETS), was administered to both groups
during a class session. The exam consists of 7 types of tasks: define, access, evaluate, manage, integrate,
create, communicate, evaluated using 14 tasks based on real-world scenarios.
Main Results –
Approximately 20% (15) of all students passed the test, 9 from the intervention
group and 6 from the control group. Significant differences were detected
between the groups on the Integrate and Manage subtests. The range for
individual subtests and total scores was wide. Scores for two of the seven
subtests, Create and Evaluate, showed the greatest amount of variability; the
Communicate subtest scores had the least.
Conclusion – Limitations of the study include potential motivational differences
between the groups. Students who completed workbook exercises appeared to be
motivated to do well on the test, while those who did not seemed less
motivated. The effectiveness of exercises in developing critical thinking
skills in this study will persuade administrators to consider using such
exercises in the classroom.
Commentary
Despite their perceived importance for student learning outcomes,
critical thinking skills are rarely taught explicitly in the college classroom.
In part, this is because few approach an operational definition than can be
used to inform instruction and evaluation. Those definitions included in a
recent systematic review (Behar-Horenstein & Niu, 2011) describe critical thinking
as an attitude, application of skills, and a process. Similarly, many
instructional methods have been used, but are not sufficiently characterized in
the literature. This, along with the preponderance of pre- and
quasi-experimental studies using small samples sizes, threatens internal
validity, thus limiting applicability to other instructional settings because
causal effects for the interventions cannot be asserted.
The study reviewed here examines this relatively understudied area, the
role of critical thinking skills in information seeking and use. Glynn’s (2006) tool developed for library and information science was used to
appraise the article. Unfortunately, the article does not include sufficient
detail to support a comprehensive appraisal, though it mirrors many of the
design weaknesses described above.
The major strengths of the study by Wallace and Jefferson (2013) are the
use of explicit and active learning strategies. However, it is difficult to
establish the quality and generalizability of the evidence reported due to
incomplete description of the study. Critical elements of the study are not
described, including the population, sample, and the details of the
instructional intervention. Without this information, it is difficult to
determine the relevance to professional practice.
Another area of confusion is which of two testing instruments were used.
The authors report using the “iCritical Thinking Skills Test” offered by ETS.
No such test is listed by ETS, but they do offer the iCritical Thinking
Certification (ETS, 2010). Both tests include the same seven task areas, but
the iSkills test was designed and validated to assess information literacy
skills. Since the authors’ citation does not match the
narrative description, it is impossible to determine whether the instrument
used was appropriate.
There are further concerns relating to the testing situation regarding
validity of the test results. The class period was not long enough to allow for
completion of the registration process (15 minutes) and the test (60 minutes).
If students perceived that the exercises and/or the test were irrelevant to the
course, it is likely that their performance does not reflect their actual
abilities. A lack of motivation may be a confounding variable for test performance.
Furthermore, the accuracy of the ANOVA results for subtest differences is
questionable. The reported ANOVA results are inconsistent with the table
values, but consistent with the narrative. More generally, the authors do not
adequately discuss the implications of the findings, particularly the study
limitations and how they may be addressed in future research. Overall, the
missing and conflicting information presented in this article raise significant
concerns as to the validity and applicability of the findings.
Despite the methodological concerns, this study contributes to a gap in
the literature. Given the increasing demand to demonstrate the value of higher
education, this is an area ripe for further study. However, future studies
should address the design limitations outlined by Behar-Horenstein and Nui
(2011), by using carefully design quasi-experimental or experimental studies
that combine quantitative and qualitative approaches for measuring change in
critical thinking ability.
References
Glynn, L.
(2006). A critical appraisal tool for library and information research. Library Hi Tech, 24(3), 387-399. doi:10.1108/07378830610692154
Behar-Horenstein,
L. & Niu, L. (2011). Teaching critical thinking skills in higher education:
A review of the literature, Journal of
College Teaching & Learning, 8(2), 25-42. Retrieved from http://journals.cluteonline.com/index.php/TLC/article/view/3554/3601
Educational
Testing Service. (2010). iCritical Thinking Certification. Retrieved from
http://www.certiport.com/portal/common/documentlibrary/iCritical_Thinking-datasheet.pdf