Center for Assessment & Improvement of Learning
Sam Houston State University
Quality Enhancement Plan
In 2009, a team of faculty at Sam Houston State University (SHSU) developed a new course called Foundations of Science as part of its Quality Enhancement Plan (QEP) for reaccreditation by the Southern Association of Colleges and Schools. The primary goal of the course was to improve the scientific literacy and critical thinking of our students. We defined critical thinking as the process of drawing reasonable, fair-minded conclusions based on evidence and logic. This definition matches the process of scientific reasoning, wherein empirical evidence is evaluated in an objective manner. As part of the reaccreditation process, we were required to evaluate our students’ gains in critical thinking resulting from having taken the course. We considered several possible instruments designed to assess critical thinking and chose the Critical Thinking Assessment Test (CAT) because the nature of the questions on the CAT fit perfectly with our working definition of critical thinking. Specifically, the test requires students to examine data and information, develop hypotheses, propose a means of testing those hypotheses, and draw reasonable conclusions. It also tests students’ ability to recognize one of the most common fallacies that scientists (and people in general) must avoid; namely, the False Cause fallacy. One of the many strong points of the instrument is that it does not require specific knowledge of a scientific discipline or set of facts; rather, it assesses the scientific/logical reasoning process of students; consequently, it can be used by students in any discipline.
We have administered the CAT almost every semester since 2009, both as a pre-test and a post-test, in order to gage student gains in critical thinking skills. We hold a formal scoring session at the end of each semester and invite faculty members from across campus to participate in scoring the tests. Our results in all semesters have shown a statistically significant increase in critical thinking.
The CAT instrument has also served to highlight the importance of reading and written communication skills. Indeed, scorers frequently comment on the weaknesses in these areas which are highlighted in the responses to the test. Moreover, many faculty members have left the scoring sessions with a desire to incorporate CAT-like questions into their coursework in order to promote critical thinking.
The use of the CAT has been so successful at SHSU that we plan to continue using it for the Foundations of Science course, and also to assess the gains in critical thinking of students as a result of their overall education at SHSU. Specifically, it will be given to students in their junior/senior year and the results will be compared to those from other universities. The ability to compare results with those from other universities across the U.S. is one of the benefits of using the CAT instrument.
Marcus Gillespie, Associate Dean, GEO_BMG@SHSU.EDU
University Assessment of Learning Outcomes
We are using the CAT as an end-of-experience measure, in order to capture data on how well are students are doing with critical thinking and problem solving as they prepare to graduate. To this end, we are administering the CAT in 3000- and 4000-level courses from each of our academic colleges over a three-year cycle. Each year, we are anticipate collecting approximately 500 completed CAT tests for scoring by a cross-discipline group of faculty scorers. These data will be used to inform our core assessment efforts; however, results will also be provided to the participating colleges and departments for use in programmatic assessment. Over time, it is hoped that these data will help inform curricular and pedagogical changes to help improve critical thinking for our students. We are particularly excited to dive deeper into the data we get back from y’all about our student performance. Although the various institutional and college averages have their role, we really want to start looking at that data by various student demographic characteristics, such as gender, ethnicity, socio-economic class, etc. Over time, we are also looking to align the data we get from the CAT with other critical thinking measures we have in place on campus to better inform both our CAT results, and the results from those locally developed measures.
Jeff Roberts, Director of Assessment, jlr022@SHSU.EDU
Florida State University has adopted the Critical Thinking Assessment Test as the assessment instrument for its Quality Enhancement Plant Think FSU. The decision was based on several factors including proven validity and reliability and widespread usage at similar type institutions. Our choice has proven to a solid one. In addition to the characteristics mentioned above, we have been extremely pleased with the broad support we receive from the CAT team.
Early in our implementation, we visited with the team to discuss the test and to sketch out an assessment plan. This experience was typical of our interactions with the CAT group. They worked to understand our objectives and, based upon that information, provided advice on how to make the most of our assessments. These conversations included recommendations ranging from sampling strategies to nuancing the script in order to get authentic student participation. This individualized attention is not limited to FSU. At other training events, colleagues from around the country have commented on the outstanding level of support they receive from the CAT group.
An added feature of our involvement with the CAT group has been the attention paid to methods for enhancing critical thinking skills among our students. Assessment is important, but the ultimate goal is improved skills. As part of our faculty development efforts, we brought the CAT folks to FSU for a session on constructing assignments aimed at boosting critical thinking. Participants continue to talk about the lessons learned as part of the sessions. There is little doubt that these sessions resulted in changes to the classroom experience around campus.
Because of the statistical strength of the CAT and the comprehensive support offered, our decision to utilize the CAT has proven to be one of the smartest we have made to date.
Lynn Hogan, Director, Critical Thinking Initiatives, firstname.lastname@example.org
Laurie Molina, Associate Director, Critical Thinking Initiatives, email@example.com
Keene State College uses the Critical thinking Assessment Test to assess critical thinking as one of our college-wide learning outcomes. We chose CAT over other options because we have local control over how we use the test. We chose to administer it as a pre- and posttest in selected first-year and upper-level courses across the curriculum. We also chose CAT because scoring the tests locally provides important faculty development opportunities. We believe that over time using CAT will improve the teaching of critical thinking skills on our campus. The follow-up rescoring and data analysis by Tennessee Tech ensures that our scores are accurate and comparable to other institutions.