Reducing test anxiety with standards-based grading
What we know -- and don't know -- about test anxiety and alternative assessments.
If you talk with someone who uses some sort of alternative assessments — whether standards-based grading, mastery-based testing, specifications grading, or many others — they will almost certainly tell you how it reduces students’ test anxiety.
In today’s post, I’m going to review what we actually know, what we think we know, and what we really don’t know about how these sort of assessments affect test anxiety.
I’m going to stick with the term “SBG” (Standards-Based Grading), since that’s the context in which the most relevant work has been done. However, authors have examined anxiety across many forms of alternative assessments: Specifications grading, mastery-based testing, and others.
When we talk about assessments, the topic of “anxiety” or “stress” inevitably comes up. These words can mean many things, and articles often use these words in an interchangeable and general sense. There are many more specific, and better-studied, forms of anxiety. Some common ones include test anxiety, subject-specific anxiety (such as “math anxiety”), social anxiety, and communication anxiety.
We’ll mostly focus on test anxiety: Anxiety that is induced by concern about potential negative consequences resulting from performance on an exam or similar assessment. There are a huge range of symptoms: Difficulty concentrating, panic attacks, racing thoughts, and headaches are just a few of the more common ones. These can happen both before and during assessments.
It’s well established that students who have higher test anxiety perform worse than those who don’t.1 There is debate about why this happens, but students regularly report that the symptoms directly interfere with their ability to process, understand, and recall. To measure “performance”, studies use overall GPAs, grades in individual classes, or on specific assessments.
It’s also known that female students tend to have higher levels of test anxiety than male students — although this doesn’t necessarily correspond to lower performance than male students, due to a variety of confounding factors.2
One note for what follows: I’ll be careful to write test anxiety when discussing results specific to this type of anxiety. I’ll say “anxiety” or “stress” for cases where authors are discussing more generalized, non-specific anxiety.
If there's one thing we know for sure, it's this: Students self-report lower stress in any kind of SBG class. It’s almost impossible to find an SBG article that doesn’t mention self-reported lower stress.
Many of the arguments about why SBG should reduce stress are based on student comments, often from SETs (Student Evaluations of Teaching) or instructor-created surveys. Other arguments are from first principles, and go something like this: A key tenet of SBG is that students should have multiple opportunities, without penalty, to demonstrate their understanding of each objective. If students know that there are multiple chances to earn credit, then each individual assessment has lower stakes, and hence lower stress. (But, be careful: Literature seems to be mixed on whether lower-stakes assignments actually reduce test anxiety compared to higher-stakes ones.)
As a practitioner of SBG, I’ve seen these exact things, and made the same arguments. But wouldn’t it be nice to have some slightly harder data?
One of the best articles studying general stress is:
Harsy, A., Carlson, C., & Klamerus, L. (2020). An Analysis of the Impact of Mastery-Based Testing in Mathematics Courses. PRIMUS, 1-18. Available online.
In this article, students took instructor-created surveys at different points in the semester. On these, they self-reported how much anxiety they felt before exams in an SBG class vs. other classes, and also self-reported how their anxiety changed throughout the semester. The key takeaways are:
Students in SBG classes overwhelmingly reported less anxiety before exams (compared to traditionally graded classes).
Students in SBG classes started the semester significantly more anxious but ended significantly less anxious (again, versus traditionally graded classes).
A likely source of the early anxiety is the unfamiliar assessment system itself. This anxiety reduces as students become familiar with the system.
This is one article, but it’s representative of many others. There are some caveats: All amounts and changes in stress are self-reported. Survey methods vary a lot across articles, and few (if any) use a validated instrument to measure any type of anxiety. Surveys tend to mix “anxiety”, “test anxiety”, “stress”, etc., so it’s not clear that instructors and students even mean the same things when they talk about “stress”.
Enter a new article:
Lewis, D. (2019). Student anxiety in standards-based grading in mathematics courses. Innovative Higher Education, 1-12. Available online.
Lewis administered several validated instruments that measure test anxiety, including the Short Test Anxiety Inventory (STAI). These were administered both pre- and post-semester in SBG classes.
Students self-reported lower test anxiety in the SBG classes, as expected.
But... their actual test anxiety (as measured by the STAI) increased during the semester.
It's not surprising that test anxiety could increase throughout a semester, even in an SBG class. Reassessments eventually come to an end. Exams are still exams. But how could students self-report less test anxiety, yet still experience more?
Perhaps their stress was increasing, but just not as much as in non-SBG classes? Nope: The negative correlation between test anxiety and number of standards completed (standing in for “performance”) was in line with the expectations from literature. As Lewis says, “If the test anxiety was having less of an effect in these SBG courses, this correlation should be closer to zero.”
Here’s where a second (and very recent) article comes in:
Lewis, D. Impacts of Standards-Based Grading on Students’ Mindset and Test Anxiety. To appear in Journal of the Scholarship of Teaching and Learning.
Lewis was just as confused as you or I right now and designed a followup study to try to clear up this seeming paradox. Lewis conjectured that students were thinking about test anxiety in other classes, not just the one SBG class in which the survey was given.
In this new study, students in several SBG classes again took pre- and post-surveys using the STAI. This time, they took the post-survey twice: Once with instructions to complete the survey as it applied to their SBG class, and once as it applied to their other classes.
The results aren’t just good, they’re awesome:
Students had significantly lower test anxiety in the SBG classes than in their other classes (on the post-test).
In the SBG classes, test anxiety decreased throughout the semester.
Test anxiety significantly increased in the other (non-SBG) classes.
In addition, and as expected, female students reported significantly higher test anxiety on the pre-test. But (in the author’s words), the post-tests showed that “the difference [in test anxiety] between male and female students was eliminated in the SBG course but persisted in students’ other courses.” (See the diagram below.)
There are some limitations: These articles focus only on certain early-curriculum math classes. In addition, the classes were taught by two instructors who had developed their approach to SBG together, so the assessment structures were quite similar.
This leaves a lot of questions open. Do the results apply beyond math, or even beyond the first few years of the math curriculum? How do things change with other instructors, who may use their own variations on SBG? What about related systems like mastery-based testing or specifications grading? Is it true that the reassessment aspect of these systems is the key feature in reducing test anxiety, or is it something else? Perhaps it’s the whole package of new assessment ideas and instructor attitude towards them?
So, what do we know? Students feel lower stress in SBG classes. There are some reasonable theoretical arguments for why students should feel this way. Validated surveys, when done carefully, confirm this and indicate that SBG can reduce or eliminate gender difference in test anxiety. But, these studies are limited in scope.
There’s a lot remaining to do. In particular, replicating Lewis’s second study in different classes, in different disciplines, and with different assessment structures. If you’re reading this and are in a position to do some follow-up studies, you could contribute a lot to our overall understanding of test anxiety and alternative assessments.
Subscribe to get Grading For Growth posts delivered directly to your inbox, once a week.
Schwarzer, R. (1990). Current trends in anxiety research. European perspectives in psychology, 2, 225-244. The correlation reported by Schwarzer is -0.21. Using a different approach, Cassady and Johnson report that cognitive test anxiety accounts for 7-8% of variance in exam scores: Cassady, J. C., & Johnson, R. E. (2002). Cognitive test anxiety and academic performance. Contemporary educational psychology, 27(2), 270-295.
Hembree, R. (1988). Correlates, causes, effects, and treatment of test anxiety. Review of educational research, 58(1), 47-77.