Studies show that practice exams are the best way to prepare for real exams. This is because they implement spaced retrieval practice, with the learner pulling the information out of their memory when applying it to the question, which hardens that information in memory (Oakley & Sejnowski, 2018). Plus, they alert the student to topics that they do not fully understand and need to study before the real exam.

Practice exams also advance educational equity by reducing differences in student performance due to different backgrounds. Many students come from backgrounds that do not provide the same academic testing experiences as others. Private college prep schools are designed to develop the skills needed to succeed in college, and thus they model the exams and exam atmosphere that students will encounter in college. Students who come from large public schools that need to serve a wide range of students, including those without any plans to go to college and may have special needs, often do not get those experiences, and so they arrive at college already behind their peers despite being no different in innate academic ability.

Putting online practice exams into practice

I have used practice tests since I started teaching as a graduate student in the 1990s, and recently I decided to do a formal experiment on whether, and how well, they improve performance on the real thing. In my three fall 2020 classes (government, economics, and law), I found that practice tests (in this case, multiple choice, though I provide sample essay tests and sample essays) boosted undergraduate scores by wide margins for both political science majors and nonmajors, as evidenced in Figure 1.

Figure 1. Summary of government, economics, and law exam scores

As you can see, students improved by nearly 25 percentage points from their practice test to the actual test grade in all three exams in all three classes. This is not solely reflected in the scores of a few. Nearly 90 percent of students did better on the actual exam than they did on the simulation. (The rest equaled their grade or saw their scores decline.) Those who took the practice quiz outperformed those who did not by an average of nearly 15 percentage points. The questions on the practice test were not identical to those on the exam, but similar in a way that would demonstrate what the exam questions would cover.

Adesope, Trevisan, and Sundararajan (2017) conducted a “meta-analysis [which] was to summarize the learning benefits of taking a practice test versus other forms of non-testing learning conditions” (p. 682). Specifically, they “examined all studies that compared learning benefits on a final test among practice retrieval and nonretrieval learning conditions . . . [with] results from 272 independent effects from 118 separate experiments” (p. 682). They found that “post hoc analysis also revealed that multiple-choice practice tests were associated with higher weighted mean effect size than short-answer tests, and differed significantly from them” (p. 673). Additionally, the authors noted that one full-length practice test was preferable to multiple pretest quizzes, and it should take place within a week of the exam. Moreover, Waddell (2017) found “the benefits of practice testing were greater when the practice test and the final test formats were identical rather than dissimilar (assuming the practice test and final test utilized only one question type). This is due to a phenomenon known as Transfer-Appropriate Processing, which suggests that memories are easier to retrieve when the retrieval process is similar to how they were encoded during an initial learning activity.”

It is also helpful to advise students to “learn from [their] mistakes” on the practice exams and to “decide how [they] can improve future test results” (Bryant, n.d.). To this end, Waddell (2017) recommends that instructors “increase the number of low-stakes quizzes on material students need to retain.” St. Mary’s University of California (n.d.) suggests that students “create test questions to help [them] actively think about the exam.” The university adds that “many students fail multiple-choice questions because their expectations are that questions will be straightforward and easily recognized. Most professors develop multiple-choice questions by synthesizing material from more than one source, creating a dual-layered question demanding analysis of the question, rather than rote memory.” In that case, students clearly need simulated exams so they know ex ante what to expect on the real test.

Sometimes, the students tried the exam multiple times to improve their scores, without even a suggestion from me to do so. By the end of the semester, fewer and fewer students eschewed the practice quiz; in some classes, every student took the practice test.

Further benefits of exam simulations

I found that exam simulations also provide a trove of information. I could see better and more rapidly which questions the students struggled with. That let me know to check whether I provided the proper answer, worded the question awkwardly, or hadn’t provided enough instruction on a particular point and needed to call attention to the issue. Moreover, I could use the data to determine whether students were improving from the practice exam to the real deal. Such information was vital for my annual reports, which increasingly seek quantitative evidence to supplement a qualitative narrative.

Moreover, these exams seemed to help our students on other types of tests. For comparative purposes, our college students are required to take a nationally normed exam, which is administered online. When I was in charge of the process for our program, our students tended to rank below our overall college scores and national results. But since then, with the advent of my change in offering in-class simulations (of multiple-choice and essay exams), the results have reversed. Our majors have now exceeded the college and national averages.

There are four tips I would offer for instructors willing to employ these methods in their classes, both in-person and virtual.

  1. Adopt the practice of allowing students to take practice exams multiple times so they can get more familiar with (and better at) the simulations.
  2. Be ready to read the results in real time, to correct any errors in the answers, and to boost instruction on test questions where many answers are missed.
  3. Collect the data so you can compare the results on the virtual quizzes to the real exams for purposes of evaluating your teaching methods.
  4. Write the exam first and the practice quiz second so you know that the latter is a good representation of the former. That also means getting that practice test to students early instead of at the last minute so they can take it multiple times and you can monitor the process for problems.

All evidence points to exam simulations improving student performance and learning.


John A. Tures, PhD, is a professor of political science at LaGrange College in LaGrange, Georgia.

References

Adesope, O. O., Trevisan, D. A., & Sundararajan, N. (2017). Rethinking the use of tests: A meta-analysis of practice testing. Review of Educational Research, 87(3), 659–701. https://doi.org/10.3102/0034654316689306

Bryant, S. (n.d.). Test taking strategies seminar. https://my.ciu.edu/ICS/icsfs/Test_Taking_Strategies_Handout.pdf.pdf?target=fd2db443-4b34-48d1-a2a7-bd1179827824

Oakley, B., & Sejnowski, T. (2018). Learning how to learn. Penguin.

St. Mary’s College. (n.d.). Test taking strategies. Tutorial and Academic Skills Center. https://www.stmarys-ca.edu/tutorial-and-academic-skills-center/additional-resources/test-taking-strategies

Waddell, R. (2017, October 26). 5 proven ways to get the most out of practice testing. Edmentum. https://blog.edmentum.com/5-proven-ways-get-most-out-practice-testing



Post Views:
23





Source link

Leave a Reply

Your email address will not be published.