REVIEW EFFECTIVE LEARNING

Learners are often confronted with large amounts of information to be stored and understood. Some cope with this excellently, others may have more trouble. When a student yet again has barely passed a certain course, I sometimes ask this student what the struggle was in passing the course. Hardly ever is it that I hear a student admit that his or her study method was not so optimal. However, research has shown that many learners tend to use study strategies that are not so effective, like rereading or highlighting (e.g., Dunlosky et al., 2013). However, what should students do to study in an optimal way? Additionally, do teachers also play a role in this? Those types of questions were addressed during the conference on Effective Learning of the Open University, held in Eindhoven on 7 November 2018, with the aim to bring together two different worlds, those of teachers and of educational scientists.

The conference started with a short introduction by the organiser, dr. Gino Camps. Next, prof. dr. Shana Carpenter from Iowa State University delivered a keynote on using pre-questions to enhance student learning. This was followed by a number of parallel sessions, for which participants could choose between presentations and workshops on a variety of topics. I was able to watch the presentations by Ms. Milou van Harsel on model learning, by Ms. Sanne Rovers on self-regulation in problem-based learning, and by Mr. Tim Surma on effective learning and the role of the teacher.

As a scientist, I have always been very much interested in the learning brain. For many years, I have been focusing on cognitive enhancement in vulnerable populations, such as the elderly with memory impairments. The focus here was to improve brain functioning by administering a medicine that optimizes the brain. More recently, however, I have become interested in cognitive enhancement in an educational setting. Namely, within the education system of my university, we tend to offer an enormous amount of literature to the students, yet expect them to know by themselves how to deal with this in the best way possible. I think this expectation is a bridge too far. Could we improve the academic performance of our own students by paying more attention to the learning process rather than the content that must be learned? If we can do this, what will actually happen in the brain while performing this more optimal activity? Finally, what is the role of the teacher in all of this? Should we assume that students are capable of finding their optimal study strategy themselves, or do we need to help them? I hoped to find a number of answers to my questions during this conference. First, the keynote on pre-questions and the presentation on self-regulation in problem-based learning could help me to learn more about optimizing the study process of my students, as I work in this setting with problem-based learning. Second, the presentation on the role of the teacher could provide me with insights as to how I can shape the learning process of my students more elegantly.

Keynote

Before the keynote started, dr. Gino Camp welcomed all participants to the conference. He expressed his hope that the participants would use the opportunity to discuss educational research from the perspective of implementing its outcomes in the classroom. Next, he introduced the main speaker, prof. dr. Shana Carpenter from Iowa State University. She delivered her keynote about the enhancement of learning by using pre-questions.

Prof. Carpenter is interested in the topic of retrieval practice. She explained what retrieval practice is by presenting the maybe first study ever on this topic, published by Herbert Spitzer in 1939. Spitzer was originally interested in the so-called Ebbinghaus forgetting curve, which provides an estimate on how long newly stored information may stay in long-term memory before it is forgotten (see also Murre & Dros, 2015). Spitzer was not really interested in the influence of practice testing. During the learning process in his experiment, he gave his pupils the opportunity to practice the material with multiple-choice questions, simply hoping to induce longer retention of the material, so a slower forgetting curve. This was exactly what he found. Immediate recall in the form of a test helped pupils to retain information for a longer period than they normally would (Spitzer, 1939). Now, many decades later, we call the paradigm during which somebody reads information and then practices with questions ‘retrieval practice’ and the outcome of such a study the ‘testing effect’ (Dunlosky et al., 2013).

According to Carpenter, there is no doubt that the testing effect exists and also works, given the many situations in which this effect has been found (for review, see Karpicke, 2016). However, contrary to original work by Spitzer, scientists now always use control groups in their research, she said. In the ‘new era’ of this type of research, Roediger and Karpicke (2006) were amongst the first to study this topic. The type of setup they used is still the most applied one in this field. They asked the experimental group to read a text and next perform retrieval practice on it. The control group read and then reread the text. After two days and after one week, a final test had to be performed. Not only were the participants in the experimental group better able to recall the information after two days, but this effect also lasted at least until after one week (Roediger & Karpicke, 2006).

What is interesting in this respect, but which Carpenter failed to mention, is the fact that the time chosen for the final test plays a crucial role in the findings. It is not always the same in such studies. If one directly after learning compares performance between the experimental and control group, it is not uncommon to find better performance in the control than the experimental group, so an opposite effect. A meta-analysis by Rowland (2014) indeed showed that testing effects are larger following longer retention intervals. For a novice researcher in this field, like myself, this may be one of the pitfalls of this research. Recently, I performed my first experiment on retrieval practice. I was mainly interested in how the brain responds when a participant performs such a paradigm. The retention interval was of minor importance to me at the time. Consequently, I did not pay attention to that and asked my participants to do their final test half an hour after the retrieval practice had taken place. This led my control group to outperform the retrieval-practice group. Interestingly, though, the brain responses suggested deeper memorization in the experimental group (Sambeth et al., unpublished data, see also Jia et al., 2020). In a discussion with a colleague from my university, but also with the organizer of the conference dr. Camps, we concluded that this is in line with the type of learning taking place during retrieval practice. According to them, retrieval practice leads to a deeper processing of material, which will eventually lead to better retention at longer intervals. Rereading, on the other hand, leads to surface learning, which is particularly optimal with short intervals, but not with longer ones. Already during learning, the brain may show this differential responding.

Prof. Carpenter continued her story by tapping into new avenues one could undertake in this research field. Above advising our learners to do retrieval practice after processing material, is there more we can teach them based on scientific output? Carpenter elaborated on her keen interest in a particular type of practice testing one could do. Sometimes, teachers also use pre-questions before they even start providing information to their pupils. This way, they try to find out how much the learners already know and which topics might need particular attention in class. Carpenter wondered whether this type of questioning could even increase the testing effect in comparison to merely doing retrieval practice after dealing with the course content. For this purpose, Carpenter has performed a number of lab and field experiments.

First, she explained one of the early studies in this field, by Rickards (1976). This researcher asked students to read a passage of text about an African country that does not exist. Half of the students received pre-questions relating to the passage, then read about this content, and finally were questioned with the same pre-questions again and a number of new questions on that content, called post-questions. The control group read the passage and only received the post-questions at the end of class. The experimental group clearly performed better on the pre-questions than on the post-questions. For the post-questions, though, there was a slight benefit for the control group as compared to the experimental group (Rickards, 1976), probably because they were not distracted by the pre-questions. Carpenter found similar results in her own study when the instruction material consisted of a video rather than a text, except that the experimental group also outperformed the control group relating to post-questions (Carpenter & Toftness, 2017). This led her to conclude that pre-questions may have both specific and general benefits for learning, at least if you present videos.

Somebody from the audience asked whether this would also work in a real-life setting instead of under such controlled circumstances. After all, in Carpenter’s first study standard videos were shown. If an experienced teacher would discuss a topic without having a script to work on, would the advantage of pre-questions still be there? This was exactly what prof. Carpenter studied next in a psychology class, for which students were required to write a lab report. In this field study, the experimental group was first presented with pre-questions, then with a lecture on the course material, and finally with the pre-questions and post-questions (Carpenter et al., 2018). One week later, the students again answered these questions, but now also new questions about the material were added. The control group would only receive the post-questions on the learning day and the new questions one week later. Carpenter and colleagues found results similar to her first study, only this time the experimental and control groups performed equally well on the post-questions. One week later, the difference between the pre-questions and post-questions just failed to reach significance. However, both groups did better on the questions seen directly after learning, in comparison to the new questions, indicating that a general testing effect was still apparent.

After presenting yet another study, Carpenter concluded her keynote with the notion that using pre-questions may sometimes be more beneficial than administering only post-questions, so it might boost the general effects of retrieval practice. What, according to prof. Carpenter, is currently not clear though, is whether transfer takes place from presenting factual pre-questions to obtaining conceptual knowledge. This is what she is currently studying. I do not think that this would work, since retrieval practice on factual questions has previously not resulted in more conceptual knowledge in middle school and college students (Agarwal, 2019). Nevertheless, adding pre-questions to the next lecture I will deliver may be an interesting way to trigger my students and, with a bit of luck, they will also learn more about my lecture.

Presentation Model learning

Milou van Harsel, a PhD student working at Avans College, delivered the next presentation. She started by asking her audience to perform a short task, namely doing an origami folding of a little fox with the use of an instruction. Afterward, she asked the audience whether this was difficult. Some people found the exercise to be hard and frustrating, whereas for others, it was relatively easy. After this observation, she told the audience she had been instructing the participants differently. The participants who found the task relatively easy had been allowed to watch a short example video of somebody folding the little fox. Others simply had to do with a written text, which was likely the reason for the discrepancy between those doing well or not so well. Van Harsel next explained that this is exactly the type of research she does in her PhD.

First, Van Harsel described the prior research done in this field, which often uses worked-examples or modelling examples to teach a certain skill. The worked-examples would be step-by-step procedures that are guided by an explanation as to how and why each step is undertaken to solve a certain problem. Modelling examples, on the other hand, would be step-by-step examples only explaining the how, according to Van Harsel. Prior research has shown that learning with the use of such examples (E) leads to better learning than only providing problems (P) similar to the written assignment for the origami fox, at least in novice learners (van Gog et al., 2011).

In her own research, Van Harsel and colleagues (2019) used the same design as originally used by van Gog, only with more practice trials. Van Harsel’s participants of a technical educational programme studied mathematical equations using four trials. After this, their performance on such equations was tested. Group 1 studied an example in all trials (i.e., EEEE), the second group studied an example in Trials 1 and 3 and a problem in Trials 2 and 4 (i.e., EPEP), the third group studied the reversed order compared to the second group (i.e. PEPE), and the final group studied a problem in all trials (i.e. PPPP). Based on the research by van Gog and colleagues (2011), Van Harsel hypothesized that the first two groups would perform better than the other two groups. However, this was not the case. Group 3, which had started with a problem and then continued with an example, did equally well as the groups starting their learning with an example. The final group, which only studied using problems, did show less effective learning, similar to what van Gog had found as well. Van Harsel concluded that, if you offer more than two learning trials, the fact that you at some point also start presenting examples will help the student to master the skill after all (Van Harsel et al., 2019).

The way Van Harsel’s research differs from that of prof. Carpenter is that Van Harsel is interested in skill learning, whereas Carpenter studies memorization and comprehension. One could find some resemblance in the research as well, though. If you see processing examples as doing practice, and if you think of processing problems as simple reading, what Van Harsel showed is that doing (retrieval) practice is highly effective when learning a skill. If I were to take this one step further, this would mean that it does not matter much whether you start with the practice or whether you do it later during the learning process, in all cases performance will outweigh the performance of only reading. This reminds me of when I was a student learning about Biopsychology. Due to time limitations, I did not even read the book and only practiced the 300 example multiple-choice questions our professors had offered us and I got an excellent grade. However, in my case, this only worked because I received feedback about the correct answer each time, which is easy with a multiple-choice question. If you would not receive any feedback or if you only studied open-ended questions, you might fail if you do not have any background information to fall back on. Therefore, the type of test used could matter as well. Indeed, although McDermott and colleagues (2014) were not able to find differences between retrieval practice using multiple-choice or short-answer questions, Ramraje and Sable (2011) did find an advantage of multiple-choice questions over short-answer questions. So far, only limited research was done on this particular issue, but in the future, I would prompt my students with multiple-choice questions. The only question still remaining is whether this helps my students to gain conceptual knowledge, which prof. Carpenter is currently studying.

Presentation Self-regulation in higher education

The next presentation was delivered by Ms. Sanne Rovers, a PhD student from Maastricht University. She is interested in how we can optimize the use of study strategies by students in a medical curriculum. For this purpose, she performed a focus-group study in which 26 first-year students, known to use effective techniques, took part. The idea behind this, according to Rovers, was to learn from students that study effectively and then later teach some of these ways to those students that do not use effective study strategies.

In Problem-based learning, as used at Maastricht University, students are far more responsible for their learning process in comparison to students at other universities. This is because students are first offered real-life problems, next in a team determine the knowledge gaps, then formulate learning goals, and finally dive into a large amount of literature to find an answer to those learning goals. The learning process ends with a discussion with their peers on those to-be-answered learning goals under the guidance of a tutor, Rovers explained (see also EDview, 2018).

Rovers referred to the 2013 paper by Dunlosky and colleagues, which presents a list of ten study strategies that are not all equally effective. Teachers often notice students using highlighting or rereading as their strategies, even though these strategies are not seen as effective by Dunlosky and his peers (2013). Rovers now wondered what the effective students in the medical curriculum do when they prepare for a session or study for an exam. She divided the students into four groups and did a focus group with them twice, initially and after a couple of months.

Interestingly, the participants to some extent used quite ineffective study strategies, such as summarizing texts. However, what Rovers noticed was that the way those techniques were employed is more active than what was described in the paper by Dunlosky et al. (2013). Rovers developed a model based on the focus-group sessions, describing different key aspects of the strategies undertaken by her participants. First, all those students work on personal learning goals during a study session, such as how many pages they want to read. Second, they constantly monitor their work, for instance by checking the notes at the end of the day or by performing a number of practice questions. A third finding was the active processing of the material. Even when summaries were made, students would do this in their own words and next try to link the different topics wherever possible. Finally, the participants seemed to have metacognitive knowledge of their study behaviour. They addressed the fact that they knew why a certain strategy was good for them and that they could adjust this if needed.

Rovers did not have a clear implication of her results as of yet. What I take from this presentation, though, is that the topic of effective studying is more complex than often presented in the literature. It is nice that Dunlosky and colleagues (2013) were able to pinpoint ten strategies and explain why they are or are not effective. However, students do not take that paper and decide to employ just one of those strategies. As shown in Rovers’ experiment, students employ several strategies simultaneously. For instance, the student summarizing the material and monitoring it by practice questions at the end of the study session would actually be using a highly ineffective and a highly effective technique simultaneously. Hardly any research has been done on the combination of study techniques. However, O’Day and Karpicke (2020) found that combining the two effective strategies of retrieval practice and concept mapping did not lead to superior performance in comparison to using only one strategy. Therefore, I can hardly believe that combining retrieval practice with a less effective technique would do the trick either. I think that retrieval practice performed at the end of the learning session is what makes this particular student an effective learner, not the generally active attitude the student takes when studying. Therefore, providing practice questions to my students is probably still the main thing I should do.

Presentation Effective learning and the role of the teacher

The final presentation I attended was by Mr. Tim Surma, a math teacher from Belgium, who now does his PhD at the Open University. He is interested in effective learning and the role of the teacher in this. First, he explained three relevant and effective study strategies, after which he presented how one can use these when giving instruction in class.

Surma dived into the effectiveness of retrieval practice, distributed practice, and interleaved practice. This first one, I already extensively described in this paper and I will not go into it anymore. Distributed practice is the spreading of learning over time, rather than massing everything at once. Research has shown that learners in the distributed group outperform those who do massed learning (e.g., Goossens et al., 2012). According to the reminding framework, this is because a repeated encounter with certain information reminds the learner of the prior episode, including the context in which that prior encounter took place. This leads to the enhancement of memory for the content of that information (Benjamin & Tullis, 2010).

The third strategy Surma explained is one that I find highly intriguing, namely interleaved practice. This technique differs from the distributed practice in that not only the to-be-learned content needs to be spaced in time, but one should also make sure that the type of content is mixed. Therefore, in a practice session, the teacher shuffles different topics, which the learner actively has to reflect upon. The learner in that case has to realize that the solution for each different topic may be different, Surma explained. Students commonly do not like this type of learning according to Surma, but research has clearly shown the benefits of this type of instruction in comparison to what is called blocked practice, the mere practice of each topic separately (Brunmair & Richter, 2019).

Why do I find interleaved practice so intriguing? Well, in my first study on retrieval practice, the participants were also asked to perform interleaved practice, after which I again looked into what the brain does while performing such a task. Each participant was asked how effective a certain method was to them. My participants thought that the blocked practice would lead to better learning. However, those same participants in the end significantly performed better for material that was studied using interleaved practice than for the material studied in a block (Sambeth et al., unpublished data). In other words, students do not see the benefits of interleaved practice, so here the teacher can play an important role.

After introducing the three effective study strategies, Surma gave some examples of how a teacher may improve his/her instruction using those techniques. Namely, according to Surma, students will not start practicing those optimal strategies spontaneously. They simply study in a way that takes them the least time, which is not necessarily how they should study. It is Surma’s opinion that the teacher must help students to study effectively. I fully agree with this notion, although I am afraid that not all my fellow teachers would agree.

Reflection

I attended this conference with the aim to learn more about how I can optimize the study process of my students, next to finding out what the role of the teacher, in general, should be in this respect. Before this conference, I had already performed an experiment into how the brain responds when using two strategies known to be effective, namely retrieval practice and interleaved practice. What I now realise is that the techniques chosen by me are also seen as most relevant by others, otherwise they would not perform research into them or explain them in detail to their audience. In the future, I would like to address the use of pre-questions, given that they may trigger students to pay attention to the most relevant aspects discussed in a class. How to implement this in an experiment on the brain’s responses is something I will have to figure out, though, next to the fact that I need to place this better into a theoretical perspective. There are various theories on why certain strategies may work, but a conclusive answer has not been found yet (see also Karpicke et al., 2016).

The role of the teacher is still a difficult one for me. Tim Surma gave a number of examples of what a teacher can do in class to trigger better learning, such as asking students practice questions on various occasions, preferably coming back to each topic after a while. However, this does not fit with the educational method employed at my university, where students work in teams to solve real-life problems. I cannot simply instruct them in a lecture, as the active learning we value might partly be lost. Therefore, I need to find ways to trigger effective learning during the interactive sessions they have with their peers. The presentation by Sanne Rovers might help me, as she found that effective students set their personal learning goals, do self-monitoring, and study actively, all because they have the metacognitive knowledge that this will work. I can prompt my students to be that active too.

One issue I find particularly difficult, is that I often hear my fellow teachers being critical about the role of us as a teacher. When I talk about my new scientific interest and how I think we can be of help to our students, they often state that students should already know how to study and that we should not waste our time on this, but rather focus on the vast amount of knowledge the students must obtain. Would this be the average view of the average teacher or am I surrounded by staff without much interest in teaching? For this reason, I am now preparing a survey study in which I will ask staff about their knowledge on which techniques are effective, how these techniques can be employed in a problem-based learning curriculum, and whether we should teach those effective strategies at the start of their studies. If the results are indeed that negative, I will advise the management to look into this issue. In any case, through this conference, I feel empowered by especially Tim Surma that I am not alone in this!

Sources

Agarwal, P. K. (2019). Retrieval practice & Bloom’s taxonomy: Do students need fact knowledge before higher order learning? Journal of Educational Psychology, 111(2), 189–209. https://doi.org/10.1037/edu0000282

Benjamin, A. S., & Tullis, J. (2010). What makes distributed practice effective? Cognitive Psychology, 61, 228-247. https://doi.org/10.1016/j.cogpsych.2010.05.004

Brunmair, M., & Richter, T. (2019). Similarity matters: a meta-analysis of interleaved learning and its moderators. Psychological Bulletin, 145(11), 1029-1052. https://doi.org/10.1037/bul0000209

Carpenter, S. K., Rahman, S., & Perkins, K. (2018). The effects of prequestions on classroom learning. Journal of Experimental Psychology: Applied, 24(1), 34-42. https://doi.org/10.1037/xap0000145

Carpenter, S. K., & Toftness, A. R. (2017). The effect of prequestions on learning from video presentations. Journal of Applied Research in Memory and Cognition, 6(1), 104-109. https://doi.org/10.1016/j.jarmac.2016.07.014

Dunlosky, J., Rawson, K. A., Marsh, E. J., Nathan, M. J., & Willingham, D. T. (2013). Improving students’ learning with effective learning techniques: promising directions from cognitive and educational psychology. Psychological Science in the Public Interest, 14(1), 4-58.

EDview (2018). Position Paper: The Full Potential of PBL Philosophy. Diversifying Education at UM. Retrieved from https://edlab.nl/wp-content/uploads/2018/10/EDview_Position-Paper.pdf

Goossens, N. A. M. C., Camp, G., Verkoeijen, P. P. J. L., Tabbers, H. K., & Zwaan, R. A. (2012). Spreading the words: a spacing effect in vocabulary learning. Journal of Cognitive Psychology, 24(8), 965-971. https://doi.org/10.1080/20445911.2012.722617

Jia, X., Gao, C., Cui, L., & Guo, C. (2020). Neurophysiological evidence for the retrieval practice effect under emotional context. International Journal of Psychophysiology, 147, 224-231. https://doi.org/10.1016/j.ijpsycho.2019.12.008

Karpicke, J. D. (2017). Retrieval-based learning: A decade of progress. In J. T. Wixted (Ed.), Cognitive psychology of memory, Vol. 2 of Learning and memory: A comprehensive reference (pp. 487-514). Oxford: Academic Press. https://doi.org/10.1016/B978-0-12-809324-5.21055-9

McDermott, K. B., Agarwal, P. K., D’antonio, L., Roediger, I. I. I., H. L, & McDaniel, M. A. (2014). Both multiple-choice and short-answer quizzes enhance later exam performance in middle and high school classes. Journal of Experimental Psychology, Applied, 20(1), 3-21. https://doi.org/10.1037/xap0000004

Murre, J. M. J., & Dros, J. (2015). Replication and analysis of Ebbinghaus’ forgetting curve. Plos ONE, 10(7), e0120644. https://doi.org/10.1371/journal.pone.0120644

Ramraje, S. N., & Sable, P. L. (2011). Comparison of the effect of post-instruction multiple-choice and short-answer tests on delayed retention learning. The Australasian Medical Journal, 4, 332–339. https://doi.org/10.4066/AMJ.2011.727

Rickards, J. P. (1976). Interaction of position and conceptual level of adjunct questions on immediate and delayed retention of text. Journal of Educational Psychology, 68(2), 210–217. https://doi.org/10.1037/0022-0663.68.2.210

Roediger, H. L., I. I. I., & Karpicke, J. D. (2006). The power of testing memory. Basic research and implications for educational practice. Perspectives on Psychological Science, 1(3), 181-210. https://doi.org/10.1111/j.1745-6916.2006.00012.x

Rowland, C. A. (2014). The effect of testing versus restudy on retention: a meta-analytic review on the testing effect. Psychological Bulletin, 140(6), 1432-1463. https://doi.org/10.1037/a0037559

Sambeth, A., Pieters, J., Biwer, F., & De Bruin, A. B. H. (in preparation). The neural correlates of retrieval practice and interleaved practice: N400 for familiarity and P600 for recollection.

Spitzer, H. F. (1939). Studies in retention. Journal of Educational Psychology, 30(9), 641–656. https://doi.org/10.1037/h0063404

Van Gog, T., Kester, L., & Paas, F. (2011). Effects of worked examples, example-problem, and problem-example pairs on novices’ learning. Contemporary Educational Psychology, 36(3), 212-218. https://doi.org/10.1016/j.cedpsych.2010.10.004

Van Harsel, M., Hoogerheid, V., Verkoeijen, P., & van Gog, T. (2019). Effects of different sequences of examples and problems on motivation and learning. Contemporary Educational Psychology, 58, 260-275. https://doi.org/10.1016/j.cedpsych.2019.03.005