Increasing Student Engagement with Practical Classes Through Online Pre-Lab Quizzes

Laboratory practicals classes are an essential component of all science degrees, but are a pinch point because of rising student numbers, rising student expectations and falling student exposure to laboratory work prior to entering higher education. Augmentation of physical laboratory work with online interventions is not new, but as virtual laboratories become increasingly sophisticated, cutting-edge approaches have become less available to many institutions as they are unable to meet the investment or specialist skills needed to build or maintain these complex tools. This case study examines the possibilities for increasing student engagement with practical work using the simplest tools available in any standard virtual learning environment and available to all. Based on results obtained from a large student cohort, the results indicate that this low-cost, low-tech approach can achieve high levels of student satisfaction.

to the cutting-edge science to which they are exposed in lectures and other aspects of their course. Bombarded with information, finding the time to prepare in advance for the unfamiliar practical environment is a logistic and intellectual challenge for many students. But without adequate preparation, they frequently struggle in the laboratory, getting lost in complex protocols and losing sight of the bigger picture. Battling with information overload, students are busy doing, not busy thinking (Adams 2009).
The idea of using technology to encourage and engage students with practical tasks is not new. In 1980 Christine Case described the use of companion audio recordings for laboratory classes (Case 1980). With the emergence of the Internet in the 1990s there was a fresh wave of innovation in this area. Between 1994 and 1997 I developed my own series of online laboratory simulations (Cann 1999). Many others followed similar approaches (Rolfe and Gray 2011), with varying degrees of sophistication depending on the resources available for development: notably the Howard Hughes Medical Institute Virtual Lab Series (Huang 2004). Virtual dissections have been a particularly common theme, with dozens if not hundreds developed. The objective of these exercises was not only cost-saving, but also intended to develop Price and Rogers concept of 'digitally augmented physical spaces' (Price and Rogers 2004). In the UK, the eBiolabs system developed in Bristol has been an award-winning example of this approach. In this sophisticated online environment there are three components: online experimental information, including simulations and videos; a pre-laboratory session quiz; and a post-laboratory assignment (Cameron 2010).
But as online learning has become more sophisticated, have we tended to lose sight of the original objectives of online adjunct laboratory activities-do we need to get back to the low-hanging fruit? Although eBiolabs as been adopted and adapted by a few other institutions, it has not become the dominant model of practical teaching. This is mostly due to lack of time and resources to utilize this complex bespoke system. This paper explores a simpler approach to online pre-lab activities which is available to all.

Implementation
A series of simple online exercises were developed for a large cohort of first year biological sciences students using only standard tools available in any virtual learning environment (VLE). In this case, Blackboard was used as the platform for development and delivery of the materials. Similar tools are freely available in other VLEs, such as Moodle or Canvas. A total of 315 students were involved in three phases of an action research-based study designed to adapt to findings and student feedback from each phase (Wallace and Atkins 2012). As this work was undertaken as part of a normal course of study, institutional ethical approval was deemed not to be required.
Phase I: Cell and Developmental Biology, n = 182, Semester 1 Students were asked to complete an online quiz via Blackboard in the week before each of the four weekly practicals on this module. We deliberately kept the number of sequential weekly quizzes to three or four to prevent interest waning due to the arduous nature of an overly repetitive online task. The initial design quiz design for this module deliberately included students writing sort essay-style answers of approximately 100 words, not simply responding to prompts from multiple choice questions. The quizzes were designed to be quite challenging for first year students and to require independent reading beyond the content of the module/practical handbook. Figure 1 shows an example of the types of question format used.
It was decided not to allocate any of the module marks to the quizzes on this module in order to determine what the response rate would be without the driver of summative assessment. As there were no concerns about summative assessment, detailed instantaneous feedback, including links to online resources and videos, was given to students immediately after the quiz answers were submitted, in order to try to boost the students' engagement with the content of practicals while their attention was on the subject matter. However, essay format questions require manual marking, so a score for the quiz was not available until a member of staff had manually checked and marked each of the essay questions. (Feedback was still delivered immediately on question submission.) Although credit was given for participation (defined as a reasonable attempt at an answer, i.e. not a blank space or an obviously irrelevant answer) rather than writing quality, this still slowed the return of quiz scores to students by approximately 24 h, and also took up staff time in checking all submissions manually. For this reason, in subsequent quizzes on this module, fill-in-the-blank format questions were substituted for essay format questions and used to return quiz scores to students instantly without any staff input. If a student did not submit an answer the question, they did not receive the feedback for that question and were asked to submit an answer. This change was possible because these quizzes were formatively rather than summatively marked, meaning that no staff time was required for manual marking.
Phase II: Biochemistry, n = 325, Semester 1 This second phase consisted of three online quizzes but differed from phase I in that no long-form student writing was required. These quizzes used a mixture of multiplechoice questions (MCQs), including visual questions about using laboratory equipment, calculated answers and Fill-in-the-blank format questions, all automatically marked. Students were given short confirmatory feedback on correct answers and asked to resubmit incorrect answers. As in phase I, these quizzes were entirely formative and did not contribute to module marks. Figure 2 shows an example of the types of question format used.
Phase III: Microbiology, n = 315, Semester 2 In contrast to phases I and II, the pre-lab quizzes on this module were summatively assessed as part of the practical submission, contributing a small percentage of the overall mark (2.25%). Students completed four online quizzes in the week before each practical class. A wide variety of standard question formats was used in each quiz, including multiple-choice questions, image-based MCQs, multiple-answer MCQs, true/false, ordering and fill-in-the-blank questions (marked using pattern matching). All questions were automatically marked and marks and detailed feedback, including links to online resources, were given to all students immediately on submission. Figure 3 shows an example of the types of question format used.
An average of 2-3 h of staff time was required to produce and set up each of the quizzes in this study but after that, apart from the essay format questions used at the beginning of phase I, no further input of staff time was required.

Phase I
Although students were aware that the online quizzes were not formally assessed, behaviour still seemed to be driven by marks obtained, suggesting the possibility of a game element. Many students retook these quizzes once or more to get a higher (preferably perfect) score. It was a common pattern to resubmit answers a few minutes after the first submission, presumably after having read the feedback, so that Blackboard displayed a higher recorded mark. It may be that this reflects student anxieties at the beginning of their higher education career, something that later phases of this study were designed to test. Figure 4 shows submission rates (measured before the start of each practical class) for the four quizzes, which averaged 79%, with no drop-off between week 1 and week 4 in this case.
Although quizzes were available online in the week before the practical class, unsurprisingly peak submission time was in the 24 h before the start of the practical (86% of responses), with 14% submitting on the day of the practical itself. Submission patterns became set early on, with the same students tending to complete/not complete each week. This could be monitored as a trigger for early intervention in a comparable way to attendance monitoring (Bevitt, Baldwin, and Calvert 2010). Student feedback data collected via end-of-module anonymous questionnaire was generally positive. Many students commented (some unfavourably) on being prompted to do extra reading in order to answer the quiz questions rather than simply reading the practical book in the lab. Feedback included: Figure 4. Submission rates (measured before the start of each practical class) for the four quizzes in phase I. Submitted: these students submitted their answer to Blackboard and received the online feedback before the practical class. Looked at: these students had opened the quiz but not submitted answers before the practical class (some of this group submitted answers after the class). Incomplete: refers to Weeks 1 and 2 only, when we used essay-format questions, before switching from essay-format questions to fill-in-the-blank format for the reasons already described. Not opened: these students had not opened the quiz and consequently not seen the questions or feedback Increasing Student Engagement with Practical Classes Through Online Pre-Lab Quizzes 107 What did you think of the pre-lab quizzes on this module-how did they affect your approach to the practical classes?
Helped with my understanding as practical classes were quite fast-paced. They helped as they helped you get a sense of the background of the practical. This helped when trying to complete the write-ups for each practical, as you already had read up on the science of why the results should have been as they were. It also helped me to understand what more about what we were supposed to achieve from dong these practicals, and how they related to the lectures we were attending. Appropriately challenging-the practical work often felt very disconnected from the work we were doing in lectures, however perhaps a lecture focusing on what we did and the significance of it would be a good idea. I would have liked to have more questions to motivate me to do more work outside lectures. They were an integral part of my preparation for the lab classes: if there was anything I didn't understand in the quiz I could go and look it up and come into the practical more confident about what I was going to do. I thought the pre lab quizzes were good because the forced me to read around the subject that was being investigated in the next practical, so during the practical I had a greater understanding of what was going on. To improve the quizzes, information could be given on where to look for relevant information to help complete the quiz. They were useful as a method of giving you a bit of background to the area you'd be working on but some of the questions, especially the fill in the blanks and write 100 or so words ones were too vague or difficult to understand how much detail or what was being asked for in order to get the full marks which is more difficult when you've had to research the topic yourself. I love them and they give you a good insight on what to expect before the practical. A fantastic idea and very helpful.

Phase II
The phase II pre-lab quizzes followed on from phase I in the second part of Semester 1 and involved an overlapping, slightly larger cohort of students. As with the phase I quizzes, these exercises were not summatively assessed. Pre-lab submission rates for the three exercises were lower than in phase I, at 57, 40 and 40%, respectively. This possibly reflects increasing workloads later in the semester, or may reflect a change in student attitudes as they transition to higher education. Again, submission patterns were fixed, with the same students submitting or not submitting answers each week. In the anonymous end-of-module questionnaire, no students commented on the pre-lab exercises either positively or negatively, reflecting the relatively low prominence of these exercises in this module.

Phase III
In accordance with the action research design of this study, because of the phase II findings, the four pre-lab quizzes in phase III (run in Semester 2) were the only exercises to be summatively assessed. On this module 95% of students completed the tests before the laboratory sessions, with an average score of 70%. Performance on all four of the tests was similar. Not surprisingly, given the link to summative assessment, the 95% completion rate was an improvement on the 80% and 4-50% completion rates for the formatively assessed tests in Semester 1. The overall end of module feedback was positive (n = 265/315 = 84%): How useful were the practical preparation quizzes to you?
Very useful: 27% Useful: 57% Neither useful nor not useful: 13% Not useful: 3% Not useful at all: 0% How difficult were the practical preparation quizzes?
Too easy: 0% Easy: 9% About right: 73% Difficult: 18% Too difficult: 0% The main advantage of the practical preparation quizzes was: It made me look in more detail at the topics that would come up in the practical, therefore I was more prepared for each practical. Gave some background knowledge to what we were doing-gave some deeper understanding before practical so understood practical better. By taking the quizzes, it helped to be able to answer the questions in the practical question, as it was all relevant preparation and I knew roughly what to expect and how to interpret my experiments.
The questions made you investigate things that you would be doing in the laboratory so that when you were doing them, you had an understanding of why and the expected results. Most questions I didn't know the answer to, so hand to do some background reading on the topics that were associated with our practicals. It was a good way of motivating me to not just read, but study and understand the lab protocol and why were carrying out the experiments we were doing. It got me to look at what I was doing in my practical in more detail before I actually did it. That helped because I knew what to expect and what to look for, and I could follow the procedure properly. It forced me to prepare to the practical sessions before I have attended to the lab, so I knew what I am going to do.
Knowing the theory behind the practical before going into the lab, especially the various tests for the identification of microorganisms.
The main disadvantage of the practical preparation quizzes was: Certain questions were not easy and required you to do some research in order for you to answer the question. I'm not sure they should actually count towards the module mark, would make more sense for them to be formative. Some of the questions required a lot of research, such as the question about UK legal limits for bottled water. Consequently the quizzes became quite time consuming, time that could have been spent looking over the lecture material. As most of the material had not been covered in lectures, it took a long time to search and find the answers. Possibly difficult to research some subjects by yourself if you couldn't understand a concept. I knew very few answers, it was ALL new and untaught. They are just another deadline-not that that's a complaint. Life is a deadline. I wasn't familiar with a lot of the terms used. Could have given better feedback when the questions were wrong.
Any other comments about these quizzes: The questions needed to be more close the practical books. Giving the correct answers at the end of quizzes allowed people to copy the answers achieving 100% without doing the work. I think that generally the quizzes being assessed was a good idea, because it motivated students to research practical material. There were the right amount of questions, and the fact that the questions were multiple choice meant that even if you weren't entirely confident on the correct answer, you could always have an educated guess. Generally were very good. I think some more in depth questions have to be added. Surprisingly worthwhile. They were a very good length. Videos that were located in answers were very useful. These quizzes have been greatly helpful to my understanding of the practical and all modules should run such quizzes before a practical session, as currently not all modules do this, and it would be very beneficial. Good variety of question formats and most questions required independent research. They are very useful but the questions could be longer and quizzes could be longer.

Discussion
As online education matured, increasing amounts of time and effort were poured into sophisticated virtual recreations of the laboratory environment. Inevitably, these are reported in the literature with positive outcomes. Initially, perhaps more enamoured by the technology rather than the pedagogy, the approach tended towards literal recreation of online spaces (Peat 2000). This was reflected by the Second Life mania of the late noughties when investment in online substitution peaked (Wang and Burton 2013).
Clunky as well as difficult and expensive to develop, Second Life or home-brewed equivalents were at odds with trends towards mobile devices and interest was finally reduced by the unstoppable rise of Facebook and the social media boom currently underway.
Recently there has, thankfully, been a return to an emphasis on a more thoughtful approach including augmenting and enhancing rather than replacing physical experiences (Jones and Edwards 2010).
The objectives of the studies reported here were to explore student responses to simple, low cost online interventions available to all institutions, and to investigate the behaviour of a large first year cohort adapting to the demands of higher education throughout a transitional year. Working with a large cohort, we sought to gain maximum educational benefit from minimum input. Early in their university career, the majority of students readily engage with online quizzes without the driver of assessment. The results presented here show that by the end of their first semester this engagement has waned, but can be rekindled by utilisation of minimal summative assessment as a gentle push to refocus attention on pre-laboratory preparation. These results confirm similar findings recently published and show that even relatively simple online interventions can increase student engagement with practicals (Whittle and Bickerdike 2014).
The outcomes of laboratory practical classes are complex and difficult to measure because they are multifactorial. Anecdotally, staff involved in delivering these classes felt that students seemed well prepared in labs, which may have been due in part to the online quizzes. This is based in informal comments by students about the quizzes, but it is extremely difficult to accurately measure the full impact of such an intervention. Direct comparison with student attainment data from previous years would be misleading due to the multifactorial nature of such outcomes, such as staff changes and cohort effects. The cost of investment in mounting the online quizzes is so low that with the data presented here and student feedback received it is difficult to argue against their continued use. It may be that such an approach is also valuable for preparation of postgraduate demonstrators for laboratory classes, and for students at higher levels of study than year 1. However, it is not guaranteed that the responses of these groups would be the same as the first year cohort, but since the investment required by this approach is low, it would be a relatively easy question to investigate, although it was beyond the scope of the present study.
Although simple numerical measures of student attainment may not reflect relatively small-scale interventions such as the online quizzes described in this document, the data presented here, and in particular the extremely positive student response to this approach, has convinced staff that a change in practice is worthwhile. By utilizing simple tools already present in the learning environment in use, the problem of preparing increasingly large student cohorts, most of who have much less experience of practical laboratory work than previous generations of students, for expensive lab classes has been mitigated at minimal cost.