Now that semester is over it is time to reflect on the Show, Zap, Pat project and anlayse the results of the surveys, focus groups and interviews we conducted over the year. I’ve been pretty pleased with the results so far…tick for us!…. though as expected, I also have a pretty long list of ‘what to do better’ as well!
First of all a reflection on the method used to collect student feedback: by asking students to complete pen and paper style forms at the beginning of the lab classes we were able to get very close to a 100% response rate.This meant that we were able to reduce the potential for sampling bias (i.e. that those who have strong feelings are more likely to respond to online quizzes than those who do not) and collect an understanding of the whole class.
As a reminder, the research question for the project was :
Are the practical laboratory skills (and in-situ recall of practical theory) of students performing recrystallisation experiments improved by providing feedback in the form of online self-directed development exercises before subsequent attempts of those skills?
Therefore the surveys contained four multiple choice questions and one recall question to assess this. We were particularly interested in the recall question that queried students on what they thought made a ‘successful recrystallisation’. As reported earlier, the results from the baseline group from semester 1 (who where not provided with any online or in-class resources relating to the project) showed that only a quarter of the students were able to answer this question and indeed this figure was almost the same as the semester 2 cohort’s response at the beginning of their first lab, before completing any labs in the unit. It should be noted that students in the second cohort were given details about the show, zap pat, project and access to all online resources before their first lab, though they were not specifically directed to any resources until the show, zap, pat project commenced. When the same survey was given to the second semester cohort (after they had participated in show zap pat) the number of students who were able to correctly answer the recall question doubled with just under half of the students being able to recall without prompting the purpose and practice of the recrystallisation skill. The graphs below give a comparison of the findings for the recall question from all three surveys.
As might be expected, the number of correct responses to the multiple choice questions was considerably higher than the recall question in all three surveys, though there was an improvement in the number of correct responses in the end of semester survey for cohort 2 compared to their beginning of semester responses and cohort 1 (baseline). I will update with these results shortly!!
A large part of this project was testing to see if instant feedback was possible for students using LMS and portable devices. Despite my grand visions of a sophisticated system where by students receive email feedback instantly at the click of a button for the lab demonstrator..this proved to be the most difficult and awkward part of the show, zap, pat project.
The problem came with the grade book, it has been designed for someone to be sitting at a desktop computer entering grades and thus it has essentially zero functionality on a mobile phone. The stiff and anti-initiative format proved too much and we abandoned the plan for demonstrators to use their own mobile phones for grading at the training session we held to educate demonstrators on the new initiative. In our practice run Using a mobile phone it took several minutes to navigate and identify and fill in a grade for a sub skill…of which unto 4 were being assessed in a single session. In a class of 20-30 students, this was completely un-workable.
The situation was slightly better with a tablet (we used ipads) as the bigger screen size meant that more of the grade book was showing and this meant less fiddling around to make sure the correct square was being selected to enter a grade. However and it is a big however, this still proved to be a stiff and time consuming way of entering teh grade. The grade book required two confirmations for a grade to be entered and then another is it is amended. This proved tedious
So how did we do the in class assessment?
To avoid interfering with the function of the labs, we added extra demonstrators whose role was to asses the show zap pat skills in a lab class. they wore white coats to avoid confusion with e actual demonstrators and moved around the lab when students were performing key skills (about 1 hour into each lab). We tried this in two formats one where these extra demonstrators used ipads and one where they used pen and paper. Overwhelmingly the demonstrators prefferd the pen and paper method, even though it meant that they had to go back after class and enter all the grades in LMS. The reason for this was the frustrating lack of functionality with the grade book which consumed their time and prevented them from speaking to the students. Given our initial plan was to give students instant feedback we asked that demonstrators continue to hand out stickers with QR codes and let students know how their performance was rated. We are going to host student and demonstrator focus groups to ascertain how the students felt about this type of assessment and if this is a practicable way to assess lab skills.
Like most things that involve creating something from not much. The broad brush strokes are easy to discuss and do but it is the minute detail that really makes the idea come alive..and takes up all your time!
As the beginning of our live trial grows nearer, my to do list (and anxiety) grow larger. Finishing off the final details for the student resources has meant that several have had to be made from scratch as extensive searching for existing resources has not proved as fruitful as I had hoped, with many available resources being too complex or different from our accepted practices This has made me realise how valuable good, clear instructions that are broken down into steps are!
Next week we go live, with a short presentation in a lecture to the class to ‘sell’ the idea and then the first week of labs. We have taken the route to allow students to ease into their roles in the labs and thus not ask them to do any of the ‘show’ aspect of the programme in the first session. instead we will focus on collecting the baseline data in the form of a survey and then arm our demonstrating staff with the ‘zap’ element (QR code stickers) to hand out where needed as a form of feedback. The following lab session will give the students the opportunity to demonstrate elements of the recrystallisation skill and put the idea to the test. We will meantime recruit a small group of students to interview about their experiences along the way.
Now back to work!
We are now well underway with the development of the online resources. This has involved the curating of a you tube video library as well as shooting and editing videos and making posters for resources that were not available/suitable online.
Perhaps more importantly, we have developed the set of criteria that students will need to complete in order to be awarded a badge.
Shannan was asked to participate at the recent 7th wave conference hosted by education futures , where she was given the opportunity to talk about the project and achievements so far to perspective scholarship applicants. This slide sums up how were are integrating instant feedback and gamification into the lab.
Finally, We have been developing a third party website to host the resources that will be linked through QR codes and given to students as feedback during the semester. You can check it out : showzappat.weebly.com
In late May, we rolled out the baseline questionnaire on students’ knowledge of recrystallisation skills and theory. We were fortunate to received 270 responses (90% response rate).
The survey was given in the last lab session of semester. What did we find?
59% of students felt confident enough in their skills to attempt a recrystallisation solo, if given instructions.
60% of students were able to identify a faulty (incorrect) lab procedure.
62% of students were able to identify the correct volume of solvent to use in a recrystallisation.
66% of students were able to correctly identify the purpose of using filter paper in a recrystallisation.
Interestingly, only 24% of students were able to describe what makes a recrystallisation successful.
The results from this survey, along with the demonstrator focus group feedback (see last post) will be used to drive the focus of the online resources for the changes we will be making to the lab program in semester 2. Now comes the the hard part, where our LMS platform, resource development and digital badge scheme need to be created, tested and rolled out by early August!
Lured with the promise of a free afternoon tea, 8 sessional staff members were asked to participate in a focus group about improving the development of practical skills as well as the feedback for students on their performance of these skills. We were also very interested in exploring the experiences of these staff in the Chem1002 laboratory classes. Prior to the focus group meeting, the participants were given a list of questions pertaining to student development of practical re-crystallization skills (Questions for semester 1 demonstrators). These questions formed the basis for the discussion.
The discussion was very productive with some interesting points raised and great suggestions made. A summary of the meeting is as follows
Important Theoretical Concepts
- Why do a re-crystallization? – Students don’t know why we would want them to dissolve the solid they already have and then form new crystals
- The concept of solvents and why choose a specific solvent (though keep this to low grade theory and a choice of only 2) hot and cold temperatures are used.
- Why they add acid etc to make the first round of un purified crystals (ie. speciation/ionisation)
- Why use boiling chips
- Why fold filter papers
Practical skills that students stumble on
- Hot filtration – keeping it hot and why this is important (also, why they don’t collect the crystals in the filter paper) there is a problem of the wrong filter paper being used in the labs, the current stock is too thick and slow! Yield is their motivation in the marking scheme.
- Folding filter paper and why
- The amount of MgSO4 to add and why it is added (snowglobe effect video suggested). Some concerns about the consistency being too course.
- Glass rods not scratchy enough for seeding – usually just instructed to use spatula.
- Given a solvent range in the lab manual, suggestion that a minimal solvent amount would be more helpful with instruction on how to tell if more needs to be added and how much.
Practical skills that students seem to ‘get’
- Vacuum filtration
- Setting up condensers
- Provide students with a flow chart for the process with a trouble shooting aspect to it.
- Focus on general skills of time management ie reading ahead and knowing what they need to do next and why.
- Demonstrators would like to know what is going on in lectures i.e. a weekly email update
- Focus on motivators for students reading and responding to feedback
- Make sure that students don’t get huge amounts of negative feedback.
- Provide better written instruction for practical skills and don’t replace paper information with videos, provide both.
- A demonstrator LMS for the transfer of knowledge and skills between demonstrators
Suggestions for videos used as learning resources
- Make sure use similar equipment to what we use in labs
- Doing the same technique as their lab demonstrators will be tell them to
- In house video’s are more personal so a scattering o them might help
- Students respond well to well-presented and entertaining videos so quality control is important.
Our initial project proposal included a small scale pilot study in first semester whereby a live test of the online badge system would be implemented. This however did not prove to be feasible in the restricted timeline and further discussion decided that baseline data would be more beneficial for comparative discussion and course design. By baseline data I mean determining what level of practical skills students and knowledge of practical theory are obtaining at the completion of the unit, before any changes are made to the course. I was also interested in talking to the lab demonstrators and understanding their main concerns and frustrations with the lab course and what improvements might be made to convey the practical skill set.
What did we do?
After talking to our postgraduate consultants (see previous post) we identified that beyond the physical side of practical skills there is a large element of theory behind practical skills in the chemistry laboratory that students are unaware of. the mentality to follow the instructions blindly, without thought for these theories was identified as a key area of concern. To this end, we developed a questionnaire (Survey on Recrystalisation in Chem1002 lab classes) for students to complete that assessed their confidence to complete a re-crystallization without demonstrator assistance as well as their understanding of key element of theory behind this skill set. Finally, students were asked to define what a ‘successful’ re-crystallization is in an open worded answer. The survey was administered to students in Chem1002 in class before the final practical session of the semester. Of the 295 students enrolled in the unit, 270 responses were recorded (91.5%).
What did we find?
Results from this baseline study are still being collated. However, preliminary assessment of the final question, regarding the definition of a ‘successful’ re-crystallization showed that few students were able to identify the key purpose of this procedure. The previously highlighted propensity of students to follow the method blindly was demonstrated with many students defining a successful re-crystallization as one where the instructions were followed correctly. We await the findings of the baseline to help us develop the key areas needed for our online learning resources.
Having met with the Blackboard experts, we have had to change our ‘dream’ platform slightly to accommodate the limitations of the LMS. Instead we will need to rely on QR coded stickers and demonstrator interaction to provide the instant feedback we had originally hoped would be in the form of an instant email sent specifically to student’s inboxes. We have decided upon a ‘points’ system where student must obtain a certain number of points in order to be awarded with a digital badge. points are awarded for completing online exercises and in-class demonstrations of practical skills. Students will have the opportunity to re-attempt certain tasks and skills in order to obtain a higher level badge (i.e.e completed, silver, gold, platinum)
Meanwhile, we have identified the key skill set (re-crystallization) on which to focus the pilot study. We have employed postgraduate student sessional teaching staff who are familiar with the laboratory course to do a skills audit of each practical class to identify the sub-skills of re-crystallization. This will enable us to identify and develop self assessment and feedback loops to help students to develop these sub-skills. The current idea being that each lab will focus on developing specific sub skills for re-crystallization.
We are also pleased to say that our human ethics application for this project has been submitted and accepted.
After a bit of a late start the project is well underway. I’m excited this week because we are meeting with a Blackboard guru to get an idea of how many of the fantastic features we want our LMS platforms to be able to do are actually possible.
Meantime I’m playing catch up wading through the forms to actually make this project a reality. Tasks this week include beginning a full skills assessment of the laboratory course we have decided to tackle. Consulting and developing appropriate questions to include in the evaluation surveys and completing the (dreaded) human ethics application.
I remain as ever, optimistic!