Our online interpreting exam experience: students as partners
COVID19 brought its own online teaching challenges. But that was the easy part. Wait until you hear about the summative assessment challenges! Here is how we did it. That could perhaps help you with your own online final exam approach.
Traditionally, assessing conference interpreting is done in so many ways, but always face to face. Some universities assess their students as they interpret live speeches whilst others prefer their students to use recorded speeches. Some universities organise a final panel to assess candidates, others assess recorded exams. This post is not about the format of interpreting exams, but rather about the approach to online exams at a time of crisis.
At London Metropolitan University, the format of our interpreting exams is as follows: each module is individually assessed. It could combine one or more of the following formats: practical interpreting exams (consecutive and simultaneous), presentations, essays, and portfolios of practice.
I would like to focus on the assessment of interpreting skills (10 mn simultaneous interpretation; 6 mn consecutive interpretation in different language combinations).
Before the exam:
If we were worried about exams, imagine how worried students were. Imagine how worried you would be if this was your own exam. Most students had moved back to their own country. They depended on their home technology. They had to take the exams from home and some of them actually were in quarantine. This is not the highly desirable context one would describe as conducive to excelling in a challenging interpretation final assessment. But this was the context, and we had to make the most of it.
A student agency approach
Based on the understanding of this specific context, our approach to the final exams was defined by a genuine partnership between students and staff. The COVID19 crisis offered the opportunity to move away from the traditional approach to assessment whereby an institution represented by lecturers dictates what the exam format and conditions are. We were able to rethink the assessment approach and take collective ownership of the final exams. In my mind, student agency is a game changer that deserves to be explored beyond the COVID19 context.
The MACI is a professional course. The final exam has to be at the same professional level of skills as stated in the course specifications.
It is accredited by a Higher Institution in the UK with strict rules and regulations we have to comply with.
This was not negotiable.
What was negotiable was the “how”. How do we go about this? How do we make this happen knowing that every student is now in a unique technical context?
Start planning early: engage students in a collective reflection
Our exams were mid May. We started planning at the beginning of April. How did we go about it:
All staff met online and one question was asked: are students making the progress we expected them to make before moving to online teaching? How does this compare to previous years?
We agreed that students were on track, making very good progress and fully engaged during online teaching.
As a result, we thought it was fair to keep the same quality standards as for face to face assessment (levels of speeches, preparation time, choice of topics).
Before the Easter break, we organised an online meeting with students to brainstorm information about assessment (technology; emotions; support; teaching and learning). Students knew they could not change the format of the exams. But they were invited to take ownership of the “HOW”. For me, this was an extraordinary experience to see how constructive and positive this approach was. The interaction between us all generated solutions that none of us could have come up with at an individual level. One very striking example was the idea to invite “a friend” in the student assessment environment for support and help with the technology. This was an option that could relieve some of the pressure from students who could then concentrate on their exams.
Selecting the best suited technology: simplicity and familiarity
We decided to use tools we were very familiar with:
Zoom to conduct exams synchronously.
Students selected two familiar devices with one device dedicated to recording (smartphone, tablet, laptop, desktop) with headsets.
Students selected their preferred recording software.
Google Drive to upload recordings.
We opted for audio recordings of exams instead of video recording to reduce uploading time and technical challenges for students.
Students filled in information about their individual technical solution on an assessment planner that allowed us to help them out on the day (Available here)
At this point, we had 4 mock conferences left (16 hours in total). Each mock conference was an opportunity to use the technology and approach we had adopted for the final exams:
Each speech was recorded on YouTube.
The link of the speech was made available at the last minute.
Students used their exam dedicated devices and headsets to interpret and record.
They uploaded their recording (audio file) into a dedicated named folder on Google drive.
At the end of each mock conference, we conducted a survey to measure their confidence in their technology and the processes involved.
In parallel, as we were going through this experience, I wrote a step by step approach document aimed at students and staff for the day of the exams. It was divided into 3 sequential steps: before, during, and after the exam. This document was continuously updated by students, staff and myself so that it included every single observation or tip. It aimed at enhancing predictability and reducing the level of stress on the day of the exam for students and staff. (Available here)
On the day of the exam: predictability at its best
I opened the exam space one hour before the exam started. This provided me with plenty of time to set up Zoom and all links to speeches. But it also allowed students to check their connection and benefit from last minute reassurance (“yes, the link works well”; “Yes, you are connected”. “Yes we can hear you very well).
As a virtual background, we selected a photo of the interpreting suite at university. This was a reminder of our familiar professional set up and added a personal touch to the exam context.
Students with the same language combination took their exams at the same time.
The Zoom waiting room allowed us to control who moved in the assessment room. We were able to send a dedicated message to students in the waiting room (“Hello John, we can see you are waiting. We will let you in in 10 mn”).
Students were then welcome to the assessment room. From that point, we recorded the full exam procedure to provide the university with evidence that we conducted exams according to the assessment regulations.
There was no informal chit chat to relax students before the exams. We did not want to overload them with unnecessary information and avoid confusion.
We briefed students using the same wording as on the step by step document we had created together.
Students were given the topic and key words for the speech (shared screen), and 10 mn to prepare. We were able to see them at all times.
Then the link of the pre-recorded speech (YouTube) was delivered on both Zoom chat and our dedicated course social platform (MeWe).
Students opened the link. Students played the video file and adjusted the volume control during the introduction of the speech.
Students paused the file and gave a thumb up to say they could hear well. When we could see all thumbs up (Zoom feature), we gave them the go ahead to interpret.
We could not hear the students interpret but we could see them at all times (marking was done from the recorded interpretation).
Once the interpretation was over, students had to upload their file onto their named folder on Google drive. We too were connected to the same Google folder.
We opened each audio file and listened (snapshot of the beginning, middle and end). If we were happy the recording was working, we sent a message in the chat to the dedicated student.
Students had to stay in the exam room until everyone had uploaded their file and was given the go ahead to leave.
Did we experience any issues?
Yes we did! But you have to expect that you would! And this is what we had discussed with students. All the “What if” had been brainstormed and we had agreed on the strategy to follow:
As soon as the student is aware something is wrong, they will write a dedicated message in the chat box to let us know.
We will then move the student to a breakout room called “help room”. Then we were comfortable to speak with the student and help out without disturbing anyone else. Uploading files in the google drive sometimes was slow and took longer than expected.
We used the help room once only.
The uploading of the audio files to Google drive sometimes took longer than anticipated.
I felt students were nervous for the exam but they controlled the exam environment very well.
With the assessment planner, we knew what devices and software students were using and could give advice.
The “friends” were extremely helpful and truly made a great contribution on the day, especially when the technology was challenged.
Student agency engages students to take ownership of their assessment with staff. It is no longer “the” exam, it is “our” exam. This sense of togetherness was generated by trust and one idea only: we all knew all decisions were taken with each other’s best interest at heart. We all wanted the assessment to be fair and conducted in an environment that was conducive to success. As students and staff, we may have played different roles but we all belonged to the same community committed to our collective interest.