Kinder, Gentler Oral Exams

Let me start off by saying that, as a student, I found oral exams to be very intimidating and frustrating. I could see their value as assessment tools, but found that in practice they were simply a source of personal dread. Enter 2012 where I am using oral assessments with my own students, but what I have done is try to minimize what I found intimidating and frustrating about oral exams. I have made my oral assessments kinder and gentler.

The strengths of oral assessments

In my opinion, the strengths of oral assessments are a result of their interactive nature.

If a student is stuck on a minor point, or even a major one, you can give them a hint or use some leading questions to help them along. Compare this to what happens if a student gets stuck on a written exam question and you can see how the oral assessment provides you with a much better assessment of student understanding than an incomplete or nonsensical written response.

Another strength is that no ambiguity need be left unturned. If some sort of ambiguous statement comes out of a student’s mouth, you can ask them to clarify or expand on what they have said instead of dealing with the common grader’s dilemma of sitting in front of a written response trying to make judgement calls related to ambiguous student work.

Some other benefits are that marking is a breeze (I will discuss my specific marking scheme later) and I have also found that I can generate “good” oral exam questions much more quickly than I can written ones.

My perception of the weaknesses of traditional oral assessments

The following are common, but not universal characteristics of oral assessments.

Public –Looking dumb in front of me may not be fun, but it is far more comfortable than looking dumb in front of a room full of your peers or discipline experts. Having spent some time on both sides of the desk, I don’t feel that my students ever “look dumb”, but as a student I remember feeling dumb on many occasions (here I will also include comprehensive exams, dissertation defences and question periods after oral presentations in my definition of oral assessments). I guess I’m saying that it feels worse than it looks, but doing it in public makes it feel even worse.

A lack of time to think – This is actually my biggest beef with oral assessments. In a written assessment you can read the question, collect your thoughts, brain-storm, make some mistakes, try multiple paths, and then finally try to put together a cohesive answer. I realize that you can do all these things in an oral assessment as well, but there is a certain time pressure which hangs over your head during an oral assessment. And there is a difference between privately pursuing different paths before coming to a desired one and having people scrutinize your every step while you do this.

Inauthentic – By inauthentic, I mean that oral exams (and for the most part, written ones too) isolate you from resources and come with some sort of urgent time pressure. If we are confronted with a challenging problem or question in the real world, we usually have access to the internet, textbooks, journals and even experts. We are able to use those resources to help build or clarify our understanding before having to present our solution. On the flip side, we can also consider the question period after a presentation as a real-world assessment and we are usually expected to have answers at our fingertips without consulting any resources so arguments can be made for and against the authenticity of an oral assessment.

Context (Advanced Lab)

Before I break down my kinder, gentler oral exams, I want to discuss the course in which I was using them. This course was my Advanced Lab (see an earlier post) where students work in pairs on roughly month-long experimental physics projects. One students is asked to be in charge of writing about the background and theory and the other the experimental details, and then on the second project they switch. For their oral assessments I used the same set of questions for both partners, but the actual questions (see below) were very project-specific. My hope was that using the same questions for both partners would have forced them to pay much closer attention to what the other had written.

It took at most a total of 2 hours to come up with the 6 sets of questions (12 students total in the course) and then 9 hours of actual oral exams which comes out to less than an hour per student. I would say that this is roughly equivalent to the time I would have spent creating and marking that many different written exams, but this was much more pleasant for me than all that marking.

Kinder, gentler oral exams

I will describe the format that I use and then highlight some of the key changes that I made to improve on what I perceive to be the weaknesses of traditional oral exams.

I book a 45-minute time slot for each student and they come to my office one at a time. When they show up in my office I have 3 questions for them. They have 10 minutes to gather their thoughts and use whatever resources that they brought (including using the internet, but not consulting with somebody) to help formulate some coherent answers. I also give them a nice big whiteboard to use how they see fit. Once their 10 minutes are up (it is not uncommon for them to take a couple extra minutes if they want that little bit of extra time), they are asked to answer the questions in whatever order would please them. For each question I try, but not always successfully, to let them get their answer out before I start asking clarification, leading or follow-up questions. If they are on the completely wrong track or get stuck I will step in much earlier. If the leading questions do not help them get to the correct answer, we will discuss the question on the spot until I feel like the student “gets” the answer. Sometimes these discussions would immediately follow the question and sometimes I would wait until after they have had a chance to answer all three questions. After they have answered all three questions and we have discussed the correct answers, I pull out the rubric (see below) and we try to come to consensus on their grade for each question. They leave my office with a grade and knowledge of the correct answer to all three questions.

The key changes:

  • Private – I have them come to my office and do the assessment one-on-one instead of in front of the whole class.
  • 10 minutes to collect their thoughts and consult resources – It is similar to the perceived safety blanket offered by an open book exam. Students that were well-prepared rarely used the entire time and students that were not well-prepared tried to cram but did not do very well since I would always ask some clarification or follow-up questions. I have some post-course feedback interviews planned to learn more about the student perspective on this, but my perception is that the preparation time was helpful, even for the well-prepared students. It gave them a chance to build some confidence in their answers and I often delighted in how well they were able to answer their questions. I think that time also offered an opportunity to get some minor details straight, which is beneficial in terms of confidence building and improving the quality of their answers. And finally, knowing that they had that 10 minutes of live prep time seemed to reduce their pre-test stress.
  • Immediate feedback – Discussing the correct answer with the student immediately after they have answered a question is a potential confidence killer. I suspect that the students would prefer to wait until after they have answered all the questions before discussing the correct answers, and I am interested to see what I will learn in my feedback interviews.
  • Grading done as collaborative process with the student – In practice I would usually suggest a grade for a question and mention some examples from their answer (including how much help they needed from me) and then ask them if they thought that was fair. If they felt they should have earned a higher grade, they were invited to give examples of how their answer fell in the higher rubric category and there were many occasions where those students received higher grades. However, the problem is that this is a squeaky wheel situation and it is hard to figure out if it is entirely fair to all students. For cases where I asked students to tell me what grade they thought they earned before saying anything myself, students were far more likely to self-assess lower than I would have assessed them than to self-assess higher than I would have assessed them.

Grading rubric

The grading rubric used was as follows:

Grade Meets Expections? Criteria
100% Greatly exceeds expectations The students displayed an understanding which went far beyond the scope of the question.
90% Exceeds expectations Everything was explained correctly without leading questions.
75% Meets expectations The major points were all explained correctly, but some leading questions were needed to help get there. There may have been a minor point which was not explained correctly.
60% Approaching expectations There was a major point or many minor points which were not explained correctly. The student was able to communicate an overall understanding which is correct.
45% Below expections Some of the major points were explained correctly, but the overall explanation was mostly incorrect.
30% Far below expectations Some of the minor points were explained correctly, but the overall explanation was mostly incorrect.
0% x_X X_x

Some example questions

General

  • I would pull a figure from their lab report and ask them to explain the underlying physics or experimental details that led to a specific detail in the figure.

Superconductor experiment

  • “Run me through the physics of how you were able to get a current into the superconducting loop. Why did you have to have the magnet in place before the superconducting transition?”
  • “Describe the physics behind how the Hall gave a voltage output which is proportional (when zeroed) to the external field. How do the external magnetic field and the hall sensor need to be oriented with respect to each other?”
  • “Explain superconductivity to me in a way which a student, just finishing up first-year science, would understand.”

Electron-spin resonance experiment

  • “Discuss how the relative alignment between your experiment and the Earth’s magnetic field might affect your results.”

Gamma-ray spectroscopy

  • “In what ways did your detector resolution not agree with what was expected according to the lab manual? What are some reasonable steps that you could take to TRY to improve this agreement?”

Some other directions to take oral assessments

A couple of my blogger buddies have also been writing about using oral assessments and really like what they are up to as well.

Andy Rundquist has written quite a bit about oral assessments (one example) because they are quite central to his Standards-Based Grading implements. One of the things that he has been doing lately is giving a student a question ahead of time and asking them to prepare a page-length solution to the question to bring to class. In class the student projects their solution via doc-cam, Andy studies it a bit, and then he starts asking the student questions. To my mind this is most similar to the question period after a presentation. The student has had some time, in isolation, to put together the pieces to answer the question, and the questions are used to see how well they understood all the pieces required to put together the solution. Another thing that Andy does is gets the whole class to publicly participate in determining the student’s overall grade on that assessment. I love that idea, but feel like I have some work to do in terms of creating an appropriate classroom environment to do that.

Bret Benesh wrote a couple of posts (1, 2) discussing his use of oral exams. His format it closer to mine than it is to Andy’s, but Bret’s experience was that even if they knew the exam question ahead of time, he could easily tell the difference between students that understood their answers and those that did not. I really want to try giving them the questions ahead of time now.

One final note

I am giving a short AAPT talk on my kinder, gentler oral exams, so any feedback that will help with my presentation will be greatly appreciated. Are there certain points which were not, but should have been emphasized?


8 Comments on “Kinder, Gentler Oral Exams”

  1. bretbenesh says:

    Hi Joss,

    Great stuff, as usual. I would be interested in finding out whether there is a difference in giving your students the questions ahead of time. I kind of think that my results had a lot to do with my particular type of student (elementary education major), and that physics majors might be different. But I am not sure, and I would love to find out!
    Bret

    • Joss Ives says:

      Hi Bret,

      I suspect that I will have to do a really good job of record keeping to be able key in on the differences between if they are provided the question or now. And keeping tabs on how they do on the original question as compared to the follow-up questions. Looks like a good research question!

  2. Andy "SuperFly" Rundquist says:

    For my students, there were two major types of problems they would do: 1) a standard like “I can calculate the electric field for a complicated situation”, and 2) a standard like “derive the Euler-Lagrange equation.” For (1), it was great to see what type of situation the student came up with, and it led to very different types of questions for students doing similar standards. For (2) there were some very fine grained details that I would end up asking about, because usually the derivations were done in the book. I felt I liked discussing type (1) problems with students better, but I feel that type (2) problems were more fairly graded.

    Regarding your point about inauthentic-ness (is that a word?), I would say that I think about the oral exams as related to casual conversations about the material in the course. If a student needs 10 minutes to pull her thoughts together, that’s different than such a casual conversation. I sometimes was surprised by the coarse-level details that students had to think about in the oral exam situation. I thought those would be things we could have such a casual conversation about, especially as we started digging deeper into the fine-grained details.

    • Joss Ives says:

      Andy, I like this idea of the casual conversation. It helps me differentiate between what I think they should have at their fingertips and what they should be able to explain given some time to organize their thoughts. In a casual conversation, a student that just spent a month working on an experiment should be able to describe how a piece of equipment in their experiment works or how their results mesh with the underlying theory.

  3. LE says:

    As it happens, I submitted a paper on Orals to TPT a couple of months ago that’s pretty close to these lines. I wish more instructors used them, they’re such powerful tools.

    • Joss Ives says:

      Hi Ellie,

      Our interests sure seem to cross paths often. We should chat about this topic more at AAPT. I’m looking forward to reading your paper.

  4. […] document, and then walk you through it? Some of this reminds me, by the way, of the great talk that Joss Ives gave at AAPT about giving his students time to think during oral […]

  5. […] This would be based on a blog post and conference presentation that I gave last year on kinder, gentler oral exams. […]


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s