Kinder, Gentler Oral Exams

Let me start off by saying that, as a student, I found oral exams to be very intimidating and frustrating. I could see their value as assessment tools, but found that in practice they were simply a source of personal dread. Enter 2012 where I am using oral assessments with my own students, but what I have done is try to minimize what I found intimidating and frustrating about oral exams. I have made my oral assessments kinder and gentler.

The strengths of oral assessments

In my opinion, the strengths of oral assessments are a result of their interactive nature.

If a student is stuck on a minor point, or even a major one, you can give them a hint or use some leading questions to help them along. Compare this to what happens if a student gets stuck on a written exam question and you can see how the oral assessment provides you with a much better assessment of student understanding than an incomplete or nonsensical written response.

Another strength is that no ambiguity need be left unturned. If some sort of ambiguous statement comes out of a student’s mouth, you can ask them to clarify or expand on what they have said instead of dealing with the common grader’s dilemma of sitting in front of a written response trying to make judgement calls related to ambiguous student work.

Some other benefits are that marking is a breeze (I will discuss my specific marking scheme later) and I have also found that I can generate “good” oral exam questions much more quickly than I can written ones.

My perception of the weaknesses of traditional oral assessments

The following are common, but not universal characteristics of oral assessments.

Public –Looking dumb in front of me may not be fun, but it is far more comfortable than looking dumb in front of a room full of your peers or discipline experts. Having spent some time on both sides of the desk, I don’t feel that my students ever “look dumb”, but as a student I remember feeling dumb on many occasions (here I will also include comprehensive exams, dissertation defences and question periods after oral presentations in my definition of oral assessments). I guess I’m saying that it feels worse than it looks, but doing it in public makes it feel even worse.

A lack of time to think – This is actually my biggest beef with oral assessments. In a written assessment you can read the question, collect your thoughts, brain-storm, make some mistakes, try multiple paths, and then finally try to put together a cohesive answer. I realize that you can do all these things in an oral assessment as well, but there is a certain time pressure which hangs over your head during an oral assessment. And there is a difference between privately pursuing different paths before coming to a desired one and having people scrutinize your every step while you do this.

Inauthentic – By inauthentic, I mean that oral exams (and for the most part, written ones too) isolate you from resources and come with some sort of urgent time pressure. If we are confronted with a challenging problem or question in the real world, we usually have access to the internet, textbooks, journals and even experts. We are able to use those resources to help build or clarify our understanding before having to present our solution. On the flip side, we can also consider the question period after a presentation as a real-world assessment and we are usually expected to have answers at our fingertips without consulting any resources so arguments can be made for and against the authenticity of an oral assessment.

Context (Advanced Lab)

Before I break down my kinder, gentler oral exams, I want to discuss the course in which I was using them. This course was my Advanced Lab (see an earlier post) where students work in pairs on roughly month-long experimental physics projects. One students is asked to be in charge of writing about the background and theory and the other the experimental details, and then on the second project they switch. For their oral assessments I used the same set of questions for both partners, but the actual questions (see below) were very project-specific. My hope was that using the same questions for both partners would have forced them to pay much closer attention to what the other had written.

It took at most a total of 2 hours to come up with the 6 sets of questions (12 students total in the course) and then 9 hours of actual oral exams which comes out to less than an hour per student. I would say that this is roughly equivalent to the time I would have spent creating and marking that many different written exams, but this was much more pleasant for me than all that marking.

Kinder, gentler oral exams

I will describe the format that I use and then highlight some of the key changes that I made to improve on what I perceive to be the weaknesses of traditional oral exams.

I book a 45-minute time slot for each student and they come to my office one at a time. When they show up in my office I have 3 questions for them. They have 10 minutes to gather their thoughts and use whatever resources that they brought (including using the internet, but not consulting with somebody) to help formulate some coherent answers. I also give them a nice big whiteboard to use how they see fit. Once their 10 minutes are up (it is not uncommon for them to take a couple extra minutes if they want that little bit of extra time), they are asked to answer the questions in whatever order would please them. For each question I try, but not always successfully, to let them get their answer out before I start asking clarification, leading or follow-up questions. If they are on the completely wrong track or get stuck I will step in much earlier. If the leading questions do not help them get to the correct answer, we will discuss the question on the spot until I feel like the student “gets” the answer. Sometimes these discussions would immediately follow the question and sometimes I would wait until after they have had a chance to answer all three questions. After they have answered all three questions and we have discussed the correct answers, I pull out the rubric (see below) and we try to come to consensus on their grade for each question. They leave my office with a grade and knowledge of the correct answer to all three questions.

The key changes:

  • Private – I have them come to my office and do the assessment one-on-one instead of in front of the whole class.
  • 10 minutes to collect their thoughts and consult resources – It is similar to the perceived safety blanket offered by an open book exam. Students that were well-prepared rarely used the entire time and students that were not well-prepared tried to cram but did not do very well since I would always ask some clarification or follow-up questions. I have some post-course feedback interviews planned to learn more about the student perspective on this, but my perception is that the preparation time was helpful, even for the well-prepared students. It gave them a chance to build some confidence in their answers and I often delighted in how well they were able to answer their questions. I think that time also offered an opportunity to get some minor details straight, which is beneficial in terms of confidence building and improving the quality of their answers. And finally, knowing that they had that 10 minutes of live prep time seemed to reduce their pre-test stress.
  • Immediate feedback – Discussing the correct answer with the student immediately after they have answered a question is a potential confidence killer. I suspect that the students would prefer to wait until after they have answered all the questions before discussing the correct answers, and I am interested to see what I will learn in my feedback interviews.
  • Grading done as collaborative process with the student – In practice I would usually suggest a grade for a question and mention some examples from their answer (including how much help they needed from me) and then ask them if they thought that was fair. If they felt they should have earned a higher grade, they were invited to give examples of how their answer fell in the higher rubric category and there were many occasions where those students received higher grades. However, the problem is that this is a squeaky wheel situation and it is hard to figure out if it is entirely fair to all students. For cases where I asked students to tell me what grade they thought they earned before saying anything myself, students were far more likely to self-assess lower than I would have assessed them than to self-assess higher than I would have assessed them.

Grading rubric

The grading rubric used was as follows:

Grade Meets Expections? Criteria
100% Greatly exceeds expectations The students displayed an understanding which went far beyond the scope of the question.
90% Exceeds expectations Everything was explained correctly without leading questions.
75% Meets expectations The major points were all explained correctly, but some leading questions were needed to help get there. There may have been a minor point which was not explained correctly.
60% Approaching expectations There was a major point or many minor points which were not explained correctly. The student was able to communicate an overall understanding which is correct.
45% Below expections Some of the major points were explained correctly, but the overall explanation was mostly incorrect.
30% Far below expectations Some of the minor points were explained correctly, but the overall explanation was mostly incorrect.
0% x_X X_x

Some example questions

General

  • I would pull a figure from their lab report and ask them to explain the underlying physics or experimental details that led to a specific detail in the figure.

Superconductor experiment

  • “Run me through the physics of how you were able to get a current into the superconducting loop. Why did you have to have the magnet in place before the superconducting transition?”
  • “Describe the physics behind how the Hall gave a voltage output which is proportional (when zeroed) to the external field. How do the external magnetic field and the hall sensor need to be oriented with respect to each other?”
  • “Explain superconductivity to me in a way which a student, just finishing up first-year science, would understand.”

Electron-spin resonance experiment

  • “Discuss how the relative alignment between your experiment and the Earth’s magnetic field might affect your results.”

Gamma-ray spectroscopy

  • “In what ways did your detector resolution not agree with what was expected according to the lab manual? What are some reasonable steps that you could take to TRY to improve this agreement?”

Some other directions to take oral assessments

A couple of my blogger buddies have also been writing about using oral assessments and really like what they are up to as well.

Andy Rundquist has written quite a bit about oral assessments (one example) because they are quite central to his Standards-Based Grading implements. One of the things that he has been doing lately is giving a student a question ahead of time and asking them to prepare a page-length solution to the question to bring to class. In class the student projects their solution via doc-cam, Andy studies it a bit, and then he starts asking the student questions. To my mind this is most similar to the question period after a presentation. The student has had some time, in isolation, to put together the pieces to answer the question, and the questions are used to see how well they understood all the pieces required to put together the solution. Another thing that Andy does is gets the whole class to publicly participate in determining the student’s overall grade on that assessment. I love that idea, but feel like I have some work to do in terms of creating an appropriate classroom environment to do that.

Bret Benesh wrote a couple of posts (1, 2) discussing his use of oral exams. His format it closer to mine than it is to Andy’s, but Bret’s experience was that even if they knew the exam question ahead of time, he could easily tell the difference between students that understood their answers and those that did not. I really want to try giving them the questions ahead of time now.

One final note

I am giving a short AAPT talk on my kinder, gentler oral exams, so any feedback that will help with my presentation will be greatly appreciated. Are there certain points which were not, but should have been emphasized?


The Science Learnification (Almost) Weekly – May 30, 2011

This is a collection of things that tickled my science education fancy in the past couple of weeks or so

Facilitating student discussion

  • Facilitating Discussion with Peer Instruction: This was buried somewhere in my to post pile (the post is almost a month old). The always thoughtful Brian Frank discusses a couple of things that most of us end up doing that are counter-productive when trying to facilitate student discussion. Buried in the comments he adds a nice list of non-counter-productive things the facilitator can say in response to a student’s point to help continue the discussion.

Dear Mythbusters, please make your data and unused videos available for public analysis

  • An open letter to Mythbusters on how to transform science education: John Burk shares his thoughts with the Mythbusters on the good they are doing for science education and the public perception of science (and scientists) and then goes one step further and asks them to share their raw experimental data and video for all their experiments and trials, failed and successful. Worth noting is that Adam Savage is very active in the skeptical movement, a group of folks that consider science education to be a very high priority.

End of year reflections

Well it is that time of the year when classes are wrapping up and folks are reflecting on the year. Here are a couple of such posts.

  • Time for New Teaching Clothes: SBG Reflections: Terie Engelbrecht had a handful of reflection posts over the past couple of weeks. In this post she does a nice job of reminding us that for any sort of unfamiliar-to-students instructional strategy that we need to communicate to the students WHY we have chosen to use this strategy. And this communication needs to happen early (as in first day) and be re-communicated often (since the first day is a murky blur to most of them). On a personal side note I spend most of my first day of class communicating to my students that the instructional strategies I use were chosen to (the best of my abilities and knowledge) best help them learn because I care about their learning. Earlier this month I had a parent tell me that after the first day of class her daughter came home very excited about my class because of my message about my caring about her learning. I couldn’t have smiled bigger.
  • Thoughts on the culture of an inverted classroom: Robert Talbert discusses what is essentially a buy-in issue, with his end-of-term feedback showing 3/4 of his students seeing the value of his flipped/inverted classroom approach. This number is pretty consistent with my own experience, where I am judging the buy-in by the fraction of students that complete their pre-lecture assignments. He makes a nice point at the end that students used to an inverted classroom would probably be much more appalled with a regular lecture course than vice-versa.
  • “Even our brightest students…” Part II: Michael Rees writes about his own (student) perspective on Standards-Based Grading. We need more of these student perspective on education blogs, they are fantastic.

An experiment in not using points in the classroom

  • Pointslessness: An Experiment in Teenage Psychology: Shawn Cornally ran a bioethics class where their work for almost the entire year did not count toward their grade and they discussed readings and movies which were “interesting” (not sure what was used to qualify these things as interesting, but when looking through the list I’m pretty sure I would find most of those things interesting). Without the marks attached the students engaged in the discussions for the sake of engaging in the discussions and those students that usually try to glean what is going on from only the classroom discussions (instead of doing the readings themselves) would often go and do the readings after the discussions.

Effective communication of physics education research

  • Get the word out: Effective communication of physics education research: Stephanie Chasteen posts and discusses her fantastic talk from the Foundation and Frontiers of Physics Education Research – Puget Sound conference (FFPER and FFPER-PS are by far my favorite conferences btw). The talk discussed the generally poor job that physics education researchers do of communicating with the outside world and discussed some strategies to become more effective in this communication.

A few more posts of interest

  • It is just fine to give a quiz based on the homework that’s due today: Agreed! I do it too, but I use online homework that provides instant feedback so they show up in class having already received some feedback on their understanding.
  • Why Schools Should Embrace the Maker Movement: I’m hoping to develop an upper-year electronics course based on Arduino, and requiring only intro computer science and physics as prerequisites. Go Makers!
  • Probing potential PhDs: One of a grad student’s responsibilities is typically to be a teaching assistant and some folks at Stony Brook are taking this into account when interviewing potential new grad students by asking them to explain, at an undergraduate level, the answer to conceptual challenge problems. I think I want this collection of challenge problems for my own use.

The Science Learnification Weekly (March 13, 2011)

This is a collection of things that tickled my science education fancy in the past week or so. They tend to be clumped together in themes because an interesting thing on the internet tends to lead to more interesting things.

Moving beyond plug-n-chug Physics problems

  • Dolores Gende talks about representations in Physics and how these can be used to move the student problem solving process beyond just formula hunting. Translation of representation is a very challenging task for novice Physics students and a typical end-of-chapter exercise can be made much more challenging by asking them to translate from one representation to another, such as asking them to extract “known quantities” from a graph instead of being given explicitly in the word problem. I must say that I prefer Knight over other intro texts as a source of homework and quiz problems because he has a lot of these physics exercise +  translation of representation questions. Gende links to the Rutgers University PAER (Physics and Astronomy Education Research) multiple representation resources, but there are a ton of other excellent resources throughout the PAER pages.

Scientific thinking and the nature of science

  • Early this past week, Chad Orzel from the Uncertain Principles blog posted three posts related to scientific thinking and the general population: Everybody Thinks Scientifically, Teaching Ambiguity and the Scientific Method,Scientific Thinking, Stereotypes, and Attitudes. I won’t even try to summarize the posts here, but one of the main messages is that letting the average person believe that science is too difficult for them is not a great idea.
  • On Thursday I wrote a post which featured a couple of activities that can help teach about the nature of science. Andy Rundquist brought up in the comments the mystery tube activity which was also discussed in a recent Scientific Amaerican post which discusses that schools should teach more about how science is done.
  • Habits of scientific thinking is a post from John Burk of the Quantum Progress blog . A nice discussion follows in the comments. His example habit is…

    Estimate: use basic numeric sense, unit analysis, explicit assumptions and mathematical reasoning to develop an reasonable estimate to a particular question, and then be able to examine the plausibility of that estimate based its inputs.”

  • Chains of Reasoning is a post from the Newton’s Minions blog. He is trying to work on getting his physics students from information to conclusion through logical (inference) steps. I’m trying to directly, explicitly work on students in physics reasoning well.  His main message for his students is one that sums up well the disconnect between the common perception of science and the true nature of science:

    “Science isn’t about ‘knowing;’ it’s about being able to figure out something that you don’t know!  If you can’t reason, then you’re not doing science.”

What Salman Khan might be getting right

  • Mark Hammond’s first post on his Physics & Parsimony blog talks about some of the positive things that we can take away from Khan’s recent TED talk that has recently been a hotly discussed topic on the old internet. I had been paying some attention to the discussion, but didn’t actually watch the talk until after reading Hammond’s post. It is much easier to tear something apart than to do as Mark did and to pull out some important lessons. Mark’s two things that Khan is getting right are related to flipped classrooms and mastery learning, and it is important to remember that the audience being reached by this talk have mostly never heard of these education paradigms which are generally supported by the greater education reform community (myself included). I commented on mark’s blog:

    “In terms of public service, I feel that he could have sold the idea of the flipped classroom as something that every teacher can do, even without his videos, but that his academy makes it even easier for teachers to implement. I’m sure this is the first time that many people have heard of a flipped classroom, and it would be nice if people understand that this is a general teaching strategy and not something brand-new that you can all of a sudden do thanks to Khan.”

Collaborative scoring on oral assessments

  • Andy Rundquist posted a couple of videos showing him collaborate with students in scoring oral assessments for his upper-division Mathematical Physics course, which also happens to be his first Standards-Based Grading implementation.
The Science Learnification Weekly (March 6, 2011)