Looking at the number of drafts submitted for project reports in the Advanced Lab

In this post I take a quick look at the number of drafts of their project papers that students submitted in my January 2012 Advanced Lab course. This course had a minimum bar for the paper grades and the students were allowed to revise and resubmit as many times as needed to get there, with an average of 3.22 drafts needed. I decided to look at these numbers for the purpose of communicating realistic expectations to students currently registered for my fall section of the course and thought I would share those numbers.

I am starting to prepare for my fall Advanced Lab course. Here is a quick overview of this course from a previous post:

This type of course, a standard course in most physics departments, is a standalone lab course without any associated lecture course. There is an amazing amount of variability from one Advanced Lab course to the next and they range in format from one experiment per week with everything already set up and cookbook procedures ready to be followed, to a single student-developed project over the entire term (or year!).

In my specific incarnation, we spend the first month doing some introductory activities to build up some foundational skills which are mostly related to data analysis and presentation. For the rest of the course pairs of students work on two month-long experimental physics projects. The students are guided to work on projects that can be viewed as being part of a larger research line, where they build on the work of previous students and future students will build on their work. Thus no two groups will ever perform identical experiments.

A major piece of the course is that they have to write a journal-style article to communicate the results of one of their projects. To help them practice revising their own writing and impress upon them that effective writing requires many revisions, I require that students earn a grade equivalent to a B on their paper according to this rubric, and are allowed to revise and resubmit as many times as needed to reach that threshold grade.

The overall grade for these papers was calculated as 25% from the first graded draft and 75% from the final draft. They were allowed to submit an initial draft, which was not graded, where I would spend a maximum of a half an hour reading over the paper and providing feedback. Students were encouraged to have a peer read through their paper and provide some feedback before submitting this initial draft. After reaching the threshold B-grade, they were allowed to resubmit one final draft. At some point in the revision process I also had a formal process where students provided each other with some peer feedback on their papers.

A quick summary of the numbers are in order. Of the twelve students, three of them gave up at some point before reaching threshold B-grade on the journal-style article. Those students were only given partial credit for the last grade that their paper received. Of the nine students whose papers reached the threshold B-grade, five of them submitted a final draft to improve their overall paper grade.

Of the 9 papers that were accepted (met the minimum grade threshold of a B), 5 of them were revised at least one additional time .

The number of drafts in this graph includes the initial ungraded draft, but does not include the final revision that 5 of 9 students submitted after their papers reached the B-grade threshold.

What is the take-home message here? Based on this system, students should expect to submit three or more drafts of a paper in order to meet the threshold grade.

This coming fall, I plan to adopt some new feedback strategies that  take the focus off grammatical correctness and similar issues in the hopes to focus more on the ideas in the papers. As part of this, I may move to a reviewer report style of feedback (for example, this is the one for AJP) and away from detailed rubrics, but I haven’t quite made up my mind on this yet. My grading philosophy in the course this fall will be that their course grade will represent the quality of the recommendation that I would give them in a reference letter based on their work in the course, and I want to do my best to make sure all of the individual components are assessed in ways that match up with this overall grading philosophy.


Reflecting on what I have read so far in John C. Bean’s “Engaging Ideas”

Ugh. I just had one of those moments where I lost a bunch of what I have written. I recovered what I could, but don’t feel like re-writing it all so instead will treat you to a fairly short post.

51dBVhWv0fLMy interest in and engagement with student writing comes mostly from my use of the journal article genre for lab reports in my Advanced Lab course. Through attending a Writing Across the Curriculum workshop last month, I was invited to participate in planning a workshop built around Bean’s book “Engaging Ideas: The Professor’s Guide to Integrating Writing, Critical Thinking and Active Learning in the Classroom”. I have been skimming some parts of the book and reading other parts very carefully, and along the way I have been reflecting on the places where the student journal articles intersect with ideas from the book. What is proving to very interesting is the grey area where I can debate with myself (and at some point with others) about places where these intersections might exist, or perhaps should exist.

There is a lot of very practical information in this book. He has chapters on using rubrics, on handling the paper load and on writing comments on students’ papers. I haven’t read those yet, but in reading through some of the earlier chapter, I came across two things that he wrote or referenced that struck a chord with me.

…many teachers read student essays with the primary purpose of finding errors, whereas they read their own colleagues’ drafts-in-progress for ideas

and

…for many college writers, the freedom of an open-topic research paper is debilitating.

My approach to the student journal articles thus far has mostly been that they are an information dump meant to follow the guidelines of the genre. As you can imagine, this is a vastly different approach from Bean’s approach to student writing. I am interested to see where I will end up after finishing the book and after having a chance to interact more with the colleagues with whom I am planning this workshop (as well as the workshop attendees). Although it is possible that I will continue to feel that the majority of the book does not apply to my situation, the conflicting ideas whirling around in my brain suggest that I will experience a significant shift in how I approach student writing. I originally had a lot more to say about these things, but will leave it at that for now.


Student Interview Feedback for Advanced Lab, Spring 2012, Part 2

This is part 2 (part 1 here) of my post discussing feedback I got from a couple of my students after the conclusion of my Advanced Lab course. This went long again so it looks like there will have to be a part 3.

The 8-hour work-week and filling out time sheets

The combination of this having been only my second time teaching the course and my policy on student experiments always building on, but not repeating, the project of previous groups made it very hard for me to figure out projects of appropriate scope. So my solution was to ask that the students put in a minimum of 8 hours each week into the course and then I had to make sure that the projects consisted of sequences of achievable milestones. With that in place, I was happy to accept however far along each group made it with their projects as long as those 8 hours each week were actually spent working productively on the course.

So I got them to fill out and submit time sheets. I was worried that they would perceive these as being too strict or beneath them or terrible in some other way.

Interview feedback: No complaints. The time sheets were fine and did a good job of encouraging them to dedicate an appropriate amount of time to the course even when they felt like doing something else. Yay!

Future plan: It looks like I will continue to use these time sheets. The thoughtful-assessment part of my conscience doesn’t really like having to use these, but for the most part these students have never had to budget time for a longer project and they really need the scaffolding so that they don’t fall on their faces.

Oral and written dissemination

One of my major guiding principles in this course was (and continues to be) to try to make sure that the communication of their project was directed toward authentic audiences. For the weekly group meetings, they were bringing me up to speed on their project as well as informally presenting to their work to people with less project-specific expertise (the rest of their class-mates). Since projects are always meant to build on previous projects, their formal reports are going to be part of the literature used by the next group building on that same project. Their formal oral presentations were targeted at peers that lacked project-specific expertise (again the rest of the class).

The first time I taught the course, I had the students write journal-style articles (each partner wrote one). There were two problems. First, the partner that was not writing the article ended up contributing very little to the analysis and usually didn’t dig deep enough into how everything worked from either the theoretical or the experimental side of things (which is part of why I implemented the oral assessments into the course). Second, the background and theory sections often lacked an authentic audience for multiple reasons: (A) they were often vaguely repeating the work from a source journal article; (B) if they were building on a previous group’s work, writing their own background and theory sections would be mostly redundant; and (C) the topics were often deep enough that it was not reasonable to expect them to develop the project-specific expertise to do a very good job these sections.

So in this second incarnation of the course I decided to split the journal article up into two pieces, one for each partner: a LaTeX “technote” and a wiki entry for the background and theory. The idea was that future groups could add to the wiki entry, which would eliminate the redundancy of recreating essentially the same theory and background sections for future groups working on that research line. With the theory and background stripped out of the journal article, all I asked in the technote was that all equations and important terminology be clearly defined within the technote, and no other theory was needed. I thought this would have the added benefit of having both partners invested in a writing task for each project. But the whole thing did not work very well. The technotes worked fine, but the wiki entries ended up being so disconnected from the technotes that partners often didn’t even use the same notation between the wiki entry and technote.

It is worth noting that between a time crunch and the technote+wiki not working as well as I liked, I got the students to team up and write something a bit closer to a journal article for their second project.

In addition to their technote and wiki entry, each student gave a 12-15 minute formal oral presentation on one of their projects (each partner presented on the project for which they wrote the technote) instead of writing a final exam.

One of the things I wanted to discuss in the interview was what sort of improvements could we make to this dissemination process. I had some ideas in my head to discuss such as poster presentations and articles written for a lay-audience.

Interview feedback: for each project one person would write a journal article and the other would prepare and present a poster at a research symposium. The logistics of this still need to be worked out and we discussed a number of combinations of dissemination methods before coming to a consensus on this specific one. The interviewed students saw communicating science to a lay-audience (the research symposium attendees) as an important thing for them to practice.

Future plan: I really like how this combination makes each member of the group responsible for communicating all the important pieces of their project. Targeting them at different audiences means that they will be able to work together while still ultimately having to produce their own non-redundant (relative to each other) work.

Their are a lot of logistical issues to work out here. Our university has a student research day a couple of weeks before the end of our winter term and that would be a perfect place for them to present their poster. The problem is that, with a proper revision cycle for their poster, they will essentially have had to have completed both projects a month before the end of the term. I’m not certain I can make that work. We can always have our own research symposium, but it seems ideal to get involved with an existing one that already has an audience.

The other piece here is that I will probably ask them to keep the journal articles closer to the technotes than a real journal article (meaning bare-bones theory and background).

A vague notion of a plan dawned on me while proof-reading this post. I could probably get the timing with the student research symposium to work if I reduce the scope of each project by roughly a week and then in the final month of the course I could ask each group to revisit one of their experiments and push it a bit further forward. There are all sorts of problems with this plan, such as how they will disseminate this additional work and the experimental apparatus probably having been torn down, but it is still something to consider.

The timing of peer review for their lab reports

Each technote and journal article was allowed as many drafts as needed to get the paper up to “accepted for publication with minor revisions” standards (a B-grade) based on a very picky rubric. After that, they were allowed one final draft if they wished to try earn an A grade. A typical number of drafts was 3 or 4, but there were exceptions in both directions.

For the first report, I had each person do a peer review of another student’s submission. One of the questions I had on my mind for the feedback interview had to do with the timing of the peer review in the draft cycle. The first draft of the first paper is always an extremely rough thing to slog through, even those written by very strong students. Thus, asking them to do peer review on a first draft is asking them to do something very painful. But, having to critically apply a rubric and provide constructive feedback does wonders for getting students to pay much better attention to the specifics of the writing assignment and the sooner that happens in the course, the sooner that I see those improvements in their writing.

Interview feedback: not too sure if it is best to do peer review on a first or second draft. We discussed this for a bit, decided we could see both options as equally valid, and never came to any real conclusion.

Future plan: dunno yet. I could sign my course up for the Journal of the Advanced Undergraduate Physics Laboratory Investigation tool. They have peer review calibration tasks and the added benefit of anonymous peer reviewers from other institutions, but since JAUPLI is still small, the timing all has to work out magically well.


Student Interview Feedback for Advanced Lab, Spring 2012, Part 1

Once I started writing this it got pretty long so I will call this part 1 and work on part 2 another day.

A month ago I took a couple of my students to a local coffee shop, filled them full of treats, and poked their brains for a couple of hours about my Advanced Laboratory course that ended in April, 2012. I’m summarizing their feedback here to make sure I have a good record of their feedback and, of course, to share. For any given piece of feedback from the students I will try my best to explain the context and by the end you should have a pretty good idea of how the course looked and where it will be headed in the future.

Let’s start with my definition of an Advanced Laboratory course from an earlier post:

This type of course, a standard course in most physics departments, is a standalone lab course without any associated lecture course. There is an amazing amount of variability from one Advanced Lab course to the next and they range in format from one experiment per week with everything already set up and cookbook procedures ready to be followed, to a single student-developed project over the entire term (or year!).

In my specific incarnation, we spend the first month doing some introductory activities to build up some foundational skills which are mostly related to data analysis and presentation. For the rest of the course pairs of students work on two month-long experimental physics projects. The students are guided to work on projects that can be viewed as being part of a larger research line, where they build on the work of previous students and future students will build on their work. Thus no two groups will ever perform identical experiments.

Onto the feedback!

Weekly research group meetings

Each week we had a research group meeting where each group was asked to post a couple of figures or tables to the course wiki and quickly bring us up to speed on what they had done the previous week as well as what they planned on doing the next week. A very small part of their grade was based on how much they contributed to these research group meetings. My expectation was that, averaged over multiple meetings, each student would ask at least one meaningful question to the presenters per meeting or contribute in some other way to the conversations surrounding the projects of the other groups. I had twelve students enrolled so I split the class into two larger research groups so each research group consisted of myself and three pairs of students.

I was quite happy with how these meetings worked and felt it was really valuable for the presenters to have to frame things so that people other than me and the presenters actually understood what they were up to. Anecdotally it felt like students spent more of their initial time learning about the basics (experiment and theory) of their projects than in the past because they were going to have to explain things to somebody else.

Interview feedback: the meetings felt too long. Their initial suggestion was to make the meetings every other week or to time-limit them somehow. A research group meeting took somewhere between a half-hour and an hour each week. The actual presentations were reasonably concise, but there were always lots of questions from the other groups and feedback from me. This was also the place where we hashed out a lot of the gory details of what they should try to do in the coming week. But the thing was that the other groups often contributed to these discussions on what to tackle in the coming week so I felt like the whole process was extremely valuable for all parties. But it could probably be tightened up.

Future plan: Partners will alternate being the main presenter each week (previously they were both expected to present each week) and will be asked to present 1 or 2 tables of figures. The feedback was that it sometimes felt like a stretch to find that 2nd table or figure to present. The actual presentation will be limited to 5 minutes for a total of 10 minutes per project group between the presentation and the discussion afterwards. I won’t be strict on the time limits, but will be mindful of the clock to help prioritize which discussions to have at that moment and which ones can be saved for private discussions later. One of the students also suggested that having time limits on their presentation time would serve as good practice for their formal oral presentations later in the course and these did have strict time limits.

Parallel investigations

Twice during the intro sequence I tried to have a number of small groups working on different things and then had them report out to the class. The second time we did this I called them “parallel investigations” and sent them off to study goodness of fit, Monte-Carlo methods for fitting or Monte-Carlo methods for error analysis. In addition to orally reporting out their findings, I asked one partner to write their findings up in LaTeX and the other on the course wiki. These two write-ups were allowed to be identical and the reason that I used two formats was because one of the students was going to write up the background and theory for their first project on the wiki and the other was going to write up the analysis and results as a LaTeX “tech note”. Thus I wanted them to have some practice using these writing formats. Note that on the second project they were to switch who wrote on the wiki and who wrote the LaTeX “tech note”.

Interview feedback: this might not have been the best use of their time. Yup, I agree. In the end, for the investigations that they did not perform, each group was simply ont he receiving end of a mediocre lecture on the topic and never got a chance to actively engage with the ideas. This is the exact opposite of how I try to run my courses.

Future plan: I’m planning on completely restructuring how the first-month introductory stuff works and will talk about that a bit more later, but I think the parallel investigations idea as it existed is officially dead.

Introduction to LaTeX

This is a place where I have not offered my students a ton of support. I wrote a tutorial so that they can install miktex and texniccenter on their windows machines and gave them a couple of sample LaTeX documents that cover all the basics, but that’s it.

Interview feedback: they would like some coherent instruction and resources. For most of them this is their first time ever dealing with a markup language and the learning curve seems to be steeper than I have been admitting to myself.

Future plan: It looks like a crew of us on twitter are going to put together a LaTeX workshop for the Summer 2013 AAPT meeting and I am hoping that as part of this process we will have put together a straight forward introduction to LaTeX for physicists package that I can drop on my students like a big ol’ pile of awesomeness.

Notebook activity

From Student-Generated Scientific Inquiry (Leslie Atkins and Irene Salter) I used their lab notebook activity which uses pages taken from different famous scientists actual lab notebooks. The students are asked, in small groups, to take some notes on how these famous scientists took notes and organized their information. As a large group we then built a rubric for their lab notebooks based on their observations of the pages from the notebooks of the famous scientists. The students were highly engaged in this activity and seemed to be supportive of the rubric that we developed from this activity.

Interview feedback: they thought this activity was great. But they didn’t find it in the way that I expected. The two students I interviewed both had some previous experience with lab notebooks in research labs and had, in the past, put way too much emphasis on maintaining an immaculate lab notebook. This activity had let them know that it was OK to have rough notes in their lab notebook.

Future plan: I hate marking lab notebooks. It is the worst. And with so much of the work they do being digital these days it is really hard to find a solution that fits into their work flow and doesn’t involve pasting umpteen print-outs into their lab notebook. I’m actually planning on backing off of trying to get them to keep a really good lab notebook and emphasize getting them to report at the beginning and end of the day what they planned on doing and what they actually accomplished (science fair style!). I will be checking it every class period and it will be graded as complete or incomplete. Once I feel that I can get a group of students doing a consistently good job of this, I will consider the next step to take.


Kinder, Gentler Oral Exams

Let me start off by saying that, as a student, I found oral exams to be very intimidating and frustrating. I could see their value as assessment tools, but found that in practice they were simply a source of personal dread. Enter 2012 where I am using oral assessments with my own students, but what I have done is try to minimize what I found intimidating and frustrating about oral exams. I have made my oral assessments kinder and gentler.

The strengths of oral assessments

In my opinion, the strengths of oral assessments are a result of their interactive nature.

If a student is stuck on a minor point, or even a major one, you can give them a hint or use some leading questions to help them along. Compare this to what happens if a student gets stuck on a written exam question and you can see how the oral assessment provides you with a much better assessment of student understanding than an incomplete or nonsensical written response.

Another strength is that no ambiguity need be left unturned. If some sort of ambiguous statement comes out of a student’s mouth, you can ask them to clarify or expand on what they have said instead of dealing with the common grader’s dilemma of sitting in front of a written response trying to make judgement calls related to ambiguous student work.

Some other benefits are that marking is a breeze (I will discuss my specific marking scheme later) and I have also found that I can generate “good” oral exam questions much more quickly than I can written ones.

My perception of the weaknesses of traditional oral assessments

The following are common, but not universal characteristics of oral assessments.

Public –Looking dumb in front of me may not be fun, but it is far more comfortable than looking dumb in front of a room full of your peers or discipline experts. Having spent some time on both sides of the desk, I don’t feel that my students ever “look dumb”, but as a student I remember feeling dumb on many occasions (here I will also include comprehensive exams, dissertation defences and question periods after oral presentations in my definition of oral assessments). I guess I’m saying that it feels worse than it looks, but doing it in public makes it feel even worse.

A lack of time to think – This is actually my biggest beef with oral assessments. In a written assessment you can read the question, collect your thoughts, brain-storm, make some mistakes, try multiple paths, and then finally try to put together a cohesive answer. I realize that you can do all these things in an oral assessment as well, but there is a certain time pressure which hangs over your head during an oral assessment. And there is a difference between privately pursuing different paths before coming to a desired one and having people scrutinize your every step while you do this.

Inauthentic – By inauthentic, I mean that oral exams (and for the most part, written ones too) isolate you from resources and come with some sort of urgent time pressure. If we are confronted with a challenging problem or question in the real world, we usually have access to the internet, textbooks, journals and even experts. We are able to use those resources to help build or clarify our understanding before having to present our solution. On the flip side, we can also consider the question period after a presentation as a real-world assessment and we are usually expected to have answers at our fingertips without consulting any resources so arguments can be made for and against the authenticity of an oral assessment.

Context (Advanced Lab)

Before I break down my kinder, gentler oral exams, I want to discuss the course in which I was using them. This course was my Advanced Lab (see an earlier post) where students work in pairs on roughly month-long experimental physics projects. One students is asked to be in charge of writing about the background and theory and the other the experimental details, and then on the second project they switch. For their oral assessments I used the same set of questions for both partners, but the actual questions (see below) were very project-specific. My hope was that using the same questions for both partners would have forced them to pay much closer attention to what the other had written.

It took at most a total of 2 hours to come up with the 6 sets of questions (12 students total in the course) and then 9 hours of actual oral exams which comes out to less than an hour per student. I would say that this is roughly equivalent to the time I would have spent creating and marking that many different written exams, but this was much more pleasant for me than all that marking.

Kinder, gentler oral exams

I will describe the format that I use and then highlight some of the key changes that I made to improve on what I perceive to be the weaknesses of traditional oral exams.

I book a 45-minute time slot for each student and they come to my office one at a time. When they show up in my office I have 3 questions for them. They have 10 minutes to gather their thoughts and use whatever resources that they brought (including using the internet, but not consulting with somebody) to help formulate some coherent answers. I also give them a nice big whiteboard to use how they see fit. Once their 10 minutes are up (it is not uncommon for them to take a couple extra minutes if they want that little bit of extra time), they are asked to answer the questions in whatever order would please them. For each question I try, but not always successfully, to let them get their answer out before I start asking clarification, leading or follow-up questions. If they are on the completely wrong track or get stuck I will step in much earlier. If the leading questions do not help them get to the correct answer, we will discuss the question on the spot until I feel like the student “gets” the answer. Sometimes these discussions would immediately follow the question and sometimes I would wait until after they have had a chance to answer all three questions. After they have answered all three questions and we have discussed the correct answers, I pull out the rubric (see below) and we try to come to consensus on their grade for each question. They leave my office with a grade and knowledge of the correct answer to all three questions.

The key changes:

  • Private – I have them come to my office and do the assessment one-on-one instead of in front of the whole class.
  • 10 minutes to collect their thoughts and consult resources – It is similar to the perceived safety blanket offered by an open book exam. Students that were well-prepared rarely used the entire time and students that were not well-prepared tried to cram but did not do very well since I would always ask some clarification or follow-up questions. I have some post-course feedback interviews planned to learn more about the student perspective on this, but my perception is that the preparation time was helpful, even for the well-prepared students. It gave them a chance to build some confidence in their answers and I often delighted in how well they were able to answer their questions. I think that time also offered an opportunity to get some minor details straight, which is beneficial in terms of confidence building and improving the quality of their answers. And finally, knowing that they had that 10 minutes of live prep time seemed to reduce their pre-test stress.
  • Immediate feedback – Discussing the correct answer with the student immediately after they have answered a question is a potential confidence killer. I suspect that the students would prefer to wait until after they have answered all the questions before discussing the correct answers, and I am interested to see what I will learn in my feedback interviews.
  • Grading done as collaborative process with the student – In practice I would usually suggest a grade for a question and mention some examples from their answer (including how much help they needed from me) and then ask them if they thought that was fair. If they felt they should have earned a higher grade, they were invited to give examples of how their answer fell in the higher rubric category and there were many occasions where those students received higher grades. However, the problem is that this is a squeaky wheel situation and it is hard to figure out if it is entirely fair to all students. For cases where I asked students to tell me what grade they thought they earned before saying anything myself, students were far more likely to self-assess lower than I would have assessed them than to self-assess higher than I would have assessed them.

Grading rubric

The grading rubric used was as follows:

Grade Meets Expections? Criteria
100% Greatly exceeds expectations The students displayed an understanding which went far beyond the scope of the question.
90% Exceeds expectations Everything was explained correctly without leading questions.
75% Meets expectations The major points were all explained correctly, but some leading questions were needed to help get there. There may have been a minor point which was not explained correctly.
60% Approaching expectations There was a major point or many minor points which were not explained correctly. The student was able to communicate an overall understanding which is correct.
45% Below expections Some of the major points were explained correctly, but the overall explanation was mostly incorrect.
30% Far below expectations Some of the minor points were explained correctly, but the overall explanation was mostly incorrect.
0% x_X X_x

Some example questions

General

  • I would pull a figure from their lab report and ask them to explain the underlying physics or experimental details that led to a specific detail in the figure.

Superconductor experiment

  • “Run me through the physics of how you were able to get a current into the superconducting loop. Why did you have to have the magnet in place before the superconducting transition?”
  • “Describe the physics behind how the Hall gave a voltage output which is proportional (when zeroed) to the external field. How do the external magnetic field and the hall sensor need to be oriented with respect to each other?”
  • “Explain superconductivity to me in a way which a student, just finishing up first-year science, would understand.”

Electron-spin resonance experiment

  • “Discuss how the relative alignment between your experiment and the Earth’s magnetic field might affect your results.”

Gamma-ray spectroscopy

  • “In what ways did your detector resolution not agree with what was expected according to the lab manual? What are some reasonable steps that you could take to TRY to improve this agreement?”

Some other directions to take oral assessments

A couple of my blogger buddies have also been writing about using oral assessments and really like what they are up to as well.

Andy Rundquist has written quite a bit about oral assessments (one example) because they are quite central to his Standards-Based Grading implements. One of the things that he has been doing lately is giving a student a question ahead of time and asking them to prepare a page-length solution to the question to bring to class. In class the student projects their solution via doc-cam, Andy studies it a bit, and then he starts asking the student questions. To my mind this is most similar to the question period after a presentation. The student has had some time, in isolation, to put together the pieces to answer the question, and the questions are used to see how well they understood all the pieces required to put together the solution. Another thing that Andy does is gets the whole class to publicly participate in determining the student’s overall grade on that assessment. I love that idea, but feel like I have some work to do in terms of creating an appropriate classroom environment to do that.

Bret Benesh wrote a couple of posts (1, 2) discussing his use of oral exams. His format it closer to mine than it is to Andy’s, but Bret’s experience was that even if they knew the exam question ahead of time, he could easily tell the difference between students that understood their answers and those that did not. I really want to try giving them the questions ahead of time now.

One final note

I am giving a short AAPT talk on my kinder, gentler oral exams, so any feedback that will help with my presentation will be greatly appreciated. Are there certain points which were not, but should have been emphasized?


How do YOU give students credit for the scope of their projects

In my Advanced Laboratory course my students are just about to submit the first drafts of their papers on their first of two projects (the whole course is essentially two projects after some initial introductory activities).

In this course I try to tailor the challenge-level of each project to the ability level of the students in each group. And their grade for each project is mostly based on the quality of the dissemination of their work, and their level of understanding of their project as assessed by their written report, their presentations at the weekly research group meetings and an oral assessment given at the end of their project.

But there is nothing explicit in my evaluation scheme that rewards students for tackling challenging projects or penalizes them for shooting really low. I tend to be more generous with my rubrics for students that have taken on the challenging projects, but I feel like I would like something built into the overall evaluation scheme.

So my question to you dear blog reader is how do you give students credit for the scope of challenge-level of their projects?


More than Make-Work Writing in my Advanced Lab Course

I teach the Advanced Physics Laboratory course in my department. This type of course, a standard course in most physics departments, is a standalone lab course without any associated lecture course. There is an amazing amount of variability from one Advanced Lab course to the next and they range in format from one experiment per week with everything already set up and cookbook procedures ready to be followed, to a single student-developed project over the entire term (or year!).

Often communication is emphasized in these courses, with the probably the most common forms of dissemination being formal lab reports modeled after journal articles and oral presentations.

I started to prepare in May for my 2nd time teaching this course and it’s not running until January 2012. I had tweaked the writing tasks that I was going to assign to the students when I came across the following paper:

Inquiry-Based Writing in the Laboratory Course, Cary Moskovitz and David Kellogg, Science Vol. 332 no. 6032 pp. 919-920 (May 2011). DOI: 10.1126/science.1200353

This paper ended up helping me further develop effective student writing tasks by making sure I was giving the students a tangible audience and by cutting out some portions of these writing tasks which were probably more make-work than anything.

Feel free to contact me if you need help finding a copy of this paper.

Paper Summary

I’m not certain I understand what exactly “inquiry-based writing” is (blame how poorly define inquiry-based anything is), but I am taking points discussed in Moskovitz and Kellog’s paper as a guide to giving students productive/effective writing tasks in the lab. This is in contrast to assigning to the students the standard lab reports which the authors describe as “largely inauthentic and make-work affairs, involving little actual communication beyond the implicit argument for a good grade.”

My rough summary (filtered through my own interests) of their main three points for designing effective writing tasks in the lab are:

  1. Assign forms (genres) of writing that working scientists use, such as journal articles, experimental reports, proposals, peer reviews, and conference posters.
  2. Ensure that students have something meaningful to say. It’s OK for the writing task to include only certain parts of a given genre. For example, if the students didn’t design or modify the procedures, how are they supposed to see writing up a methods section as meaningful?
  3. Create a real communication scenario by providing them with a tangible audience for their written work. This does not mean asking them to imagine addressing scientists in their field, but instead providing them with a real audience that will have interest in their work.

In the context of my course, I have taken these three points and used them to inform all the communication tasks in the lab, not just written tasks. But in this post I will focus on the written tasks.

Summary of my course

The primary characteristics of my Advanced Lab course are

  1. The term is divided roughly into thirds. The first third of the course is dedicated to developing skills[1]critical to (or at least useful for) the Advanced Lab through shorter experimental tasks. Each student spends the remainder of the term working in series on two research projects including mostly written dissemination of their results.
  2. Model the course after a physics research group with the students taking on the role of apprentice scientists (co-op students, graduate students, etc.) and me taking on the role of research supervisor. The entire class is treated as a larger research group and there are weekly group meetings (see this earlier post of mine for more of my thoughts on running a course in a way similar to how a research group functions).
  3. Each student research project is part of a larger ongoing research line. Future projects build upon previous projects instead of repeatedly replicating what previous students have done year after year.

I must tip my hat to Martin Madsen at Wabash College from whom I borrowed/ stole/ adopted characteristics 2 and 3. His course webpage can be found here and some of the research lines that he offers have seen three or more projects on the same line.

The experimental equipment that I use for each of the research lines typically have multiple well-defined experimental tasks, but also have plenty of room for students to explore some “intellectual phase-space” with novel experimental tasks being possible and plenty of freedom for future projects to build upon previous ones.

Connecting the dots between this paper and my course

My real communication scenario makes future students the tangible audience – The major light bulb that went on in my head thanks to this paper was the realization there was a lack of tangible audience in the student task of writing a journal article for the vague blob of “scientists in their field” or even for a local faux “journal of advanced lab.” In my experience, the journal article (typically in the style of American Journal of Physics) is one of the most common forms of formal writing that Advanced Lab instructors ask of their students, but with very very very few students actually publishing these papers to journals the audience is vague and intangible. Here is where Martin Madsen’s wonderful idea of research lines lends a great big hand to this communication scenario. Instead of the journal articles being written for these intangible audiences, the research line structure makes it so that the tangible audience for each project group is the future groups of students that will be working on their research line. For future students to build on the previous experimental work in a specific research line, they will necessarily have to use the journal articles written by previous students. Authentic audience = [✓].

Before coming across this paper, my plan was to have each student write a journal article for one of their experiments and a popular-science-level article (e.g., New Scientist or Scientific American) for the other. Students are partnered up, so for a given experiment one partner would write the journal article and the other the popular science article. What I really liked about this idea was that I thought that the popular science article would force them to focus on the big picture idea of why their results are interesting without getting so bogged down in the gory experimental and analysis details one would see in a journal article. I also thought it would allow the students to write drastically different papers based on the same shared experimental and background research experiences. Unfortunately, I do not have a target “tangible audience” for these popular-science-level articles.

My new primary writing tasks are a technical note and and a wiki entry – Even with a now-tangible audience (future students), the journal article as a writing task is not completely authentic for my students. Once an initial group in a research line has written introduction and theory sections, future groups will simply be parroting that work. Keep in mind that the introduction section in a typical journal article is usually a brief overview of the field for the purposes of motivating that specific published work. Perhaps a new group would extend the theory and motivation a bit because of the specific way that their project goes beyond the original project in that experimental line, but they would still be mostly re-writing the previous work done for these sections by the original group.

The technical note – So let’s just strip that type of introduction and theory section right out of the journal article and re-frame the journal article as a technical (or tech) note. My personal concept of a technical note comes from my time as a particle physics Ph.D. student where our collaboration had written a hundred or so technical notes. Most of these tech notes were written for the purpose of disseminating to the rest of the collaboration (including future collaborators) the experimental results of a study, analysis, measurement or something similar. In these “experimental” tech notes there was no need to rehash the motivation or theory for the collaboration’s full experiment since that was already common knowledge shared within the collaboration. A brief introduction/motivation was still needed for the particular experimental task being discussed in the tech note. But for the most part, an equation was introduced in the form it was going to be needed without worrying too much about the underlying physics needed to get to that equation. Sweet! Stripping the traditional introduction and theory out saves my students the task of doing something which feels like make work once a previous group has already done it.

Another point that Moskovitz and Kellog make is that the students often don’t have the perspective, time or expertise to write up the introduction (background) as you would see it in a typical journal article. It is simply too large of a task for them to tackle properly.

The wiki entry – Here’s a question you may be asking yourself: “Is anybody ever going to write up the theory and background for a given research line?” The answer is yes and it will be in the form of a wiki entry. I have a wiki set up for this course and the second major writing task each student will do is create or greatly contribute to a wiki entry for their experimental line. One student in a group will work on the tech note while the other student works on the wiki entry.

The purpose of the main wiki page for a given experimental line is two-fold. First, it is meant to be a place that a curious student can go to read about the experimental line in sufficient detail to determine if a project in that line would interest them, and with enough detail provided that they could develop a decent picture of their potential experimental tasks and the underlying physics. The second purpose is to have a living document which represents the current and collective understanding of the theory, background and experimental equipment of those students who have collaborated on a given research line. This is a collaborative document for which each participating student can take great personal ownership.

I want to try to keep the idea of writing to a sort of popular-science article level for the wiki and am going to get the students to write their wiki entries with the assumption that the reader has only our introductory physics courses under their belt. Our majors program has very little in terms of required courses so it is never safe to assume that a given student will have a certain class under their belt when they walk into the Advanced Lab. With curious students being one of the target audiences, the writing needs to be very understandable and anything being discussed needs to start from fairly basic/common principles.

The writing task will be to contribute a certain number of words to the research line’s wiki entry(ies). I haven’t nailed it down yet, but I’m thinking of something in the neighbourhood of 1000-1500 words. A student would be asked to contribute something meaningful and complete to the wiki. I might suggest to the first student that he/she write an overview of the theory and background. And then based on the conceptual difficulties that I find that group having in the weekly group meetings, I could suggest that a future student flesh out the section(s) related to those conceptual difficulties.

Final thoughts

I really like how this paper helped me tweak my already planned writing tasks for my advanced lab course. With future students being an audience that will actually have to use the written dissemination of previous students to help them build on that work, I feel like I have identified a very tangible audience for students in this course. And by re-framing the journal article as a technical note and moving background and theory to the wiki, I have managed to make sure that the students always have something new and relevant to say.

I’m really not certain how the wiki entries will evolve, but I will get lots of feedback from the students to try to help me refine and improve the wiki writing task to make it feel as productive to them as possible.

Endnotes

  1. These skills include a re-introduction to statistics, error analysis and curve fitting; an introduction to LabVIEW programming; and learning how to use LaTeX.

Running courses in a way similar to how a research group functions

Bret Benesh recently posted a wonderful post about setting up your courses/policies/actions to maximally respect students. I started writing a comment and it got long as my comments often do. So I decided to post it here as well as leave it as a comment on Bret’s blog. I have turned off the comments here so please wander on over to Bret’s post if you want to talk about it with me.

Running courses in a way similar to how a research group functions

I have yet to move past bribing students to do what is good for them. I am trying to move past this starting with my upper-year courses and slowly bringing my most successful policies into my first-year courses. I love it when Brett says “Part of my job is to help them learn to make responsible decisions. This is impossible to do unless the students are given the opportunity to make actual decisions.” My bribery-type policies are mostly in place to help that bottom quartile of students because many of those students are the ones that in my experience have the most trouble making responsible decisions. They are the ones that, when given very flexible due dates, will simply put things off until the bitter end and then scramble (and usually fail) to get everything in at the last minute. Of course when I use rigid due dates they often don’t bother to turn stuff in at all, so the net effect is probably the same, but I feel much more guilty when I feel like I have put them in a position to fail because it feels to me like I set them up to have the mad scramble at the end.

My new ultimate goal (as of today) is to have my courses feel like a well-functioning research group where I am the supervisor and the students are in the role of grad/co-op/summer student. I am there to support them as little or as much as needed and as a “good” supervisor part of my job is to quickly figure out what level of support they will need to be successful. In this model the students would feel responsible to the entire group to be productive on a regular basis and that other people (not just me) depended on them in different ways so that they could do their own work as well. If, on occasion, a grad student hasn’t done enough work in the past week to present something worthwhile at a research group meeting, the group moves on and the supervisor says something along the lines of “ok, we’ll look at that piece next week instead.” I would like my classes to look like that as well. It’s OK to miss arbitrary deadlines now and then, but the sense of responsibility to the greater group results in most students staying on top of things.

To do this requires some specific structuring of courses in a way such that the student is in fact responsible to the greater group with their weekly work instead of just to me. Some thought on how to do this include:

  • Students taking turns presenting or even better running some sort of learning activity on topics that I would normally be in charge of. One thing I have never tried, but just occurred to me, is to have a student be in charge of running the show for a sequence of clicker questions. Hmm…
  • Having in-class small-group activities where each student in the group is responsible for doing some different piece of pre-class preparation so that each student comes into the activity with much different types of expertise and thus each student’s level of preparation is important to their small group. If a student knows that they won’t be able to adequately prepare for their piece they can negotiate to take more responsibility for a future activity in exchange for the group covering for them on the current one.
This idea of running a course like a research group is not something new that came to me today. It is how I plan to run my Advanced Lab course (the name commonly given to the standalone upper-year physics laboratory courses) in January and I have a blog post on it simmering in the background. But there the students are engaged in a much more research-like experience so it only occurred to me today that you could take some of those elements and bring them over to a regular course.
If you want to chat with me about this, head on over to Bret’s post (link at the top of this page).

The DAQ-ness Monster

USB-6009 (left) and Arduino (right). They are both about the size of the palm of your hand.

I spent a nice chunk of time last week playing with different combinations of hardware and software to look at Data Acquisition (DAQ) options for my Advanced Laboratory course (the name given to standalone upper-division physics laboratory courses). I will first talk about the two main hardware devices I was trying out (National Instruments USB-6009 Multifunction DAQ and an Arduino Uno microcontroller) and then look at some combinations of software and hardware that I played with, or at least wanted to play with.

In the end using one of the National Instruments USB DAQ devices with LabVIEW is probably the easiest thing to do thanks to the variety of ways that LabVIEW allows you to quickly create DAQ software on the computer side. But thanks to the huge number of software options that you have to talk to the Arduino and how cheap it is, it is very reasonable to use an Arduino as your general purpose DAQ device.

An example DAQ task

In case you are reading along at home, but not too sure what I’m talking about, I will give you an example DAQ task that uses some of the functionality that is discussed in this post. Let’s say you want to do an (admittedly boring experiment) of trying to figure out which colour of coffee cup keeps your coffee warm the longest. This is a tedious experiment because it takes a long time for the coffee to cool, you want to do multiple runs for each coffee cup and you don’t want to have to just sit there and heat the coffee up over and over again. So you build a circuit with a little heater that can be turned on and off by sending a digital signal to it. You have a thermocouple with which you will measure temperature. And you will have the computer record your data every 10 seconds and write your data sets to file.

So you hook your thermocouple up to a thermocouple amp and that output goes to one of your analog input channels. One of your digital out channels is used to turn the little heater on an off. And you will have some software that does a few different things:

  • Requests the temperature of the coffee every 10 seconds from the thermocouple (amp);
  • Once the coffee gets down to 50 degrees, it turns on the heater until the coffee is back at 90 degrees;
  • It writes all the collected data to a file.

So to run this experiment you just stick the thermocouple and heater in the coffee cup, press go and wander off for a while. You come back every once in a while to make sure everything is running smoothly and to change to a different coffee cup once you feel you have enough runs. With fairly simple software running on the computer, all the hardware/software DAQ combos I mention below will work just fine.

National Instruments USB-6009 Multifunction DAQ

This (and its cheaper cousin the USB-6008) seem like they have been the standard low-cost USB DAQ devices for a little while, and are used most often with LabVIEW. The analog inputs are 14-bit and 48k samples/second. It has 5V out and a bunch of digital Input/Output channels.

Arduino Uno Microcontroller

Arduino Uno with one hooked up analog input channel (potentiometer) and three hooked up digital out channels (three LEDs).

Arduinos are teeny computers with a bunch of analog and digital Input/Output channels. You write programs for them and send them via USB to the device.  With the appropriate program (firmware) you can have your Arduino duplicate the functionality of the USB-600x, but with much less impressive resolution and sample rate. The Arduino analog inputs are only 10-bit and my best estimate of the sample rate is roughly 100 samples per second (edit: you should be able to do better than this if you temporarily store data in Arduino’s memory and then send it to the computer in chunks). Keep in mind that the Arduinos only cost $30 compared to >$200 for the USB-600x (and in my example DAQ task I only needed 1 sample every 10 seconds). That resolution and sample rate are also more than adequate for monitoring and controlling a lot of experiments. You can also use the Arduino as a remote data logger and controller with the help of wireless/bluetooth chips, SD card readers and all sorts of other fun things.

The hardware/software DAQ combos

National Instruments USB-6009 and LabVIEW

This combination seems to be a very common DAQ solution in recent times. I think the standard is about to become LabVIEW + NI MyDAQ, where the MyDAQ is also a Multifunction USB DAQ device. It has fewer channels than a USB-6009 but has 16-bit analog channels, 200k samples/second and an output voltage of 15V as opposed to the 5V of the USB-6009. It also has some neat ready-to-go functionality as a Digital Multimeter and only costs about $200. I think that my pros and cons below for the USB-6009 + LabVIEW combination also applies equally well to the MyDAQ + LabVIEW combination.

Pros:

  • LabVIEW has a lot of built-in and quick-to-get-up-and-running functionality. Graphing, writing to file, signal processing and Input/Output tasks are all quite easy to do once you learn how to do them. As I mention in the cons, you still have to get used to the visual programming environment which is a huge shift in paradigm for many.
  • LabVIEW has a DAQ assistant that helps you build DAQ software quickly. They (National Instruments) also have a program called LabVIEW SignalExpress that is meant to make DAQ (and control, like turning on the heater) tasks without having to do any actual LabVIEW programming. I have not actually tried it, but it comes bundled with a $60 suite version of the student edition of LabVIEW.

Cons:

  • LabVIEW is crazy expensive, but the student edition of LabVIEW is only $20 (or $60 for the suite) so if you can get the students to buy their own copies to put on their laptops the cost becomes pretty negligible.
  • The USB devices (USB-6008/9 or MyDAQ) are quite reasonably priced ($200-$300), but not as cheap as a $30 arduino.
  • LabVIEW has a steepish learning curve. It takes a few weeks to become fluent using it and I still can’t remember how to do half the things I want to do (I am probably at the “few weeks” mark). Visual programming (in this case LabVIEW) is quite a different paradigm than text-based programming.

Arduino and free software

Let’s assume for this discussion that you want to duplicate the hardware functionality of the National Instruments multifunction DAQ devices with the Arduino instead of programming the DAQ task directly onto the Arduino. To do this you write your own Arduino sketch (the name for a program/firmware that you write to the Arduino’s memory) that can read the Arduino’s analog line-ins and send out a digital output signal when told to. There are some existing firmwares (is that the plural of firmware?) such as Firmata (which comes bundled with the Arduino software) and a firmware that comes with the Python Arduino Prototyping API. You can then use Python (for either firmware) or Processing (for Firmata) to collect data and send control signals to the Arduino.

I did not have great luck getting a DAQ task up and running using Python and Firmata, but was delightfully successful doing so using the Python Arduino Prototyping API. With a little vPython in tow I’m sure you could figure out some pretty fun stuff to do with the data coming in through the Arduino.

I don’t know too much about Processing, but it is a computer language and it seems to be quite easy to use it to make graphs based on the examples I played with that come included with the Arduino software. There are some libraries that (I understand) make it very easy to talk to an Arduino with the Firmata firmware using Processing.

Pros:

  • This is the cheapest solution. Python and processing are both free and the Arduino is $30.

Cons:

  • For my example DAQ task, the sample rate and resolution of the Arduino is more than adequate, but the National Instruments USB DAQ devices are really much higher performance which might be needed for experiments more exciting than watching coffee cool.

Unsure:

  • I didn’t do more than try to read something in and send out a signal with the Python Arduino Prototyping API so live-graphing the data (easy in LabVIEW) may be simple to get done using matplotlib or it may be quite tedious. I’m really not to sure.
  • Similarly for Processing I only graphed some data coming in and sent out some signals (to turn on LEDs), but since it is a programming language it should be very straightforward to do things like write data to file or tell the Arduino to turn on the heater once the temperature being read drops below some certain value.

Note: there are a bunch of other software interfacing options for Arduino but I just mention the ones that seemed easiest for me. Mathematica (which is not free) is discussed below.

Arduino and LabVIEW

My version of the Dancing LED example included with the LabVIEW Interface for Arduino.


Early in May this year the LabVIEW Interface for Arduino was released. I found it easy to install everything needed and the examples that I tried worked on my first try. It comes with a custom firmware that you put on the Arduino and then you can get input and output signals just like you would with the National Instruments multifunction DAQ devices.
Pros:

  • Easy to install everything needed.
  • Cheap if you are using the student edition of LabVIEW.
  • There’s a nice tutorial and lots of examples.

Cons:

  • You can’t build tasks for interfacing with Arduino using things like LabVIEW SignalExpress or DAQ assistant, but you can build your own VIs and the included examples can serve as a good template.

Arduino and Mathematica

For those that are already familiar with Mathematica (and perhaps not familiar with some of the software solutions above), you just need the M$ .NET framework installed and then you can chat with the Arduino via serial I/O commands from within Mathematica. I’m not sure if you can get Mathematica to play nice with a pre-existing firmware like Firmata. I haven’t used Mathematica much since early in grad school so I am very much at a novice level and can pretty much only do things that require modification of existing code. I did have some trouble getting this whole thing up and running even though Andy Rundquist basically gave me all the code I needed. This was probably mostly due to my novice skill-level using Mathematica.

Pros:

  • Mathematica is very good at displaying data (and then doing fun stuff with it after).
  • For those already familiar with Mathematica, there is very little learning curve.

Cons:

  • For this purpose there are much better (and cheaper) software options out there.

Other interesting stuff

  • Vernier Sensor DAQ: This thing costs about the same as a USB-6009 with similar performance, but less analog and digital I/O channels. What it does have instead is 3 analog and 1 digital connections for Vernier sensors and probes. I am interested in getting one of these to play around with.
  • LabPro toolkit for LabVIEW: Vernier probes usually connect to the computer via a LabPro which has a bunch of connections for the probes and then interfaces with the computer via USB. The computer runs Vernier LoggerPro to read the probes and record the data. The LabPro toolkit allows you to use your probes, attached to your LabPro, directly in LabVIEW. I couldn’t get all of this to work, but I didn’t really try that hard either.
  • Pasco ScienceWorkshop Probeware and LabVIEW: This seems to work with only the older ScienceWorkshop probes and not the newer Passport probes. There is an adapter, but I don’t know if that will then let the Passport probes work with LabVIEW.
  • NXT Sensor Adaptor: This lets you use Vernier analog probes with Lego Mindstorms. The first thing that comes to mind is to use a force plate as a gas pedal for a Mindstorm RC car. That would be very fun