Learning Before Class Strategies Part 1: Types of Assignments

This is part 1 of a 3 part series on pre-class learning strategies, which can be used as part of the flipped or inverted class. I have discussed why I implement pre-class learning strategies in the past and want to focus instead more on the how in these posts.

  • Part 1 (this one) focuses on some common types of assignments/assessments that you can use.
  • Part 2 focuses on some different ways that you provide your students with the resources that they will be using to do some learning before coming to class.
  • Part 2.5 is a continuation of part 2 and discusses the types of video lectures in a bit more detail
  • Part 3 (eventually) will discuss some tips and some issues that I have come across trying to implement learning before class strategies.

Common pre-class assignments

These are types of pre-class assignments that I have tried out in my own classes or have heard of being in use out in the wild. Part 2 will go into more detail on the learning resources you can provide to your students, but most of these assignments types will work well with commonly used “flipped class” resources such as textbook readings, pencasts, screencasts and multimedia video.

    1. Reading Quiz – Reading quizzes are usually administered at the beginning of class and marked for correctness. I have used clickers, bubble sheets and index cards to administer them. The problem with reading quizzes is that they have to be reasonably easy  and target recall or very basic understanding so that students that put in an honest effort to do the reading will get most of the questions correct. As a result it is hard to come up with questions that aren’t reasonably easy to guess. The administration through index cards refers to the times that I have asked them to draw or explain something in a way that shows their understanding, but the overall performance of the class on these types of questions has usually ended up being low. I moved on from reading quizzes after just one term of using them.
    2. Guided Reading Quiz – Instead of asking the students to read and try to figure out what are the most important ideas from their pre-class resources, you supply them with a set of questions to guide their initial learning. And then a small handful of those questions are used for the start of class reading quiz. I have not tried this in my own courses, but I came across it in a effectiveness of peer instruction in computing paper by my friend and former CWSEI STLF Beth Simon. If I was teaching a giant section of a course, I think that I might use this method. If the students try to take a shortcut and just learn the answers to the questions, they still end up touching base with the most important points from the pre-class resources (assuming well-constructed questions) which means the assignment will still have accomplished its purpose. Haha, gotcha shortcut-takers.
    3. Pre-Class Online Participation Questions – This is my generic name for the type of pre-class assignments used in Just-in-Time-Teaching . Students are asked to answer questions online (through your CMS, online homework system or plain old email) at least a couple of hours before class, giving you time to review their answers and modify your lesson plan or seed your class notes with their words and questions. These questions are not marked for correctness, but are instead marked for completion, usually based on answers that show that the student put in some reasonable level of effort to learn the material. I get the most mileage out of asking them to answer relatively easy conceptual multiple-choice questions and asking them to explain their answer. Easy is a very relative term and I am usually happy if anywhere from 50-100% of the students are able to get the question correct after consuming the pre-class resources, but it is most important to me that I can see from their explanations that they had to mash the ideas around in their head a bit before being able to answer the question. For these questions I usually just pull a clicker question straight out of my notes. Other types of questions that work well for this type of assignment are estimation questions and short calculations (I will be discussing when these go poorly in Part 3). As I have discussed in a previous post, this is the style of pre-class assignment I currently use for all my courses (intro, upper-year and labs) other than my project-based upper-year lab course. One of my main open questions with this type of assignment (to be discussed more in Part 3) is how best to provide feedback to the students.
    4. Student-Generated Questions – (added an hour after the orginal post went live) I can’t believe I forgot to include this. As part of the pre-class online participation question assignments I usually provide an extra box on the web-form where they are encouraged to ask any questions that they have regarding the content in question. If there are some common themes to these questions I bring them up in class. For more isolated questions that won’t be addressed in class, I usually respond to that student’s question via email. This year in my 3rd-year quantum class I had the students generate some questions after reading an 8-page excerpt on the postulates of quantum mechanics and some compare/contrast points between classical and quantum mechanics. They generated fantastic questions and we spent a whole period going through these questions and tying the ideas from these questions to each other and to their previous courses. The best part was that if I had prepared a lecture to discuss those exact same ideas from their questions without having had them first generate the questions, they would have been nowhere near as invested in what I had to say. It really personalized the whole thing. I want to try this type of thing again in the future.
    5. Summaries – Get them to write a paragraph or three summarizing the main ideas or their understanding of the pre-class resources.
    6. Reflective Writing – The purpose of the student writing in this type of assignment is for them to focus on the ideas that they are having trouble understanding and to highlight or summarize those ideas through their writing. This type of assignment is marked for completion and evidence that they were writing for their own understanding, but is not marked for correctness. Calvin Kalman is a proponent of this type of writing to learn strategy.
If you have tried other types of assignments for learning before class, let me know about them. Part 2 coming soon.
Advertisements

The Science Learnification (Almost) Weekly – June 19, 2011

This is a collection of things that tickled my science education fancy in the past couple of weeks or so.

Reflections on Standards-Based Grading

Lots of end-of-year reflections from SBG implementers

  • SBG with voice revisions – Andy Rundquist only accepts (re)assessments where he can hear the student’s voice. When they hand in a problem solution, it basically has to be a screencast or pencast (livescribe pen) submission. The post is his reflections on what worked, what didn’t and what to do next time.
  • Standards-Based Feedback and SBG Reflections – Bret Benesh has two SBG-posts one after the other. I was especially fond of the one on Standards-Based Feedback where he proposes that students would not receive standards-based grades throughout the term but would instead produce a portfolio of their work which best showed their mastery for each standard. This one got my mind racing and my fingers typing.
  • A Small Tweak and a Feedback Inequality – Dan Anderson posts about providing feedback-only on the first assessment in nerd form: Feedback > Feedback + Grade > Grade. This is his take on the same issue which lead Bret Benesh to thinking about Standards-Based Feedback, when there is a grade and feedback provided, the students focus all their attention on the grade. He also has a neat system of calculating the final score for an assessments.
  • Reflections on SBG – Roger Wistar (computer science teacher) discusses his SBG journey and the good and bad of his experience so far.

Modeling

Flipped classrooms and screencasting

Peer Instruction

  • Why should I use peer instruction in my class? – Peter uses  a study on student (non)learning from video by the Kansas State Physics Education Research Group to help answer this question. The short answer is “Because they give the students and you to ability to assess the current level of understanding of the concepts. Current, right now, before it’s too late and the house of cards you’re so carefully building come crashing down.”

The tale of sciencegeekgirl’s career

Getting them to do stuff they are interested in

John Burk gets busy


Rambling thoughts on flipping the physics classroom

I seem to have some sort of a knack for writing comments that are longer than my original post ever was. Simon Bates commented on my last post about possibly flipping a couple of courses at his own institution and I started to write a long comment on some extra things to consider, which I may have discussed had I written a post about flipping my courses in general as opposed to a post specifically about flipping a third-year Quantum Mechanics course. Here is what I was writing as a reply to Simon, massaged instead into a post.

SmartPhysics as an alternative to making my own screencasts for intro Physics

For those teaching intro physics that are more interested in screencasting/pre-class multimedia video presentations instead of pre-class reading assignments, you might wish to take a look at SmartPhysics. It’s a package developed by the PER group at UIUC that consists of online homework, online pre-class multimedia presentations and a shorter than usual textbook (read: cheaper than usual) because there are no end-of-chapter questions in the book, and the book’s presentation is geared more toward being a student reference since the multi-media presentations take care of the the “first time encountering a topic” level of exposition. My understanding is that they paid great attention to Mayer’s research on minimizing cognitive load during multimedia presentations. I will be using SmartPhysics for my first time this coming fall and will certainly write a post about my experience once I’m up and running.

Level of student participation in pre-lecture learning

I have found that student participation on the pre-class reading assignments with introductory physics students (no matter how many marks I dangle in front of them) is at best the same as student homework completion percentages. In my case this is around 80% and I have heard similar numbers from others. The thing that I have found the most challenging in using pre-class reading assignments is resisting quickly “catching-up” the 20% that didn’t complete the pre-class assignment. In the end, this just reinforces their behaviour and makes the whole process of flipping my class somewhat redundant. Since my class-time is mostly driven by clicker questions, it seems that the reluctant 20% end up building a bit of an understanding of the topic at hand through peer discussion. Of course, the students in that 20% tend to clump themselves together physically in the classroom making things even more challenging for themselves.

Getting started on screencasting

In terms of the resources to help get you up and running doing actual screencasts, some folks in PLN that have posted about their experiences include: Mylene at Shifting Phases (in these post you will find a great conversation that we had about screencasting vs. reading assignments there), Andy Rundquist at SuperFly Physics, and Robert Talbert at Casting Out Nines.


The Science Learnification Weekly (March 27, 2011)

This is a collection of things that tickled my science education fancy in the couple of weeks or so (I missed posting a Weekly last week). They tend to be clumped together in themes because an interesting thing on the internet tends to lead to more interesting things.

Getting students to see the value of keeping a good experimental logbook/science notebook

Introducing Science Notebooks is a nice little post over at the Students Talk Science Blog discussing an activity from the Student-Generated Scientific Inquiry project at CSU-Chico. The activity has students studying notebook pages from 6 famous scientists and looking at the commonalities. Based on their observations, the students generate through class-wide discussion a list what they think science notebooks are for. They are then told that they should model their own notebooks after those of the scientists and that the marking rubric that will be used for their notebooks will be generated from their list of the “kinds of things are reasonable to expect from yourselves in creating a quality notebook.”

I love it! I had my 2nd and 3rd year lab students keep logbooks for their first time last year. I gave them some pretty standard reasons why keeping a proper logbook was good practice, but in the end I’m pretty sure it just looked to them like I was stomping around and demanding that they do things the way I wished them to. This student-generated list of criteria is far superior to what I did. I can’t wait to try it out next time I teach a 200+ level lab course (winter 2012?). I also wonder if you could develop a similar activity for developing rubrics for marking papers, homework solutions and other student work?

Thinking about moving beyond confront-elicit-resolve and other interesting things that followed from discussions regarding the Khan Academy

Haha. That’s my longest heading yet. This little section brings together my recent efforts to learn more about the Investigative Science Learning Environment (ISLE) curriculum and some of the Khan Academy discussions near my local region of the blogosphere. The recent posts of interest are listed here, and discussed in context later:

The ISLE curriculum was developed by the PAER group at Rutgers. They have a very informative 48 page paper that discusses their curriculum in detail and also points to the research that informed their design of this curriculum. To quote them:

[ISLE] helps students learn physics by engaging in processes that mirror the activities of physicists when they construct and apply knowledge. These processes involve observing, finding patterns, building and testing explanations of the patterns, and using multiple representations to reason about physical phenomena.

In a section on “how do people learn and change conceptions?” they point out that the confront-elicit-resolve model of changing student conceptions has traditionally been the basis of instructional strategies promoted by Physics Education Researchers. They point to work, which is now an excellent paper [1], by Redish and Hammer. In that paper, Hammer and Redish talk about getting students to use their existing understanding of the world to build up to the correct conceptual understanding instead of trying to change student conceptions by confronting them head on. In the ISLE curriculum, they use a lot of observational experiments, and they side-step the cognitive-conflict issue by not asking the students to predict the outcome of these experiments. Instead they get students to generate conceptual understanding through looking at the patterns from these experiments and then designing further experiments to refine the concepts. I think I have a longer post in me to discuss moving beyond confront-elicit-resolve further so I will just leave it there for now.

This leads to Frank Noschese’s Action-Reaction post: “Khan Academy and the Effectiveness of Science Videos”. In this post he discusses a video made by Derek Muller (a.k.a. Veritasium) which discusses, based on Muller’s own PhD research, why the Khan videos might not be effective at teaching scientific concepts. Muller goes all meta on us and produces a video to discuss this issue and the video is somewhat in the style of Khan’s videos. The short story is that misconceptions need to be specifically addressed in order for conceptual change to occur or the video lesson will simply be misremembered in a way that makes the information being presented consistent with the viewers previous misconceptions. This is basically the same as the discussion of why demos don’t always work. Muller’s own videos include explicit discussions of misconceptions as part of the lesson. So when I was reading the ISLE paper above, I was thinking that what Muller is talking about is basically confront-elicit-resolve. But that’s not really the case. In Muller’s guest Action-Reaction post “What Puts the Pseudo in Pseudoteaching?” he discusses how to make multimedia presentations more effective and one point he makes, as part of  “using previous conceptions as footholds” is also part of the strategy used by Redish and Hammer (based on earlier work by Elby [2]):

The misconception should be discussed for its own merits – why is this idea so common? In what ways does it correctly reflect observations of the world? In what specific ways does it lead to inaccurate reasoning?

OK, now I have it a bit clearer in my mind how this all fits together.

Onto another piece; could students learn more if Kahn made mistakes? In this post John Burk (Quantum Progress blog) wonders if students would learn more from videos that have purposeful mistakes in them, and for which the student is asked to find the mistake. You can find in the comments from that post discussion of using the “find the mistake” strategy in the written form as well. This has me cooking up a study to look at this very idea.

Misconceptions Misconceived: The example of current: And while we are on the topic of misconceptions, Brian Frank has a very lengthy post/rant on his Teach Brian Teach blog discussing the common “current gets used up” student misconception. Lots of good stuff in there, but my favorite is:

When a student balks at the idea that “current is conserved”, I think, “Great, it doesn’t make sense, does it?” When students don’t want to accept that answer, I see them as being committed to sensibility (not to authority).

I’m MORE worried about the students who quickly accept the “right” answers. I really am.

The effect of Twitter on college student engagement and grades

John Burk posts about a study [3] where they found that the experimental group (lotsa twitter) had higher engagement and higher overall GPAs than the control group (no twitter). John was kind enough to extract their uses of twitter from the study, and many of these are great uses that we could all bring into our own classroom.

References

[1] E.F. Redish an D. Hammer, Reinventing college physics for biologists: Explicating an epistemological curriculum, Am. J. Phys. 77, 629 (2009). [pdf]

[2] A. Elby, Helping physics students learn how to learn, Am. J. Phys. 69, S54 (2001). [preprint]

[3] Junco, R., Heiberger, G. and Loken, E. (2011), The effect of Twitter on college student engagement and grades. Journal of Computer Assisted Learning, 27: 119–132. doi: 10.1111/j.1365-2729.2010.00387.x [pdf]


The Science Learnification Weekly (March 6, 2011)

This is a collection of things that tickled my science education fancy in the past week or so. Some of these things may turn out to be seeds for future posts.

Screencasting in education

Last week I posted links to a couple of posts on screencasting as part of a collection of posts on flipped/inverted classrooms in higher education. Well this week I’m going to post some more on just screencasting.

  • I mentioned this last week, but Robert Talbert has started a series of posts on how he makes screencasts.
    • In the first post, he is kind enough to spell out exactly what a screencast is among other things.
    • In the second post, he talks about the planning phase of preparing a screencast.
  • Roger Freedman goes all meta on us and posts a screencast about using screencasts (well, it’s actually about his flipped a.k.a. inverted classroom and how he uses screencasts as part of that).
  • Andy Rundquist talks about using screencasting to grade and provide feedback. He also gets his students to submit homework problems or SBG assessments via screencast. He has a ton of other posts on how he uses screencasting in the classroom.
  • It’s unofficially official that #scast is the hashtag of choice for talking about screencasting on twitter.
  • Added March 10: Mylene from the Shifting Phases blog talks about some of the nuts and bolts of preparing her screencasts including pointing out how the improved lesson planning helps her remember to discuss all the important little points.

I taught a 3rd-year quantum mechanics course last year and encouraged the students, using a very small bonus marks bribe, to read the text before coming to class. I think that due to the dense nature of the material, their preparation time would be much more productive and enjoyable if I created screencasts for the basic concepts and then had a chance to work on derivations, examples and synthesis in class. With the reading assignments they were forced to try to deal with basic concepts, synthesis, derivations and examples on their own which was asking quite a lot for their first contact with all those new ideas. I’m pretty interested to try out screencasting and

Arduino

I have been scheming for a while to bring the Arduino microprocessor (a.k.a. very small open-source computer) into my electronics courses starting with a 2nd year lab. Arduino is a favorite of home hobbyists and the greater make community.

This is a collection of things that tickled my science education fancy in the past week or so. Some of these things may turn out to be seeds for future posts.

Standards-based grading in higher education

Standards-based grading (SBG) is a pretty hot topic in the blogosphere (SBG gala #5) and on twitter (#sbar). There’s a nice short overview of standards-based grading at the chalk|dust blog.

I am very fond of the idea of basing a student’s grade on what they can do by the end of the course instead of penalizing them for what they can’t do along the way when they are still in the process of learning. I also love the huge potential to side-step test anxiety and cramming.

Folks using this grading scheme/philosophy (a.k.a. the SBG borg) are mostly found at the high-school level, but there are some folks in higher ed implementing it as well. I am strongly considering trying out SBG in one of my future upper-division courses, such as Quantum Mechanics, but there are some implementation issues that I want to resolve before I completely sell myself on trying it out. I am in the middle of writing a post about these issues and look forward to discussing them with those that are interested.

Special thanks go to Jason Buell from the Always Formative Blog for bringing most of these higher ed SBG folks to my attention. He has a great bunch of posts on SBG implementation that fork out from this main page.

SBG implementations in higher ed:

  • Andy Runquist is using collaboarative oral assessments as part of his SBG implementation. This is the only higher ed Physics implementation that I have encountered so far and I have been chatting Andy up a ton about what he is up to in his first implementation.
  • Adam Glesser from the GL(s,R) blog has tons of SBG posts: He is in his first year of a full SBG implementation in his Calculus courses. He gets bonus points for being a boardgame geek and playing Zelda with his 4-year old son.
  • Sue VanHattum talks about wading into the water as she slowly moves into SBG implementations by way of a mastery learning implementation. Search her blog for other SBG posts.
  • Bret Benesh comes up with a new grading system for his math courses with help from the SBG borg.
  • Added March 10: Mylene teaches technical school electronics courses and replaces the achievement levels for each standard with a system where the standards build on each other and are assessed using the binary yup or nope system.

That’s it for this week. Enjoy the interwebs everybody.

Robert Talbert