The Science Learnification Weekly (March 27, 2011)

This is a collection of things that tickled my science education fancy in the couple of weeks or so (I missed posting a Weekly last week). They tend to be clumped together in themes because an interesting thing on the internet tends to lead to more interesting things.

Getting students to see the value of keeping a good experimental logbook/science notebook

Introducing Science Notebooks is a nice little post over at the Students Talk Science Blog discussing an activity from the Student-Generated Scientific Inquiry project at CSU-Chico. The activity has students studying notebook pages from 6 famous scientists and looking at the commonalities. Based on their observations, the students generate through class-wide discussion a list what they think science notebooks are for. They are then told that they should model their own notebooks after those of the scientists and that the marking rubric that will be used for their notebooks will be generated from their list of the “kinds of things are reasonable to expect from yourselves in creating a quality notebook.”

I love it! I had my 2nd and 3rd year lab students keep logbooks for their first time last year. I gave them some pretty standard reasons why keeping a proper logbook was good practice, but in the end I’m pretty sure it just looked to them like I was stomping around and demanding that they do things the way I wished them to. This student-generated list of criteria is far superior to what I did. I can’t wait to try it out next time I teach a 200+ level lab course (winter 2012?). I also wonder if you could develop a similar activity for developing rubrics for marking papers, homework solutions and other student work?

Thinking about moving beyond confront-elicit-resolve and other interesting things that followed from discussions regarding the Khan Academy

Haha. That’s my longest heading yet. This little section brings together my recent efforts to learn more about the Investigative Science Learning Environment (ISLE) curriculum and some of the Khan Academy discussions near my local region of the blogosphere. The recent posts of interest are listed here, and discussed in context later:

The ISLE curriculum was developed by the PAER group at Rutgers. They have a very informative 48 page paper that discusses their curriculum in detail and also points to the research that informed their design of this curriculum. To quote them:

[ISLE] helps students learn physics by engaging in processes that mirror the activities of physicists when they construct and apply knowledge. These processes involve observing, finding patterns, building and testing explanations of the patterns, and using multiple representations to reason about physical phenomena.

In a section on “how do people learn and change conceptions?” they point out that the confront-elicit-resolve model of changing student conceptions has traditionally been the basis of instructional strategies promoted by Physics Education Researchers. They point to work, which is now an excellent paper [1], by Redish and Hammer. In that paper, Hammer and Redish talk about getting students to use their existing understanding of the world to build up to the correct conceptual understanding instead of trying to change student conceptions by confronting them head on. In the ISLE curriculum, they use a lot of observational experiments, and they side-step the cognitive-conflict issue by not asking the students to predict the outcome of these experiments. Instead they get students to generate conceptual understanding through looking at the patterns from these experiments and then designing further experiments to refine the concepts. I think I have a longer post in me to discuss moving beyond confront-elicit-resolve further so I will just leave it there for now.

This leads to Frank Noschese’s Action-Reaction post: “Khan Academy and the Effectiveness of Science Videos”. In this post he discusses a video made by Derek Muller (a.k.a. Veritasium) which discusses, based on Muller’s own PhD research, why the Khan videos might not be effective at teaching scientific concepts. Muller goes all meta on us and produces a video to discuss this issue and the video is somewhat in the style of Khan’s videos. The short story is that misconceptions need to be specifically addressed in order for conceptual change to occur or the video lesson will simply be misremembered in a way that makes the information being presented consistent with the viewers previous misconceptions. This is basically the same as the discussion of why demos don’t always work. Muller’s own videos include explicit discussions of misconceptions as part of the lesson. So when I was reading the ISLE paper above, I was thinking that what Muller is talking about is basically confront-elicit-resolve. But that’s not really the case. In Muller’s guest Action-Reaction post “What Puts the Pseudo in Pseudoteaching?” he discusses how to make multimedia presentations more effective and one point he makes, as part of  “using previous conceptions as footholds” is also part of the strategy used by Redish and Hammer (based on earlier work by Elby [2]):

The misconception should be discussed for its own merits – why is this idea so common? In what ways does it correctly reflect observations of the world? In what specific ways does it lead to inaccurate reasoning?

OK, now I have it a bit clearer in my mind how this all fits together.

Onto another piece; could students learn more if Kahn made mistakes? In this post John Burk (Quantum Progress blog) wonders if students would learn more from videos that have purposeful mistakes in them, and for which the student is asked to find the mistake. You can find in the comments from that post discussion of using the “find the mistake” strategy in the written form as well. This has me cooking up a study to look at this very idea.

Misconceptions Misconceived: The example of current: And while we are on the topic of misconceptions, Brian Frank has a very lengthy post/rant on his Teach Brian Teach blog discussing the common “current gets used up” student misconception. Lots of good stuff in there, but my favorite is:

When a student balks at the idea that “current is conserved”, I think, “Great, it doesn’t make sense, does it?” When students don’t want to accept that answer, I see them as being committed to sensibility (not to authority).

I’m MORE worried about the students who quickly accept the “right” answers. I really am.

The effect of Twitter on college student engagement and grades

John Burk posts about a study [3] where they found that the experimental group (lotsa twitter) had higher engagement and higher overall GPAs than the control group (no twitter). John was kind enough to extract their uses of twitter from the study, and many of these are great uses that we could all bring into our own classroom.

References

[1] E.F. Redish an D. Hammer, Reinventing college physics for biologists: Explicating an epistemological curriculum, Am. J. Phys. 77, 629 (2009). [pdf]

[2] A. Elby, Helping physics students learn how to learn, Am. J. Phys. 69, S54 (2001). [preprint]

[3] Junco, R., Heiberger, G. and Loken, E. (2011), The effect of Twitter on college student engagement and grades. Journal of Computer Assisted Learning, 27: 119–132. doi: 10.1111/j.1365-2729.2010.00387.x [pdf]

Advertisements

6 Comments on “The Science Learnification Weekly (March 27, 2011)”

  1. Modeling is similar to ISLE in the fact that it doesn’t use the confront-elicit-resolve strategy either. I’ve heard arguments that C-E-R works really well for adult learners, especially teachers. The “Physics by Inquiry” curriculum (John Burk wrote about it in his Pseudoteaching by Inquiry post) is based entirely on C-E-R. And PbI is NOT recommended for HS.

    If I recall correctly, the C-E-R doesn’t work well with HS kids (and college too??) is because it makes kids second-guess their intuition, which is what you DON’T want to do while trying to improve their scientific intuition. It might also make them feel dumb.

    So ISLE, and Modeling, gives kids the tools to analyze the situation properly, which should supplant the misconception. But I think the supplating isn’t going to happen on its own. It must be introduced AFTER the students have the correct model to be applied.

    Take home message: Misconceptions must be addressed, but it is more effective when the students have the proper tools to deal with them.

    Do you agree?

    • Joss Ives says:

      Even in my own experience C-E-R has a fairly negative effect on affect. I have had a few of my harder working middle-of-the-pack students comment to me that they NEVER trust their intution on my clicker questions. I now try to put more scaffolding-type clicker questions in there where >90% of the students get the questions right and those questions are the pieces needed to answer the challenging Mazur-level clicker question. I let them get a bit comfortable with the tools that they have and then the hard question is more like a puzzle to them than something meant to expose misconceptions. Etkina and van Heuvelen also discuss the effect on affect in the PAER 48-page paper that I linked to. van Heuvelen ran two sections of a course where the only difference was that in one section, he asked the students for predictions before he ran his demos (remember in ISLE they use the data/patterns from the demos to build up the concept) and he ended up having significantly worse evaluations in the class that did the predictions.

      Back to your statement: “Misconceptions must be addressed, but it is more effective when the students have the proper tools to deal with them.” To have a coherent framework that is not just ideas handed down by authority, they need to believe those concepts and develop intuition using them so it seems that the misconceptions need to be changed, clarified, or as is done with Redish et al., the students need to understand in which contexts these misconceptions apply and how they need to be refined to be extended beyond those contexts. I’m still sorting out exactly how misconceptions are addressed within the ISLE curriculum.

      The only reading I have done on Modeling Instruction so far is Modeling instruction: Positive attitudinal shifts in introductory physics measured with CLASS, which is a college-level implementation. I grabbed a couple more of the FIU PER group’s modeling papers and some stuff from the Remodeling University Physics project page and hope to gain a better sense of Modeling Instruction as well.

  2. Here’s a great introduction to modeling: Modeling Instruction: An Effective Model for Science Education

    And then there’s this seminal paper: A Modeling Method for High School Physics Instruction

    I love the ISLE approach, too, and have been devouring anything from Etkina and Van Heuvelen. Do you have his ALPS Kits?

    • Joss Ives says:

      Thanks for adding to my reading queue Frank!

      Have the ALPS kits been superseded by the Active Learning Guide? I have the Active Learning Guide coming from the publisher, but do not have the ALPS kits.

  3. I think the ALG has superceded ALPS. However, I think the ALPS kits have many excellent exercises that are not in the ALG. Try to get your hands on them. There’s a mechanics ALPS and an E&M ALPS.

  4. Brian says:

    I just caught this conversation this morning, a month after the fact. That’s the downside of reading the posts right away, and forgetting to come back. Great stuff!

    I also have a post on pseudoteaching about a class using PbI. It’s here


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s