Student collision mini-projects from my summer 2013 comp-phys course
Posted: October 3, 2013 Filed under: Computational Physics 2 CommentsThe last course that I taught at UFV before taking a job at UBC was an online Computation Physics course. I previously posted the mini-project videos from when I ran the course in the fall and you can check that previous post to learn more about the context of these mini-projects. The overall level of creativity seems like it was a bit lower this time than last and that might be in part due to the online nature of the course, where it was in person the last time the course ran. Last time, people would show up to the computer lab and see what others were working on and the stakes would become raised. I think if I did this type of course online again, I would get people to submit regular progress videos so that there was the virtual equivalent of people showing up in the lab and feeling the stakes be raised.
Mathematica vs. Python in my Computational Physics course
Posted: September 26, 2013 Filed under: Computational Physics 4 CommentsThis past year I taught two sections (fall and summer) of Computational Physics at UFV, which is quite a rare thing to happen at a school where we typically run 3rd and 4th year courses every other year. The course was structured so that students would get some initial exposure to both Python and Mathematica for the purposes of physics-related computational tasks, and then as the course went on they would be given more and more opportunities to choose between the two platforms when completing a task. This post looks at the choices made by the students on a per task basis and a per student basis. From a total of 297 choices, they chose Mathematica 62.6 ± 2.8% (standard error) of the time. Two things to note. First, there is a small additional bias toward Mathematica because two of the tasks are more Mathematica-friendly and only one of the tasks is more Python-friendly, as will be discussed. Second, there were students coming into the course with some previous experience using Mathematica or a strong computing background in Java or C++ and those students usually gravitated toward Mathematica or Python, respectively. But some students did the opposite and took advantage of the opportunity to become much more familiar with a new-to-them platform and tried to do as much of their work as possible using the new platform.
Task by task comparison
The general format of the course is that, for the first three weeks, the students are given very similar tasks that they must complete using Python and Mathematica. Examples include importing data and manipulating the resulting arrays/lists to do some plotting, and basic iterative (for/do) loops. Each of those weeks there is a third task (1.3 and 2.3 in Figure 1), where they build on those initial tasks and can choose to use either Python or Mathematica. In subsequent weeks, being forced to use both platforms goes away and they can choose which platform to use.
Figure 1 shows their choices for those tasks in which they could choose. In week 3 they learn how to do animations using the Euler-Kromer method in Mathematica and using VPython. Although Mathematica works wonders with animations when using NDSolve, it is really clunky for Euler-Kromer-based animations. Task 4.1 is a mini-project (examples) where they animate a collision and since they have only learned how to do this type of thing using Euler-Kromer at this point, VPython is the much friendlier platform to use for the mini-project. But as you can see in the third column in Figure 1, some students were already quite set on using Mathematica. Conversely, for tasks 8.2 and 8.3, Mathematica is the much more pleasant platform to be using and you can see that only a few diehards stuck to Python for those ones.
Student by student comparison
Figure 2 shows the breakdown of these choices by student or group. For those seven students that only used Python by choice once, it was for the mini-project. And the two students that only used Mathematica once used it for the Linear Algebra exercises.
Student feedback
After the course was completed, I asked the students for some feedback on various aspects of the course and one of the questions I asked was
How did you find that the mix of Mathematica and Python worked in this course?
Approximately half of the students submitted feedback and their responses were as follows
I would have preferred more emphasis on Mathematica 1 14% I would have preferred more emphasis on Python 0 0% The mix worked well 6 86%
And their comments
- Kinda hard to choose one of the above choices. I felt like I wish I could’ve done more stuff in python but I was happy with how proficient I became at using Mathematica. So I think the mix worked well, though I kinda wish I explored python a bit more.
- I think it was very valuable to see the strengths and weaknesses of both Mathematica and Python as well as to get familiar with both of them.
- It was cool to see things done in two languages. I’m sure that people less enthusiastic about programming would probably have preferred one language, but I thought it was good.
- Doing both to start followed by allowing us to pick a preferred system was great.
In closing
I ended up being quite happy with how the mix worked. I learned a ton in the process because I came in being only mildly proficient with each platform. I was also really happy to have the students come out of the course seeing Python as a viable alternative to Mathematica for a wide range of scientific computing tasks, and conversely getting to see some places where Mathematica really excels.
Student collision mini-projects from my fall comp-phys course
Posted: June 3, 2013 Filed under: Computational Physics, Electronics Courses, Mini-Projects 9 CommentsThis past fall I had a revelation which I have yet to harness, but it is hiding out in my brain waiting to be incorporated into future courses. In two of my courses, I had the students work on mini-projects. This was the first time I had used mini-projects in a course and I was delighted with how independent the students were as compared to an overly prescribed task and I was also delighted with the quality of their work as compared to work from the regular prescribed tasks. Later in this post I have shared some videos of the comp-phys mini-projects, but I want to discuss a few things first.
In my digital electronics labs course they were asked to take the input from an analog sensor, apply some electronic decision-making to this input and provide some digital output related to the input. An example is to use a photoresistor to monitor room brightness and use 3 different colours of LED to provide feedback related to the room brightness: a red LED is lit if the room is dark, a yellow LED is lit if the room is of “standard” brightness and a green LED is lit if the room is extremely bright.
In my comp-phys course they were asked to make a collision simulation using Mathematica or Python where there has to be at least 3 different parameters which can be manipulated by the user (e.g., mass, velocity, coefficient of restitution, type of object) and at least one challenging piece of physics in the simulation (e.g., rolling friction, coefficient of restitution which varies between 0 and 1). Examples ideas that I provided included the ballistic pendulum or a 2D collision where you have to worry about the angle of attack.
In both cases, the task was designed to be something which should take approximately one week of the regular time that they are expected to put into the course. In both cases I had some small-in-scope expectations related to the documentation/presentation of the mini-project. For the digital mini-project, I asked them to submit a complete circuit diagram and a brief video of them walking me through how the mini-project works. For the comp-phys mini-project, I asked for well-documented code and a brief document which highlighted the physics being simulated and explained how it was implemented in the code.
Before I share the comp-phys mini-projects from the fall, I want to share an “in no particular order” list of things that I liked about the mini-projects above what I would see from a regular prescribed task or series of tasks:
- The students seemed much more willing to take on larger challenges with less support.
- The students were provided with the opportunity to bring some creativity into their work. There seems to be very few of these opportunities in most physics programs.
- The quality of student work was consistently higher than usual and competition played a small role in this. With the comp-phys mini-projects, students would show up to class and see what others had done and decide they had to step up their game by adding more bells and whistles than they had originally intended.
- The students had a lot more ownership of the learning task.
I suspect that Andy has seen a lot of these benefits since switching to SBG. A lot of the student submissions for standards that I have seen from his courses seem to involve some creativity and students taking on larger challenges that would normally be expected. The scope of those standards tends to be smaller than the mini-projects I am talking about here, but my experience with mini-projects certainly helps me appreciate even more how powerful SBG can be in terms of giving the students some choice in how they show their proficiency.
Mini-project playlist
Below is a playlist of no-audio videos of the 10 mini-projects from the comp-phys course. Each of them is in the neighborhood of 30 seconds long of me playing around with the various controls and then running the simulation one or two times. Some of them were done by groups. They’re pretty tiny in the embedded player so I would suggest going full-screen.
Student feedback on having weekly practice quizzes (instead of homework)
Posted: February 14, 2013 Filed under: Group Quizzes, Homework, Introductory Electricity and Magnetism, Practice Quizzes, Weekly Quizzes | Tags: intro physics, practice quiz questions 5 CommentsThis term I eliminated the weekly homework assignment from my calc-based intro physics course and replaced it with a weekly practice quiz (not for marks in any way), meant to help them prepare for their weekly quiz. There’s a post coming discussing why I have done this and how it has worked, but a la Brian or Mylene, I think it can be valuable to post this student feedback.
I asked a couple of clicker questions related to how they use the practice quizzes and how relevant they find the practice quiz questions in preparing them for the real quizzes. I also handed out index cards and asked for extra comments.
Aside from changing from homework assignments to practice quizzes, the structure of my intro course remains largely the same. I get them to do pre-class assignments, we spend most of our class time doing clicker questions and whiteboard activities, and there is a weekly two-stage quiz (individual then group). I have added a single problem (well, closer to an exercise) to each weekly quiz, where in the past I would infrequently ask them to work a problem on a quiz.
Clicker Question 1
Clicker Question 2
Just from a quick scan of the individual student responses on this one, I saw that the students with the highest quiz averages (so far) tended to answer A or B, where the students with the lower quiz averages tended to answer B or C. I will look at the correlations more closely at a later date, but I find that this is a really interesting piece of insight.
Additional Written Feedback
Most of the time I ask the students for some feedback after the first month and then continue to ask them about various aspects of the course every couple of weeks. In some courses I don’t do such a great job with the frequency.
Usually, for this first round of feedback, the additional comments are dominated by frustration toward the online homework system (I have used Mastering Physics and smartPhysics), requests/demands for me to do more examples in class, and some comments on there being a disconnect between the weekly homework and the weekly quiz. As you can see below, there is none of that this time. The practice quizzes, the inclusion of a problem on each weekly quiz, and perhaps the provided learning goals, seem to do a pretty good job of communicating my expectations to them (and thus minimize their frustration).
Student comments (that were somewhat on topic)
- I feel like the practice quizzes would be more helpful if I did them more often. I forget that they have been posted so maybe an extra reminder as class ends would help.
- The wording is kind of confusing then I over think things. I think it’s just me though. Defining the terms and the equations that go with each question help but the quizzes are still really confusing…
- Curveball questions are important. Memorize concepts not questions. Changes how students approach studying.
- The group quizzes are awesome for verbalizing processes to others. I like having the opportunity to have “friendly arguments” about question we disagree on
- I love the way you teach your class Joss! The preclass assignments are sometimes annoying, but they do motivate me to come to class prepared
- I enjoy this teaching style. I feel like I am actually learning physics, as opposed to just memorizing how to answer a question (which has been the case in the past).
- I really enjoy the group quiz section. It gets a debate going and makes us really think about the concepts. Therefore making the material stick a lot better.
Last thought: With this kind of student feedback, I like to figure out a couple of things that I can improve or change and bring them back to the class as things I will work on. It looks like I will need to ask them a weekly feedback question which asks them specifically about areas of potential improvement in the course.
Syllabus for Digital Electronics Lecture, Fall 2012
Posted: August 14, 2012 Filed under: Electronics Courses, syllabi 2 CommentsI have three new-to-me courses that I am teaching this fall: comp-phys, digital electronics lecture and digital electronics lab. I am sharing the syllabus for my digital electronics “lecture” course below, but have removed a few things which are only relevant internally.
—————–
UFV Physics 362 – Digital Electronics and Computer Interfacing Syllabus (V1) – Section AB1, Fall 2012
About this course
In addition to learning about digital electronics, one of the main goals of this course is to help you develop as a lifelong independent learner. Robert Talbert puts it much better than I ever could (http://goo.gl/ZIh0R):
“As you move through your degree and eventually into your career and your adult post-college life, your main value to the rest of the world and to the people you love is your ability to learn and grow without needing other people around to make it happen. There are many times in life where you MUST learn something, and you can’t wait for the next semester at the local college to come around for you to sign up for a course. You have to take charge. You have to learn on your own.”
This course is structured around the idea that you will do some initial learning on your own before you come to class and then in class you will work with your peers to deepen your understanding. You will be doing the heavy lifting in class instead of just watching me do examples and derivations on the board (do you remember how proficient you became at sword-fighting by watching the Princess Bride?). Some students find this very disorienting and some of you will find that this course structure will take you out of your normal comfort zone. The best thing you can do is come into the course with a positive attitude and be prepared to tweak your normal recipes for success to be able to get the most out of this course.
Please note that this course has a corequisite lab (Physics 372) which will focus on the hands-on aspects of digital electronics as well as the interplay between theory and hands-on applications.
Course Description (from the official outline)
This course emphasizes elementary digital electronics and interfaces. Topics include gates and Boolean algebra, Karnaugh maps, flip flops, registers, counters and memories, digital components, microprocessor functions and architecture, instruction sets, D/A and A/D converters, and waveshaping. PHYS 372, the laboratory portion of this course, must be taken concurrently. This course is designed to provide practical experience with the basic digital logic chips and how digital circuits can be interfaced with microprocessors.
Learning Goals
Note that we will co-construct a proper set of detailed learning goals as we proceed through this course and those detailed learning goals will define what sort of questions can be asked on the quizzes and the final exam. The learning goals listed below, which are from the official course outline, are meant to be very broad and as such only provide a very rough framework in which we will fit all the fun that is digital electronics.
Learning goals from the official course outline: This course is designed to provide students with:
- the theory needed to understand the purpose and how digital devices function;
- an understanding and an appreciation of how a digital computer functions;
- the ability to design, construct and test simple digital logic circuits;
- an ability to program the common microprocessors;
- how information can be transferred to and from computers.
Textbook
Tony R. Kuphaldt, Lesson in Circuits: Volume IV – Digital, http://openbookproject.net/electricCircuits/Digital/index.html
In addition to this online textbook, I will leave a nice big pile of electronics textbooks in A353 for your use. As a group we can sort out a reasonable scheme for lending out these books while making sure that they are still available to everybody.
Course Components
Pre-class Assignments: The engine that drives this course is the collection of Socratic Electronics worksheets. For each worksheet, you will be assigned to research and answer a subset of the questions. In class you will present your findings in small and large groups. The goal is for you to learn how to locate information, problem-solve, collaborate, and clearly articulate your thoughts while learning about digital electronics. The answers to all the questions will be provided with the worksheets, so it is the solutions in which I am most interested and for which you are responsible in your preparation.
Class Periods: I run each class period under the assumption that you have completed the relevant pre-class-assignment and have made a genuine effort to make some sense of the material before showing up to class. We will use class time to help you clarify your understanding of the material and to build on the core ideas that you wrestled with in your pre-class assignments. In class you will mostly be working in small groups. Not all members of a group will have been assigned the exact same pre-class questions, so the first thing that you will do is present your own findings and come to group consensus on the solutions. In class I will also ask you to work on additional questions from the worksheets as well as other additional questions which I will provide. At appropriate times, I will provide mini-lectures to clarify ideas or to plant the initial seed for an idea which you will be studying on an upcoming pre-class assignment.
Peer and instructor assessment of pre-class and in-class work: Each class period you will be given a number of contribution points to spread among the rest of your group (not including yourself) based on how much their pre-class preparation and in-class work contributed to your group’s overall learning in class. The exact number is 8*(N-1)+1, where N is the number of students in your group. You can give any individual student up to 10 points and do not have to give out all of your points. I will average the points assigned to you by the rest of your group for each class period. If needed, I may adjust this average up or down by up to a couple of points if I feel that your class period average is a very poor match to my own observations of your pre-class preparation and in-class contributions. I prefer not to have to make any adjustments this way and will very clearly spell out for you what factors I have considered when adjusting this daily class period average. I will drop your five worst daily class period averages when calculating your final mark for this category.
Homework: Nope, but I will make sure that you have sufficient resources for quiz and exam preparation.
Quizzes: Roughly every two weeks we will use the entire class period to have a quiz, for a total of 5 quizzes over the course of the term. The quiz will be split into two pieces: a solo quiz and a group quiz. You will first write the solo quiz and then approximately 2/3rds of the way through the class period I will collect the solo quizzes and then get you to write the group quiz, typically in groups of 3 or 4. The group quiz will mostly be the same as the solo quiz, but will often have some extra questions. If you score higher on the solo quiz than the group quiz, I will use your solo quiz mark when calculating your overall group-quiz average.
Quiz Averages: I will use your best 3 of 5 group quiz scores when calculating your overall group-quiz average. Things are a little more complicated for your overall solo-quiz average. In addition to the three-hour final exam, I will be creating five different half-hour-long re-tests, one for the material covered on each of the five quizzes during the term. You can choose to write two of these re-tests as part of the final exam and your mark from each of those re-tests can replace your earlier mark on the corresponding quiz (including if you missed the earlier quiz completely). The catch here is that I will only allow you to write a given re-test if you demonstrate to me that you have put in a reasonable amount of effort to learn that material. I will expect you to make your case by presenting me with the specific things that you did to learn the material and that you did to learn from your mistakes on the initial quiz.
Evaluation Scheme
Peer and instructor assessment of pre-class and in-class work: |
20% |
Solo quizzes: |
40% |
Group quizzes: |
10% |
Final exam: |
30% |
Tentative Course Schedule
The numbers Sxx indicate the worksheet number for that day’s worksheet. The worksheets can be found at http://www.ibiblio.org/kuphaldt/socratic/doc/topical.html
Week of | Monday | Wednesday | Friday | Notes |
Sept. 3 | D01 – Numeration Systems (S04) | No class | Classes begin Sept. 4. | |
Sept. 10 | D02 – Binary Arithmetic (S05) | D03 – Digital Codes (S06) | D04 – Basic Logic Gates (S03) | |
Sept. 17 | D05 – TTL Logic Gates (S07) | D06 – CMOS Logic Gates (S08) | No class | |
Sept. 24 | Quiz 1 | D07 – Trouble Gates (S09) | D08 – Boolean Algebra (S13) | |
Oct. 1 | D09 – Sum-of-Products and Product-of-Sums Expressions (S14) | D10 – Karnaugh Mapping (S15) | No class | |
Oct. 8 | Thanksgiving. No classes. | D11 – Binary Math Circuits (S16) | Quiz 2 | Wednesday Oct. 10 is last day to withdraw without W appearing on transcript. |
Oct. 15 | D12 – Encoders and Decoders (S17) | D13 – Multiplexers and Demultiplexers (S18) | No class | |
Oct. 22 | D14 – Latch Circuits (S21) | D15 – Timer Circuits (S22) | Quiz 3 | |
Oct. 29 | D16 – Flip-flop Circuits (S23) | D17 – Counters (S26) | No class | |
Nov. 5 | D18 – Shift Registers (S28) | Quiz 4 | Remembrance day. No classes. | |
Nov. 12 | D19 – Digital-to-Analog Conversion (S30) | D20 – Analog-to-Digital Conversion (S31) | No class | Tuesday Nov. 13 is the final day to withdraw from courses. |
Nov. 19 | D21 – Memory Devices (S34) | D22 – Optional Topics (see notes) | D23 – – Optional Topics | |
Nov. 26 | Quiz 5 | D24 – Optional Topics | No class | Potential topics include digital communication, micro-controllers, state machines and electro-mechanical relays. |
Dec. 3 | D25 – Optional Topics | Monday Dec. 3 is the last day of classes |
Nonlinear narratives in the inverted classroom
Posted: March 2, 2012 Filed under: Flipped Classroom, Introductory Electricity and Magnetism, Inverted Classroom 2 CommentsI have temporarily taken over an introductory E&M course from one of my colleagues. I’m teaching the course using his format (and notes) which means that I am (A) lecturing and (B) not using pre-class assignments for the first time since 2006. In addition to his format, I am using the odd clicker question here and there.
The thing that has been the most interesting about lecturing in a non-inverted class has been the difference in narrative. In my regular courses, I assume that the students have had contact with all the major ideas from a given chapter or section before I even ask them the first clicker question. Because of this we are able to bring all the relevant ideas from a chapter to bear on each question if needed. This is what i am used to.
My colleague’s notes develop the ideas in a nice linear fashion and very easy to lecture from, but I just can’t stop myself from bringing in ideas that are multiple sections down the road. I am having a ton of trouble, even with a set of notes in front of me, letting the story develop according to a well laid-out narrative. It has simply been too long since I have presented material in this sort of a structured way. Note that when I give a talk at a conference it takes me a ton of practice to massage the talk I have prepared into something which I am able to deliver using a nice linear narrative. Even when it is nicely laid out, I will jump ahead to other ideas if I don’t spend some serious time practicing not doing that.
It has been really interesting being the one completely responsible for the narrative instead of sharing that responsibility with the resources that I provide for my pre-class assignments.
It has also been weird not having the inmates run the asylum.
Homework Presentation Rubric V1
Posted: February 20, 2012 Filed under: Quantum Mechanics, Rubrics 7 CommentsIn my 3rd-year quantum mechanics course last term I had the students each take a turn presenting an additional problem to the class. I wanted them to place emphasis on setting up their problem and interpreting their results over showing the intermediate mathematical grinding.
I wanted to share the rubric because I know how incredibly helpful it was to find rubrics that others had shared when I was putting together my own rubrics for various things. I have always adapted the rubrics that I found to suit my own situation and preferences, but they always provide a very helpful starting point as well as providing a useful framework when trying to put together my criteria.
A few notes first:
- I asked them to give an 8-10 minute presentation, which sets the time scale against which “Appropriateness and depth was compared”.
- Each category is assigned a score according to the lowest of the different things which could be evaluated as part of that category. For example, in “Appropriateness and depth”, a student that gave an overly long talk (say 15 minutes instead of the max of 10 minutes that I asked for) [Acceptable] and whose presentation only required minor clarifications [Good] would be assigned an overall score of “Acceptable” for that category. When one of the criteria scores significantly lower than the others, I usually bump up the score so in the example above if there had been no clarification questions needed at all, I would have scored the overall category as “Good”.
- One of the problems with a rubric with such specific criteria is that students always find amazing and new ways to break the rubric since it is nigh-impossible to anticipate every possible scenario. So I usually find ways to work these things into the rubric as well as I can and err on the side of benefit to the student. One of the ones that annoys me the most is when something comes up that crosses multiple categories of vastly different weights. I try not to double-penalize the students so it will mean that I am choosing between giving students a “Good” in a category worth very few points and one worth many points. And this choice tends to come with a fairly large swing in overall grade. I try to make notes of the occurrences so that I can revise the rubric in the future, but students are good at breaking any system you come up with.
Any and all feedback welcome.
Word version of the rubric: Homework_Presentation_Rubric_V1.docx
Excellent (x1) | Good (x0.75) | Acceptable (x0.5) | Poor (x0.25) | Unacceptable (0) |
---|---|---|---|---|
A1. Roadmap and organization [2 pts] | ||||
The main ideas or overall purpose (“what the question is about”) of the presentation are clearly communicated at the start of the presentation. The purpose of each sub-question is clearly stated before jumping directly into the details. A brief summary is provided for each sub-question, tying the answer back to the original sub-question. If appropriate (e.g., all the sub-questions make up a greater whole), a summary of the overall question is provided. There is room for creative license here, but the main point is that the presentation needs to be well-organized. | Brief purposes and summaries are provided for most of the sub-questions. Some attempt is made to present the main ideas or overall purpose of the question at either the beginning or end of the presentation. | Brief purposes and summaries are provided for most of the sub-questions. No attempt is made to present the main ideas or overall purpose of the question. | Brief purposes and summaries are provided for less than half of the sub-questions. | No attempt is made to present the main ideas or overall purpose of the question. No attempt is made to present the purpose or summarize any part of the question. |
A2. Appropriateness and depth [2 pts] | ||||
The presentation is presented at the appropriate level for another person enrolled in the course (a “peer”) to be able to follow along with only minor clarification questions. Mathematical details are presented in a concise way, but are still worked out in sufficient depth that a peer does not need to fill in important details on their own. The overall presentation makes good use of time. | One or two major clarification questions would be needed to fill in conceptual or important mathematical details that were left out. Mathematical details are mostly presented in a concise way. The presentation ran a little long or a little short, but was overall still reasonable in terms of use of time. | Multiple major clarification questions would be needed to fill in conceptual or important mathematical details that were left out. More effort should have been put in to make the presentation more concise or to make the presentation fill the time allotted. | Due to shooting way too high or way too low, a peer would wonder if this presentation was targeted toward a person in this course. Little effort appears to have been put in to make the presentation concise. | No effort appears to have been put in to make the presentation concise or the presentation lacks enough depth to be informative in any way. |
A3. Consistency and correctness of terminology and notation [2 pts.] | ||||
Terminology is always used correctly or when a mistake in terminology is made it is corrected by the end of the presentation. Notation and terminology are used in a consistent way. | Some terminology is misused or is missing as a result of nervousness or oversight, but the audience recognizes that the presenter would probably be able to correct these errors if follow-up questions were asked. This misuse of terminology does not introduce any significant confusion into the presentation. There are one or two inconsistencies in notation or terminlogy that are left unaddressed. | Some terminology is grossly misused or missing, and would be distracting to a peer. There are enough inconsistencies in notation and terminology to be distracting to a peer. | Enough terminology is misused or missing to distract a knowledgeable audience and to confuse a peer. There are enough inconsistencies in notation or terminology to be confusing to a peer. | Terminology is misused or notation / terminology are used inconsistently to the point that a peer would find it mostly impossible to follow the presentation. |
A4. Accuracy and completeness of Physics [6 pts.] | ||||
The physics in the presentation is consistently accurate. Corrections to inaccuracies are made at the time of the mistake or by the end of the presentation. | No significant errors or omissions are made. Audience is able to recognize that small errors or omissions are the result of nervousness or oversight. | One significant error or omission is made. | Multiple significant errors or omissions are made. | Errors, contradictions and omissions are apparent and serious enough to make it almost impossible for a peer to determine which information is reliable. |
A5. Interpretation of results [4 pts.] | ||||
Obvious effort is made to interpret results (in terms of analogous results in other contexts, why the result makes sense, or why the result is counterintuitive) whenever possible. The flow of the presentation is such that the mathematical details feel like their purpose is to support the results and their interpretation. | Some effort is made to interpret results, but it feels like these interpretations take a back seat to mathematical details. | There is only a small effort made to interpret results, and one or two results that beg for interpretation (e.g., extremely counter-intuitive results, obviously incorrect results due to execution errors) are mostly overlooked. The purpose of the presentation appears to be a demonstration in mathematical grinding. | Most or all of the results that beg for interpretation are overlooked. | No effort at all is made to interpret any of the results. |
A6. Correctness of execution [2 pts.] | ||||
No mathematical or other execution errors survive uncorrected. | One or two minor mathematical errors are made, but these do not result in answers that are incorrect in a significant way. | There are multiple mathematical errors, but do not result in answers that are incorrect in a significant way. | One or more errors are made that result in answers whose incorrectness should be apparent if the presenter were to try to interpret the answer or consider physics issues such as units. (Yes, you do get penalized for this sort of thing in multiple categories.) | A step in the solution is purposely manipulated to compensate for an earlier mathematical error and to attempt to force a reasonable or known result. |
A7. Speaking style [1 pt.] | ||||
Presentation is free from vocal fillers. Speaking style is conversational. Vocal variety (pitch, volume, pace, etc.) is used to enhance the message. Words are enunciated clearly. | Vocal fillers are sometimes present, but are not distracting. Speaking adheres mostly to a conversational style. One or two words are not enunciated clearly. | Vocal fillers are often present and are sometimes distracting. Pace is rushed. Speaker sometimes reads passages aloud from the poster or recites them from memory with a complete lack of vocal variety. | Vocal fillers are often present and very distracting. Parts of the presentation are difficult to understand due to a lack of enunciation or appropriate speaking volume. Speaker usually reads passages aloud or recites them from memory with a complete lack of vocal variety. | Most of the presentation is difficult to understand due to a lack of enunciation or appropriate speaking volume. |
A8. Ability to answer questions [2 pts.] | ||||
Speaker answers all reasonable questions correctly and coherently. | Speaker answers most of the reasonable questions correctly and coherently. Answers to questions indicate that the fundamentals are reasonably well understood. | Answers to questions indicate that most of the fundamentals are reasonably well understood, but one or two important fundamental ideas are not. | Answers to questions indicate that many of the fundamentals are not reasonably well understood. | Answers to questions indicate that little to none of the fundamentals are reasonably well understood. |
Overall [21 pts.] |
The rubric was inspired by “NEIU Oral Communication Rubric” and “PHY420 Final Oral Presentation Rubric” by Ernie Behringer at Eastern Michigan University, but no longer bears any real resemblance to those rubrics.
Recent Comments