Mathematica vs. Python in my Computational Physics course

This past year I taught two sections (fall and summer) of Computational Physics at UFV, which is quite a rare thing to happen at a school where we typically run 3rd and 4th year courses every other year. The course was structured so that students would get some initial exposure to both Python and Mathematica for the purposes of physics-related computational tasks, and then as the course went on they would be given more and more opportunities to choose between the two platforms when completing a task. This post looks at the choices made by the students on a per task basis and a per student basis.  From a total of 297 choices, they chose Mathematica 62.6 ± 2.8% (standard error) of the time. Two things to note. First, there is a small additional bias toward Mathematica because two of the tasks are more Mathematica-friendly and only one of the tasks is more Python-friendly, as will be discussed. Second, there were students coming into the course with some previous experience using Mathematica or a strong computing background in Java or C++ and those students usually gravitated toward Mathematica or Python, respectively. But some students did the opposite and took advantage of the opportunity to become much more familiar with a new-to-them platform and tried to do as much of their work as possible using the new platform.

Task by task comparison

The general format of the course is that, for the first three weeks, the students are given very similar tasks that they must complete using Python and Mathematica. Examples include importing data and manipulating the resulting arrays/lists to do some plotting, and basic iterative (for/do) loops. Each of those weeks there is a third task (1.3 and 2.3 in Figure 1), where they build on those initial tasks and can choose to use either Python or Mathematica. In subsequent weeks, being forced to use both platforms goes away and they can choose which platform to use.

Figure 1 shows their choices for those tasks in which they could choose. In week 3 they learn how to do animations using the Euler-Kromer method in Mathematica and using VPython. Although Mathematica works wonders with animations when using NDSolve, it is really clunky for Euler-Kromer-based animations. Task 4.1 is a mini-project (examples) where they animate a collision and since they have only learned how to do this type of thing using Euler-Kromer at this point, VPython is the much friendlier platform to use for the mini-project. But as you can see in the third column in Figure 1, some students were already quite set on using Mathematica. Conversely, for tasks 8.2 and 8.3, Mathematica is the much more pleasant platform to be using and you can see that only a few diehards stuck to Python for those ones.

Comparison between Mathematica and Python usage by task. Each column represents submissions from approximately 25 students (typically 23) or student groups (typically 2). There is some variation in the total counts from column to column due to non-submissions. Final projects were done individually, but for each of the two groups, members made the same choice with regards to platform for their final projects.

Figure 1. Comparison between Mathematica and Python usage by task. Each column represents submissions from approximately 25 students (typically 23) or student groups (typically 2). There is some variation in the total counts from column to column due to non-submissions. Final projects were done individually, but for each of the two groups, members made the same choice with regards to platform for their final projects.

Student by student comparison

Figure 2 shows the breakdown of these choices by student or group. For those seven students that only used Python by choice once, it was for the mini-project. And the two students that only used Mathematica once used it for the Linear Algebra exercises.

Comparison of Mathematica and Python usage on a student by student (or group by group) basis. They have been put in order from maximum Mathematica usage to maximum Python usage.

Figure2. Comparison of Mathematica and Python usage on a student by student (or group by group) basis. They have been put in order from maximum Mathematica usage to maximum Python usage. Variation in the total counts from column to column is due to non-submissions. Final projects were done individually, but for each of the two groups, members made the same choice with regards to platform for their final projects.

Student feedback

After the course was completed, I asked the students for some feedback on various aspects of the course and one of the questions I asked was

How did you find that the mix of Mathematica and Python worked in this course?

Approximately half of the students submitted feedback and their responses were as follows

I would have preferred more emphasis on Mathematica 1 14%
I would have preferred more emphasis on Python 0 0%
The mix worked well 6 86%

And their comments

  • Kinda hard to choose one of the above choices. I felt like I wish I could’ve done more stuff in python but I was happy with how proficient I became at using Mathematica. So I think the mix worked well, though I kinda wish I explored python a bit more.
  • I think it was very valuable to see the strengths and weaknesses of both Mathematica and Python as well as to get familiar with both of them.
  • It was cool to see things done in two languages. I’m sure that people less enthusiastic about programming would probably have preferred one language, but I thought it was good.
  • Doing both to start followed by allowing us to pick a preferred system was great.

In closing

I ended up being quite happy with how the mix worked. I learned a ton in the process because I came in being only mildly proficient with each platform. I was also really happy to have the students come out of the course seeing Python as a viable alternative to Mathematica for a wide range of scientific computing tasks, and conversely getting to see some places where Mathematica really excels.


75 vs. 150

As previously mentioned, a significant component of my new job at UBC this year is curriculum design for a first year cohort that will be taking, among other things, Physics, Calculus and Chemistry, and will have English language instruction somehow embedded into the courses or the support pieces around the courses. These students are mostly prospective Math, Physics and Chemistry majors. An interesting discussion we are having right now relates to class size; specifically class sizes of 75 versus 150.

To me, there are two main possible models for the physics courses in this program:

  1. Run them similar to our other lecture-hall first-year courses, where clickers and worksheets make up a decent portion of the time spent in class. In this case, 75 vs. 150 is mostly the difference in how personal the experience is for the students. Based on my own experience, I feel like 75 students is the maximum number where an instructor putting in the effort will be able to learn all the faces and names. With TAs embedded in the lecture, the personal attention they get in the lecture could be similar when comparing the two sizes, but there is still the difference of the prof specifically knowing who you are.
  2. Rethink the space completely and have a studio or SCALE-UP style of classroom, where students sit around tables in larger groups (9 is common) and can subdivide into smaller groups when the task require it. This would mean that 75 is the only practical choice of the two sizes. This type of setup facilitates transitioning back and forth between “lecture” and lab, but it is not uncommon for “lecture” and lab to have their own scheduled time as is the case in most introductory physics classrooms.

Going with 75 students is going to require additional resources or for program resources to be re-allocated, so the benefits of the smaller size need to clearly outweigh this additional resource cost. My knee-jerk reaction was that 75 is clearly better because smaller class sizes are better (this seems to be the mantra at smaller institutions), but I got over that quickly and am trying to determine specifically what additional benefits can be offered to the students if the class size is 75 instead of 150. But I am also trying to figure out what additional benefits could we bring to the student if we took the resources that would be needed to support class sizes of 75 and moved them somewhere else within the cohort.

What do you think about trying to argue between these two numbers? Have you taught similar sizes and can offer your perspective? I realize that there are so many possible factors, but I would like to hear which things might be important from your perspective.


Help me figure out which article to write

I have had four paper proposals accepted to the journal Physics in Canada, which is the official journal of the Canadian Association of Physicists. I will only be submitting one paper and would love to hear some opinions on which one to write and submit. I will briefly summarize what they are looking for according to the call for papers and then summarize my own proposals.

Note: My understanding is that the tone of these would be similar to articles appearing in the Physics Teacher.

Call for Papers

Call for papers in special issue of Physics in Canada on Physics Educational Research (PER) or on teaching practices:

  • Active learning and interactive teaching (practicals, labatorials, studio teaching, interactive large classes, etc.)
  • Teaching with technology (clickers, online homework, whiteboards, video- analysis, etc)
  • Innovative curricula (in particular, in advanced physics courses)
  • Physics for non-physics majors (life sciences, engineers, physics for non-scientists)
  • Outreach to high schools and community at large

The paper should be 1500 maximum.

My proposals

“Learning before class” or pre-class assignments

  • This article would be a how-to guide on using reading and other types of assignments that get the students to start working with the material before they show up in class (based on some blog posts I previously wrote).

Use of authentic audience in student communication

  • Often, when we ask student to do some sort of written or oral communication, we ask that they target that communication toward a specific imagined audience, but the real audience is usually the grader. In this article I will discuss some different ideas (some I have tried, some I have not) to have student oral and written tasks have authentic audiences; audiences that will be the target audience and actually consume those communication tasks. This follows on some work I did this summer co-facilitating a writing across the curriculum workshop based on John Bean’s Engaging Ideas

Making oral exams less intimidating

Update your bag of teaching practices

  • This would be a summary of (mostly research-informed) instructional techniques that your average university might not be aware of. I would discuss how they could be implemented in small and large courses and include appropriate references for people that wanted to learn more. Techniques I had in mind include pre-class assignments, group quizzes and exams, quiz reflection assignments, using whiteboards in class, and clicker questions beyond one-shot ConcepTests (for example, embedding clicker questions in worked examples).

Your help

And where you come in is to provide me with a bit of feedback as to which article(s) would potentially be of the most interest to an audience of physics instructors that will vary from very traditional to full-on PER folks.