Rate My Professors Is a Flawed Site

04/12/2015 10:04 EDT | Updated 06/12/2015 05:59 EDT

The site allows students to write anonymous comments about their professors (or more strictly speaking, university teachers, because many are not actually professors) and rate them on their teaching performance. As a teaching evaluation it is the bluntest of blunt instruments, as the student only gives three ratings: helpfulness, clarity and easiness.

In addition there is the controversial "Hotness Rating" in chili peppers, described as "a bit of fun" by the site owners, but really just an excuse for massive gender-based bias, and one of the leading sources of complaints, quite understandably, from many university teachers. There is also a really inadequate right to reply by the subject (or should I say "target") of the comments.

As you can imagine, many university teachers loathe it completely. In fact, it's not so much a teaching evaluation, as a review of the student experience in the class. It completely ignores all important contexts of where the course fits in the program of study, and why the teaching methods chosen were used, resources available to the teacher to help teach the course, and other necessary background information.

Remember, the course instructor is the subject expert, and understands that, but the student is a learner and cannot be expected to grasp all of the ramifications of the material and the wider context. They will only get that as their knowledge expands beyond the confines of the course.

By the time they finish the degree, they would be able to do a much better job of putting it all in context, but most reviews seem to appear immediately after the course finishes. That gives the possibility of students who performed worse than their expectations giving bad reviews, and skewing the averages very significantly, particularly on small sample sizes. There is an interesting article and comments here at Inside Higher Education.

So are there circumstances where it sums up the student experience of the class relatively accurate? An academic study by Coladarci & Kornfield suggests that the results do correlate reasonably well with more sophisticated evaluations done by university surveys.

That circumstance is where the university teacher is relatively highly-rated by the institution, the ratings from RMP tend to be good too. I am fortunate enough to be in that category, and in my case, there is actually a good correlation to my RMP values. One should be very, very careful though, as gender bias, race bias and all sorts of other things can easily creep into the evaluations. As a white male instructor with a British accent, I am in a favoured group in Canada. Many others are not so fortunate. Hurtful comments based on appearance, accent, and gender are distressingly frequent. It is also possible for the same person to post anonymously using different accounts, leading to considerable biases in the average score.

For lower rankings on RMP, a strong dose of caution is advised. Look extremely carefully at the comments to see if averages are being lowered by students with an axe to grind against that particular instructor. If there is a split between rave reviews and damning comments, then there is almost certainly a section of the class who did not enjoy the class at all, or failed or did not attain the grades they expected. Unfortunately, it is not always possible to judge the context of why this is so. You will often find some student comments such as "This Prof was great, ignore the comments below" -- take those seriously, it means that at least some students in the class found it a rewarding experience.

So given all of these serious issues with RMP, why do people still use it? Simply because most higher education institutions do not release the results of their teaching evaluations on individual professors, so it is impossible for a student to make an informed choice about that particular instructor. It is regrettable that whilst it is easy to find out the research prowess of a particular faculty member by looking at papers published, grants received, and students supervised, it is almost impossible to find out how they do their job in the class room.

Word-of-mouth, RMP and other similar sites are the fact-finding tools available to the students when choosing courses. While this situation continues, students will turn to these sites and may be grossly misinformed about the instructor. This is extremely regrettable and should be an institutional priority to remedy. Students are paying a great deal in tuition fees and should expect good teaching. If the Institution is unable to provide them with information, then they will turn to RMP and the like, the "Wild West" of information sources.

If anyone wants to see my Carleton teaching evaluations, they are quite welcome to. I have nothing to hide. Just leave a comment (I could put them in a blog post, but it would be rather boring!).

My RMP ratings are here for Carleton and here for the University of Saskatchewan

I'm pleased to say that the comments made, are extremely similar to the ones I receive from students on the official teaching evaluations. So, proceed with caution. As cartographers used to write on their maps when there was unknown territory: Here Be Dragons.

This article is also posted on my blog, Precarious Physicist


  • 10. Regent University
  • 9. Samford University
  • 8. University of St. Thomas
    Wikimedia Commons
  • 7. Stanford University
    Justin Sullivan via Getty Images
  • 6. Pepperdine University
  • 5. University of Chicago
    Wikimedia Commons
  • 4. Washington And Lee University
    Wikimedia Commons
  • 3. University of Virginia
    Wikimedia Commons
  • 2. Boston University
    Wikimedia Commons
  • 1. Duke University
    Wikimedia Commons