Rate My Professors

{{Short description|Review website}}

{{Use mdy dates|date=April 2023}}

{{Infobox website

| logo = RateMyProfessors logo (2021).svg

| logocaption =

| screenshot =

| collapsible =

| collapsetext =

| caption =

| url = {{URL|http://www.RateMyProfessors.com}}

| commercial =

| type = Review site

| registration =

| language = English

| num_users =

| content_license =

| owner =

Cheddar (Archetype)

| author =

| editor =

| launch_date = {{Start date and age|1999|5}}

| alexa =

| revenue =

| current_status =

| footnotes =

}}

Rate My Professors (RMP) is a review site founded in May 1999 by John Swapceinski, a software engineer from Menlo Park, California, which allows anyone to assign ratings to professors and campuses of American, Canadian, and United Kingdom institutions.{{Cite web|url=http://www.ratemyprofessors.com/About.jsp|title=About RateMyProfessors.com|accessdate=28 April 2023}} The site was originally launched as TeacherRatings.com and converted to RateMyProfessors in 2001. RMP was acquired in 2005 by Patrick Nagle and William DeSantis.{{Cite magazine|url=https://www.wired.com/2005/09/prof-ratings-site-irks-academics/|title=Prof-Ratings Site Irks Academics|first=Joanna|last=Glasner|magazine=Wired |accessdate=28 April 2023|via=www.wired.com}} Nagle and DeSantis later resold RMP in 2007 to Viacom's MTVU, MTV's College channel.{{Cite web|url=http://www.prnewswire.com/news-releases/mtv-networks-mtvu-agrees-to-acquire-ratemyprofessorscom-53560002.html|title=MTV Networks' mtvU Agrees to Acquire RateMyProfessors.com|accessdate=28 April 2023}} Viacom owned and operated RateMyProfessors.com for a decade. Cheddar announced its acquisition of RMP from Viacom in 2018.{{cite web|title=Cheddar buys a user-generated content biz, Rate My Professors, from Viacom|date=25 October 2018 |url=https://techcrunch.com/2018/10/25/cheddar-buys-a-user-generated-content-biz-rate-my-professors-from-viacom/}} Cheddar was acquired by internet service provider Altice USA in 2019. Cheddar was then sold to media company Archetype in December 2023. RMP is the largest online destination for professor ratings. The site includes 8,000+ schools, 1.7 million professors, and over 19 million ratings.

Ratings and reviews

On RMP, users may post a rating and review of any professor that is already listed on the site. Furthermore, users may create a listing for any individual not already listed. To be posted, a rater must rate the course and/or professor on a 1-5 scale in the following categories: "overall quality" and "level of difficulty". The rater may also share if they would take the professor again, if the class was taken for credit, if attendance was mandatory, if the textbook was used, and what grade the student received in the course; additionally, the rater may include comments of up to 350 characters in length. Since the website does not require users to create an account, non-students or even professors themselves can post ratings. Raters may also select up to 3 tags (from a list of 20) that describe the professor.[http://www.ratemyprofessors.com/AddRating.jsp?tid=1458112 Rate My Professors]{{Cite web|url=http://www.ratemyprofessors.com/help.jsp#tally|title = Rate My Professors Help Center}}

According to the website's help page, "a professor’s Overall Quality rating should reflect how well a professor teaches the course material, and how helpful he/she is both inside and outside of the classroom".{{Cite web|url=http://www.ratemyprofessors.com/help.jsp|title = Rate My Professors Help Center}} The professor's Overall Quality rating determines whether his/her name is accompanied by a smiley face (meaning "Good Quality"), a frowny face ("Poor Quality"), or an in-between, expressionless face ("Average Quality").

Correlation with in-class student evaluations

Using data for 426 instructors at the University of Maine, [researchers] examined the relationship between RMP indices and formal in-class student evaluations of teaching (SET). This study found that the two primary RMP indices correlated weakly with their respective SET items. First, RMP "overall quality" showed a correlation of only r = .68 with SET item "Overall, how would you rate the instructor?" Second, RMP "ease" showed a correlation of r = .44 with SET item "How did the work load for this course compare to that of others of equal credit?" Further, RMP "overall quality" (r = .57) and RMP "ease" (r = .51) were each correlated with its corresponding SET factor derived from a principal components analysis of all 29 SET items. The researchers concluded "While these RMP/SET correlations should give pause to those who are inclined to dismiss RMP indices as meaningless, the amount of variance left unexplained in SET criteria limits the utility of RMP."{{cite web|title=RateMyProfessors.com versus formal in-class student evaluations of teaching|url=http://pareonline.net/getvn.asp?v=12&n=6}} Formal in-class evaluations can only be completed by students who are registered in the course, whereas on RMP, anyone can post ratings, whether or not they have taken the course.

Criticism

=Stanford Law School and Reputation Protection=

Two Stanford law professors argue that a legal framework for protecting online platform reputation should be responsive to the

changing set of practices ushered in by the Internet and capable of resolving conflicts in a fair and satisfactory way. In light of recent failed lawsuits against online content providers, the professors advocate for "a new regime requiring such platforms to formulate an appropriate information policy providing transparency rules, including disclosing how aggregate evaluations are made and providing for a right to respond, to achieve a new body of communication “netiquette” for social evaluation in the online era."Anne SY Cheung and Wolfgang Schulz, "Reputation Protection on Online Rating Sites," 21 STAN. TECH. L. REV. 310 (2018)

=Positive correlation between ease of class and rating of professor=

Research on in-class evaluations shows that professor ratings increase when students rate the course as easy.Mau, Ronald R., & Opengart, Rose A. (2012). Comparing Ratings: In-Class (Paper) vs. out of Class (Online) Student Evaluations. Higher Education Studies, 2(3), 55-68. The same relationship has been shown for RMP. In an article in the journal Assessment and Evaluation in Higher Education, Clayson investigated what RMP actually rates and concluded that "students will give higher evaluations to instructors they judge as being

easy. There is also a suggestion in these findings that, if students like an instructor (for whatever reason), then the easiness of the class becomes relatively irrelevant.".Dennis E. Clayson (2013) What does ratemyprofessors.com actually rate?, Assessment & Evaluation in Higher Education, 39:6, 678-698, DOI: 10.1080/02602938.2013.861384 Clayson concluded that "the majority of the evidence indicates that [ratemyprofessors.com] is biassed by a halo effect, and creates what most accurately could be called a 'likeability' scale." Other analyses of RMP class ratings have come to similar conclusions,Legg, Angela & H. Wilson, Janie. (2012). RateMyProfessors.com offers biased evaluations. Assessment & Evaluation in Higher Education. 37. 89-97. 10.1080/02602938.2010.507299.{{cite web|url=http://www.apa.org/gradpsych/features/2007/ratings.aspx|access-date=28 April 2023|title=Ratings|website=APA|year=2007}}Castro, Daniel, and Robert Atkinson. “Why it’s time to disrupt higher education by separating learning from credentialing”. Washington: Information Technology and Innovation Foundation, 2016. Online. Internet. 16 Apr 2018. Available: http://www2.itif.org/2016-disrupting-higher-education.pdf and some have concluded that professor attractiveness is also positively correlated with evaluation scores on RMP.James Felton, Peter T. Koper, John Mitchell, and Michael Stinson. Attractiveness, easiness and other issues: student evaluations of professors on ratemyprofessors.com. Assessment & Evaluation in Higher Education, 33(1):45–61, 2008 Felton et al. evaluated RMP ratings and found that "the hotter and easier professors are, the more likely they’ll get rated as a good teacher."David Epstein, "[http://insidehighered.com/news/2006/05/08/rateprof ‘Hotness’ and Quality]", Inside Higher Ed, 8 May 2006, accessed 10 May 2008.

=Evaluation bias issues=

A frequent criticism of RMP is that there is little reason to think that the ratings accurately reflect the quality of the professors rated.{{cite news|author=Pfeiffer, Sacha |title=Ratings sites flourish behind a veil of anonymity|date= September 20, 2006|work= Boston Globe Online |url=http://www.boston.com/business/technology/articles/2006/09/20/ratings_sites_flourish_behind_a_veil_of_anonymity/}}{{cite web|website=UWaterloo.ca|url=http://arts.uwaterloo.ca/~kwesthue/berman.htm|author=Westhues, Kenneth |title=Stephen Berman: Scapegoat|date= December 2006}} Another criticism is that ratings have been shown to reflect gender bias toward the professors evaluated.{{cite news|work=WNYC Morning Edition|date=February 23, 2015|title=How We Talk About Our Teachers|url=http://www.wnyc.org/story/how-we-talk-about-our-teachers/|author=Huntsberry, William}} Furthermore, at RMP, "easiness", "clarity", and "helpfulness" are the only components taken into consideration and are not considered well-designed evaluations.{{cite journal|author=Lang, James M. |title=RateMyBuns.com| journal= Chronicle of Higher Education|date= December 1, 2003 |url=http://chronicle.com/jobs/2003/12/2003120101c.htm}}See Fritz Machlup and T. Wilson, cited in Paul Trout, "[http://mtprof.msun.edu/Fall1998/TroutArt.html Deconstructing an Evaluation Form]", The Montana Professor, Vol. 8 No. 3, Fall 1998, accessed 7 May 2008. Edward Nuhfer argues that both Pickaprof.com and RMP "are transparently obvious in their advocacy that describes a 'good teacher' as an easy grader. Additionally, presenter Phil Abrami ... rated RMP as 'The worst evaluation I've seen' during a panel discussion on student evaluations at the 2005 annual AERA meeting."Edward B. Nuhfer, 2005, "[http://www.isu.edu/ctl/faculty/docs/fractalThinker.pdf A Fractal Thinker Looks at Student Evaluations]", accessed 10 May 2008.

=Multiple ratings per person=

Single individuals are able to make multiple separate ratings of a single professor on RMP.Gabriela Montell, "The Art of the Bogus Rating", Chronicle of Higher Education, September 27, 2006 [http://chronicle.com/article/The-Art-of-the-Bogus-Rating/46887/] RMP admitsPfeiffer, "Ratings sites flourish behind a veil of anonymity".{{Better citation needed|reason=The current source is insufficiently reliable (WP:NOTRS).|date=May 2023}} that while it does not allow such multiple ratings from any one IP address, it has no control over raters who use several different computers, or those that "spoof" IP addresses. Also, there is no way of knowing that those who rate a professor's course have actually taken the course in question, making it possible for professors to rate themselves and each other.Montell, "The Art of the Bogus Rating", Chronicle of Higher Education.

=Rating relevancy=

Critics stated that a number of the ratings focus on qualities they see as irrelevant to teaching, such as physical appearance.{{cite web |last1=Bates |first1=Laura |title=Female academics face huge sexist bias – no wonder there are so few of them |url=https://www.theguardian.com/lifeandstyle/womens-blog/2015/feb/13/female-academics-huge-sexist-bias-students |website=The Guardian |access-date=30 June 2018 |date=13 February 2015}} In late June 2018, several academics criticized the website's "hotness" score for contributing to sexism in academia.{{Cite web |last=Flaherty |first=Colleen |title=Bye, Bye, Chili Pepper |url=https://www.insidehighered.com/news/2018/07/02/rate-my-professors-ditches-its-chili-pepper-hotness-quotient |access-date=2024-02-29 |website=Inside Higher Ed |language=en}} On 28 June, RateMyProfessors responded that while the feature was intended to "reflect a dynamic/exciting teaching style," it was often misused; the hotness rating was removed immediately.{{cite web |last1=Dalbey |first1=Alex |title=Ratemyprofessors.com Ends Hotness Rating |url=https://www.dailydot.com/irl/rate-my-professors-hotness-rating/ |website=The Daily Dot |access-date=30 June 2018 |date=29 June 2018}}

RateMyProfessors lets the student identify the course that they took with the professor and combines the ratings for all courses taught by the professor instead of providing separate ratings averages for each course taught.

=Permanent vs adjunct faculty=

Adjunct faculty are not always readily identifiable or verifiable, as such professors may work at multiple universities, change universities frequently, or maintain employment outside an academic setting.{{Cite news |last=Lumpkin |first=Lauren |date=2022-04-28 |title=In a city full of adjunct faculty members, many struggle to get by |url=https://www.washingtonpost.com/education/2022/04/26/adjunct-professor-american-georgetown-gwu/ |access-date=2024-02-29 |work=Washington Post |language=en-US |issn=0190-8286}}{{Cite web |date=2023-03-16 |title=Data Snapshot: Tenure and Contingency in US Higher Education |url=https://www.aaup.org/article/data-snapshot-tenure-and-contingency-us-higher-education |access-date=2024-02-29 |website=AAUP |language=en}}

Data breach

On January 11, 2016, RMP notified its users via email (and with a small notification link on its website) that a decommissioned version of RMP's website suffered a data breach affecting email addresses, passwords, and registration dates.{{Cite web|title = RateMyProfessors.com – Find and rate your professor or campus.|url = http://www.ratemyprofessors.com/securityFAQs|website = www.ratemyprofessors.com|access-date = 2016-01-12}} According to the California Department of Justice website, the security breach occurred six weeks earlier on or about November 26, 2015.{{Cite web|url=https://oag.ca.gov/ecrime/databreach/reports/sb24-59576|title = Submitted Breach Notification Sample|date = 12 January 2016}}

Website features

=Professor Notes=

{{Update|section|date=March 2016}}

After mtvU took over the website, a notes feature was added that allows professors to register with the website (using a ".edu" e-mail address) in order to reply to students' comments. Another option, called "Professors Strike Back", featured videos of professors responding to their ratings on RMP.{{Cite web|url=https://www.mtv.com|title=Reality TV Shows, Celebrity News, Pop Culture & Music Videos|website=MTV|access-date=28 April 2023}} Additionally, in 2015, the site debuted a new series "Professors Read Their Ratings"{{Cite web|url=http://www.ratemyprofessors.com/blog/video/albion-college-professors-read-their-ratings-part-2/|title = Albion College Professors Read Their Ratings – Part 2}} in which professors read and react to their RMP ratings. Students may also submit videos to RMP.[http://www.ratemyprofessors.com/blog/video/submit-to-ratemyprofessors-com submit videos to RMP] RMP

Recognition

In 2008 RMP was recognized by Time Magazine as one of the 50 best websites of 2008.

In 2008, student evaluations of Professors from RMP accounted for 25% of a school's rating in Forbes annual "America's Best Colleges" listing. However, this is no longer true.{{Cite web|url=https://www.forbes.com/sites/cartercoudriet/2017/08/02/top-colleges-2017-the-methodology/|title = Top Colleges 2017: The Methodology|website = Forbes}}

In 2015, the site won two People's Choice Webby Awards after an extensive site overhaul.{{Cite web|url=http://www.ratemyprofessors.com/blog/buzzpost/weve-won-two-peoples-voice-webby-awards/|title = We've Won Two People's Voice Webby Awards!}}

Competitors

The rating company has a variety of competitors. RateMyTeachers, a similar teacher rating forum was launched by Patrick Nagle in 2001.{{Cite web|url=http://www.kentwired.com/latest_updates/article_c208ccc8-e885-11e9-8535-2f09c86a4964.html|title = Rate My Professor aids in student choices}}

See also

References

{{reflist}}