You should ignore film ratings on IMDb and Rotten Tomatoes
Picking a film to watch is an emotional rollercoaster. First, you have to deal with the crushing knowledge that none of your streaming services of choice actually have the film you want to watch. Then you narrow the field down to three films that you never really intended to watch but are the only half-decent options available.
At this point, paralysed by the thought of making the wrong decisions in life, you will Google the ratings of these films to find out if they\u2019re worth your time. Three hours later \u2013 unable to make a decision because of the conflicting information \u2013 you realise that it\u2019s too late to start watching a film now anyway and settle down to watch old episodes of Parks and Rec.
But why do the big film-ranking sites come up with such radically different options? Is The Wizard of Oz really best film of all time, or is it The Shawshank Redemption? Why does Metacritic think that Ratatouille is the twenty-third best film in the history of cinema?
To answer all these questions, let\u2019s take a look at how the three biggest film-ranking sites come up with their ratings, and why you should ignore them all.
Movie-rating: the methodology
On IMDb, all films are given an overall rating out of ten. In a roundabout way, these ratings are derived from votes submitted by IMDb users, not movie critics.
All registered IMDb users can submit a single rating \u2013 a number between one and ten \u2013 for any film on the website. These votes are then re-jigged so that certain demographics (newly-registered users, for example) don\u2019t disproportionately influence the overall ranking of the film. IMDb doesn\u2019t disclose how it re-jigs these votes, but what does mean is that a film\u2019s ranking is not quite an overall average of all its user scores, but it\u2019s probably quite close.
Read more: 52 of the best films on Netflix UK this week
Just to be extra helpful, IMDb\u2019s Top 250 films are ranked in a slightly different way. Only votes from \u2018regular IMDb voters\u2019 are used to make up these rankings. Helpfully, IMDb doesn\u2019t say what makes someone a regular IMDb voter.
In short: IMDb ratings are based on the votes of the website\u2019s users, with a little bit of mathematical re-jigging to stop certain groups disproportionately influencing the vote.
This all sounds very egalitarian, but as we\u2019ll see, most IMDb voters are male, which seems to skew the rankings in favour of films that are aimed more towards men.
****: 1. The Shawshank Redemption
****: 2. The Godfather
****: 3. The Godfather: Part II
****: 4. The Dark Knight
****: 5. 12 Angry Men
****: 6. Schindler's List
****: 7. Pulp Fiction
****: 8. The Lord of the Rings: The Return of the King
****: 9. The Good, the Bad and the Ugly
****: 10. Fight Club
Rotten Tomatoes gives films a score out of 100 based on the averaged reviews of professional film critics. If a film gets a rating of 60 or more it gets a \u2018fresh\u2019 red tomato on the site. Less than 60 and it gets a rotten tomato. The best films are picked out for a \u2018certified fresh\u2019 rating, which usually means the film has at least 80 critical reviews and a rating of 75 or more. The website also separately ranks film by user scores, but let\u2019s not get distracted by that here.
For its main rankings, Rotten Tomatoes only takes into account reviews from approved critics and approved publications. To rank as an approved critic, you have to write for a large or well-regarded website, magazine or newspaper.
But just to make things a little more complicated, Rotten Tomatoes also weights its rankings depending on how many reviews a film has. That\u2019s why The Wizard of Oz with an average score of 99 from 111 reviews beats Citizen Kane, which an average score of 100 from 75 reviews, to the top spot.
In short: Rotten Tomatoes ranks selected critics' reviews, and tweaks the rankings to favour films with a large number of positive reviews,
And you guessed it. Most of rotten Tomatoes\u2019 selected critics are men.
****: 1. The Wizard of Oz
****: 2. Citizen Kane
****: 3. The Third Man
****: 4. Get Out
****: 5. Mad Max: Fury Road
****: 6. The Cabinet of Dr. Caligari (Das Cabinet des Dr. Caligari)
****: 7. All About Eve
****: 8. Inside Out
****: 9. Metropolis
****: 10. The Godfather
Metacritic also gives films a score out of 100, based on published critics\u2019 reviews. The site converts letter or number scores from reviews into a score out of 100 and then weights those scores so that some reviews influence the score a little more than others do.
The website doesn\u2019t publish a list of its featured critics, but you can see the list of which publications it aggregates scores from here. This list of publications is updated on a regular basis, but Metacritic doesn\u2019t say why it picks some websites and ignores others.
Read more: 40 of the best documentaries you need to watch
Unlike Rotten Tomatoes, Metacritic seems to calculate its ranks based on fairly small numbers of critics\u2019 reviews, so they are more subject to strange fluctuations in the rankings.
In short: Metacritic works a bit like Rotten Tomatoes, but with fewer reviews.
Metacritic seems to place a bit more emphasis on publishers rather than critics, so it\u2019s hard to get an idea what the gender balance of reviewers is. Its top-ranking film \u2013 Citizen Kane \u2013 is based on reviews from only two women and ten men, though.
****: 1. Citizen Kane
****: 2. The Godfather
****: 3. Casablanca
****: 4. Boyhood
****: 5. Three Colors: Red
****: 6. Singin' in the Rain
****: 7. Moonlight
****: 8. Pan's Labyrinth
****: 9. Hoop Dreams
****: 10. My Left Foot
Why you should ignore all movie-ranking sites
Don\u2019t be tricked into thinking that movie-ranking sites give some kind of objective rating on how good a film is. All three of the above sites are skewed pretty heavily towards the opinions of men.
Take IMDb\u2019s top-ranked film for example \u2013 The Shawshank Redemption. Its score of 9.3 is based on the votes of around 1.86 million IMDb users. 1.2 million of those votes came from men. IMDb does tweak its rankings to lessen the influence of particular demographics, but men often make up over 70 per cent of the voters for any film.
And it turns out that men tend to look much more favourably on films with more masculine themes, or male leading actors.
A look at the ratings for Sex and the City demonstrates how divided the voting audience on IMDb is. Over 29,000 men gave the film an average rating of 5.8, while 43,000 women came up with a score of 8.1. A straight-up averaging of the scores gives it a ranking of 7.4, but IMDb\u2019s maths leaves it with a final score of 7.
IMDb breaks down the voting demographics for all of its films. Take a flick through them and you\u2019ll see that men consistently rank masculine films higher than films that feature female leads or more traditionally female themes.
Rotten Tomatoes doesn\u2019t come off much better. In 2015, Meryl Streep attacked the website for featuring way more male critics. Back then, there were 168 female critics on the websites\u2019 approved list, and 760 men. A 2016 study from San Diego State University found that only 27 per cent of \u2018top critics\u2019 on the site were women.
There aren\u2019t comprehensive breakdowns for the gender balance on Metacritic, but since it shares many of the same sources at Rotten Tomatoes, it\u2019s likely that the website suffers from a similar degree of bias.
Just pick a film already
If you came here hoping for a verdict on which site should reign supreme in the movie-ranking stakes, then you must be bitterly disappointed. Really, it boils down to this: if you want to know which movies men on the internet tend to like, look on IMDb. If you\u2019re looking for critics\u2019 favourites, go for Rotten Tomatoes. If you want a slightly worse version of Rotten Tomatoes, opt for Metacritic.
Or just watch Inside Out now because it\u2019s lovely and heartwarming and you are clearly incapable of making a decision independently.
This article was originally published by WIRED UK
Verified
\n \n