By Ben Spaeth
Photo: Vox – Source
Just about everyone has their own opinion about any given piece of media that makes its way through theaters these days. When it comes down to whether you enjoyed the film or not though, there are really only two camps someone can join. This is precisely what websites like Rotten Tomatoes do. They analyze a review and then determine if the review is either positive or negative. The only issues in this process being that most critics don’t grade on a thumbs up thumbs down scale. Usually they’ll give a letter grade, rating 10, or rating out of five stars. Some critics do have access to their Rotten Tomatoes reviews and are able to adjust it to properly fit their full review, but there are still some issues with this system. Take for instance critics that thought the film was just ok. Not bad enough to warrant a rotten tomato, but not good enough to warrant consideration with some of the greatest films. The problem arises when a majority of critics agree that a movie was just ok. Rotten Tomatoes then sees all of those reviews as positive. Thus inflating the film’s score on Rotten Tomatoes.
Evidence of this can be seen in most Marvel movie scores. Marvel films aren’t blowing credits out of the water with their mise-en-scène, but most entries into the franchise are solid enough to warrant relatively high scores. However, Marvel movies regularly provide enough entertainment value to earn a decent review. The issue here is most Marvel movies do not deserve to be consistently placed in the low to mid 90’s on any review site rating out of 100. Personally I think that space should be reserved for films that transform the medium, are impactful, or should be considered for an award.
The system isn’t just flawed from the critics perspective though. The audience reviews on Rotten Tomatoes have been fairly busted for a while. Trolls have tried to “review bomb” films which means that they spam negative reviews to purposefully lower the audience score on Rotten Tomatoes. Some the films that have been review bombed include The Last Jedi, Captain Marvel, and Black Panther. One can only imagine the angry geeks who created 15 different email addresses to take the movie they don’t like because there’s a woman in it, down a couple percentage points on a tomato themed movie review website. To combat review bombing, Rotten Tomatoes has put in a few safe guards. The first being that you can no longer review a film before it comes out, which makes sense and only really prevents people who went to a prescreening from reviewing. The other is that the primary audience score that is now displayed is from verified ticket buyers. There's still a separate section with all audience scores, but displaying the verified score first diminishes the reward for review bombing any particular film. However, this system does not work with films on streaming services as there’s currently no way to prove someone streamed a film.
Despite the system’s flaws, it is still an interesting metric to use to compare how audiences and critics differ on the perception of a film. Critics and audiences tend to have different biases when it comes to particular genres. Action and comedy movies generally score higher than critics. While critics give higher scores to artsier films and slower paced films then most general audience members. There are some films though, that have absurdly high differences between their critics and audience scores. One reason for this could be that the critic reviews are often from around the period the film came out. Unlike most critic reviews, audience reviews are entered at any time after the release. Meaning overtime audience reviews will match the continued perception of a film while the critic score just captures what people thought of the film at the time.
With all this said, I’d like to take a look at some of the films with the largest differences in critic and audience review scores. The first film I’d like to examine is Bright (2017). Bright was widely regarded as being a hot mess of a film after being released. Critics called the film “flat and generic” and the film received an abysmal 27% critic score on Rotten Tomatoes. Despite this rating, the audience score for Bright is 83% with over 10,000+ reviews of the film. This is one of the most drastic examples I could find, but other notable films that critics hated and audiences loved according to Rotten Tomatoes include Venom (30% vs 81%), We’re the Millers (49% vs 72%), Passengers (30% vs 63%), and The Greatest Showman. (56% vs 86%)
There are also a plethora of films that critics love and audiences hate. I think this is an interesting group to look at as I feel it further isolates the difference in taste critics and audiences have. Some examples of these films include Indiana Jones and the Kingdom of the Crystal Skull (78% vs 53%), Us (93% vs 60%), Sausage Party (82% vs 50%), and Hail, Caesar! (86% vs 44%)
It’s hard to pinpoint any exact reason for the differences of opinion between audience and critics. All the films listed are very different in their concept, tone, and genre. Making it hard to spot any particular reason for the divide other than subjective difference in opinion. For now the mystery of why critics and audiences can be incredibly divided about particular films remains unsolved.