Films should not be penalized for getting 10/10 ratings - A new look at why not.

  • 1
  • Question
  • Updated 6 years ago
  • Answered
In an effort to understand IMDB's rating system I have looked through all of the threads on score bombing etc. and I have taken In the time to examine how scores are calculated. After digesting everything I feel that there is a problem IMDB's rating system. Here is why.

I have a film which has done very well as a niche film in the film festival circuit, but it has a very poor rating on IMDB due to the fact that many of the people who watched it rated it a 10/10. It has won numerous prestigious awards and has a very passionate following, although of course it is a niche following and definitely not a mainstream film. It is not comparable to a mainstream film such as Avatar or The Matrix, but within its own spirit-science documentary genre it is at the top of its class.

So here is the issue: My film got hit with 3 scores of 1 (within a day) and has dropped dramatically, even though it has 153 scores of 10/10, making it seem like the film is worthless. It seems that a film with a passionate fan base gets unfairly penalized in the current system and can be taken down by a few (or single) individual determined to give the film a bad rating. The higher the true fans rate it, the more worthless their votes are. This has several overall effects:

1) Because the system assumes that ANY film that gets rated 10/10 is fraudulent or the result of score-tampering, films with legitimate high scores are penalized, resulting in only mediocre films succeeding in the system.

2) It keeps good niche films from gaining a wider audience.

I know the weighting has been a contentious issue and that concerns of film-makers have been responded to over and over. I hope I have been able to present the problem in a clear light and that someone will take action to correct the scoring system so that it does not unfairly prejudice films that are rated by fans to be exceptional.

Thanks for considering this post.
Photo of Dan Schmidt

Dan Schmidt

  • 4 Posts
  • 0 Reply Likes

Posted 6 years ago

  • 1
Photo of Dan Dassow

Dan Dassow, Champion

  • 13457 Posts
  • 13801 Reply Likes
Dan,

I am sympathetic to your problem.

The IMDb ratings are nothing more and nothing less than the aggregate ratings of people who have accounts on the IMDb. It is probably not representative of the views of the population as a whole nor does it reflect any serious critical assessment of the quality of a film. It is at best a qualitative measure of populist film preferences and at worst entirely misleading. It is highly inappropriate to use the IMDb ratings as a means to assess independent or hobbyist films. The “They Shoot Pictures, Don't They?” website (http://www.theyshootpictures.com/gf10...) would be more appropriate.

Consider these inherent shortcomings:
1. The IMDb poll is a self selected survey. Self selected surveys are inherently flawed and statistically unsound.
2. Not everyone provides their ages and gender when registering for an account. No one can say with certainty that people provide correct demographic information.
3. Assuming that people provide accurate demographic information, the underlying population of the IMDb membership is not representative of the general population or may not be representative of the population of the film going population. People with accounts on the IMDb are predominately males between the ages of 18 and 29 who live in the United States.
4. Comparing the ratings of any two films are at best problematic since the populations rating each film may have very little overlap. Since this is a self selection survey sampling methodologies cannot be used to make that comparison.
5. There are significantly more votes for recent and heavily publicized films.
6. Initial ratings for recently released films tend to be much higher reflecting the perspective of the film’s fan base and declines in time. For instance, Avatar had a rating of 9.03, a weighted rating of 8.95 and a rank of 21 on 21 December 2009. It now has a rating of 7.98, a weighted rating of 7.92 and is no longer in the IMDb Top 250.
7. The methods that IMDb uses to reduce voting fraud generally do not work well for films with few votes.
Photo of Dan Dassow

Dan Dassow, Champion

  • 13457 Posts
  • 13801 Reply Likes
Dan,

Your should post your suggestion to use pattern recognition to root out fraudulent scores as a separate suggestion. Click fraud also applies to attempts to inflate STARmeter ratings.

On the other hand, IMDb may already be using IP and other kinds of filtering to reduce the problem. IMDb has indicated that it does not disclose the methods it uses to make it more difficult to game the system.

On a separate note, I find the IMDb Top 250 and ratings interesting, but understand their limitations. Unfortunately, many users of IMDb do not.
Photo of Dan Schmidt

Dan Schmidt

  • 4 Posts
  • 0 Reply Likes
I do find IMDB's ratings interesting as well, and generally I find them a great resource. Overall I think their system is doing a pretty good job. But for the smaller niche films that are more susceptible to manipulation I think they could do better and create a system in which the good films can thrive and the poor ones get weeded out.

I can say for sure that on my film they did not recognize a very obvious attempt to lower the rating (a number of 1 rankings all on the same day). This indicates the system needs a tweak.

Thanks for your comments and feedback on this subject. I will do as you suggest re: posting.

This conversation is no longer open for comments or replies.