# The rating algorithm for top 1000 voters should be revised

• Idea
• Updated 6 years ago
###### Archived and Closed

This conversation is no longer open for comments or replies and is no longer visible to community members. The community moderator provided the following reason for archiving: Old thread

The rating algorithm for top 1000 voters should be revised.

The user ratings for the new movie "Rush" for the top 1000 voters as of 27 September 2013 is as follows:

-----------------------------------------------------
56 Top 1000 voters have given a weighted average vote of 5.7 / 10

Demographic breakdowns are shown below.

Votes Percentage Rating
6 10.7% 10
3 5.4% 9
15 26.8% 8
9 16.1% 7
4 7.1% 6
0 5
0 4
3 5.4% 3
0 2
16 28.6% 1

Arithmetic mean = 5.7. Median = 7
-----------------------------------------------------

The weighted average would seem to be slightly less than 8, if one were to disregard the statistical outlier votes of 1 and 10.

Now, the top 1000 voters has extra importance in that it is shown separately in the breakdown of the general user ratings. The idea, I think, would be that these top 1000 voters watch more movies and so would be knowledgeable and thus able to give a more trustworthy rating, i.e. not wildly exaggerated either positively or negatively.

However, this does not seem to be the case. 28% of the 56 top 1000 voters gave the movie a 1 rating. This is just plain ridiculous. It would seem to be a case of just casting votes without doing a proper evaluation. Perhaps they have not even seen the movie and just vote away to have many votes registered, or they want to influence the weighted average.

The 10% votes with rating 10 is also an outlier but very much less so. A deviation of 2 from the weighted average (not counting the outliers) is not uncommon and shows that perhaps in this case they just like the movie very much. Still, even the 10 votes should count for slightly less in calculating the weighted average.

Now, a rating of 1 is so far of the mark that I would suggest that the votes be plainly disregarded, and perhaps even that top 1000 voters who votes like this should be monitored: If a top 1000 voter has several or many such totally "off" votes, they should be disqualified as a top 1000 voter and the vote not counted in that category. They can still be counted in the general rating, just not in top 1000 voters. This way the top 1000 voters category would be more meaningful.

Frank
• 2 Posts
• 0 Reply Likes

Posted 6 years ago

• 416 Posts
• 559 Reply Likes
I'm highly skeptical to the whole top 1000 voters system at all. I would imagine that many, if not even most, of the top 1000 voters are vote spammers who often haven't even seen the film they've voted on. That they are given extra credit for the averages only enforces their reason to continue.

Every vote should count equally, in my opinion.

Emperor, Champion

• 6418 Posts
• 3004 Reply Likes
See my suggestion here with further discussion on the topic:

https://getsatisfaction.com/imdb/topi...
• 2 Posts
• 0 Reply Likes
That is a good suggestion as well. I think though that when it comes to the general rating that so many votes comes in that these 1 bombers will not count for much. However, for movies with only few ratings it may make a difference.

My concern with the top 1000 voters is that they have their own category in the rating breakdown and so should be scrutinized more closely unless they are replaced by a category of top / regular reviewers or regular voters as you suggest.

Emperor, Champion

• 6418 Posts
• 3004 Reply Likes
Worth noting that I believe the top voters weighting applies to more people than the Top 1000 voters who get flagged up in the votes, so it makes crunching the numbers a little trickier.

Basically, as it stands fiddling with the weighting is too late - the system is clearly flawed (and being abused), so we either get rid of voter weighting completely or shift it to top reviewers, who have taken the time to publicly explain their choice of vote. The beauty of that is that it takes away the main incentive for score bombing and reduces its effect.
• 17 Posts
• 3 Reply Likes
What is now happening is that every new release is quickly voted on by 2 to 3 dozen Top 1000 Voter accounts and each of these accounts votes a score of 1 for every movie. Gravity now has 161 votes from TOP 1000 Voters, and 37 of those 161 votes score Gravity a 1/10.

Look, if 37 of the TOP 1000 voters were each trying to stuff the ballot box in order to remain a Top 1000 Voters, then why are all 37 giving every movie the exact same ranking - a 1/10? One would expect that some voters would give every movie a 5, or a 6 or a 7. That would be far less conspicuous. But every single voter is voting a 1/10.

This is automated robotic software in action - and each of the 37+ accounts that are using automated voting are using the same software to post votes - there is no other reason why they are all voting Gravity and other films a 1/10. The similarity of the voting makes it highly likely that the voting on these accounts is coordinated. This is not 37 idiots acting independently - this is 37 accounts using the same voting software or voting instructions.

As I have said elsewhere, if IMDB realizes that their TOP 1000 Voter votes are being corrupted by malware - by automated voting software - then they have two choices. (1) Either analyze, detect and block the use of malware and/or not factor the votes from suspect accounts into the posted vote totals, or, (2) Stop reporting the information about votes in the TOP 1000 Voter category. If these TOp 1000 Voter totals are so unimportant that they are not worth safeguarding, then they are not worth reporting to the public in the first place.

No company -including IMDB - can maintain the trust of their User community if they fail to safeguard the integrity of the information that is being posted.

This conversation is no longer open for comments or replies.