How to report extreme shilling?

  • 1
  • Problem
  • Updated 4 months ago
  • In Progress
It's fairly obvious where we're supposed to report issues we may have with a user review, but how do we report movies that have obvious and heinous shill activity that has gone far past unconscionable?
Photo of SCA

SCA

  • 10 Posts
  • 9 Reply Likes
  • Angry and frustrated.

Posted 5 months ago

  • 1
Photo of Ed Jones (XLIX)

Ed Jones (XLIX)

  • 14433 Posts
  • 16459 Reply Likes
What Movie?
Photo of SCA

SCA

  • 10 Posts
  • 9 Reply Likes
Photo of SCA

SCA

  • 10 Posts
  • 9 Reply Likes
1. Every negative review, no matter how reasonable, gets absolutely pummeled. Incredibly so.
2. Every positive review, even if it's just "Derp... It made me so happy I drooled", gets wild "helpful" acclaim. Also incredibly so.
3. A heavy number of the positive reviews are one-review-wonders and have that sort of phraseology that has slimy shill written all over it.
Photo of Joel

Joel, Employee

  • 918 Posts
  • 1094 Reply Likes
Hi SCA,

Thanks for your message.

I've cut a ticket to the necessary team to investigate this.

Have a great day.

Joel 
Photo of SCA

SCA

  • 10 Posts
  • 9 Reply Likes
Thank you, Joel.
Photo of Nick Burfle

Nick Burfle

  • 97 Posts
  • 125 Reply Likes
Interesting data looking at those reviewers.  Of the 25 that show up on the first page, 10 are substantial, 2 are marginal, and 13 are one- or two-liners.  

Of the 15 non-substantial ones, 4 reviewers have been members for varying times: 2mo, 5mo, 1yr, 9yrs, and have multiple reviews or ratings.  One reviewer has been a member for 1 month; we all have to start somewhere.

Ten (!) have all been members for 6 years.  Not 5, not 7, all 6s.  One has 2 reviews, nine of them have one review.  It certainly looks fishy to me.

But the helpfuls and not helpfuls... I suspect that's largely just how ordinary viewers vote.
(Edited)
Photo of SCA

SCA

  • 10 Posts
  • 9 Reply Likes
"But the helpfuls and not helpfuls... I suspect that's largely just how ordinary viewers vote."

Really? Well, just for grins and giggles, let's see what sorts of patterns I see.

Of the top 25 user reviews, ranked by "Review Rating":

13  are one review wonders
4 have 2 or 3 reviews, of which reviewed movies all are by Bhandal or 2 of 3 are by Bhandal
6 are probably legitimate
2 are possibly legitimate

By my math that suggests that only 8/25 of the top 25 user reviews may be legitimate. 17/25 are almost certainly not.

Because, you know, that's just "how ordinary viewers vote". Sure.

Of the total of 68 reviews that were extant at the time I wrote this, 30 of the reviews ranked the movie at 4 stars or below with overtly negative comments and review titles. Magically, these reviews have helpful ratings like 4 of 154, 13 of 156, 12 of 147, 16 of 165, 13 of 165, 12 of 162, 14 of 165, 13 of 157, 23 of 176 and so on. So, at a glance, that means helpful ratings ranging between 3% and 8%.

Of the 38 or so that are ranked above that, THEY typically have helpful ratings hovering at about 90% and above, give or take. And sometimes way above that. In other words, the people actually WRITING reviews here are biased AGAINST the movie, ESPECIALLY if you subtract the pretty obvious 17 shells in the first 25 at the top. I don't know about you, but my general experience is that the averages of the "helpfuls" button clickers relative to the review writers are never so staggeringly far apart in terms of where the averages fall.

Even utterly useless positive reviews showing 10 out of 10 and the text of "Good...I was sad when lily died...ending was happy :)....anyways was nice and I liked it" gets a helpful of 139 of 150 or a 93% helpful rating.

Because, you know, that's just "how ordinary viewers vote".

Sure.

>:(

Photo of Nick Burfle

Nick Burfle

  • 97 Posts
  • 125 Reply Likes
As I previously agreed, you're right, some of the high reviews are obviously coordinated. We can agree to differ on how many.  But everyone is a "one-review wonder" when they write their first review... mine was a (my only) 1-star review, which no doubt looked suspicious for a couple of weeks.  It took intense dissatisfaction to get me off my butt to write one.

My point about up and down votes, I stand by.  What I see on IMDb is that a very lot of people upvote reviews as "helpful" -- even if its five words -- if they agree with the rating, and vice versa.  That's not shilling.  Granted, it's also not serving the intended purpose... but I think it's mostly just human nature.

Prediction: the next time some new guy is chosen to play Batman, a certain percentage of folks who care (not me) will be outraged by the choice, and thousands will flame his first Batman film in the ratings, upvote the 1-star reviews and downvote the 10s, regardless of the quality of the film.  Are they being completely unhelpful? Yep.  Shills, in coordinated action?  I doubt it... because yes, that is just how people vote.

If IMDb can remove and block the coordinated reviews, I think they'll be doing well, and you have done a service by bringing it to their attention.  Thank you for that.  'nuf said.
Photo of Ed Jones (XLIX)

Ed Jones (XLIX)

  • 14433 Posts
  • 16459 Reply Likes
Tell you what my theory is on short reviews. The Text-Gens.
Make reviews 250 words.
See what happens.
The reviews that are done by farms will stick out like a sore thumb!