Why don't reviews and star-ratings sync-up?

  • 8
  • Question
  • Updated 4 years ago
  • Answered
My only problem with AllMusic has never been one of functionality (it has always been, IMO, very user-friendly), but rather with consistency between reviews and star-ratings. It's all too common to see a review with a phrase like "not only the band's greatest album, but one of the greatest albums of the era," accompanied by a 3-star rating. Subsequently, the following album will often be described as "an extreme let-down" and "signs that the band was beginning to lose focus and run out of steam," accompanied by a four-star rating. What gives here? All I can think is that the reviews are written by individuals, while the star-ratings are assigned by committee. But regardless of the reason(s), it makes for some pretty glaring inconsistencies. Any way you guys could fix that?
Photo of Mark Weisinger

Mark Weisinger

  • 31 Posts
  • 7 Reply Likes
  • frustrated

Posted 7 years ago

  • 8
Photo of johnbush360

johnbush360

  • 136 Posts
  • 20 Reply Likes
Thanks for the feedback and the concern, but we don't assign our ratings by committee -- they are produced by the writers themselves. I'd like to know the specific albums for any of the extreme examples you mention above, but most of the issues you see are due to our ratings system, which treats artists and genres on their own (i.e. we don't judge Britney Spears according to Leonard Cohen's standards, which are neither better nor worse but entirely different).
Photo of paulkent4

paulkent4

  • 0 Posts
  • 0 Reply Likes
Same question here. Item: Cyril Scott 'Symphony no.3, etc' on Chandos label gets a crap review but 4.5 star rating.
p.s. I think your reviewer of 19th and 20th century English composers should pass the CDs on to someone else. I actually agree with Penguin & BBC Music Mag high ratings for the Scott/Brabbins set.
Photo of rootsmusic

rootsmusic

  • 500 Posts
  • 3 Reply Likes
Photo of Mark Weisinger

Mark Weisinger

  • 31 Posts
  • 7 Reply Likes
If the "This Is 40" Soundtrack is "funny, perceptive, and moving, the best of its breed," then why does it get 3 1/2 stars? Is it just that the breed itself ("upwardly mobile indie refugees") is inherently unworthy or mediocre? If so, I would say that such genre contempt is atypical of the typically objective reviews I associate with this site. Just wondering, because it looks like this plays into a long-standing problem with this site having star-ratings that don't match their reviews. Just wondering what you were thinking here.
Photo of johnbush360

johnbush360

  • 136 Posts
  • 20 Reply Likes
Hey Mark,

I asked Stephen Thomas Erlewine about it, here's what he wrote:

'The writing was perhaps a bit too enthusiastic—it’s a very good album but it really doesn’t deserve more than 3 1⁄2 (which is a pretty good rating!)'
Photo of Mark Weisinger

Mark Weisinger

  • 31 Posts
  • 7 Reply Likes
Fair enough, although that's essentially the crux of my criticism - the tone of the review in question (in this case This Is 40) doesn't match the star rating. If something is truly the best of its breed, it would seem to me that it would deserve better than 3 1/2 stars, which is an above average rating, but just barely (I would assume, on a 5-point scale, that 3 is average). Also, just for the record, the song "Watch the Moon Come Down" is not a new Graham Parker & the Rumour song - it's from the 1977 album Stick to Me. The new Graham Parker song is "What Do You Like?" (with the Punch Brothers).
Photo of Patsy Morita

Patsy Morita

  • 0 Posts
  • 0 Reply Likes
For classical albums, the rating is more broadly based on the quality of the music in the context of similar music, the composer's whole output, and his/her goals for the particular works; the album's program, and the quality of the performance. In this specific instance, the review clearly assigned the wrong rating, and it has been changed to more accurately reflect the review.
Photo of rootsmusic

rootsmusic

  • 500 Posts
  • 3 Reply Likes
Patsy, your Senior Editor should answer the FAQ at http://www.allmusic.com/faq#ratings about AMG's rating guidelines in more detail. By the way, see https://getsatisfaction.com/allmusic/...
Photo of Mark Weisinger

Mark Weisinger

  • 31 Posts
  • 7 Reply Likes
You know, I shouldn't really need to do this since I am a fan of the site who is trying to help you improve its quality (you know, the purpose of a feedback/comments section), but fine; here's an example: Several years ago, I emailed allmusic to complain that the review of Fleetwood Mac's Tango In the Night didn't match its star-rating. In fact, the last sentence of the review reads, in part, "Buckingham...consistently brings out the best in his colleagues on this superb album." If Buckingham consistently brings out the best in his colleagues, and this is indeed a superb album (not good or even great, mind you, but superb), then how does that translate to a 3-star rating? In what universe is 3-stars superb? I didn't get an answer to that email several years ago, so maybe I can get a real answer now.

Now, I'm not trying to be a jerk here; honestly, I'm trying to help. But it is extremely frustrating for a legitimate complaint (which has been attempted to be reinforced successfully by several users on this thread) to be dismissed by a defensive, often vaguely condescending reply from an "employee" aimed towards damage control rather than legitimate problem solution. If that is indeed going to be the policy - try to convince the user that they're imagining a problem rather than attempting to solve it, then what is the point in even having a comments/suggestions section like this?

I love this site; I do. I come here daily (and have for over a decade) to read reviews of albums I'm considering buying, and I defer to the opinions on this site, more often than not, when I'm in doubt about picking up a vinyl record or CD. So, believe me when I say that I'm not trying to be a jerk, but I do want my question legitimately addressed, if in no other way than for someone besides "an employee" to say something along the lines of "several people have mentioned this problem, and we are currently working to address a possible solution," rather than insulting my intelligence with a pat answer that in no way addresses my complaint or insinuating that I must be delusional about the existence of such a problem in the first place.

So, here's what I plan to do: Whenever I happen to find another example of how the problem in question manifests itself in your site, I will post a brief summary of the album in question and how it demonstrates this problem. I don't know how often that will be (Every day? Every month? Who knows). But I will continue to do so until someone gives me an honest and legitimate answer to my question.
Photo of johnbush360

johnbush360

  • 136 Posts
  • 20 Reply Likes
Hi Mark,

I'm sorry you're so dissatisfied with the responses you're getting, and I hope I can help. Although I don't think it particularly matters, if it helps I can tell you that I'm not just "an employee" but the Senior Managing Editor for Pop Music.

First, if any of our comments came across as condescending, I apologize. That was certainly not the intent. I'm afraid we'll have to disagree when you say we were only doing damage control. We can only give you our interpretation of your issue; we can't force you to trust us that our responses are honest or legitimate.

Specific to your Fleetwooc Mac comments: Tango in the Night can be a superb Fleetwood Mac album, and still receive only 3 stars, for the same reason that Beatles for Sale can be the most uneven Beatles album and still get 5 stars -- it's because we judge different artists and different artists' time periods differently, and we recognize that many qualities go into how most people rate music (and thus produce the truest ratings). If you like Rumours and Tusk, will you think "the superb" Tango in the Night deserves 5 stars, or even 4 stars? We say no, because that's our opinion. Others can say yes, and that's fine, since that's their opinion.

Back to generalities now: We have indeed seen reviews and ratings that don't match, and I'd encourage you to look at my comments below for more information.
https://getsatisfaction.com/allmusic/...

Again, I hope this helps.
Photo of rootsmusic

rootsmusic

  • 500 Posts
  • 3 Reply Likes
John, please elaborate what you meant by rating "different artists' time periods differently" because this wasn't explained in the FAQ. See also my question for Zac at https://getsatisfaction.com/allmusic/...
Photo of rootsmusic

rootsmusic

  • 500 Posts
  • 3 Reply Likes
John, please elaborate on what you meant by rating "different artists' time periods differently" because the FAQ didn't explain this. Are "time periods" referring to an artist's "artistic maturity"?
Photo of Mark Weisinger

Mark Weisinger

  • 31 Posts
  • 7 Reply Likes
In answer to your question, specific to Fleetwood Mac, yes, I like Rumours and Tusk (as well as the 1975 S/T), but I'm not really all that crazy about, say, Mirage (or Fleetwood Mac Live, for that matter). Neither are you, apparently, as you give Mirage a mediocre review and a 3-star rating. Tango In the Night, on the other hand, is, in my opinion, a distinctly better album than Mirage, an opinion which it seems, from the reviews of both albums, that AMG holds as well.

All I'm saying (and again, I'm not comparing the Beatles to Fleetwood Mac, so I don't know why that whole "apples to oranges," "Leonard Cohen to Brittney Spears" argument keeps coming up) is that if the review of Mirage states that the album "suffers from a lack of substance" and isn't as "compelling" as their best work, while Tango In the Night is a "superb" album and "a creative high note," then shouldn't Tango get a higher review than Mirage?

I mean, not only is the word "superb" a term than empirically means "above average" (see "magnificent"), and not only is a 3-star rating, by definition, average, but if an album is a "creative high note" for a band, then that means that, relative to that band's other work (again, comparisons to other artists notwithstanding), it still stands out as a better-than-average album for that band, deserving of an appropriately higher-than-average rating and a higher rating than another, less compelling album. Maybe this was one of those cases where the reviewer got "over-excited and attempt[ed] to give five stars to an album that hasn’t quite proved its classic mettle just yet" and was overruled.

And, as far as that goes, I'm not suggesting that you give Tango a 5-star rating; I think that would be excessive. But, under the circumstances, I believe a 4 star rating would be a rating more appropriate to the review. Either that or at least dock Mirage a star or a half-star to address the discrepancies in the reviews (although that still leaves the problem of superlatives like "superb" and "creative high note" in an average rating). Either way, as it stands, I just see an inconsistency between the reviews and the ratings. That's all. JMO.

Again, though, this is not about Fleetwood Mac; lengthy digression aside, that was merely an example (since you asked for one). It's about a syndrome that I see as troubling within your site. I'm glad that you have "indeed seen reviews and ratings that don't match," but in your responses to myself, Robert Jaz, paulkent4 and rootsmusic, you've still tried to downplay the significance of the problem by implying that such inconsistencies are relatively isolated incidents, when, it fact, they recur somewhat frequently. Still, as long as you're aware of the problem, and you're working to fix it, that's what's most important.
Photo of johnbush360

johnbush360

  • 136 Posts
  • 20 Reply Likes
Thanks again, Mark -- I think we're definitely on the same page. I'm sorry if you think I've tried to downplay the significance of the problem; it's just that we often get people who aren't familiar with our ratings system and then talk more about perceived inconsistencies (rather than what we would call real inconsistencies).

I'll definitely have Tom (Stephen Thomas Erlewine), our chief Fleetwood Mac fan, check out the ratings of Tango in the Night vs. Mirage. (It becomes a very compelling argument when you're talking about inconsistencies within the same artist discography and the same relative time period.) It's possible that at one point we gave Alex Henderson the assignment to re-review Tango in the Night, but didn't change the old rating to reflect his new review. This in fact might be the chief culprit of ratings/review inconsistencies: failing to catch that a reviewer's reappraisal of an album should prompt a new look at the rating.

Hope this helps, and I'll let you know what we decide to do with Tango in the Night and Mirage.
Photo of johnbush360

johnbush360

  • 136 Posts
  • 20 Reply Likes
I talked to Tom yesterday and he says he agrees that Tango should be boosted, to a new rating of 3-1/2 stars. We just made the change, which should propagate to the website by early next week. Thanks, good catch!
Photo of Zac Johnson

Zac Johnson, Official Rep

  • 3288 Posts
  • 159 Reply Likes
To John's point:
It's possible that at one point we gave Alex Henderson the assignment to re-review Tango in the Night, but didn't change the old rating to reflect his new review. This in fact might be the chief culprit of ratings/review inconsistencies: failing to catch that a reviewer's reappraisal of an album should prompt a new look at the rating.

This is certainly a possibility.

Back in the early days of publishing the All Music Guide books, there were handfuls of reviews that were just one line like "Great live set" or "Hot trio."

Here's one that still exists in the wild!
http://www.allmusic.com/album/MW00000...

When we can, our editorial team combs back through to revisit older reviews that are not up to snuff. If that task ever fell to one of our freelance writers, the older rating might not be associated properly with the updated review.
Photo of rootsmusic

rootsmusic

  • 500 Posts
  • 3 Reply Likes
Zac, some albums still have "reviews" that were shorter than a complete sentence. Other reviews at least typed a couple of sentences, but not all of which actually expressed a critical opinion. For example, the review at http://www.allmusic.com/album/the-las... argues: "The Last Real Texas Blues Band is a nearly perfect roots record, boasting both a stellar set of songs and exciting, unpredictable performances that make it arguably his best record ever". To me, that sentence is too brief to completely support the superlative of "a nearly perfect roots record" but its rating is merely 3-stars?
Photo of Andy DeNardi

Andy DeNardi

  • 231 Posts
  • 27 Reply Likes
"Arguably his best record ever", yet two records also also have 3 stars and NINE are rated higher (under his main entry)! How is it possible that a p[ossible best album is ranked at the bottom?

There's also the problem the Rootsmusic example is a standalone entry under Last Texas Blues Band. Sahm's main entry doesn't list it but does have a different Last Texas Blues Band album. But that's another issue that I'll take up some other time.
Photo of johnbush360

johnbush360

  • 136 Posts
  • 20 Reply Likes
I'll have Tom Erlewine take a look at this -- he's the Doug Sahm authority around here, and it definitely looks like some adjusting needs to be done. I can tell at a glance that the Last Texas Blues Band review is an old one.
Photo of TalkingBook

TalkingBook

  • 0 Posts
  • 0 Reply Likes
I came across this topic while trying to post the same question.

I understand that maintaining a large database of reviews and keeping them up to date is a time and resource intensive process. I've run into more than a few cases of rating/review disconnect, and have usually chalked them up to this fact. However, even given the rationale provided in this thread, I'm still confused as to how the following case could come about:

Fripp & Eno's compilation "The Essential Fripp & Eno" is designated as an 'album pick', and receives a rating of 4 1/2 stars; however, reviewer Greg Prato has little more to say about the album than that "the material is rather boring", going on to say that it consists largely of "characterless instrumentals" with "nothing really going on".

It's difficult to believe that the rater and the reviewer are the same person in this case. If this is indeed the result of an old rating not being adjusted to fit a new review, one has to wonder if there is no sort of quality control being carried out with regards to new reviews - surely they are read by somebody before being posted to the site, so wouldn't it make sense to have this person glance at the rating as well to ensure some sort of consistency? As it stands, the disconnect is jarring and confusing.

(As an aside, which I don't expect to be answered here: I strongly disagree with the review I've cited here. Normally, this wouldn't be an issue - there's no accounting for taste. However, the review is also severely lacking in information and detail. The reviews for every other album by Fripp & Eno include information about the unique recording technology employed by the duo, as well as details about specific tracks. This level of detail is entirely absent in Prato's review, which simply sums up the entire album as "rather boring" "electronic instrumentals" which "go on far too long". Compare this to Ted Mills' mixed review of Fripp & Eno's No Pussyfooting, which goes into detail about both the strengths and weaknesses of the album and addresses individual tracks, and thus delivers the kind of information and detail which I've learned to appreciate and expect from AMG.)
Photo of johnbush360

johnbush360

  • 136 Posts
  • 20 Reply Likes
Apologies for not responding to this one earlier -- this is a stellar example of something that needs to be replaced.

Nothing against Greg (the original writer) or our assignment editors, but it's a little mystifying why he would be assigned to review music that's obviously quite far afield from his usual area.

I talked to Thom Jurek yesterday and he leaped at the chance to re-review this (those records made a big impact on him).

Look for the new piece to be up within a week to ten days.

Best,
John Bush
AMG Senior Managing Editor, Pop Music
Photo of TalkingBook

TalkingBook

  • 0 Posts
  • 0 Reply Likes
That's great to hear - better late than never.

Incidentally, I just checked about a week ago to see if Mr. Prato's review was still in place. Even though I'm still a bit confused on how such a blatant discrepancy can occur in the first place, I think it's great that you are acknowledging and attempting to fix the problem. This is much more than I can say for many other sites and services I use.

Thanks.
Photo of louisvalentinejohnson

louisvalentinejohnson

  • 0 Posts
  • 0 Reply Likes
I am requesting that you remove and stop selling my recordings as they all appear to be given a rating of two stars by your reviewers.
It does not seem right that you should be selling an artist's work that is hung with a label of low quality.
Thank You for taking care of this.
Lou V. Johnson
Photo of Dawn G.

Dawn G.

  • 705 Posts
  • 16 Reply Likes
Hi Louis,

Thank you for your feedback. Allmusic is a site to help music lovers discover new music, revisit the classics and research their favorite artist. We do not sell any products directly on our site. If you have questions, please feel free to get in touch with us again. Thanks for stopping by!
Photo of louisvalentinejohnson

louisvalentinejohnson

  • 0 Posts
  • 0 Reply Likes
Thank you for your nice reply.
What I am asked you to do is remove your information about my recordings from your system.
When might you accomplish that please?
Thank You.
Lou V. Johnson
Photo of paulkent4

paulkent4

  • 0 Posts
  • 0 Reply Likes
Dear Louis,
My guess is that they won't, and my hope is that they don't. Reviewers should be free to say what they believe - even if we don't always agree with them. As the great writer Beaumarchais said 'Without the freedom to criticise, there is no true praise'
Photo of Chrysta Cherrie

Chrysta Cherrie

  • 731 Posts
  • 7 Reply Likes
Hi Louis,
Because we have samples for some of your releases, we cannot remove them. It's necessary they remain up for media recognition and historical perspective-related purposes.

Thank you for using AllMusic!
Photo of Andy DeNardi

Andy DeNardi

  • 231 Posts
  • 27 Reply Likes
Here's another example of what we see as a problem.

http://www.allmusic.com/album/essenti...

Essential Blues, vol. 3 - various artists

"Two-disc set of blues recordings that makes the "essential" in the title a very arguable point indeed" ...
"This isn't a bad collection, necessarily, but there are overall far better collections out there to spend your money on."

It's given 4.5 stars! I might expect 3.5 or even 4, but a record rated that highly should have more positive characteristics. Yeah, I know that you're only comparing against other assorted blues collections, you've pushed that several times. But there are a hundreds of collections to compare this against, it shouldn't be that hard to put it in context.

The staff seems to have reined in their desire to hand out five stars to at least one album in each artist's output. They seem better at only doling out 4.5 stars to a very good album. I appreciate that. My expectation is that even among the very best performers there should be only two or three five star albums.
Photo of johnbush360

johnbush360

  • 136 Posts
  • 20 Reply Likes
Hi Andy,

Thanks, that's a good one -- we'll check it out and (probably) change it. Sometimes two albums with a very similar (and general) title will get accidentally merged, which causes reviews and ratings to merge in odd (and hilarious) ways.

I think it's valid that your expectation that "even among the very best performers there should be only two or three five star albums", up to a point. The rare occasions that break that rule are pretty understandable, I'd hope. (I don't think you'd downgrade a Miles Davis album that is one of the best jazz albums ever, but isn't Walkin' or Steamin' or Kind of Blue or Bitches Brew or something; same goes with the parade of five star Beatles albums.)

Again, I hope all of this helps.
Photo of Andy DeNardi

Andy DeNardi

  • 231 Posts
  • 27 Reply Likes
As long as I have your attention:

The Complete Dinah Washington on Mercury, Vol. 6
"Most of the 73 performances are difficult to sit through."

That's a four and a half star record. The previous five volumes are five star records. The three examples I've brought up, by three different reviewers (Yanow, Owens and the late lamented Cub Koda), illustrate that this is not an isolated problem. It really speaks badly for the integrity of your ratings.

I've made my point, I won't annoy you with more examples. I realize that you can't go back through thousands of reviews and adjust them. I know that many questionable entries will remain. Music fans count on you to get this right. We don't have MusicHound and Virgin offering second opinions anymore. Christgau probably won't put out another guide and Penguin only does jazz. AMG killed them all off, you're the top of the heap. And now you're doing a piss-poor job, just like all the other monopolies.

To your other point: "I don't think you'd downgrade a Miles Davis album that is one of the best jazz albums ever" I agree that your examples are among the best. But the point was made above that the reviewers don't compare them to every other jazz album. They compare them to Miles Davis albums.

I count 77 albums on Mile's main page. Eighteen of them are rated five stars. The man was great, but you're doing folks depend on you to tell them where to start. It's a disservice to mark almost a quarter of his output as indispensable. Cut that eighteen to nine.

OR, I really think that some artists need to be divided up. Someone (like myself) who likes Steamin', isn't that keen on Bitches Brew. They are entirely different kinds of music. In my own collection, I have three divisions for Davis 1949-1961, 1962-1967 and 1968-1991. Coltrane has similar divisions. With sub-categories, you can hand out nine five star albums in each one. Everybody is happy!
Photo of Andy DeNardi

Andy DeNardi

  • 231 Posts
  • 27 Reply Likes
re: "the parade of five star Beatles albums"

Hard Day's Night / Revolver / Rubber Soul / Sgt. Pepper / Abbey Road

That's five instead of eleven. Kick the others to 4.5 stars. Knock the three current 4.5 star to 4.0. You don't have any of those.

For some young punk that's never heard the Beatles, those five are the ones you need. Five stars for the White Album? Be serious, shoulda been a single album. With The Beatles? Great album if you're sixty but it's awfully dated for today's audiences. I love the Beatles as much as anyone but this isn't a kid's soccer game. Not everyone gets a gold star.
Photo of Mark Weisinger

Mark Weisinger

  • 31 Posts
  • 7 Reply Likes
I agree with pretty much everything you said, in theory, except for the example of Van Halen. To my ears, Van Halen and "Van Hagar" are two very, very different bands, even though they have 3/4 of the same lineup. The original band was a very hard rocking, fun, rollicking rock and roll band with a good bit of silliness and camp, while the latter incarnation is a mostly sober, serious, ballad-heavy, middle of the road AOR-band that couldn't loosen up if they tried. Ironic considering that the replacement singer is the guy who wrote songs like "I Can't Drive 55," "Bad Motor Scooter," and "Cruisin' and Boozin'," but I digress. Anyway, while I think a better example would have been someone like AC/DC, the point was still well-made, although while I agree with everything you said in theory, in practice, I tend to agree with Gar Seeya' that it would be a can of worms better left unopened. I mean, who would decide which artists get subcategories and which don't if you and I can't even agree on Van Halen?
Photo of Gar Seeya'

Gar Seeya'

  • 15 Posts
  • 0 Reply Likes
Not only which artists do or don't, but even on the artists you agree need to subcategorize, where do you draw the line? In some cases, a la VH, it is easy because it is based on a line-up change. But where do you draw the line with The Beatles? Or Coltrane? Curious, by the way, which 5 star album turned you off to Coltrane for a few years, Andy, if you don't mind sharing.
Photo of Andy DeNardi

Andy DeNardi

  • 231 Posts
  • 27 Reply Likes
I agree that it's impractical to split up bands into phases. I offered it as a solution to looking at the discography page and seeing 15 to 20 five star albums, or over a quarter of a performer's entire output. It's not a point that I'm pushing hard on.

Where would I divide The Beatles? Sgt. Pepper is a clear dividing line so that's the one I'd officially go with. But my belief is that it would actually be Magical Mystery Tour. That's when it became obvious that the tight songwriting team was drifting off into separate rooms and experimenting, not always successfully. There's no doubt that the Beatles were one of the greatest of the century, but AllMusic says that it rates albums against others by the band and not against music as a whole. For that reason, I don't think 11 of 13 albums deserve five stars with another at 4.5. Comparing Beatles to Beatles, I say there are five superb albums, two or three a half star down, and the rest four stars. And Yellow Submarine gets one star.

The Coltrane album was Ascension, a very good one as later period albums go. I was new to jazz, having drifted over from fusion-rock stuff like Mahavishnu Orch, Tangerine Dream and Jean-Luc Ponty. I'd enjoyed Giant Steps, then I hit Ascension and knew that I didn't want to spend money wandering through that record bin if there was a chance of picking up more of that. I didn't really like fusion-rock either, so it took a while to get my footing. I did and it's one of my favorite genres now. Turns out I'm a Bop kind of guy, and I still hate free jazz.
Photo of Mark Weisinger

Mark Weisinger

  • 31 Posts
  • 7 Reply Likes
I definitely agree that the Beatles were a completely different band by the time of Abbey Road than they were on Please Please Me, but I find it difficult to pinpoint a specific dividing line, as in my opinion, it was a gradual maturation, rather than a specific awakening, that caused the change. Many people cite Sgt. Pepper's (so much so that it's become conventional wisdom at this point), and tracks like "Being For the Benefit of Mr. Kite" and "A Day In the Life" certainly support that theory, but what about Revolver's "Tomorrow Never Knows," "Love You To" or "She Said She Said?" Or non-album singles like "Rain?"

Honestly, in my opinion, I would mark the most significant change even earlier than that - the album Rubber Soul was where they first encountered Dylan (and by extension, marijuana), and it radically changed their compositional style. Just a couple of years prior, they were writing songs like "I Want to Hold Your Hand," and now they were singing "In My Life" (although, to play my own Devil's advocate, the previous album, Help, spawned "Yesterday" and "You've Got to Hide Your Love Away," so that album could equally garner consideration as a dividing line). I guess my point is that, even with the Beatles, trying to parse a specific sea change in their development proves tricky, to say the least.
Photo of Mark Weisinger

Mark Weisinger

  • 31 Posts
  • 7 Reply Likes
Interestingly, though, while I disagree with you on the dividing line for the Beatles (and, in fact, the feasibility of subcategories in general), I agree 100% with your assessment of which Beatles albums are the 5-star selections. Hard Day's Night / Revolver / Rubber Soul / Sgt. Pepper / Abbey Road - all five star albums (Help! is borderline and a bit of a judgment call, but the others are all clearly first-rate). Everything else should be bumped down by at least a half-a-star or more. And yes, while it may sound harsh, Yellow Submarine is a one or two star album at best. Three is ridiculously generous. There is not a single original track on that album that would be worth purchasing on its own merits for any other reason than that of completism.
Photo of johnbush360

johnbush360

  • 136 Posts
  • 20 Reply Likes
Sorry, I guess we just love music too much... ;>

Also, for what it's worth, we've always provided (in our internal data) a single album that represents the best place to start for every artist, whether it's Miles Davis or Lady Gaga. Unfortunately, the website doesn't differentiate those 'top pick among all picks', but we'll definitely lobby for it (and look at the Dinah Washington example too).
Photo of Gar Seeya'

Gar Seeya'

  • 15 Posts
  • 0 Reply Likes
I also often wonder about who checks which songs on an album get to be checked as an "album pick," as often I find songs within the review that are given particular merit but don't get a check mark.
Photo of johnbush360

johnbush360

  • 136 Posts
  • 20 Reply Likes
Sorry, I didn't see this when it was written, but 95% of those are created by the reviewer (the only exceptions being some older albums that were missed by freelancers the first time and need to be filled in by another editor).
Photo of rootsmusic

rootsmusic

  • 500 Posts
  • 3 Reply Likes
John, you've overlooked other examples in this thread. By the way, follow-up on https://getsatisfaction.com/allmusic/... ?
Photo of Barney Rubble

Barney Rubble

  • 1 Post
  • 0 Reply Likes
Color me confused. Stephen Thomas Erlewine hits the nail right in the epicentre of its head with his album review of "Beatles for Sale". So why is his review of "the group's most uneven album" accompanied by a 5-star editor rating? Do I smell bias? All original Beatles albums are given five stars (apart from "Let It Be"), whereas their Capitol repackagings are given much lower ratings... Surely it's the same songs, right?

This reply was created from a merged topic originally titled
Re: the "Beatles for Sale" album page..
Photo of Sean Alexander Corbin

Sean Alexander Corbin

  • 0 Posts
  • 0 Reply Likes
Old Crow Medicine Show's self titled album

The review is stellar. There are 7 out of 11 staff picks for its songs. The review only gives positive comments for the album. However it only gets 3 stars. Mistake maybe? Just thought it was weird, I don't truly care if you think it's not wrong but thought you might want to look at it.
Photo of Chrysta Cherrie

Chrysta Cherrie

  • 731 Posts
  • 7 Reply Likes
Hi Sean,
I forwarded your question to our editors. One thing to keep in mind is that we rate albums relative to the artist's body of work, so it may just be that our editors think the band has improved since that album.

Thanks.
Photo of Mark Weisinger

Mark Weisinger

  • 31 Posts
  • 7 Reply Likes
Not being a fan of Old Crow Medicine Show, I don't have a dog in this race, but upon reading the reviews of their albums (just to test Chysta Cherrie's theory), I can't help but think that the "we rate albums relative to the artist's body of work" rationale is a rather weak defense in this case, judging from the tones of the reviews of all of the band's albums.

The tone for the review of the first album is indeed enthusiastic, and no less so than for any of the others. In fact, one of the closing sentences for the review of OCMS's second album, Big Iron World, notes that "What it proves is that Big Iron World is no less worthy of praise than their debut..." That, to me, would indicate that the two are on relatively equal footing (otherwise, the sentence would have read, "What it proves is that Big Iron World is perhaps even more worthy of praise than their debut...").

Also, the review of the following album, Tennessee Pusher, notes that the production by Don Was is a distinct drop-off in energy from the production of David Rawlings, "which isn't a good thing." David Rawlings produced the first two records, so one would think that the inference to be drawn here is that Tennessee Pusher is a slightly lesser album than the two that preceded it. At least that's how I'm reading it.

Now, I'm sure that your staff can probably come up with a perfectly good reason why I'm wrong on this, but it doesn't change the fact that, once again, from an objective outsider's point of view, the reviews and star ratings don't seem to match up. JMO.
Photo of johnbush360

johnbush360

  • 136 Posts
  • 20 Reply Likes
Hi Mark,

Thanks again for the feedback -- I think this is a good one to bring up, definitely worthy of a closer look by Steve and Thom (who have reviewed the majority of Old Crow Medicine Show's albums). I'll defer to them, but I will definitely bring it up and see what they think -- my guess being that the debut OCMS album should be raised to at least four stars.

I can't think of many perfectly good reasons to explain this one away, aside from noting our human error and the difficulties in consistency when you have dozens of writers reviewing hundreds of albums each week by many different artists, trying to visit (and revisit) them all in the context of each's discography. Regardless, it's good feedback to get and I thank you for it.

Actually, here's a possible reason -- the first album by any band obviously *can't* be judged relative to the artist's body of work, so sometimes you may find inconsistencies when the writer has to review subsequent albums, and judge them based on the first. Not sure if that's necessarily the case here, but it's a problem we run into often.

As always, I hope this helps.

Best
John Bush
AMG Senior Managing Editor, Pop Music
Photo of Mark Weisinger

Mark Weisinger

  • 31 Posts
  • 7 Reply Likes
Thanks for the quick response. Again, having never been a fan of Old Crow Medicine Show, I don't have any kind of stake in how, or even if, this specific problem is addressed. It only bothers me in relation to similar problems I encounter on a semi-regular basis with this site. And I know that you don't have the time or the resources to go over every review you've ever written with a fine-toothed comb, looking for inconsistencies, nor would I expect you to. I am glad, however, that you are indeed taking the problem seriously.

As for the specifics of this instance, I understand that it's impossible to judge a debut album relative to an artist's body of work. But I didn't bring up that defense; a member of your staff did. I was merely responding to the argument by noting that, even judged relative to OCMS's other albums, albeit in retrospect, it would seem, from the tones of the varying reviews, that this particular album would not be befitting of a 3-star review, especially when albums that would appear to be, from the tones of their respective reviews, equal or lesser albums are given higher star-ratings.

I really don't want to beat a dead horse here. It's just that, until the problem is finally completely fixed (which, at this rate, if it is indeed even possible, would likely take years), I will continue to bring such discrepancies to your attention (or address those brought up by others) in order to, hopefully, see progress on this issue, even if that progress may have to be measured in inches, rather than the miles that I might hope for.

And let me stress again that I am a fan of this site. If I wasn't, I wouldn't put this much effort into making criticisms/suggestions. As someone else on this thread mentioned, AllMusic is indeed the go-to site for music aficionados of all stripes. Whether or not this was your intention, it is the result of doing a job well, and I congratulate you for it. However, any project can continually be improved upon, and your responsibility at the vanguard is to meet that challenge. Thank you for continuing to do so.
Photo of Gar Seeya'

Gar Seeya'

  • 15 Posts
  • 0 Reply Likes
The glowing review by Thom Jurek for John Abercrombie's "Class Trip" makes reference to another stellar work of Abercrombie's, "Cat 'n' Mouse." However, as much praise as there is in THAT review (by "Rovi"), it only rated 3 stars. Maybe an update is warranted?

This reply was created from a merged topic originally titled
John Abercrombie sold short on stars given to an excellent review.
Photo of johnbush360

johnbush360

  • 136 Posts
  • 20 Reply Likes
I talked to Steve and Thom about Old Crow Medicine Show, and they agreed that the debut should be bumped up -- Thom decided on 3-1/2 stars. I'll also ask him about the John Abercrombie, which he obviously liked quite a bit.

Thanks again,
John