Cellartracker vs. the "Experts"

According to this analysis at Vox.com, the opinions of Cellartracker users are just as good as looking at wine critics:

Thanks for posting . The one thing I do like about the professional critics is the fact that it seems that they can rate a lot of wines (at least US) before the wines are released. CellarTracker for me works well after the wines have hit the market.

I think Cellartracker is a fine tool and find very valuable information there, but there are loads of problems with this analysis. Most amateurs don’t score blind and often know what experts have rated the wine prior to purchasing/drinking and are therefore influenced by the score that the “expert” pinned on a certain wine.

Although there are plenty of garbage reviews on CT, there are also plenty of garbage reviews by “Experts.” Big J Miller, James (95 point minimum) Suckling, etc. I found this quote from the article noteworthy:

Many professional critics, not surprisingly, have scoffed at the idea that mere amateurs understand, let alone have the ability to rate, wine. In a 2012 column for the website Wine Spectator, critic Matt Kramer described the wisdom of the crowd as a “pernicious delusion.” “One hundred people who don’t know much about, say, Auxey-Duresses,” he wrote, "adds up to 100 muddied, baffled and often duplicative conclusions.” Critic Steve Body concurred in a 2014 post titled “Crowd-Sourced Ratings and Why They Suck” on his website ThePourFool: “The readers and users of these sites are almost always slaves to their personal preferences and current trends.”

You know what the best wine is? The one you like the most (except for stuff from the Loire neener ). If 100 people taste a wine and say it is delicious, but an expert pans the wine and rates it one star, or 7/20, or 71/100, who is to say that the expert is right? Maybe the [self appointed, ego-obsessed, non-objective] expert is full of crap.

Good luck with that.

From CT:

10/22/2016 - wine_454 Does not like this wine: 91 Points
This was ok, light in color and a spark of taste.


Huh?

First you take on driedels and now you have it in for wine_454. What the hell is the matter with you?

There lots of problems with both “professional” critics and with crowd-sourced reviews. In the latter case, you have issues with self-selection (people drink the wines they already like – or that they think they like even though they’ve never tasted them blindly against other wines) and suggestion (many knew what the critics’ scores were when they drank).

I’d rather have a critic I know or a CT-scorer I know than an aggregate score from strangers who may have a bias or be clueless.

I should have fixed it for you. “do have bias and are clueless”.

This is sheer idiocy. Wow.

  1. The underlying assumption is that the crowd tasted the wines ignorant of how the critics rated them and so when the scores appear to be similar, the authors conclude “hey, these amateurs are just as capable of assessing wine as the pros!” I suspect that most of the people posting scores on CT (or at least a statistically significant percentage of them) are generally aware of the critical reaction to a wine before they score it on line. And that they are loath to assign a score that departs dramatically from the critical consensus.

  2. The great number of scores on CT for any given wine has a profoundly homogenizing effect. The “mean” score, which is what they use, is almost always going to be 88-92, even if the wine is extraordinary (good or bad). This is why the mean score is utterly meaningless to me.

  3. Indiviudals, whether posting on this board, as critics for some publication, or on CT, are or can become known quantities. If Parker says a CdP or CA cab is a 95, I have a pretty good idea of what to expect. If Alfert or Mark G give the same score, I know that they are enthusiastic about the wine, but I can draw a very different inference because I know generally the kinds of wine they like. If the CT mean is 95, who the hell knows what that means other than that the collective enjoyed the wine (and that it is a dramatic outlier in the world where almost nothing has a mean score of 95)? And a 95 from “wine-idiot545” says absolutely nothing to me; I don’t know if his base of comparison is a 1970s era Mondavi or a can of sterno.

  4. To the extent that professional critics provide value, it is because over time you can learn about their own prejudices and proclivities, something that it is impossible to learn about the “mean” palate of the crowd.

CT is a fantastic tool. I use it every day. This “study” compares apples and chainsaws.

Looks like I posted this at the same time. this thread went up … maybe mods can combine threads: Why Amateur Wine Scores Are Every Bit As Good As Professionals' - WINE TALK - WineBerserkers

I’m always wary of the average scores. When you drill down into the individual reviews you frequently find that most are meaningless. I’d suspect that well over 50% are useless. However, there are certain people whose palates I respect and I do look for their reviews.

Agree totally. In addition to the point about the amateurs copying the pro scores, the bunching of scores is a real problem. On their charts, the axes show scores of 50-100. Are they considering a score of 89 and 92 to be close? Because while those scores may be close were the whole scoring scale actually used, it’s actually not that close when only five or six points are used to begin with. Maybe someone with more statistics training than I do can opine.

Of all the comments, I find some truth here. I should add that I do not score, do not follow RP or anyone else. I do refer to CT when I have purchased an interesting (new) wine to see what punters/the amateurs think!! I also visit CT when say I am thinking of opening maybe an `05 Baudry or a Rioja. Problem there might be that my thoughts on actually drinking the wine are somewhat influenced by what others have posted. I am fortunate to be able to pass by commentss of those who do not seem to know what they are talking about…LOL.

I’m similar Bob. I post notes on CT but not scores. Don’t care about professional scores. To some degree if a lot of CT posters use it as we do, largely eschewing professional scores for the crowd’s view, then maybe professional scores have minimal effect on CT.

The real story is expert vs expert. They’re all over the map.

I like saying that judging wine is like judging food. I don’t need someone to tell me whether or not I should like something. Ill taste it and decide.
image.jpeg

This is a strawman. I don’t know that many look at professional reviews to determine whether they should like a wine. I think that many look at professional reviews to determine whether they will like a wine. Big difference. If due to circumstances you need to buy before tasting, a critic with whom you generally agree can be useful.

Combining the critic scores for purposes of comparison is another blatant error in the methodology used by the authors. It purposefully obscures the very differences of opinion reflected in the data Brig posted.

And most professional wine critics don’t know jack s*it about some of the wines and regions they review. Portugal and Madeira comes to mind very often as does Israel and some others. Doesn’t stop them from reviewing them. Just saying.

From CellarTracker:

The wine looks purple colored. The legs are medium. There is light sediment in the bottle. The body is medium/full. 80 Points

Useless note. Useless score.

Well not really. If the color was red that probably wouldn’t be good. If it didn’t have legs how would it stand. If it had lots of sediment it obviously is flawed. And what’s to hate about a medium/full bodied wine. We wouldn’t want a low/empty bodied wine right. [wow.gif]