Cellartracker vs. the "Experts"

I beg to differ. THIS is shear idiocy.

Bruce (see what I did there?)

Indeed I do. I need an editor in my life

I have written even more useless notes myself. :frowning:

I don’t know that many look at professional reviews to determine whether they should like a wine. I think that many look at professional reviews to determine whether they will like a wine.

Not me man. I drink it and then look up the score so I know if I liked it! What if I accidentally don’t like a 95 point wine? I would feel even more stupid than usual!

Other than that, your first post was right on the money. People see that a wine got 99 points. They run out to buy it or wait anxiously for their mailing list allocation to show up. Then they open it up and hoo boy! It’s 99 points! Or, if they’re uber thoughtful and nerdy and want to show independence, it’s only a 97 point wine. Post on CT and it becomes part of the noise.

As others have posted - you can select someone on CT who you know or understand and those scores can become sensible to you. But they mentioned that in the article. And they also focused on CA, which is very much based on scores. Look at this board when the WA scores come out, or even the Vinous scores.

I do love that the correlation btw Jancis and everybody else seems pretty weak. It makes sense given the fact that she has very little market influence in the US.

BTW, don’t know if any of you have read Schatzker’s book, The Dorito Effect, but it was pretty interesting.

Well, an important point kind of buried in the article – and ignored in this thread – is that Cellartracker scores correlate pretty well with all of the different critics. Accordingly, why pay to see what any given critic says when you get a good idea of whether they (individually and collectively) liked a wine from looking at Cellartracker averages?

Don’t they say that an average of all critics analyzed correlates to CT scores? That is very different. Gilman and Dr Big J do not correlate. Jancis and Parker agree on some wines but disagree violently on others.

Again, the median approach they use (and on which CT is based) is custom-made to make variations disappear. Homogenized scores look like homogenized scores. What a shock!

“All of the critics” is merely WA, IWC (before it was Vinoused), and JR. This doesn’t compare with WS, Burghound or Gilman.
For the professional scores, we purchased memberships to three sites: Wine Advocate (Robert Parker’s site), International Wine Cellar, and Jancis Robinson. We limited ourselves to these three because they had the most wines in common with CellarTracker
Why not WS? I’m sure WS had many more wines in common than Jancis (since they were only doing Cali)?
Of the critics they compared, the correlation is weakest with Jancis (who is probably the least read by CT users) and greatest with WA (probably the most read). Doubt that is a coincidence.


The first wine group I was ever in had a guy who loved WS. He’d bring a wine and announce it got 94 points (for example), but say his judgement was reserved until he tasted it, that only your personal tastes matter. Then we’d taste (not blind), and EVERY time he’d announce something like “well, they were close but it just a tad thin in the midpalate= 93” or “wow, that’s a really great finish, 95.” Great guy, but by the 2nd year I was busting a gut when he bestowed his rating.

All these sources are useful in some way. Just as long as you don’t take them too seriously.

First, the reviews analyzed were for California wines. Accordingly, that cuts down on the aggregation of different critics factor pretty highly. Parker has done the vast majority of California reviews for the Wine Advocate over the years. Similarly, I believe that only Tanzer did California wines for IWC. Not sure if Jancis Robinson’s site has only her own reviews or those of contributing writers as well.

Second, the article discusses the fact that the various “experts” don’t necessarily correlate, but notes that Cellartracker correlates about equally to all of them.

Thus, again, we seem to be able to get a thumbnail sketch of the critical opinion of a wine based upon the crowdsourced Cellartracker opinion. Again, if you’re only looking for a “thumbs up” or a “thumbs down” on an unknown wine (as opposed to more detailed information about its history, style, etc., Cellartracker may well save you a few somolians.

So the experts don’t correlate with each other but the unwashed masses correlate “equally with all of them?” That’s some fuzzy math right there.

I get what you say in that last sentence, and I suppose I can see that (although for myself, I am not convinced). But the purpose of the article was to show that we don’t need no stinking pros because the crowd ends up with the same results, and their “support” for that assertion is no support at all.

I have to be perfectly honest when it comes to the reviews we all mull over. I look at CT to see if there is a united sense of agreement about whether they like or dislike the wine. I want to see the average of what their scores are rather than a good well worded review.

I have found that the large well known magazines seem, to me anyway, to be more forgiving to those that do a lot of advertising business with them and if you don’t… LOL
Give me the consensus from CT over that any day.
Almost anyway.
LOL

I’d trust a WA, V, BH pro before a group thinking bunch of wannabe critics on CT that try and taste like a somm (barf) any day when making a purchasing decision on something I’m unfamiliar with.

That being said I love tasting at random first then checking to see who agrees with me on any given wine… not the other way around under $250/btl if I’m not familiar.

I use Cellartracker all the time, but with some caveats:

  • I disregard scores with less than two or three different users who voted
  • If there are only a few voters, or multiple scores from the same person, I look at their profiles to see if they rate a lot of wines and if my scores on wines we have both tasted agree with theirs (I also see if they have rated a lot of wines or just ones from a particular producer/distributor, in case they are merely promoting that winery by pumping up its scores)

If there are ample votes, from credible users then I find it to be very useful. But the NOTES are often more useful than the scores because it lets me determine if I will like this particular style of wine, whether the taste of the user matches mine, and also how the maturation is coming along.

I generally find Cellartracker to be more stringent about point scores than the ‘pros’ are actually. A Parker 99 will be a CT 96. A Galloni or Suckling 95 will be a CT 92.3. That conservatism suits me just fine as it cuts away some hyperbole and marketing b.s. and gives me the real drinking experiences of people who have to spend their own money on the wine, just like me, and unlike the Pro’s.

Yeah the influence of ‘knowing’ a wine is ‘supposed’ to be good can lead to confirmation bias.

That why at large tasting events I don’t want to see the scores NOR the prices next to the listed wines when I am tasting and scoring. I want to know what my senses really pick up, not what my mind expects to taste.

To me, scores are mostly useless. I derive real info from comments as to style, aging potential, etc… This may have already been said but that’s how I obtain my opinions on wine I have yet to taste.

Berserkers taking a wine question too seriously? Heaven forbid.

Whether or not the statistics and the conclusions drawn from them are bunk, can we at least agree that the quote from experts toward the beginning of the article is laughably self-important masturbation that makes them sound like complete stereotypes? I almost spit out my coffee when I read it. As if Robert Parker is anything but a slave to his personal preferences.

I just imagine that guy seeing himself as a perfect wine-calibrated mass spectrometer robot wearing an ascot.

As we all know, wine tasting and rating is subjective. A wine can taste different depending on your mood and who you are with etc. I use Cellar Tracker, because I like to see what non-experts think about a wine. Many people I drink wine with are not frequent wine drinkers, so sometimes I like to open a crowd-pleaser. But it’s not my only source of information when making a purchasing decision.

The experts can have varying opinions too. For example:

2012 Terredora Fiano di Avellino
Hard to believe these expert reviews are all of the same wine!

JS93
James Suckling - jamessuckling.com, January 2013
Fascinating nose of star fruit and lychee. Full body, lots of fruit and a dense finish. Loads of character. Goes on for minutes.

W&S90
Wine & Spirits - Wine & Spirits, January 2013
Lively scents of tangerines and roses fill out this rich, nutty white. The floral aspect is a little intense, modulated by salinity and hints of spice. It feels broad yet tightly focused, elegant enough to serve with a seafood terrine.

ST90
Stephen Tanzer’s IWC - Stephen Tanzer’s IWC, January 2013
Pale yellow with a green cast. Aromas of apple, quince and minty white flowers. Then slightly more generous in the mouth, with its broad, bright middle shaped by intense, almost creamy flavors of apple, quince and hazelnut. The finish is long and clean. I usually find this wine easier to drink in the early going than the Fiano di Avellino Campore bottling.

WS88
Wine Spectator - Wine Spectator, January 2013
Light, elegant and floral, with a fine mix of green melon, citrus and underbrush. Drink now. – NW

WE88
Wine Enthusiast - Wine Enthusiast, January 2013
It opens with aromas of stone fruit, pineapple and a hint of mineral. Rich flavors of pear, peach and mango accompany an intriguing smoked note. The palate finishes crisp and dry.

[quoteAs we all know, wine tasting and rating is subjective. A wine can taste different depending on your mood and who you are with etc. I use Cellar Tracker, because I like to see what non-experts think about a wine. Many people I drink wine with are not frequent wine drinkers, so sometimes I like to open a crowd-pleaser. But it’s not my only source of information when making a purchasing decision.

The experts can have varying opinions too. For example:

2012 Terredora Fiano di Avellino
Hard to believe these expert reviews are all of the same wine!

JS93
James Suckling - jamessuckling.com, January 2013
Fascinating nose of star fruit and lychee. Full body, lots of fruit and a dense finish. Loads of character. Goes on for minutes.

W&S90
Wine & Spirits - Wine & Spirits, January 2013
Lively scents of tangerines and roses fill out this rich, nutty white. The floral aspect is a little intense, modulated by salinity and hints of spice. It feels broad yet tightly focused, elegant enough to serve with a seafood terrine.

ST90
Stephen Tanzer’s IWC - Stephen Tanzer’s IWC, January 2013
Pale yellow with a green cast. Aromas of apple, quince and minty white flowers. Then slightly more generous in the mouth, with its broad, bright middle shaped by intense, almost creamy flavors of apple, quince and hazelnut. The finish is long and clean. I usually find this wine easier to drink in the early going than the Fiano di Avellino Campore bottling.

WS88
Wine Spectator - Wine Spectator, January 2013
Light, elegant and floral, with a fine mix of green melon, citrus and underbrush. Drink now. – NW

WE88
Wine Enthusiast - Wine Enthusiast, January 2013
It opens with aromas of stone fruit, pineapple and a hint of mineral. Rich flavors of pear, peach and mango accompany an intriguing smoked note. The palate finishes crisp and dry.][/quote]
Suckling is always about 3 points or more higher than most.

There are a few straw arguments in here. First up, CT is not a mass of amateur vignoramuses and wine bluffs. Every poster has an identity and we all surely quickly learn whose reviews to trust and respect. Just by reading the notes you can see who is really taking pains to describe the wine, and not just (frankly the exception) sitting at the keyboard with a stewed grin on their faces trumpeting about a wine they paid big bucks for. And the killer difference is that you get the most recent reviews of these mature wines coming up first - you have a history of people’s experiences with the same wine over time - and if you filter the reviewer, how one person whose palate you trust has seen the evolution of a wine. Among the “professionals”, only John Gilman really offers a similar scale to CT of retrospective views of mature as well as young wines.