CellarTracker and the problem of groupthink.

-When will it peak ?

It’s reading well now even with a straight click and read. Estimate it reads well through Jan 27th of this year… Don’t think it gets much interest the next day

The truth is that

  1. We are not on a 100 pt scale. 50-100
  2. We are not on a 50 pt scale either.

Looking at the stats above, most people are on a scale between 84-100. The ratings scale is so subjective. We kind of need a poll on each wine we drink which may ask us 5-10 specific questions with a 1-10 scale for each which then provides us with our overall score.

How was the alcohol level. 1 2 3 4 10
How was the tannins. 1 2 3 4. 10

Etc. etc etc.

No more was the wine wet? 50 pts automatic.

What % of notes are on wines people own / have bought (versus for example ordered by someone else at a dinner, or encountered at a tasting)?
That could explain the lack of wines under say 88.
Also, any idea what the distribution of wines on Parker is by score?
Also, am I correct that the majority of these were not tasted blind or semi-blind?
Finally, am I right everyone has to post out of 100? Some people pay use 20 pt scale, or even a 5 (or 4) star, so how do those scores get converted.

There are Lies, Damn Lies, Statistics, and the worst of all are wine scores!

+1

Not sure how to express this precisely, but I’m pretty sure the 100 pt scale imples a lot of spurious accuracy. How consistently do tasters really distinguish ,say, 91 and 92 point wines.

IMO, you are missing the point about the purpose of points. Simply put, it allows readers to know what wine a specific taster or group of tasters prefers over another wine in the same peer group. That is important to you, or not. But that is the value of points.

lol wow, what a clickbait headline.

CellarTracker isn’t the issue, it’s the lazy end user that looks at a score as anything other than directional.

read the notes!!!

I read this all the time but frankly most notes are not that good.

Jeff, I am not sure what you are saying. Which do you think is more important, the group scores or the scores of individual tasters I am familiar with? [cheers.gif] pileon

I guess I have the 80-100 pt. scale burned into me by reading WS RP a lot back in the day. For me 90 points starts with a wine that is enjoyable and has all the attributes of a well-made wine in it’s category. 2013 Markham Merlot fits this bill. It has ripe flavors of cherry, Cherry Garcia oak, and enough tannin to hold it. 2010 Chateau Andrea, a bordeaux, has ripe (For France) dark cherry with earthy loamy tones but a simple presentation and enough tannin to hold it.

As I get to 91, 92 points there has to be a tad more complexity and more structure (ala Chapoutier Occultism Lapidem). On CT for example many tasters rate this lower. It does not have any sweet impression. I can see the lower scores from tasters but I like the “true to soil” iron, meat, blackberry and structure.

93, 94 has to add in what I perceive as more ripe fruit, varietal tendency and structure. 2010 Brane Cantenac fits this bill. It adds that elegance in the wake of it’s ripe complex fruit.

Above 95 is classic which for me is it has the varietal and country nailed along with med-full body, inner core ripeness and long lived structure. Complexity oodles of complexity too. Most of my Sine Qua Non Syrahs do this for me. Just huge Ca Syrah fruit and plenty of structure to hold it… Awesome aromatics too!

I’d rather have a 2010 Brane Cantenac than a Sine Qua Non reflecting my palate but I can see the extra shebang in that ripe powerful Syrah fruit.

Getting into 98-100 country is Petrus 2000 country for me. And La Tache '99. These wines just are seamless and pure and make me think about life and the marvel of my senses…

Mods move this to the Post Your Favorite, but uniformed, Opinion thread please. [snort.gif]

When looking at notes on Cellartracker, I tend to zero in on those notes that are part of a larger tasting. Then I will go to the full tasting and see how that wine performed compared to others for that poster. This does several things. It gives me an idea of the palate of the poster. It shows how much variance there is in the scoring across the wines. And it shows how that taster perceived the wine against some potential benchmarks.

And, here is a problem with any point system used my multiple people. To me, that sounds like an 85 point wine, at most.

I think Cellartracker is invaluable and it is my go-to source for quality information on wine before purchase, playing the role Robert Parker used to back when I was starting out way back in the day. But you have to interpret it carefully. Different regions are scored differently. Written notes are crucial. Ageworthy wines will get lower scores in their “dumb” phase. etc.

The fact that it’s effectively a 15-20 point scale means that single point differences can mean a lot. For a Bordeaux with a lot of notes, he difference between an 89 point score and a 91 point score is huge. Even the difference between 90 and 91 is important. From Bob’s useful post above, the move from an 89 to a 91 point wine means that you have moved from the top 55% of CT-rated wines to the top third (32%) of CT-rated wines. Also, I believe that CT wines are scored quite differently by region, with softer/sweeter wines scored higher and more homogeneously than more challenging types of wine like traditional left bank Bordeaux.

I wish Eric made it easier to download CT data as I think there is some fascinating stuff in there to be analyzed. But that might interfere with monetizing it I guess.

I don’t care for a “review” solely consisting of a numerical value. I treasure the written tasting notes, like what is found here. :slight_smile:

The concept that Cellar Tracker is like Yelp is wrong, in my opinion. Most of the reviews I see on Yelp are written by those who go out of their way to vent about a negative dining experience, whereas Cellar Tracker often serves the opposite purpose, ie, people who have had memorable, positive times with specific wines.

Also, a lot of people are buying wines from auction or elsewhere that have not been stored properly and are writing very different notes from what you might see with a wine that has been stored properly throughout its life. And, some people are estimating a wine’s life without actually having every tasted a mature wine from that producer or sometimes even a mature wine from any producer - and in any case have never followed a wine from youth to maturity.

Howard, this is quite easy to explain. People should pay more attention to individual tasters, that they know, or have read a lot of notes from. The reason is simple. For example, in your case, the higher I score a wine, you should avoid it at all costs. But, when I say a wine is fresh, bright, crisp or acidic, light or Burgundian in style, back up the truck! neener

Jeff, actually, I would not trust you to know a Burgundy I would like from a Burgundy I would not like. [stirthepothal.gif]

Where I do find your notes useful is where I taste a young Bordeaux that is of a style I like. If both you and I like the same Bordeaux, it likely will age pretty well and be of pretty low risk.

This is true, but a couple of things. First, it is pretty easy to tell from notes who has a real sense of what a wine’s aging curve should look like, and to discount low scores from people who say stuff like “this eight year old classed growth Bordeaux is on its last legs”. Second, if there are many scores and notes the individual factors you are talking about (like the quality of storage) tend to balance and average out. If you see a lot of variance in notes on an older wine you can often figure out what is about the storage conditions. Third, if I am buying a wine from the secondary market I want to know how durable it is to the range of storage conditions out there – there is no way to actually tell what storage conditions the wine I am purchasing might have had, so if a wine is very delicate and you get a lot of people writing that it’s shot then that is valuable information.

I think CT is the richest and most valuable source of review info out there, more valuable than any individual critic.