CellarTracker vs. the "professionals"

TBH there have been instances of ‘Wisdom of the crowd’ amongst critics - from limiting their tasting to recognised producers; to one great winery in Australia who hit problems, yet only one local critic was willing/able to call out the issues at that point in time (Henschke / Jeremy Oliver).

Indeed I sometimes wish for those early days where we sat down as a group with half a dozen bottles, and I had no idea what to expect - no Bayesian approach then, just tasting what was in front of me. I even liked that ‘scoring system’ - how much would you pay for a bottle of it? Such innocent days.

regards
Ian

In my mind CT clearly is a great complement to the professional’s info. I use both to try to get a broad understanding of wines I’ve never tried, or to assess if wines that I’m aging are ‘ready’ to pop.

I don’t see the CT average score as meaning a whole lot, but I think it means something. A particular year/appellation (say GC Combes aux Moines 2002) has a very narrow range of scores among all the different producers. That’s kind of interesting to me because I’m thinking about buying some 2010 of this and trying to figure out if producer X is worth 10x producer Y. 2010 scores are relevant too, but as I’m planning on aging this a while some older scores tell me a relevant story. Plus I might try a bottle of 2002 to see if I like this terroir and it seems to have a fairly narrow range of flavor profiles, so I’m not going to obsess about only trying, say, the Fourrier.

For some wines CT is a lot harder to use. Eg. let’s say I want to add some 2005 right bank Bordeaux to my collection. Quick search reveals 2000 or so different wines. Scanning down the list I see Petrus with a 93.5, and Canon-la-Gaffeliere with a 93.4. I guess the’re about the same! So clearly this will take a lot more study to figure out which right banks of 2005 have a brand premium, which are old school vs. new school, which will age well or not, etc. So I’ll read lots of TNs, pros I trust, think about price range, try to go to tastings, etc.

But CT is pretty amazing for 2005 BDX for the sheer “catalog of everything” it provides, with all kinds of experiences and tasting notes from around the world.

Ian, returning to innocence is so easy. Just don’t read wine sites, buy random bottles, and pop them open! Hah sometimes I feel like that’s what I’m actually doing, I just feel more informed because I study up before buying. For most of my friends, the random experience is actually their primary experience, eg. when I show up with a couple of random 1988 burgs for a basketball watching party.

Except that normal statistical methods are invalid when applied to scores. Scores represent “compositional data” (in statistics-speak), so statistics derived directly from them are questionable. The problem is further complicated by grade inflation - because wine scores crowd around the 90-100% region, “closure” (caused by the fact that a score cannot go above 100%) becomes a huge problem.

To do anything statistically valid with scores, they first have to be mathematically transformed (the most common approaches involve log-ratios). Unless CT does this, the stats it posts aren’t very useful. Read the stuff by a British statistician named Aitchison (1982 and on) for further enlightenment - just Google “Aitchison” and “compositional data” and you will find lots of free pdfs.

But then again, who takes wine scores very seriously? [snort.gif]

Just an aside, I thought it kind of funny that the other tasting notes, pro and con called the Marcassin Chard unctuous, fat, rich, sweet, high alcohol, heavily oaked ---- and Parker thought it tasted like Chablis.

Scores are lame.

Doug,
Perhaps this is often the case, but sometimes the first influencer is not a professional. In my case, I’m thinking Gonon, MacDonald, Ceritas, Rhys, Liquid Farm. I jumped on board with these wines following non-professional comments, long before critics crooned.
Cheers,
Warren

Warren, agree with your comments. I initially read Doug’s post as related to new releases rather than new (or new to me) producers. Looking at it again, I’m not sure that was the intent. They are two different circumstances for me.

New Releases - “which 2011 Napa Cabs outperformed in a difficult vintage”. If the question relates to wines that have been very recently released, or not yet released, CT isn’t particularly valuable. This board is ok, to the extent that posters share their experiences with barrel tastings when asked. But for me, pro reviewers have value because they usually get there first in terms of publishing impressions of new releases.

Introduction to new (to me) producers - like you, I most often find these via non-professionals. More specifically through this board. Off the top of my head, this board has been the primary impetus to me seeking out:

  • Copain
  • Carlisle
  • Bedrock
  • Arnot Roberts
  • Wind Gap
  • Liquid Farms
  • Lillian / Antica Terra
  • Anthill Farms
  • Rhys
  • Corra
  • Andrew Will
  • Betz
  • etc, etc, etc.

Hmmmm. As I write this, I think I’m starting to better understand my “situation”;)!

I agree with Warren. Some of my recent favorite wines came from Board member recommendations, some with no scores attached or even critics reviews. Now that said, I do like to read reviews on wines from a few select critics that put considerable effort into French regions that I adore.

So many statements in the previous pages I agree with.

  • Scores are lame.

I find labeling my “favorite tasters” helps me sort through the noise and more easily access useful information

I gave up on the critics long ago and I don’t bother with any of that content. Instead, I rely on the voice of the community that Eric has fueled through his site, a great place for sure.

For my purposes, CT is very useful, but it requires quite a bit of sifting and reading between the lines.

I find CT to be a tool, THE most useful tool I use when deciding to make a purchase, with “she who must be obeyed” a close second [cheers.gif] .

I think people have danced around the biggest differentiator

Critics almost always are rating / drinking the wine unbelievably young.

Cellartracker allows for you to get ratings from people while your wine ages.

That difference means everything to me as I can now tell pretty well when my bottles are getting into drinking range.

Dave

Why does it have to be CellarTracker vs. the “professionals”? In my eyes, you get both. The great thing about CellarTracker is that it makes information available from all kinds of wine drinkers - newbies, seasoned vets, professionals, etc… Just look at the number of integrated professional writers/publications - some paid and some free. You have it all in one spot and can use whatever info you wish. It doesn’t get much better than that and Eric deserves major kudos for allowing this to happen.

* disclaimer - I am an integrated paid subscription resource in CellarTracker, but long before that I was a CellarTracker user. I continue to be a proud and satisfied CellarTracker user that enjoys reading what others (both amateur and professional) have to say about a wine.

When I am browsing through my cellar and I want to see what is most likely to be enjoyable at the present, I will check Cellartracker to decide between a couple of choices. I have found it to be very helpful.
Now if I just broke down and bought a Coravin, I would just trust myself. [pillow-fight.gif]

I agree with Brad, I don’t see it as either/or. Now that leading critics like him, Tanzer, Meadows, Jancis etc are integrated with CT, I use CT as my portal to view both current CT reviews and the typically older professional reviews.

Like others here I’m usually looking at CT for how something in my cellar is drinking now. Critics like Meadows will have drinking windows but current CT notes can be more reliable.

Nowadays I rarely go to the Tanzer or Burghound homepage directly (except for en primeur offers) but the critics like them I follow provide valuable data points I put into the mix with CT reviews.

One minor gripe with CT and the Burghound integration: it fairly often doesn’t work. I’ll be looking at wine x and I’ll click the Burghound link and often end up at wine y. I don’t find that this happens with Tanzer and Gillman etc. (Otherwise a great feature).

Howard

Why do you need to click? It WORKS just fine. The review is there. The score is there. The data is 100% accurate. The issue is the link to the same exact review on Burghound.com. How does it work with Gillman (sic)? John Gilman has no website with a database of his reviews. The ONLY digital representation of his reviews is found on CellarTracker.

Sadly I should just kill all of these stupid links for Burghound. They have fixed it now (as of one month ago), but for 7 years, every time they would have a new issue, they changed the ID of their wine/review. So with each new issue they made scrambled eggs. You could reproduce the same issue just by trying to save a favorite/bookmark in your web browser of any Burghound review on the Burghound website. With each new issue they would break all of your links. So please just don’t click the link. And when I have a full day to do nothing better, I will just neuter all of the links to point to http://www.burghound.com. And when I want to pay someone to do a full 2,000 man hours of new work, I will eventually fix all of the links.

One thing people don’t realize is that (a) I don’t ask a penny from any professional reviewer on CellarTracker, (b) all of this data is shown to people who subscribe to these publications regardless of whether they are a paid or unpaid CellarTracker user, and (c) I do all of the work to integrate all of the reviews. As Brad notes above, it is for the greater good of the industry, and I generally feel great about that. However in some cases, I literally get a Microsoft Word document and have to pay someone to slice and dice it all into an Excel spreadsheet. And then I pay someone else to connect the dots between each review and the wine in CellarTracker. It is not cheap. It is not easy. And some of the publications do strange things with their database ID’s (or don’t even have their own databases).

There are more than 470,000 professional reviews on CellarTracker. There are only 300,000 reviews on Wine Spectator after 30 years. We have eclipsed that. There are a bit more than 200,000 reviews on eRobertParker. They have about 15 employees. We handle more than twice as many reviews in our spare time and from about 25 publications.

I think the notion of CellarTracker VERSUS professionals is totally bogus. The professional reviewers out there are amazing. We are eager to work with all of them ON BEHALF OF YOU, the consumers. However, it is not easy.

Eric, you should change those links to “click to fax me the this review” and send all those requests to Burghound.

Good plan!

Hi Eric,

The point was a minor irritant that I assumed was a problem at the Burghound.com end, not your issue or problem. I actually really like and appreciate the way CT integrates in with sites like Tanzer etc. It works most of the time too with Burghound and in the few cases where it doesn’t, I’m already in Burghound.com and only a few clicks away from the review I want.

I wouldn’t want to ask you to spend a lot of time, or pay someone else, to fix this minor issue, it’s not worth it and not your problem!

And I agree totally, as you conclude, it’s not CT vs professionals, both greatly enhance our experience. But if I was only allowed one source it would be CT. Fantastic concept and site. Keep up the great work, it is much appreciated!

Cheers, Howard

Great for wines in the cellar or for back filling but as a resource when buying it is tough to find notes from anyone but professionals for pre-release wines (which is often when you have to decide to pull the trigger) and tough to find a critical mass right at time of release. As a result, for current new release buying it is hard to beat professionals but since professionals seldom go back and re-review years later, it is hard to beat an aggregation of CT notes for ongoing data points. However, given the unknown mix, you really have to pay attention to who is posting the review. Overall both complement each other and prove valuable as sources of comment and knowledge.

CellarTracker (the platform) is fantastic, no doubt about it, but I’m with David with regards to CT being more akin to Yelp than Wikipedia in that a big part of the experience of wine/dining is necessarily personal and subjective. The one wine that I think the current CT reviews are completely off the mark is the 1995 Faiveley Mazis-Chambertin. I bought a bottle of this at auction, and none of the current CT reviews make any mention of the hard/gritty tannins (certainly compared to the more recent vintages of this wine) and there are very few mentions of the high level of toasty oak this displayed (which I perceived to be reminiscent of coconut). In short, decent effort for the vintage, but I would certainly not be anywhere near as effusive as most of the current reviews are of this bottling. In fact, I think it’s drying out.

That said, CellarTracker is a very useful tool, but, like Yelp, I would emphasize that the identity of the person reviewing the wine is a critical factor to how relevant the review is.

Agreed!