The most interesting of event this week. I had lunch by my self and found a 2017 bdx issue of wine spectator magazine in my work cafeteria… I spent 30 mins eating and comparing the scores that were provided in the article to cellartracker scores. … holy shi7 , almost all of the CT scores were within a point of what WS projected.
I know this is only 5 years in, but that is impressive…
This is a problem that often happens with crowdsourced ratings like CT. The CT ratings tend to be “anchored” by prior knowledge of the critic scores. That’s why I think it’s much better to pay attention to who rated it and what they wrote in their TN rather than the average numerical scores that were assigned.
I think you may be wrong. I never read ws. I don’t know anyone who does or uses them for reference. I do find ct scores with more than 10 numerical entries to get close to my values
If you think ct influences my scoring , which was originally a ws score I think that is a stretch.
I don’t read much into CT notes for current vintage, but I find it extremely helpful for cellared wines. At the very least, I figure if someone is going to invest the time & energy in cellaring a wine, they’re more likely going to have an idea as to what their talking about.
not sure this is some huge insight given the wines WS reviews and those held by CT users. curious, i took the top 250 '17 bordeaux held by CT users and 70% of the scores fall within 89 and 94 points, which i think isn’t very surprising. but the point is that the great majority of all these wines are reviewed within a very narrow score range.
Agree. Yaacov beat me to this. I think the anchoring effect* can occur, but the OP is talking +/1 in what is at close to a 10 point range, so +/-10% difference which easily be statistically significant. Looking at absolute magnitude as if this were a 100pt scale is misleading.
\
The anchoring effect is probably more centered not on what critics write, but what the earliest raters in CT are writing. Theoretically, those first movers could be affected by critics. With Bordeaux though , the CT first movers (eg Jeff Leve) are often ahead of critical publications.
Not surprising at all. I would hope if it’s less then 89 people aren’t holding a ton and entering it into CT. if it’s more then 94 most people can’t afford it and there isn’t that much produced.
This also tracks most professional scores (narrow range 89-94 for the most part. However is seems maybe reviewer scores range from 93-99 these days. On a side note I’ve been considering going back and doing an analysis on the rating breakdowns over the years to see if we’re trending to higher scores more than in the past.
What this does tell me is that CT and Yelp and all other crowd sourced reviews don’t always get it right more times than not given a statistically relevant sample size.
not to mention that the commercial interest in reviews is towards good reviews, not bad ones. reviews exist for two reasons; to point you to higher scored wines and, perhaps more important, to emotionally confirm your purchase decisions.
I don’t think any commercial reviews necessarily affect CT scores to a great extent, but I do agree that the other reviews in CT affect subsequent reviews to some extent. There are a lot of people, whether we like it or not, who don’t want to be “different”. So if a bunch of people rate a wine 90 on CT, most people won’t deviate much from that no matter what their personal experience is. You have to have a lot of intestinal fortitude to rate a wine 80 points if everyone else is rating it a 90, or at least trust your own palate.
This is why I agree with an earlier comment about trusting who it is that posted scores prior to your posting. Eventually you figure out who your palate aligns with. My palate tends to align with Jeff Leve, so I pay attention to his reviews. Others, perhaps not so much.
I don’t and haven’t read any of the “professional” reviews for decades. I do read tasting notes in CT from people that I know to have a palate similar to mine. That is all I need. I could care less what any of the score grinders have to say.