I agree. I had the 2019 Quivet LPV in October and it was much better than a 92 point wine, to my palate. I tend to think Galloniās reviews are at least a few points lower than my own take on most of Mikeās cabs.
I completely disagree re: why publish it. If a reviewer believes the wines are āXā and do not enjoy, they should share those just as much as wines they enjoy.
Stewart got crushed by these scores which is interesting. Galloni doesnāt seem to love big fruity Napa wines and prefers the more restrained ones when compared to LBP. Scores were about in line with what Iād expect. He really loves Sinegal wines for some reason too.
and if yours is about stewart, he had a number of low 80s sores. it is what it is. I respect when reviewers say what they think even if i disagree (like the SQN review i posted above)
I donāt know what youāre looking at. Myriad got generally great scores, with a disappointing score for Bourn and GIII Empyrean. I really appreciate the apparent honesty and Antonio not being afraid to let us know when he thinks a more expensive wine is worse than less expensive wines from the same lineup.
Does anyone know if he had barrel scores out prior to most of these wines being released? This is information I would pay for if it was available prior to the wines being offered for sale.
Our Berserkerday friends Erin and Massimo did well with AG ā Di Costanzo Caldwell = 97 points. FWIW, the 2013 Farella is starting to wake up and itās everything I had hoped.
I subscribe to Vinous chiefly for his Italian coverage and Nealās coverage of France. I do not subscribe to Jebās site, however in reading WB, his scores and reviews on various retailer sites, I feel I have an understanding of what he likes in wines. In comparing Galloniās notes in his report, for some of the wines discussed here, while the scores do not correlate to each other, the notes are similar. For some of the wines he expressly states the wine isnāt for him as he doesnāt like how fruit forward and how big the wine is. To me each critic is correct and honest, likes and dislikes in the wine explain the score variation.
I feel like the critics are damned if they do and damned if they donāt. People complain when certain critics dare to give out several 100s, 99s, 98s, but when they go to 90 or god forbid the dreaded 80s, people complain again. While itās very possible that the same people arenāt complaining about both ends of the spectrum, it seems like people want critics to use a 90 - 96 point scale. Furthermore, sometimes it seems as if people are looking for validation of their tastes/preferences more than information to help them better understand a wine/vintage/producer.
I really appreciate when critics are willing to give their honest opinion of a wine, even if they really disliked it, whether I agree with them or not. Itās another useful datapoint for me.
When I said I appreciate his honesty, I more meant I appreciate his ability to seemingly score and review without suffering from an overwhelming affect of bias or cognitive dissonance. While I donāt think any other reviewers are being dishonest, at times, I feel scores may just a bit suspect when they align nearly perfectly with the price of the wine, especially when the wines werenāt tasted blind. I think a lot of, if not most, people (both professional and casual) have trouble completely separating the price and reputation of a wine/producer from their thoughts on the wine. This would also be supported with basic principles of psychology, and often seems potentially evident in many pro publications as well as on CellarTracker and Vivino.
While I respectfully disagree with your above comments, my question isnāt really centered around scores, itās around the content of the reviews themselves.
I appreciate people calling it like they see it, but when multiple publications (whether you like them or not) all seem in line and another seems drastically different in their experience, that seems more than odd to me.