From the LA Times:
<a href="http://latimesblogs.latimes.com/dailydish/2009/11/why-wine-ratings-are-flawed-wine-spectator-announces-top-100-wines-of-2009.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed:+latimesdailydish+(Daily+Dish+Blog)">http://latimesblogs.latimes.com/dailydish/2009/11/why-wine-ratings-are-flawed-wine-spectator-announces-top-100-wines-of-2009.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed:+latimesdailydish+(Daily+Dish+Blog)</a>
Wine Spectator reveals top 100 Wines, but are all wine rating systems flawed?
<blockquote>A few years ago, Hodgson joined the California State Fair wine competition advisory board, which allowed him to run a controlled scientific study of its tastings.
The results, published in the Journal of Wine Economics, showed that the judges' ratings varied by ?4 points on a standard 100-point rating scale. And "only about one in 10 [judges] regularly rated the same wine within a range of ?2 points."
In September in the private wine newsletter the California Grapevine, Hodgson discussed his analysis of the complete records of several wine competitions. "The distribution of medals," he wrote, "mirrors what might be expected should a gold medal be awarded by chance alone.'" Ouch. </blockquote>
My angle? Read the reviews, do your own tastings, see which reviewers you agree with - and use thier work to your advantage. I don't care how many points Spectator or Wong gave it, if you hate it - <em>it sucks.</em>