If you think this is a joke, it is not. It seems as though the highly respected Consumer Reports magazine has decided to get into the wine review business. Who Knew? I’m wondering if this mission fits into this publications direction of, “…empowering people to protect themselves.” It seems altruism is alive and well at Consumer Reports. Let’s save the world from drinking cheap plonk! I think anyone can buy into this mission.
This news should not come as a surprise for subscribers. I know this because a representative asked me to provide media coverage on the subject a few years ago. I did not understand the intent of the inquiry and simply provided them with constructive criticism, which was left answered. I thought they would surely dump the idea and it would pass in time. It seems as though this is not the case.
But before I dive into my rant, I should probably go easy on this publication, because in all fairness, I have relied on this magazine for many large purchases—from cars, to dishwashers and ovens. Its my father that usually provides guidance, “Did you check out Consumer Reports prior to making a decision?” This is probably a comment heard more often for the over-forty crowd, but regardless, anyone can appreciate their tactics for testing products. Making informed buying decisions is easy based on Consumer Reports solid testing methods. But how and why does this apply to wine?
Well, when a twitter follower of mine tweeted, “Consumer Reports tests 11 affordable sparkling wines” this keyed my curiosity. What set me off was the term ‘test’ in the tweet. Can we really test wine to provide consumer recommendations? I think not. Then I read an article, “…Consumer Reports says that when it comes to sparkling wine, a higher price doesn’t necessarily mean higher quality. Four of the wines tested — including a $30 bottle of Piper Heidsieck, a French champagne — weren’t even good enough to make Consumer Reports’ initial cut.
I can agree that higher prices don’t necessarily equate to higher quality. But a bottle of Piper Heidsieck bubbly wasn’t good enough…to rate? Or…what?
Really? According to what rating criteria? Curious minds want to know.
Then I realized I was probably getting agitated over nothing and just like sparkling wine, agitation is not a good place to be. Wine critics use criteria to rate wine, so this magazine certainly has a right to voice their opinion. But I was curious to know what criteria they use to rate wine. So I headed to the Consumer Reports page for a little investigation. What instruments do they use to ‘test’ wine quality and who performs the tests?
[quote]What’s behind our wine Ratings? Experts at our National Testing and Research Center tested 149 models in wines to see which ones perform best.[/quote]
Really? Are we talking cars, women or wine? I should not be so critical but I am curious if real wine experts are behind this statement. I would expect seasoned wine critics would take issue with terms such as ‘models’ and wines that ‘perform best’. I know I would. Even more problematic is another statement found on the site that states, “Scores are based on cutting edge performance, handling and ease of use.” I wish all of my wines were easy to handle and performed well around corners. Joking aside, the tasting criteria continues:
[quote]The tasters learn that personal preference must play no part in taste testing and that they should ignore irrelevant cues like color, which can make a bright red sauce seem more tomatoey than a dull orange sauce, even when it’s not. Some food categories—such as wine—require a specific expertise or knowledge, so experts in that category are used and they adhere to the same testing principles as our in-house panel.[/quote]
Fair enough. And I applaud the removal of personal preferences. Too many times, I see wine critics dog a wine because of personal preferences and that is not fair to the wine. But let’s get to comparing critic reviews with those from Consumer Reports. I pulled a few from the site:
Beringer Knights Valley 2008
Consumer Reports: 57
Cellar Tracker: 88.1 (average rating)
Wine Spectator: 88
Ken’s Wine Guide: 89.8 (average)
Adelsheim Pinot Gris 2010
Consumer Reports: 60
Wine & Spirits: 90
Cellar Tracker: 88.8 (average rating)
Banfi San Angelo Pinot Grigio 2010
Consumer Reports: 62
Wine Spectator: 87
Ken’s Wine Guide: 87.5 (based on 2 reviews)
The Prisoner Red Blend 2009
Consumer Reports: 71
Ken’s Wine Rating: 91
Yellow Tail Pinot Grigio
Consumer Reports: 59
ZD Chardonnay 2009
Consumer Reports: 53
Wine Spectator: 88
Cellar tracker: 89.9
Wine Searcher: 88 (average rating)
Wow, Yellow Tail rated higher than ZD, Beringer, Dry Creek Vineyard and Murphy-Goode. At least they got Kim Crawford right. So lets get back to what criteria they use to score…
[quote]…we develop standards for how an excellent product should taste. Our food experts develop these “criteria for high quality” based on how high-quality ingredients subjected to careful processing and handling would—and wouldn’t—taste. [/quote]
[quote]Many of Consumer Reports’ tests involve the use of sensitive instruments. A liquid chromatograph determines how much caffeine is in coffee, and an atomic absorption spectrophotometer determines the amount of heavy metals in plastics and toys. A digital photometer measures light and color of TV displays. To evaluate a food’s nutrition, we use sophisticated laboratory instruments. But how do we evaluate its sensory quality—the characteristics of its ingredients, the balance of its flavors? To do this, we use a very sensitive instrument called the human palate. [/quote]
Well that makes me feel better, knowing human intervention is part of the wine scoring process.
[quote]The people we hire have normal taste and odor acuity, but they’ve also shown the ability to recall and identify various flavors and textures, and to communicate—in precise terms—what their taste buds are telling them. The people we hire have normal taste and odor acuity, but they’ve also shown the ability to recall and identify various flavors and textures, and to communicate—in precise terms—what their taste buds are telling them.[/quote]
Ok, so is it good or bad if wine reviewers have normal taste and odor acuity? You tell me. Not having to prove I am a super taster—priceless!
What comes as a surprise is that most of the rated wines fall at or below the 70 mark and I find it odd that the ‘Wine Cube’ sold at Target outperformed most wines, including some well-known brands such as Adelsheim, Banfi and King Estate. Is this a travesty of assessment or honest reviewing? I guess some might look at it as a breath of fresh air. I mean, at least their ratings fall between 50-90 points, a forty-point spread, which is more than what most critics use. And a wine that scores 50 on this site is considered a good wine.
So what’s my gripe? I still take issue using the term ‘testing’ for wine reviews. If we were in a lab testing and analyzing Titratable acidity, pH or SO2 levels, this would be an accurate description and this story would take a different direction. But this is not the case. There are no sensitive instruments nor spectrophotometers used in rating a wine, so why call it testing? And using the term ‘model’ to describe the name and vintage has to vanish. We’re talking wine here folks, not cars or appliances. We don’t need consumers walking into a store asking the clerk for the Beringer ‘2011 model’. I can only assume this publication uses these terms to support consistency throughout the site, but it can’t and won’t work for wine. And although I appreciate how the publication categorizes all wines as Old World or New World, it creates confusion when they list a wine as both old and new world style. They can’t be both.
As for numbers, it’s confusing to add another rating method that doesn’t quite mirror industry 100-point scoring methods. This will only further irritate consumers who do not like ratings and completely confuse those who thought they knew what they meant. Any wine publication that scores a wine around a 50-point mark usually deems it undrinkable or not recommended. So I suspect we won’t be seeing Consumer Reports shelf talkers because no wine steward in their right mind would hang one (with a 50-point rating) next to a Wine Spectator shelf talker that would give the same wine an 89-point rating. On positive note, the online publication offers a good wine guide with lots of helpful information on wine lingo and food pairings. And the detailed pages provide helpful information on (short) tasting notes and user reviews. There are more terms that require adjusting (e.g. wine characteristics are listed as features and specs just as they are for cars) but overall, this site has potential.
I applaud Consumer Reports for stepping up to the plate and shouldering a huge chunk of responsibility for the wine drinking community. Whether it will prove to be successful or not may take some time to develop but I would imagine a few small changes to naming conventions could make a big difference. And hopefully their next issue will include a primer on how to use their system. Introduction to the reviewers and their backgrounds would also go a long way towards building credibility.