Wednesday, September 14, 2011

Why Consumer Reports Sucks

When was the last time you picked up a Consumer Reports magazine? I don't know anyone who has a subscription to this popular "reviewed by consumers, for consumers" magazine anymore except my local library and maybe a doctor's office or two. And while there are plenty of conspiracy theories out there that center around how Consumer Reports isn't the gold standard of unbiased reviews it boldly claims to be because of industry payouts, manufacturer kickbacks, and pharmaceutical company sponsorship, when I recently glanced through a copy I was dismayed with the publication for completely different reasons, and felt compelled to review the reviewers at Consumer Reports. Here's what I noticed.

No Quantitative Data!
I saw plenty of comparisons that caught my eye in the issue I was looking at, but what struck me was that none of them mentioned any numbers. For example, the June 2011 issue has a half-page article about electric razors. 13 different razors were tested, and of those 6 were recommended and 1 was a best buy. That's all lovely, except the table compares things like "Noise," "Battery Life" and "Features" without ever explaining exactly how these variables were measured. I'm sure Consumer Reports has their readers' best interests in mind, but to make a truly informed decision about which products are best, wouldn't it be nice to have a few extra details?

For example, how did they test the battery life? Did they just turn each electric razor on and run it down until it quit? Or did each tester just use their assigned razor for a few minutes each day to shave normally, introducing variables like beard toughness and shaving technique? Or did they simulate actual shaving in a lab setting, cycling each razor on and off for ten minutes at a time over the course of several days to see how well the battery held its charge? Maybe I'm splitting hairs here but I'd like to see some concrete statistics from Consumer Reports, like "Razor B1 had a tested battery life of three hours in our rundown test, but razor B2 had a tested battery life of 2 hours and 45 minutes, so we gave B1 excellent and B2 very good ratings." That way the average Joe consumer could look at the data and make a better informed buying decision, especially if there's a significant price difference. I don't know about you, but I can live with a few minutes less shaving time for a few dollars less, but Consumer Reports insists on hiding the reality behind their results with those annoying circle icons. Are they trying to appease the manufacturers by never giving out any actual performance results in their testing to show just how far ahead of the pack a product is - or how little variation there actually is among products?

Another example that concerned me was an in-depth article covering washing machine testing. The lab rated things like "gentleness," "noise" and "vibration" which all seem to be good things to consider in deciding which washing machine to purchase. But if you want to know exactly how Consumer Reports tested each machine for gentleness, you'll have to look elsewhere, because there's no explanation in the article. And it's not just these slightly more subjective measurements that are missing - I would think standards like noise could be easily measured with a decibel meter and some real digits put in an article to show consumers what the tested differences were. Of course the consumer's mileage may vary, since I doubt anyone is washing a test-sized load at home and the noise levels would be different depending on what you were washing - but why not do a little more science and a little less touting the ratings as gospel?

Re-Using Data in "Summary" Articles
I'm sure Consumer Reports isn't the only magazine that does this, but I noticed that in the June 2011 issue there's an in-depth (well, okay, as in-depth as you can get without actually mentioning any data) report on gas grills. They test and rate something like 65 gas grills in a 3 page article with their pleasantly mysterious circles. But then in September 2011 there's a whole page devoted to "Great Grills" that shows much of the same information. And we see the top performers from the first grilling test with their test results re-printed for readers to digest again, along with strikingly similar headlines. "Think safety" becomes "Give it a safety check" and "Focus on features" turns into "Choose features you'll use." And judging by last year's "Home and Yard Products roundup" edition, readers can plan on seeing the same tests again in the future.

I'm not sure I would be comfortable paying for a magazine that claims to be the ultimate buying guide for consumers that really just feeds you the same content over and over again, even if it's only 3% recycled material. Just because their reader base is aging does not mean that they are also forgetting what they read three months ago! Maybe it's time to do a little comparison of buying guides and give ratings magazines and websites some pretty circles. And doesn't it scare you just a tiny bit when a magazine puts a little burst on their next-to-last page saying "Please remember CU in your will?" (This is at the bottom of a page-long list of names of individuals involved in their organization! Why not have another review in there instead of all those names!?)

All Reliability Data Comes From Readers
I'm sure Consumer Reports readers are nice people. But judging from the heavily targeted advertising in their magazines for life insurance, reviews for medical supplements and medicare plans, and promotions reminding readers that the Consumer Reports website is available 24/7 (just like the rest of the internet!) I get the impression that their readers are in a specific demographic that could cast a shadow on some of their reliability reports. I'm talking about the mature demographic, like over age 50. A quick glance at Alexa.com's audience report confirms that most of the visitors to consumerreports.com are over the age of 50.

The point I'm getting at is when you ask your average 55-year old about how their dishwasher has been working, for example, to try and determine the most reliable brands, you're going to get an answer that's skewed by their age group's habits. I see them saying things like "Well, we only run it twice a week." So how can this data be of use to your average family of four? Is it three loads a week vs. two loads a day? As another example, think about where this generation of readers is driving their cars. They're driving to work, to the store, and back home, and then to church on weekends. They're not driving to work, to school, to soccer practice to pick up Johnny, then to the music clinic to pick up Jenny, and then to Walmart and then to the grocery store and then home and then to the park and back. I'm not saying that the data is faulty, I'm sure Consumer Reports would say that the real world mileage of their reliability reports will vary immensely, but wouldn't you like to know a few more details?

In keeping with the "just take our word for it" theme of the entire publication, Consumer Reports doesn't bother to tell you how many readers they surveyed to get their data, or how many percentage points their survey could be off by, so who's to say they aren't just calling a few extra readers until they get the results they're looking (or getting paid) for?

The Bottom Line
Maybe I'm just spoiled by this generation's fascination with user reviews and the full range of review websites out there that take great pride in exposing all the gory, technical details of the products that they work so hard to review accurately and completely. But the classic Excellent, Very Good, Good, Fair and Poor circles of Consumer Reports are starting to feel a bit outdated without any real substance to back them up, and their attitude of "trust us, we're Consumer Reports" doesn't inspire confidence, especially when they're feeding readers the same data multiple times and mining their increasingly segregated reader base for reliability info that may not accurately reflect real-world results. What's your take on Consumer Reports? As always, I'm willing to consider any viewpoint in your comments, provided you can back it up.

8 comments:

  1. Growing up, our family read consumer reports like a consumer bible, checking ratings on everything from tools to potato chips, it was regarded in the same esteem as the New York Times. My first teen job was at a television repair shop, yes there was a time when appliances were repaired and not discarded. I was supprised to see the repair techs and salespeople scoff at the publication, I thought it was mere sour grapes for not having our high end equipment listed... only, it wasnt.

    CR tends to construct its variables one dimensionally, and the most important variable is price, not cost over time. One issue described a GE television as pretty good, but an RCA television as awesome. We knew that both televisions were identical and made by a company called Thompson, both of them had 25% failure rates in the first 18 months of ownership; i.e, both were terrible. The pricey Japanese and European stuff we sold was maligned as to costly, despite the fact that we had known them to last into their second decade.

    When a Korean manufacturer with a bad reputation wanted to sell televisions again we asked them to supply three of various models that we would strip down and examine for quality and craftsmanship. we found them all to be excellent quality and sold them by the case. CR reported the brand as having a bad reputation, and said they would certainly break down. As it turned out, Samsung would emerge as a quality brand, and CR would emerge as short sighted.

    Quality is something CR never considers; What materials are used in the construction of electric razors? 304 stainless, 316? or plain cheap chrome plated whatever?. its just a factor of features and price. They also don't seem to test a number of things to take into account manufacturing deviation.

    ReplyDelete
  2. CR may not be "perfect", but where else can you get descent reviews from a more-or-less unbiased source? recently, it's been shown that most consumer reviews online are bogus (people are paid to write good reviews on Amazon and Yelp, etc...), so until there is a better source for good reviews, CR is about the best out there.

    ReplyDelete
  3. It's true, Consumer Reports is horrible.

    But some of the reasons I hate it are the fact that they don't do in depth reporting on technical details.

    They give horrible online summaries in an effort to get you to pay to see actual reviews.

    Their pools of consumer products are shallow, and don't include comparisons to all time classics, new and upcoming products, and international variants.

    Overall sloppy "reporting".

    ReplyDelete
    Replies
    1. Hi Blitz, thank you for commenting, I agree with you on not reporting technical details and shallow product pools. One thing you mentioned was having to pay to see the actual reviews - not sure if you know this but most local public libraries allow you to access the full versions of Consumer Reports articles from their websites via research links. Just ask them about it! (And yes, even though I don't particularly like Consumer Reports, I still occasionally read their reviews... they are better than nothing.)

      Delete
  4. I use CR mostly online and don't have a magazine subscription. The drawback of this is CR doesn't include review dates in its online assessments. So for items with a limited shelf life - like computer components - you don't know whether what's recommended is 2 months old or 2 years old. I wrote CR about this drawback, but got no response. Better than nothing - but only just.

    ReplyDelete
  5. The site sucks for searches too. Search for screen door and you get mostly a bunch of cars with a tv, fireplace, i phone and safety gate. Absolutely nothing on screen doors.

    ReplyDelete
  6. Just subscribed to CR. It is pretty subjective. Will cancell ASAP. They should get a serviceman to tear into a unit to look for ob vious quality issues, then at least it woulde have relavance. According to the CR rep, they never do this.

    ReplyDelete
  7. Blank page after blank page of what we are supposed to see as "graphics". Look at June 2020, 25 % of noting.

    ReplyDelete