Consumer reports "data" is biased and isn't worth taking too seriously.
Here’s why: Car rankings are based exclusively on surveys offered by Consumer Reports (CR) readers. This, in my view, is a fatally flawed approach.
1. CR subscribers aren't representative of the general public. Quantcast.com, which estimates demographic and user data for millions of websites, has provided the following demographic “snap shot” of ConsumerReports.org (see the original report here):
ConsumerReports.org demographic data
Demographic data about the ConsumerReports.org website audience, as determined by Quantcast.com
As you can see, the typical ConsumerReports.org visitor is more likely to be wealthy ($100k+ annual household income), old and college educated. While there’s nothing wrong with being wealthy, old or educated, I suspect these consumers are a bit biased against American car brands.
Hybrid buyers, for example, are significantly more likely to be educated and wealthy (see Scarborough Research). The best selling hybrid? A Toyota.
American cars don’t sell well in the country’s wealthiest cities, additional proof that wealthy people are biased against American vehicle brands.
For anyone who thinks that Quantcast’s data might be off, check out this 2009 study of CR’s auto buying guide, which was sponsored by CR. According to the data on page 34, the average CR reader (either online or via magazine subscription) is wealthier, older and more educated than average.
2. CR data is noisy. By “noisy,” I mean varying quite a bit from year to year. In this year’s study, Volvo and Chrysler fell 10 and 8 spots in the rankings, while GMC, Cadillac, and Audi skyrocketed 10, 14, and 16 (!) slots.
Consumer Reports auto reliability data is noisy
How can one brand’s reliability ranking surge from the bottom 5 to the top 10 in just one year? Because Consumer Reports data is very “noisy,” and hence not terribly accurate.
Are we honestly supposed to believe that Audi was ranked as one of the least reliable brands last year, and yet somehow ranked top 10 in reliability this year? This is obviously a result of a limited amount of data, which brings me too…
3. CR uses as few as 100 surveys to rate vehicles! That’s right folks – 100 measly surveys is all it takes for Consumer Reports to assess a specific vehicle’s reliability rating.
100 data points is hardly enough to form a scientific evaluation – it’s embarrassing that CR would admit to this methodology, but they’ve done precisely that:
…The scores are presented as a percentage better or worse than the average of all cars. The minimum sample size is 100 vehicles, but Consumer Reports often gets many more.
While CR might “often” get 100′s or surveys, this hardly seems like a good system. It also explains Audi’s wild change in rankings, doesn’t it?
The bottom line: Don’t trust Consumer Reports quality and reliability data, at least as far as automobiles are concerned.
This is who you are getting your consumer reports automotive advice from: