For years, everyone I know has been a fan of Consumer Reports Magazine. To their credit, they remain one of the largest advertiser-free publications in existence. They also maintain one of, if not the largest database of customer satisfaction and repair data for vehicles and products (though it’s typically up to a year old by the time it gets to print). But this year they awarded some 2010 Product of the Year awards that causes us to find Consumer Reports testing methodologies flawed, or they simply don’t understand certain areas of the market. For one, they just awarded Product of the Year to a drill that comes with NiCad batteries. Huh?
In their latest edition we found this entry:
Power Drill: Porter-Cable PC180DK-2 ($100)
Consumer Reports found the 18-volt Porter-Cable PC180DK-2 to “offer speed and power” in addition to being packed with helpful extras like an LED work light and a smart charger for its two nickel-cadmium batteries.
Wow, an LED light and a smart charger for its NICKEL-CADMIUM batteries? How advanced!
Why We Find Consumer Reports Testing Methodologies Flawed
While we generally like Porter Cable, a $100 18V product that runs on Ni-Cad batteries is not exactly something we’d call the “Product of the Year”. How about all the new 12V products that have hit the market, of which some are now second and third generation? Have the testers at Consumer Reports even used lithium-ion? If all they are measuring is power and speed out of the gate, we can see how that might be a decent pick for the money; but what they don’t tell you is that Ni-Cad batteries last, in our experience, about 1-2 years at best. And of course, during that time their run-time grows gradually shorter as the dreaded memory effect takes place.
Coupled with the slow fall-time of the battery power, you end up with a tool that runs at full power for a while, eventually ratchets down, and then altogether ceases to charge.
Product of the Year? I think this is a good indication that electronics, which I’ve found Consumer Reports to be absolutely spot-off on in the past (their selections seem limited to what you can purchase at Sears, Walmart, and Best Buy), aren’t the only thing the company has little perspective on when it comes to recommendations in an industry.
Other Areas for Concern
Other items where we find Consumer Reports testing methodologies flawed are, believe it or not, vehicles. Despite the company’s excellent database and commitment to testing gas mileage, braking, and steering – the company has no real sense for luxury vehicles in terms of value simply for design and handling. In addition, gas mileage, price, and their reliability reports trump just about everything else. This is why a particular truck can win the Top Pick award despite the fact that in just about every other comparison we’ve seen by professional Car and Truck magazines pan it. Those publications recognize that consumers value handling, drivability, and performance above fuel efficiency and price. Thus, it ranks behind the other two domestic models. To avoid getting hammered by the loyalists, we’ll let you research that model.
Getting the Whole Picture
Arelated issue demonstrates the seriousness of how a company “weighs” different factors. Consumer Reports recently pulled the iPhone 4 from their recommended phone list. They did so because of an antenna issue. Holding the phone a certain way decreases signal strength, causing dropped calls. In this scenario, we observed that they weighed the problem without taking into account the simplicity of the solution. This prioritized this over the entire composite base of features and advantages of the product. The use of any case (even a $5 one) eliminates the problem. Everyone I know uses a case for their phone, so the problem was moot to begin with. It seemed more of a media phenomenon and an excuse to generate publicity and traffic rather than a real issue.
Editor’s Note: They did pick the Bosch SHE55M1 Dishwasher – a choice we’d heartily back up!
So, do you think Consumer Reports is useful for power tools (or anything other than toaster ovens)? Or do you find Consumer Reports testing methodologies flawed? Sound off in the comments, and let us know.