How accurate is ResellerRatings.com?

The website ResellerRatings.com has been around for a very long time, and is the de facto standard for checking out just about any online retailer in the industry. It provides a place for unbiased reviews of companies by people who have purchased in the past. However, in the last few years it has become increasingly easy to get a 9/10 score or better. Can the scores still be trusted, or has ResellerRatings.com become nothing more than a marketing platform? If so, how is that possible if the reviews cannot be affected by the companies being reviewed?

I very much like the idea of aggregate rating systems. I trust the wisdom of a crowd more than I do any individual, provided the data is properly compiled. For example, I have always found it odd for someone to have a brand loyalty based on their own experience only.

Person: I always use Brand-X memory, it's definitely the best in my experience.
Me: How many computers have you built with it?
Person: Three, and I've never had a single failure.


Why would this person base his decision on such a small set of experiences when he can look at reliability statistics based on thousands of computers? I believe it is the same reason why people buy based on word of mouth alone. A small set of personal experiences are more real to us than "distant" data, even if it is far more accurate. While I understand this phenomenon as human nature, I still find it odd!

ResellerRatings.com represents a great concept that provides both aggregate data, and personal reviews. I've personally followed it for over 10 years now, and have watched it change over time. As a system builder, we use it quite heavily as a trusted place to send our customers who want unbiased information about our company. I've been frustrated in the last year by reports of some of our competitors "gaming" the system. Companies can try to post fake reviews, but that's easily caught (I believe). But what if they are providing incentives for good reviews? Or even harder to track, what if they are only asking already happy customers to provide feedback? While these things are obviously against the rules at ResellerRatings.com, they can be difficult, if not impossible to enforce. It is frustrating to me, to think that other companies could be cheating their way into the same scores that we earn rightfully. It undermines the very purpose of the reviews, and takes away from the credability of ResellerRatings.com.

One other important way of determining the quality of a company is to take a look at BBB reports. A large number of complaints filed with the BBB is a strong indicator of a company that doesn't care. They typically represent a situation that has gone horribly wrong, and even after discussion, the customer still feels that the company is not being fair. I've been in only a few of these situations over the years, and they're very unpleasant. If a company has a large number of BBB complaints, they should also have a poor score at ResellerRatings.com, right? I decided to take a snapshot of our industry, to see if there is indeed a correlation between the two.

  BBB Complaints ResellerRatings
  36 months 12 months Lifetime 6 months
Alienware 853 130 7.76 9.36
AVA Direct 7 7 9.47 9.32
CyberPower 480 ??? 7.51 8.31
Dell 11630 4589 4.05 2.45
Digital Storm 14 6 8.96 9.67
IBuyPower 256 ??? 7.13 8.44
Puget Systems 3 1 9.9 9.71
Velocity Micro 24 7 8.48 8.83
Vigor Gaming 6 ??? 9.61 9.89


I plotted this data on a graph, showing both the number of complaints at the BBB, and the DISTANCE from a perfect score at ResellerRatings.com. For example, since Puget Systems holds a 9.9 lifetime score, our value would be 0.1. At first, the results were hard to see. I changed to the "lifetime" scores for the company, since that's a number that is not as easily skewed by any recent events. I also changed the plot to a logarithmic scale, and was amazed at the correlation!

This is great news for trusting the scores at ResellerRatings.com (but only the lifetime scores). We have a second set of data, which lines up perfectly. But take note of the logarithmic scale — this is very important! In this case, every line is 10 times the value of the one before it. What does this tell us? It tells us that even VERY slight differences in ResellerRatings.com score are very important. A score in the low to mid 9's is a big difference from a score in the upper 9's.

I do think ResellerRatings.com has some work they need to do. Even if you're not cheating, it is far too easy to obtain a 9/10 score, and it is far too easy for recent reviews to skew the score shown. If they don't address this soon, it will begin to impact the lifetime scores. For now, the lifetime score is a much better indicator, unless you truly believe that something in the company changed in the last 6 months.

Are there other ways you objectively research the quality of a company? I'd love to hear about it!