Online reviews are a quick and easy way to get a sense of how companies similar to your own have experienced different software and services. Choosing a winner should be really simple – just look for the solution with the highest average rating and start digging into individual comments to understand the strengths and weaknesses.
Sounds easy, right? Wrong. I started digging into G2 Crowd reviews and noticed a pretty clear pattern across the reviews provided: grade inflation.
To be a top-quartile reviewed company, you needed a minimum review score of 4.4. Bottom quartile companies had scores at or below 4.1. But there’s not much of a difference between those two numbers. If this were school, that means scoring 88% on the test or more is A, and 82% or lower is an F. Six percentage points separate excellence from mediocrity. That is one weird grading scale
To be clear, this isn’t anyone’s fault or a conspiracy by tech companies with bad products. It just reflects the nature of review aggregator websites. Most people who take time out of their busy days to write a product review on a 3rd party site will probably have good things to say. And if you are writing a negative review, you have to be seriously motivated.
Even though the average review score is mostly worthless, there’s still a ton of great information being provided in the reviews. So I worked to design a better way to interpret scores and get a good representation of how people REALLY feel about companies and their products.
The Love Versus Hate Ratio (LVHR)
And thus was born the Love-vs-Hate Ratio (LVHR). The Love-vs-Hate Ratio is the number of five-star reviews divided by the total count of one-star and two-star reviews.
Here’s what I like about the LVHR:
- Easy to calculate: using a calculator you can scan a review distribution to quickly determine the LVHR with a few clicks
- Provides clear distinction among competitors: the LVHR dramatically widens the difference between top performing products and lower performers. Take a look at the adjusted performance bands for the same products below:
- Focuses on passionate advocates / detractors: my approach borrows heavily from the Net Promoter Score concept. By ignoring the middle-of-the-road reviews (three-star & four-star), you can hone in on customers that were a great fit for the product vs. those who were a terrible fit. Using average reviews caps the maximum score at 5 (assuming all perfect ratings). LVHR allows you to find the products that customers REALLY love, some of which have scores exceeding 100 points based on thousands of reviews.
As a final treat, I’ve included below a chart mapping 30 products to my LVHR scale. Any surprises on the list?