Browser Extensions 


ReviewMeta Analysis Test: Rating Trend

April 27th, 2016

The Rating Trend test analyzes the rate at which reviews for a particular product are submitted over time.  In a perfect world, if reviews were created randomly – independent of any outside forces – we would expect to see these reviews evenly distributed across all the dates that the product has been available for review.  When we analyze a given product and see spikes of reviews submitted on certain days, this can indicate outside forces causing reviews to be created unnaturally.

We call days with unnatural spikes of reviews “high volume” days, and here’s how we determine them: first, we estimate the number of days the product has been available for review by counting the number of days from the first review to the last review.  Then we run a statistical model to estimate the maximum number of reviews per day we’d expect to see if the reviews were created randomly.  If any actual days exceed this threshold, we consider them to be “high volume” days.

While these spikes may be caused by external forces, it doesn’t necessarily indicate that they are caused by dishonest methods. Pretty much anything that could cause an uptick in sales could also in turn cause a high volume day. For example:

  • Promotion/Sales: If a product is running a promotion or a sale it may cause sales volume to increase which will in turn increase the number of reviews in the following weeks or months.
  • Advertising or more media exposure: If a brand is promoting a product through a strong advertising campaign or receives media coverage (positive or negative) it may result in a boost in sales and subsequent increase in reviews as well.
  • Seasonal items: Items that are usually bought at a certain time of the year may result in more high volume days. It makes sense for reviews on plastic Santas to not be evenly distributed across the year, but rather cluster around December and January.

It is important to keep in mind that even if these factors are driving reviews, it is still unlikely that they would cause reviewers to post reviews all on the same day.  If a big marketing campaign boosted sales one month, you’d expect to see a gradual increase in reviews the following month, not just one single high volume day of reviews.

While some spikes may be completely natural, certain forms of review manipulation can also cause high volume review days. These can range from the benign to the sinister:

  • Asking for reviews: This seems like an innocent enough request for brands to ask of their customers. A brand might include a request for reviews on their customer emails or request it via social media, however if the brand only asks a specific group for reviews, such as their die-hard fans, it might result in some biased reviews.
  • Incentivized reviews:  A brand might offer their product to a group of reviewers for free or at a discount in exchange for a review.  
  • Fake reviews: A brand might order a run of fake reviews from a third party service which are delivered within a short window of time.

In order to determine if the high volume review days are malicious or benign, we group all reviews from high volume days and check their overall percentage. While it’s not immediately problematic to see a small percentage of reviews created on high volume days, an excessive amount can trigger a warning or failure. Next, we check to see if reviews created on high volume days have a higher average rating than reviews created on normal-volume days.  If they do, we’ll check to see if this discrepancy is statistically significant.  This means that we run the data through an equation that takes into account the total number of reviews along with the variance of the individual ratings and tells us if the discrepancy is more than just the result of random chance.(You can read more about our statistical significance tests here). If the high volume days have a  significantly higher rating than normal volume days, it is strongly supports that the reviews created on high volume days are not benign and are unfairly inflating the overall product rating.

3 responses to “ReviewMeta Analysis Test: Rating Trend”

  1. Kathy Zach says:

    First off, I just want to say that I have recently discovered your amazing website and I am very happy I did! I have passed the word along to all my friends to use ReviewMeta if they are interested in “weeding out” erroneous reviews.

    I am wondering though, on this specific test of Rating Trend, if you have considered the impact that Black Friday and Cyber Monday have on the purchase (and review) of a product? I have personally been shopping on Black Friday (BF) for many years and recently added Cyber Monday (CM) to my Christmas shopping. I can absolutely say that there is a major influx in purchases that wouldn’t normally be seen.
    I do see that you have a very good algorithm in place to see if a suspiciously high volume of reviews are real or fake; and you mention that promotions, advertising, and seasonal items can lead (in part) to an increase in reviewing. However, BF and CM might explain the large amount of reviews on the days following these two major shopping days. Or maybe you have already considered this and just didn’t mention it. Just a thought!

    • Thanks for the kind words, Kathy!

      Just like any test, the Rating Trend test isn’t always going to be 100% perfect. However there is a bit of a safeguard – we don’t just look at the number of reviews on a given day, we are also comparing the rating from reviews on high-volume days against the reviews of normal volume days. So if there’s a huge spike in reviews, but those reviews are giving the product the same rating as all other reviews, it will likely only result in a “WARN” and the impact on the overall report will be minimal.

Leave a Reply