Browser Extensions 

overlap

ReviewMeta Analysis Test: Overlapping Review History

April 27th, 2016

At ReviewMeta, we do a comprehensive examination of the reviewers behind the reviews.  One test that we use to help us identify unnatural reviews is comparing the review history of each reviewer and seeing where there is substantial overlap.

Here is how it works: for each product, we make a list of all other products that these reviewers have also reviewed. Any products that show up at least twice in this list are considered overlapping.  We then narrow the list down to the top nine products that have the most overlap. Any reviewer who has also reviewed at least one product on the top overlapping products list is considered to have substantial overlapping history.

Here’s a little example: “Product A” has 3 reviews, by “Reviewer 1”, “Reviewer 2” and “Reviewer 3”.  Each of these reviewers has reviewed a total of 5 products each (including Product A).

Screen Shot 2018-05-21 at 1.18.03 PM

You’ll notice that Reviewer 1 and Reviewer 2 have both ALSO reviewed another unrelated product, “Product C”.  Sometimes this overlap happens by random chance, but if the overlap is substantial, it could indicate that there is some manipulation happening.

There are a number of benign reasons that we’d see a high percentage of reviewers having a substantial overlapping history. Certain products can lead to naturally higher levels of these reviewers. If a product is very niche or is used in conjunction with other products it isn’t strange that there are reviewers who have reviewed the same products.

However, a high percentage of reviewers with substantial overlapping history is more commonly caused by factors that can introduce bias. If a single brand dominates the top overlapping products, many of these factors are similar to those causing Brand Repeats such as:

  • Die-hard fans: A brand with a loyal following may have fans who review every single one of their products. These fans may not be reviewing the product with a neutral mindset; their loyalty and past relationship with the brand may positively influence their reviews.
  • Enticing Reviews: Brands will offer discounts or freebies to customers who write reviews for them. The customers they choose to write reviews are intentionally selected to be uncritical; a single negative review could get a customer off the “free stuff” mailing list. This list acts as an incubator for positive reviews, that a brand and its subsidiaries can dip into if they want to boost the average rating of any of their products.
  • Sockpuppet Accounts: Brands will create fake accounts to post positive reviews on all of  their (and any of their sister companies’) products. These reviews are not just biased, they are fraudulent.

But where our Overlapping Review History test really shines, is in it’s ability to find connection between reviewers who aren’t reviewing a common brand. This is essential in spotting the possible use of third party review manipulation services. These companies range from facilitating incentivized reviews to the manufacturing of fraudulent reviews.  These third party services don’t have brand loyalty, they will review products for whichever brand pays them. They may boost a lawn mower’s average rating in the morning and then boost a laptop cover’s rating later that day. Since our Overlapping Review History test looks past the brand we can still find the common connection between these third party service reviewers.

We like to empower our readers with tools to determine the trustworthiness of a product’s reviews. That is why we always show the list of top nine overlapping products for our readers to examine on their own. While we can never determine with complete certainty what’s causing the overlap, our readers can gain insight by examining the top products list.  Here are a few common scenarios and what they might indicate:

  • Product’s brand dominates: This may be an indication of a brand or die hard fans of a brand influencing reviews. Checking the product’s Brand Repeats test results may help support this.
  • One or two brands dominate: If the commonly reviewed items only consist of one or two brands, it is possible that these brands are sister brands of the product being reviewed.
  • Products are seemingly unrelated: Seemingly random products with a high level of overlap may be an indication of third party services influencing the reviews. These services can range from helping facilitate incentivized reviews to services that manufacture fake reviews. Our Incentivized Review test results may help collaborate if a third party incentivized review service is being used.

Even if we aren’t certain of the cause of reviewers with substantial overlapping history we can check to see if these reviewers are influencing the average rating of a product.  We first look at the raw percentage of these reviewers.  While it’s perfectly normal to see some reviewers with substantial overlapping history an excessive amount can trigger a warning or failure. We next look to see if the average rating from reviewers with substantial overlapping history  is higher than the average rating from reviewers without one. If there is a discrepancy we’ll check to see if  is statistically significant.  This means that we run the data through an equation that takes into account the total number of reviews along with the variance of the individual ratings and tells us if the discrepancy is more than just the result of random chance. (You can read more about our statistical significance tests here).  If reviewers with substantial overlapping history have a significantly higher average rating than reviewers without one, it’s a strong indicator that reviewers with substantial overlapping history aren’t evaluating the product from a neutral mindset, and are unfairly inflating the overall product rating.


16 responses to “ReviewMeta Analysis Test: Overlapping Review History”

  1. Viking&Eris says:

    As online SW, a camming couple, and a sex positive pair, us trying out new sex toys and having brands overlap is unfortunately going to happen. Overall, I like your site and style. Someone on OF pointed our rating out to me from you guys and I think it is a noble profession. Viking lost his job when I got covid and since I’m on immunosuppressive meds, we were quarantined for more than 45 days, as I recovered. We had plenty of time to reconnect and talk and decide what was next and we’ll, so far, this is it. Thank you for keeping the reviewers honest. You wouldn’t believe how many people want is to try, test, shout out their product. It is the craziest thing in the world. Have a beautiful week and happy fall!

  2. A E says:

    A company often posts on Reddit looking for people to review their stuff for a free product. They have you purchase it, post a glowing review, then they allegedly will send you the entire price of the product back on PayPal. Probably a scam and you could lose your account for a free product that is so crappy it needs fake reviews!

  3. monnom nom says:

    https://uploads.disquscdn.com/images/47505f99b682d1327b187b36c897156fda1b00fefb17ad416b6e07bc016a49de.png

    (sorry french)
    I do not understand that it is acceptable.
    Almost half of the reviews overlap.

    but I discover reviewmeta, it’s great :)

  4. Rachel Nador says:

    I had an ASIN: B07T1Q53J1 pass with 43% Substantial Overlapping Review History. Isn’t that a lot of overlap?

  5. Roger Garcia says:

    Hello! I think this is a pretty amazing tool. I wish there was an API with just this focus to people to add several ASIN’s at a time and analyze different patterns. Or entire categories and discover trends.

  6. Guest says:

    Not convinced. I took a look at a review of a Sony TV on Amazon UK, and in the overlapping reviews you flagged a TV mounting bracket. This is surely not suspicious? Especially when nearly all TVs on Amazon have a bundle option where you can add a wall mounting bracket?

    • Hello –
      Please keep in mind several factors:

      1. The whole algorithm is an estimate – there’s no way to get it right 100% of the time, so that’s why we show our work.
      2. There are over a dozen tests so the final adjusted rating is not dependent on just one test.
      3. Just with all tests, not only are we testing to see if there are a high amount of reviews in a “suspicious group”, we’re checking to see if that group is rating the product significantly more positively.

    • Mosquitobait says:

      Ditto for computer components. I think this test is giving the reverse feedback
      of what it should be. People who buy and review lots of different computer
      components should be considered MORE trustworthy and knowledgeable about
      them, not less…

      • I can see how this could be another false positive here. This is exactly why we try and provide as much data as possible, as well as allow you to actually go in and edit the adjustment yourself. If you think the reviewers with substantial overlapping history are more trustworthy, you can just slide the weight all the way up to 100% and recalculate based on your own preference.

  7. Ken says:

    Regarding overlapping review history, you can only review a product once. How could a review “show up at least twice”?

Leave a Reply