Browser Extensions 

susp

ReviewMeta Analysis Test: Suspicious Reviewers

April 27th, 2016

At ReviewMeta we have developed a number of tests that look beyond the review itself and examine the reviewers who write the reviews. A normal reader may look at one or two reviewers’ histories if they think their reviews are suspicious, but without hundreds of hours of compiling stats, they would never be able to see the histories of all of the reviewers for a given product. It is just too labor intensive for the average reader.  Luckily, at ReviewMeta we have the ability to discover meaningful statistics about reviewers’ posting histories. We examine four suspicious reviewer traits in our analysis. Each of these traits can tell us something about who is reviewing these products and taken together these traits can be very revealing.

One-Hit Wonders: These are reviewers who have written one review, which means these reviewers have only reviewed the product being analyzed. Unbiased reviewers tend to be long term members of a site who post more than a single one-off review. If a given product has a high percentage of One-Hit Wonders it can indicate that there is review manipulation occurring.  While there are a number of causes might result in a One-Hit Wonder, a few common ones include: brands creating throwaway accounts to a post single fake review for their product, or brands somehow enticing people who don’t normally write reviews to review this product.

Take-Back Reviewers: Any reviewer who we have discovered to have a deleted review in their history is flagged as a Take-Back Reviewer.  These reviewers are suspicious because the review was most likely removed by the review platform due to a violation of the terms of service. This may indicate that the user has been caught for review manipulation before and we don’t know for sure if they have stopped breaking the rules.

Single-Day Reviewers: Reviewers who have posted multiple reviews but have posted all of them in a single day are labeled as Single-Day Reviewers. Like One-Hit Wonders, these reviewers lose trust because they haven’t shown a long term commitment to the reviewing platform. This lack of long term commitment may be indicative that the reviewer is posting biased reviews.   

Never-Verified Reviewers: These reviewers have never written a verified purchaser review. This raises a red flag because typical unbiased reviewers will review products after they purchased them. If a reviewer purchases a product from one retailer it would be odd to always write the review in different reviewing platform. This may indicate that the reviewer is not a legitimate account but rather a sockpuppet account set up by brands in order to post favorable reviews.

Each of the traits above can have a plausible explanation. Maybe that One-Hit Wonder is a legitimate reviewer who just started reviewing products online. Possibly some reviewers prefer to sit down on one day and review all of the products they bought that year. Perhaps that Take-Back Reviewer accidentally posted a review for the wrong product and didn’t want to tarnish an innocent product’s name. And maybe that Never-Verified reviewer prefers shopping in brick in mortar stores but still wants his opinion to be heard.

In order to see if any group of suspicious reviewers are benign or not we do the following for each of them (One-Hit Wonders, Take-Back Reviewers, Single-Day Reviewers, Never-Verified Reviewers):

First, we check the overall percentage of the group. While it’s perfectly normal to see a small percentage of each group of Suspicious Reviewers, an excessive amount can trigger a warning or failure. Next we check to see if any group of Suspicious Reviewers has a higher average rating than all other reviews.  If they do, we’ll check to see if this discrepancy is statistically significant.  This means that we run the data through an equation that takes into account the total number of reviews along with the variance of the individual ratings and tells us if the discrepancy is more than just the result of random chance.(You can read more about our statistical significance tests here). If a group of Suspicious Reviewers are giving the product a statistically significant higher rating, it is strongly supports that the reviewers in that group are not benign and are unfairly inflating the overall product rating