Browser Extensions 


Amazon Review Hijacking – How to Spot Sellers that are Recycling Reviews on Amazon

May 30th, 2018

As Buzzfeed recently reported, there are more ways for sellers on Amazon to obtain false reviews on their products than simply manufacturing new ones.  Essentially, sellers can recycle old reviews for entirely different products and use them to bolster their own listings.

We call this practice “Review Hijacking, and we’ve recently upgraded ReviewMeta to be able to help you identify when a seller engages in this deceptive practice.

How does Review Hijacking work?

Amazon has a “product variation” feature that allows reviews to be “pooled” from different product variations.  The idea is that some products are offered in multiple shapes/sizes/colors and there shouldn’t be a new set of reviews for each one of these slightly different variations.

Here’s an example that Amazon has provided to show how the product variations should be used for a cell phone case with multiple different color options:

Screen Shot 2018-05-30 at 11.40.55 AM

This makes complete sense in theory: if a t-shirt has 50 different size/color combos, you wouldn’t want to have to tally up the reviews from each variation to see the entire review picture.

However, some sellers are exploiting this feature and using it to pad their review pool.  We’re seeing sellers hijack unrelated listings and add them as product variations.  The net effect is that the reviews from the hijacked listings are now counting toward the overall pool of reviews for that product.

Technically speaking, the reviews might not be fake, inauthentic or unnatural – it’s just that they were originally written about completely different products.  This practice is blatantly misleading, completely unethical and definitely against Amazon’s TOS, however we are still seeing it happen on many different products.

How to Identify Hijacked Reviews

At it’s core, you’ll notice that the reviews are for an entirely different product.  You might be reading reviews for a bottle of shampoo, but the text of the review says something like “This was a was a great toenail clipper!”.  Unfortunately, it’s not always practical to read the text of every single review, especially if there are 1,000’s of them.  And sometimes the reviews will not be descriptive at all (eg “great product”).  Lastly, these hijacked reviews might be mixed into actual reviews for the correct product – so they are sometimes hard to spot in the wild.

1. Check the “Product Variations” box on the ReviewMeta report:

We’ve recently beefed up our “Product Variations” feature.  You can see if there are multiple variations of the product contributing to the review pool by looking below the product image on the ReviewMeta report.  (Here’s a link to the product we are using for this example – keep in mind that it may have been “fixed” since we published the post)

Screen Shot 2018-05-30 at 1.02.33 PM

If there is only one variation, you won’t see this box at all.  Click the text to pull up a list of the different variations that we have discovered.  For this example product, we can see many of these variations have absolutely nothing to do with the original product:


For this example product we can see hundreds of completely unrelated products that are listed as “variations” of this knee brace.  You can also see the number of reviews and average rating on Amazon along with the adjusted rating and review count after being processed through ReviewMeta.

Clicking on the “Example Review” link will provide some insight to what product was originally reviewed:

Screen Shot 2018-05-30 at 12.07.33 PM

Screen Shot 2018-05-30 at 10.58.18 AM

As you can see, this review was clearly written about a transport chair – NOT a knee brace – but still counts as a 5-star rating to the knee brace that’s pictured above.  There are 38 total reviews for this transport chair (with an average rating of 4.8 stars) that are counted towards this knee brace.

You’ll also notice that this product receives a “PASS”.  The problem is that these reviews are perfectly natural – they are just for the wrong products.  As of now, it’s really hard for us to automatically detect when the product variations feature is being abused and when it’s being used appropriately, so the best thing to do is check the “Product Variations” button for additional info and draw your own conclusions.

2. Look at the review word cloud:

Sometimes a seller will hijack just a single listing and ride the coattails of all the existing reviews.  In this case, ReviewMeta won’t always be able to show more than one product in the variations list, however there are still some ways to identify the issue.

For this example, we’ll look at this product on Amazon (keep in mind that things may have changed since publishing this article):

Screen Shot 2018-05-30 at 1.18.36 PM

So this is clearly a listing for a cassette adapter, and the 243 reviews are surely for this product, right?  Wrong – just take a look at the review word cloud on Amazon:

Screen Shot 2018-05-30 at 1.18.03 PM

Just reading a few reviews, you can clearly tell that they are describing an iPhone 5 case and not a cassette adapter:

Screen Shot 2018-05-30 at 1.28.46 PM


  • There’s definitely a lot of angles to look at here. The most important part I can see is just making sure the data is available and makes it easy for visitors to make their own decisions.

  • Wow. Just extraordinary.

    Whenever you *are* able to reliably detect such fraudulent listings, is there a way for ReviewMeta to report it directly to Amazon?

    • I’d have to really come up with a creative way to automatically test for this – and I can’t really envision one that would be terribly accurate. The problem is that there are legit reasons to have a TON of product variations – I’ve seen listings for bras, pants, shoes, etc that have hundreds of variations that are 100% legit.

      A human can take a quick look at the different product variations and it will be extremely easy to identify when something is afoul. However getting a computer program to “learn” it can be a huge challenge.

      As far as automated reporting to Amazon goes, this is probably something that I’d have to negotiate directly with them.

      • How about a “too-different” test for the product names/descriptions among the variations? I think *that*’d be close-enough, & quite similar to other lexical analysis you do.

        • That’s essentially the only way to do it – set some limits and see if the words have changed “enough”. The problem is that I could see a ton of exceptions. A seller could tweak the name from “Remote Control Car” to “RC Toy”. Same meaning but 100% different words. Or you could have a bunch of product variations of obscure colors: “34D Midnight Blue”, “28L Charcoal Sapphire -EU SIZING”, etc.

          You could also have some sellers slipping under the radar by using really similar products, or even products that were essentially the same – “Headphone Adapter/Charger for Iphone, Ipad, Ipod Touch” -> “Car Adapter/Charger for Iphone, Ipad, Ipod Touch”. Two different products but could easily trick a computer (and even a human) into believing that they are “close enough”.

          I’m still thinking about how to do this one – there’s got to be away. What confuses me the most is that Amazon is allowing this to happen. I’m afraid once I take the time to develop an elegant solution, Amazon will have closed the loophole entirely.

          • HenkPoley

            To some extent Conceptnet Numberbatch could help with that:

            They also have word vector (similarity) data that are not shown on that page.

          • Ah, I see – having a “similar terms” list could help improve the performance, but it still wouldn’t be perfect. Maybe what I could do is automatically scan the list of variations to see if there are any that are substantially different, and then have an option to see the ones that are substantially different. Would help the viewers kinda cull down the information they have to scan manually. Obviously perfect is impossible so we’re shooting for proficient.

          • I suppose the best thing to do is actually contact Amazon & find out how they’d like to proceed, especially if 1 can provide them w/ a large list of potential violators. 🤔

          • I’m just surprised they allow this to happen in the first place. It’s just so blatant.

          • I just ran across a listing that might explain it.:


            If you check it, it shows that Amazon also does this improper conflation itself. According to a Woot review (also owned by Amazon), the amp in the combo has problems, so Amazon chooses to combine reviews w/ the 1st-class speakers.

            So, what’s good for the goose….

          • Oh man, this is a much more subtle grouping issue – impressive detective work! Looks like only 3 reviews are for the Amp & Speaker combo (with an average of less than 3 stars) but the other 225 reviews are for just the speakers. To the untrained eye, you’d think the 4.7 star average applies to both variations, but in this case we see a massive discrepancy. Very misleading by Amazon!

          • I think that explains why Amazon doesn’t fix it — it’s a pretty smooth way for them to dump lemons.

            Here’s an idea: Calculate the RM rating for all items (as is done now), compare against the RM rating for *only* the specific item version requested (as much as possible), & flag if significantly different.

          • This is a great idea and something that I already started thinking about when I was looking through the data on this. It would have to be “statistically significantly” different to get flagged – you wouldn’t want to flag a version that has just one review. I’d have to play around with the data, but the skeptic in my thinks that there would be a lot of false flags. Have to play around with the data first. Adding it to the dev list.

          • Here’s another example to test this against:


            It’s also somewhat subtle in that the device is grouped w/ its official cases (so not entirely unrelated items) but not quite as sneaky as a good item & it bundled w/ a lemon item.