Browser Extensions 




Screen Shot 2019-02-27 at 12.30.47 PM

Amazon Review Hijacking – How to Spot Sellers that are Recycling Reviews on Amazon

May 30th, 2018

As Buzzfeed recently reported, there are more ways for sellers on Amazon to obtain false reviews on their products than simply manufacturing new ones.  Essentially, sellers can recycle old reviews for entirely different products and use them to bolster their own listings.

We call this practice “Review Hijacking, and we’ve recently upgraded ReviewMeta to be able to help you identify when a seller engages in this deceptive practice.  One thing to note, however, is that if we haven’t visited a product before the reviews were stolen, we simply won’t have the data to show that it’s changed.  Only Amazon holds all the data, but they refuse to share it or warn users when the products have changed (but the reviews have remained).

How does Review Hijacking work?

Amazon has a “product variation” feature that allows reviews to be “pooled” from different product variations.  The idea is that some products are offered in multiple shapes/sizes/colors and there shouldn’t be a new set of reviews for each one of these slightly different variations.

Here’s an example that Amazon has provided to show how the product variations should be used for a cell phone case with multiple different color options:

Screen Shot 2018-05-30 at 11.40.55 AM

This makes complete sense in theory: if a t-shirt has 50 different size/color combos, you wouldn’t want to have to tally up the reviews from each variation to see the entire review picture.

However, some sellers are exploiting this feature and using it to pad their review pool.  We’re seeing sellers hijack unrelated listings and add them as product variations.  The net effect is that the reviews from the hijacked listings are now counting toward the overall pool of reviews for that product.

Technically speaking, the reviews might not be fake, inauthentic or unnatural – it’s just that they were originally written about completely different products.  This practice is blatantly misleading, completely unethical and definitely against Amazon’s TOS, however we are still seeing it happen on many different products.

How to Identify Hijacked Reviews

At it’s core, you’ll notice that the reviews are for an entirely different product.  You might be reading reviews for a bottle of shampoo, but the text of the review says something like “This was a was a great toenail clipper!”.  Unfortunately, it’s not always practical to read the text of every single review, especially if there are 1,000’s of them.  And sometimes the reviews will not be descriptive at all (eg “great product”).  Lastly, these hijacked reviews might be mixed into actual reviews for the correct product – so they are sometimes hard to spot in the wild.

1. Check the “Product Variations” box on the ReviewMeta report:

We’ve recently beefed up our “Product Variations” feature.  You can see if there are multiple variations of the product contributing to the review pool by looking below the product image on the ReviewMeta report.  (Here’s a link to the product we are using for this example – keep in mind that it may have been “fixed” since we published the post)

If there is only one variation, you won’t see this box at all.  Click the text to pull up a list of the different variations that we have discovered.  For this example product, we can see many of these variations have absolutely nothing to do with the original product:

hijack-listings

For this example product we can see hundreds of completely unrelated products that are listed as “variations” of this knee brace.  You can also see the number of reviews and average rating on Amazon along with the adjusted rating and review count after being processed through ReviewMeta.

Clicking on the “Example Review” link will provide some insight to what product was originally reviewed:

Screen Shot 2018-05-30 at 12.07.33 PM

Screen Shot 2018-05-30 at 10.58.18 AM

As you can see, this review was clearly written about a transport chair – NOT a knee brace – but still counts as a 5-star rating to the knee brace that’s pictured above.  There are 38 total reviews for this transport chair (with an average rating of 4.8 stars) that are counted towards this knee brace.

You’ll also notice that this product receives a “PASS”.  The problem is that these reviews are perfectly natural – they are just for the wrong products.  As of now, it’s really hard for us to automatically detect when the product variations feature is being abused and when it’s being used appropriately, so the best thing to do is check the “Product Variations” button for additional info and draw your own conclusions.

2. Look at the review word cloud:

Sometimes a seller will hijack just a single listing and ride the coattails of all the existing reviews.  In this case, ReviewMeta won’t always be able to show more than one product in the variations list, however there are still some ways to identify the issue.

For this example, we’ll look at this product on Amazon (keep in mind that things may have changed since publishing this article): https://www.amazon.com/dp/B00IGKTH00

Screen Shot 2018-05-30 at 1.18.36 PM

So this is clearly a listing for a cassette adapter, and the 243 reviews are surely for this product, right?  Wrong – just take a look at the review word cloud on Amazon:

Screen Shot 2018-05-30 at 1.18.03 PM

Just reading a few reviews, you can clearly tell that they are describing an iPhone 5 case and not a cassette adapter:

Screen Shot 2018-05-30 at 1.28.46 PM

 


  • RB

    Is there a way for ReviewMeta to allow us to click which product variations’ reviews we want included in the average rating? For example, let’s say the knee brace has 100 product variations, and we sift through the list and see 15 product variations we think might be legit variations. Can we click those 15, and only those 15 products’ ratings are averaged together and analyzed?

    • Hey RB-

      It’s definitely possible, and I can see how it would help the user. I’ll add it to the dev list and let you know if/when we get to it!

      • Autumn Gray Eakin

        It could be combined with crowdsourced data from RM user efforts to report/flag hijacks.

        • The big issue with crowdsourced data on RM is that it would then open the doors to manipulations by brands. Reviews on Amazon are technically crowdsourced…

      • RB

        Thank you!

  • Smurfette

    How do you report a product/seller that does this? In the Amazon app, you can only report individual reviews. There’s no link to report the product or the seller.

    • I think at the top of the page there should be a link to report “incorrect product information” but that’s about it. Amazon is not clear about what happens when people report reviews, but I think that’s still the best way to go.

  • Biological

    I’m sorry, I don’t understand – how is this hard to filter out? I feel like I must be misunderstanding something here.
    There’s been a feature for a while now where if you go to ALL of the Reviews you can literally just select/tell it to ONLY show you reviews for the specific product you’re looking at. I’ve definitely done this for a while now (at least 1 year?). All you do is select the “All formats” dropdown menu and select specifically the “format” (ie. version of the product) that you were actually looking at/for.

    Example (random):
    https://www.amazon.com/Trailer-Aid-Tandem-Changing-Change-Trailers/product-reviews/B001V8UKBO/

    All you do is select the “All formats” dropdown menu right above where the Reviews begin and select your version (click the highlighted dropdown here, I just took a screenshot: http://prntscr.com/lmom1a)

    By doing so you will see an option to select the specific “Style” (ie. exact product) you were looking at (here’s another screenshot showing this clearly: http://prntscr.com/lmomku).
    You can see in this second screenshot that it gives me the option to specifically filter by the “Trailer Aid Plus” product. If you notice, it even lets you specify it to the color (in this case this random product’s options are Black or Yellow – I happened to have Yellow selected when I took the above 2nd screenshot). If I select this, I see that it tells me there are 239 Reviews (out of 1,098 Total Reviews) specifically related to the [Yellow] “Trailer Aid Plus” product (screenshot #3: http://prntscr.com/lmonjs). If I go back and change my product selection to Black, it allows me to re-filter this, now showing me only the reviews pertaining to the [Black] “Trailer Aid Plus” product (screenshot #4: http://prntscr.com/lmoomt). If I select this, I see that it now tells me there are 243 Reviews. As an aside, this tells me that out of the 1,098 Reviews, there’s only 239 + 243 = 482 Reviews for the ACTUAL product I’m hypothetically interested in. This leads to the following.

    POSSIBLE SOLUTION(?):
    If you now further notice, each review (even without filtering by “Format”) always contains above the review text the following fields:
    – Color
    – Style
    The *Style* is the actual product I’m looking at. So apparently the data is there, and available to be grabbed. It has a dedicated field to it. Could we not just grab ONLY reviews that match the selected STYLE of the product that we’re looking at? In our case, that means ONLY looking at the reviews for specifically the “Trailer Aid Plus” product in the “Style” field above the review text.

    You can see this field’s utility in action between Screenshot #1 (https://prnt.sc/lmom1a) and Screenshot #2 (https://prnt.sc/lmonjs). Notice that in #1 – before selecting the option from the “Format” (ie. Style) dropdown – the TOP Review listed is one by “Michael K”.. but it’s not even for the product I was looking for! It’s for a variant product – the “Trailer Aid”, NOT the “Trailer Aid Plus”.
    If I switch to specifically the “Trailer Aid Plus” (Yellow) reviews, suddenly I see the Top Review is by the user “Noreladim” (see: Screenshot #2).
    If you really investigated it, you’d see there’s even yet another product randomly being lumped into the review total here – the “Trailer Aid Holder” (which is literally just a metal sleeve to store the actual product in..). But this could be filtered out if we were to simply IGNORE any review that doesn’t match our original product’s actual Style field (“Trailer Aid Plus”) because the Style field for this product is “Trailer Aid Holder” (see, Screenshot #5: http://prntscr.com/lmouhi).

    So – is this not possible?
    Is it not possible to simply filter by reviews that match the actual product’s Style (ie. “Format”)? There’s an actual field dedicated to specifying it right there on Amazon. I’d imagine this could be scraped, presumably easily. For what it’s worth, the URL’s are actually technically different for each Style (you can see this if you go to the main product page and right-click the different styles > open in new tab; if you compare the different Styles you’d see they have unique URL’s), even though they’ll all appear on each alternate product’s page – so Amazon seems to be consciously recognizing that it’s lumping literally different products together (since they each normally, otherwise have their own URL’s assigned to them).

    What do you think?

  • Patty Pearson

    I had been wondering how all those different product reviews got into what I was looking at!
    I’m glad I read this ..thank you for all your research.

  • There’s definitely a lot of angles to look at here. The most important part I can see is just making sure the data is available and makes it easy for visitors to make their own decisions.

  • Wow. Just extraordinary.

    Whenever you *are* able to reliably detect such fraudulent listings, is there a way for ReviewMeta to report it directly to Amazon?

    • I’d have to really come up with a creative way to automatically test for this – and I can’t really envision one that would be terribly accurate. The problem is that there are legit reasons to have a TON of product variations – I’ve seen listings for bras, pants, shoes, etc that have hundreds of variations that are 100% legit.

      A human can take a quick look at the different product variations and it will be extremely easy to identify when something is afoul. However getting a computer program to “learn” it can be a huge challenge.

      As far as automated reporting to Amazon goes, this is probably something that I’d have to negotiate directly with them.

      • How about a “too-different” test for the product names/descriptions among the variations? I think *that*’d be close-enough, & quite similar to other lexical analysis you do.

        • That’s essentially the only way to do it – set some limits and see if the words have changed “enough”. The problem is that I could see a ton of exceptions. A seller could tweak the name from “Remote Control Car” to “RC Toy”. Same meaning but 100% different words. Or you could have a bunch of product variations of obscure colors: “34D Midnight Blue”, “28L Charcoal Sapphire -EU SIZING”, etc.

          You could also have some sellers slipping under the radar by using really similar products, or even products that were essentially the same – “Headphone Adapter/Charger for Iphone, Ipad, Ipod Touch” -> “Car Adapter/Charger for Iphone, Ipad, Ipod Touch”. Two different products but could easily trick a computer (and even a human) into believing that they are “close enough”.

          I’m still thinking about how to do this one – there’s got to be away. What confuses me the most is that Amazon is allowing this to happen. I’m afraid once I take the time to develop an elegant solution, Amazon will have closed the loophole entirely.

          • HenkPoley

            To some extent Conceptnet Numberbatch could help with that: http://conceptnet.io/c/en/midnight_blue

            They also have word vector (similarity) data that are not shown on that page.

          • Ah, I see – having a “similar terms” list could help improve the performance, but it still wouldn’t be perfect. Maybe what I could do is automatically scan the list of variations to see if there are any that are substantially different, and then have an option to see the ones that are substantially different. Would help the viewers kinda cull down the information they have to scan manually. Obviously perfect is impossible so we’re shooting for proficient.

          • I suppose the best thing to do is actually contact Amazon & find out how they’d like to proceed, especially if 1 can provide them w/ a large list of potential violators. 🤔

          • I’m just surprised they allow this to happen in the first place. It’s just so blatant.

          • I just ran across a listing that might explain it.:

            https://reviewmeta.com/amazon/B07BRKPL6D

            If you check it, it shows that Amazon also does this improper conflation itself. According to a Woot review (also owned by Amazon), the amp in the combo has problems, so Amazon chooses to combine reviews w/ the 1st-class speakers.

            So, what’s good for the goose….

          • Oh man, this is a much more subtle grouping issue – impressive detective work! Looks like only 3 reviews are for the Amp & Speaker combo (with an average of less than 3 stars) but the other 225 reviews are for just the speakers. To the untrained eye, you’d think the 4.7 star average applies to both variations, but in this case we see a massive discrepancy. Very misleading by Amazon!

          • I think that explains why Amazon doesn’t fix it — it’s a pretty smooth way for them to dump lemons.

            Here’s an idea: Calculate the RM rating for all items (as is done now), compare against the RM rating for *only* the specific item version requested (as much as possible), & flag if significantly different.

          • This is a great idea and something that I already started thinking about when I was looking through the data on this. It would have to be “statistically significantly” different to get flagged – you wouldn’t want to flag a version that has just one review. I’d have to play around with the data, but the skeptic in my thinks that there would be a lot of false flags. Have to play around with the data first. Adding it to the dev list.

          • Here’s another example to test this against:

            https://reviewmeta.com/amazon/B07DZ8B175

            It’s also somewhat subtle in that the device is grouped w/ its official cases (so not entirely unrelated items) but not quite as sneaky as a good item & it bundled w/ a lemon item.

          • So, now that Amazon has “de-conflated” this example, that prompts another RFE: Maybe include in that “Show Report History” link a count (only, so not much data has to be carried over) of how *many* product variations were detected per report — if that *&* the RM rating change drastically simultaneously, that could be an indicator of hanky-panky, I’d think.

          • Ah, this is a great idea. Would also help sort out the issue of people thinking a bunch of reviews were “deleted” when in reality the product variations were split off into different listings.

          • TPS

            Did you see evidence of Amazon taking action re: variations around 1 month after your original post, as reported in http://www.bobsledmarketing.com/blog/navigating-amazons-new-rules-around-product-variations ?

          • Interesting. Haven’t noticed anything different. Still seeing examples of textbook hijacking. This one was sent in to me just today: https://reviewmeta.com/amazon/B07DRHKLFG

          • TPS

            I find it interesting that Amazon actually mentions a policy change to Amazon retailers (so soon after your official report!) but then doesn’t really have any enforcement of it. To me, that seems like CYA, just in case Something Happens, but actually Business As Usual.

            RM Retailer Reviews can’t come soon enough!

          • TPS

            RM now says that given example is now 250 reviews for just that spatula set — no variations⁉️ It seems HIGHLY unlikely.…

          • Interesting. What’s the URL?

          • TPS

            It was spatulas (when I posted) @ the same URL as you posted ( https://reviewmeta.com/amazon/B07DRHKLFG ), but now it points to another product⁉️ Curiouser & curiouser.…

          • Great Artiste

            I don’t think Amazon cares, especially if it helps sell more product and adds to their bottom line. In effect, they’re accomplices in the violation of their own TOS. It’s all about the profit for them, customers? Screw ’em.

    • RedReindeer

      TPS asked if there was a way to notify Amazon directly and there is! Many item listings (not sure if it’s all of them) have a small section under the initial description that asks if you want to report ‘incorrect product info…’. I’ve done that several times in the last few months–since I first noticed it. I have not looked back to see if anything was changed, although in one case I went back to a page I’d ‘reported’ about a month before and saw that it still had the incorrect information there, just as it was when I’d first reported it to Amazon. Although, in all fairness, I have to say this option was relatively new at the time and maybe Amazon hadn’t gotten all the bugs out yet. We’ll have to wait and see if time will truly tell.