Jump to content


  • Content Count

  • Joined

  • Last visited

Everything posted by Tommy

  1. Tommy

    1 review "one-hit wonders"

    Hello- Thanks for posting. If all the reviewers have 2 reviews, the "Reviewer Participation" test should still raise a red flag.
  2. Tommy

    Help contacting ThePriceTest?

    Hi Ted - Have you tried reaching out to the email they have listed here? https://thepricetest.com/info Can you forward me an example "fake price change" email to [email protected] Why is it that you believe they are doing something to your device in the background?
  3. Tommy

    should I trust your reports anymore

    Hello- Please see our post on the latest algorithm updates: https://reviewmeta.com/blog/0-unnatural-reviews-august-2019-algo-updates-explained/ As usual, we always recommend reading the actual reports and making a determination for yourself. In my opinion, neither of these two products seem to have reviews that look too bad - in fact, both seem to have many reviews deleted recently so they could have been "cleaned up" by Amazon in the meantime.
  4. The source of the app? I'm sure we could make it available. What changes did you want to make?
  5. Tommy

    How good is the system at spotting fraud?

  6. Tommy

    RM vs. Non-Sold Amazon Categories

    Wow, I didn't realize that Amazon started selling full cars now. Or you're just able to "add to list" so you're not even able to buy. Definitely showing these reviews as "unnatural" on RM because they have been solicited by Amazon. I still think that's the most accurate way of showing the reports.
  7. Tommy

    Help contacting ThePriceTest?

    They said it should work now.
  8. Tommy

    DormDoc Reviews

    Thanks for the response. We've linked to this page from your brand page and all products.
  9. Tommy

    Hijackers of Inactive products

    I am asking myself this question all the time. I'm just one guy who is going off nothing but the publicly available data that Amazon displays on their site, and I'm able to flag a lot of these extremely suspicious reviews. Amazon should have figured this out A LONG TIME ago. Any of their tens of thousands of developers is probably a better coder than myself. I just don't get it. I think you're right about how it hurts both customers and the sellers who are playing by the rules. As far as creating a tool that helps sellers get notified if their listings are hijacked, I think I could put together something. The data is there, it's just a matter of trying to organize it in a way that would be organized and easy to understand. There's already been a few big news stories about this (about a year ago), however my prediction is once the story is really syndicated throughout the mainstream media, Amazon will then magically close all the loopholes. Similar to what happened with incentivized reviews in 2016.
  10. Tommy

    Help contacting ThePriceTest?

    Sure, I'll reach out to them and see if they can provide a fix.
  11. There's still a matter of publishing it through the Google Play store and managing it that I wouldn't necessarily want to "crowdsource".
  12. I don't blame the dev - adding support for multiple sites would eliminate the simplicity of the whole thing. That said, this is an excellent start for the app that we are eventually going to build. Just need to find a contractor who has some experience with publishing Android apps. Hopefully we'll have something soon!
  13. Wow, seems like it wouldn't be hard at all to just use that source code to make something that does exactly this but for ReviewMeta. Developing Andriod/Iphone apps isn't something that I'm familiar with, but I think that it could be easily figured out since the concept is so darn simple.
  14. Tommy

    The New Amazon Review Scam Tactic

    Wow! This was covered by the Washington Post last year - it's sad that it's still an industry that seems to be flourishing. These sites seem to be pretty blatant. I wonder if they are on Amazon's radar at this point...
  15. I've done a little bit of consulting work helping write reports for sellers who suspect that their competitors are trying to attack them. The problem is that it's always difficult to "prove" that a review is fake, and then also getting Amazon to respond or take any action is never guaranteed. Then the other thing I've seen is that some brand owners (not saying that you're doing this) will claim that every negative review about their business or product is fake. Sometimes people believe in their products so much that they can't believe someone wouldn't like it. Last, I've seen brand owners actually start to get a bunch of negative reviews because a competing seller slipped in and started offering a knock-off product under the same listing for a lower price.
  16. Hello Guyr8s- Thanks for writing in. As I mentioned on the "Donate" page (https://reviewmeta.com/donate) I previously had Paypal open for donations, and even started setting up a Pateron account, however part of me never really felt right taking my visitor's money. I know that a lot of successful online businesses run entirely on donations (e.g. Wikipedia) but for some reason, I just feel uncomfortable when people are generous with me when I think there's other people or organizations that need it more. That said, here's a few things to keep in mind: If you want to hide ads, anyone can click the big button at the bottom of the page here: https://reviewmeta.com/donate (If it's not working, just let me know and I'll fix it. The button broke once, so I wouldn't be surprised if it breaks again). Don't feel bad about clicking the button - it's like a fraction of a fraction of a cent of ad revenue. If you want to help the site improve, share your ideas and feedback here on the forum. I'm happy to discuss features and upgrades here (also, trying to spark some discussion here on the forum). A lot of the features added to the site in the last year have started as suggestions from visitors like yourself!
  17. Tommy

    RFE: Filtering Category Pages

    There's been some caching implemented - the category pages take a lot of server resources, so we store a page for a few hours after it has been generated to reduce server load.
  18. Tommy

    RFE: Filtering Category Pages

    Thanks for the suggestion! Yes, I see how the category pages can be greatly improved with the simple addition of a few filters. To be honest, I'm not in love with the category pages and want to do a complete overhaul in the coming months. Any more input on using the category pages will help give me ideas on how to re-design them. As for the moderation, I've still got to mess with the settings. I thought that only posts in the "Brand Response" forum would need to go through moderation, not the "Everything Else" forum as well. Edit: Looks like it's automatically going through moderation because it contains a link! I suppose I'll turn that off because most posts will contain links as a reference to stuff on the site.
  19. Is the selector you're referring to on Amazon the where you pick the "Size" (Queen/King/Full/etc) and then "Style" (8-inch/10-inch/etc)? It would be nice to have that in theory, but unfortunately we haven't been collecting that data necessary for that and it would be a pretty substantial effort to start collecting it. If Amazon had some sort of data-share agreement with us, then it would be easier to implement things like this, but unfortunately there is no such agreement in place, so it's hard to mimic the Amazon display on a lot of aspects. That said, I see your point on only being able to go off the titles, so I'll have to think about how we can use the existing data to be displayed in a better format.
  20. Ah, I see, so instead of Most Trusted and Least Trusted, (or in addition to), show the "Most Trusted Critical Reviews". Definitely see how this could be a useful feature. Adding it to the list! Thanks for the suggestion!
  21. Tommy

    May I refer to your site and reports in a blog?

    Hey Mark- Thanks for asking! You can use any of the info on the reports, but please don't add words that aren't there, such as "ReviewMeta Certified", "ReviewMeta Verified", or "Guaranteed Fake Review free" or anything of that nature that's not available on the report pages.
  22. Tommy

    That other reviews scoring site

    Interesting comparison here. Let's take a deeper look... Before we begin, it's important to keep in mind that both sites are estimates, and since I didn't create FakeSpot, anything I say about their analysis is just my guess - I don't know more about the inner workings of their site than any other visitor. (1) SKG Automatic Bread Machine 2LBhttps://www.amazon.com/dp/B071GKYPQ3Amazon: 363 reviews, 4.4 stars, rank 74%ReviewMeta: Fail, 157 reviews, 4.3 stars, rank 63%FakeSpot: Grade D, Adjusted rating 2.0 stars, 49% low quality reviews. For the ReviewMeta report, I'm pretty confident in it after taking a manual look. Looks like there was some suspicious activity which was detected and some of the reviews were de-valued. A product rating reduction of 4.4 => 4.3 isn't nothing, but it seems that the product has been able to generate a decent amount of trustworthy honest reviews over time (which seems to be quite common). Looking at the FS report, if I click on the "Trustwerty adjusted rating" it takes me to Trustwerty.com and shows that it's 2.3 stars. (It also prompts me to pay $10 to see the full report, which I'm not doing). FS also says "44.0% Low Quality Reviews Detected". If there's 363 reviews, that means they're estimating 160 Low Quality Reviews. Assuming every single one of these is a 5-star review, let's look how the review picture changes when we throw them out. 5-star: 293 => 133 4-star: 27 3-star: 11 2-star: 12 1-star: 20 Even if we throw out 160 5-star reviews, the average rating would only drop to 4.2/5. So something isn't quite adding up here. Even if you threw out every single one of the 293 5-star reviews, the average rating would only drop to a 2.6. And that's over 80% of the reviews! I strongly believe that you can't do anything beyond simply devaluing some reviews. The technology looks at reviews and has absolutely no way of knowing anything about the products themselves, so trying to further "punish" a product is a bit of an overstep in my opinion. The bottomline is this: It's unclear how FS is calculating their "trustwerty rating", but it's clear that they aren't using the same methodology that we are here at ReviewMeta. (2) Zojirushi BB-PAC20BA BB-PAC20 Home Bakery Virtuoso Breadmaker with Gluten Free Menu setting (Two selectable variations on the same product page)https://www.amazon.com/dp/B0067MQM48Amazon: 1,776 reviews, 4.5 stars, rank 77%ReviewMeta: Pass, 1,520 reviews, 4.5 stars, rank 82%FakeSpot: Grade D, Adjusted rating approx. 2.4 [approx. because by eye from filled-in stars], 44% low quality reviews. This is an example where FS doesn't like the reviews and RM does. What's tough about this is that RM shows all the work while FS doesn't, so it's hard to see exactly FS has decided to give it the rating that it does. Looking at the RM report, I don't see any major red flags, and since I didn't create FS, I can't say why they found issue with the reviews.
  23. Tommy

    That other reviews scoring site

    Hey Brec- Welcome to the forum! Yes, this is something I'm definitely interested in discussing. Would love to see your examples. One thing I noticed about the adjusted rating between RM and FS is that RM is ONLY using the existing reviews to recalculate an adjusted rating. So, for example, if a product has ONLY 100 5-star reviews and that's it, it will either get an adjusted rating of 5 stars OR an "insufficient reviews". We're not going to make up lower-star reviews. I've seen products on FS get an adjusted rating of 2 stars when there isn't a single review below 3 stars on that product. They are definitely using a different methodology, and it's not clear what it is.
  24. Hello Louis- Thanks for being the first to take a shot at the response dashboard. I went ahead and removed the marketing images from the bottom of your post as to comply with our rules. I'm assuming your product is this one? https://reviewmeta.com/amazon/B01G8FK7V2 The first thing I noticed is that the report was over a year old. I clicked the "Update" button (this is available to any visitor, not just me) and it refreshed the report and shows a PASS now. One thing that I find interesting is that there are 10 unverified reviews, all of which are perfect 5/5 star reviews, while the verified reviews average 4.3 stars. There's also three one hit wonder + unverified reviews from the following reviewers: https://www.amazon.com/gp/profile/amzn1.account.AELNM3YLIV6M7FA2OBP5Y3ZPHNWA?pldnSite=1 https://www.amazon.com/gp/profile/amzn1.account.AGVQJFAHDN4BAQH5AML77NWYZI2Q?pldnSite=1 https://www.amazon.com/gp/profile/amzn1.account.AEWNP7HHNZKNLZKCCZ427WMIHREA?pldnSite=1 All three of these users posted their only review (5-star unverified purchase review for your product) around the same time. Jun-18, Jun-20 and Jul-30 of 2016. This is what could have been causing the initial WARN but now that there's additional reviews, it could have been bumped up to a PASS. Again, I appreciate you going out on a leg and being the first to post a public response.

Important Information

Welcome to the ReviewMeta Forums! By viewing our site, you agree to our Terms of Use