Jump to content

All Activity

This stream auto-updates     

  1. Past hour
  2. Last week
  3. Earlier
  4. Hey Mike- It's always important to remember that both Fakespot and ReviewMeta are estimates - nobody can actually tell which reviews are fake and real as a matter of fact, although Fakespot isn't as upfront about this fact. Fakespot also doesn't share details about their analysis, so it's impossible to check their work or really know how their secret magic formula is working. I've seen Fakespot going around saying that 30% of reviews on Amazon are fake. We think this is way overblown, and put that number closer to 7-9%. Amazon itself claims less than 1%. So our understanding is that Fakespot is much more aggressive and is likely displaying a lot of false positives. Another thing to remember is that just because a product has fake reviews does not mean that it's a bad product. A Fakespot F or a ReviewMeta Fail doesn't necessarily mean the product is fake, defective or faulty. Always go with your gut and always return products on Amazon if you feel you've been deceived. Here's something we wrote about this topic a few years back and a good discussion in the comments: https://reviewmeta.com/blog/fakespot-vs-reviewmeta-for-analyzing-reviews/
  5. Hello,I’ve recently started using your app on Amazon and I have a question. Prior to using your app I was using Fakespot which I have continued to use in addition to yours. Tonight I looked up items that I knew would probably have questionable reviews (e.g. cheap Chinese products that have mainly all five star reviews). I did this with four different items and each time FakeSpot came up with the expected rating of F, meaning that fewer than 10% of the reviews seemed reliable. However, your app, on all four of the items that I looked at, did not change the numeric rating at all and said that it deemed all of the reviews legitimate. Even if the algorithms and metrics that you use are vastly different from that of Fakespot I would imagine there to be some correlation in results, however in these four cases I found no correlation whatsoever. As an example here is one products that comes up with an F rating on Fakespot and a perfectly clean rating on ReviewMeta. See if you could take a look at this and explain to me why. Example: “USB Hub,atolla USB 3.0 Hub with 3 USB 3.0 Ports and SD/Micro SD Card Reader, Ultra Slim USB Splitter with Individual Power Switches and LEDs” made by Atolla. Thanks, Mike
  6. Yeah, that notice should still be functioning and might be a good way for us to figure out which ones are using the one-tap reviews.
  7. I suppose they're making a minimal effort to protect customers by requiring all these to be "Verified Purchases" — which I think should be true for (almost?) all product categories, anyway. My main reason for bringing it up, though, is just in case RM runs into these beta reviews. I suppose these cases would be covered in the "x reviews reported, but y reviews found" &/or "words per review" tests?
  8. Yes, we are able to cover the costs of the server and a little bit extra. I haven't run the numbers in a while, but I think that I still haven't even recovered the full investment in the site (was losing money the first several years). I'm not doing this to get rich - I thought it would be a fun hobby to put together and now it's grown to a lot more than that. I still don't feel comfortable taking people's money and I'd rather them donate to the ACLU or CPJ as mentioned on the donate page: https://reviewmeta.com/donate
  9. Interesting - I personally believe that this is the wrong direction. It makes it easier to flood a product with a bunch of one or five star reviews that have zero justification, and possibly without even being able to see the reviewers behind these reviews. Makes it much harder for anyone to get a sense of whether the reviews are fake or real - maybe this is the whole point? We'll see how this goes. Amazon is still "testing" it at this point so we'll see if they keep it. My main question in terms of incorporating this change into ReviewMeta is how are they going to show to the customer the 1-tap reviews? Will it just say "400 reviews" but then they only display the reviews that have written text? Or will they display all 400 reviews, but just say "1-tap review" for each of the one-taps? I think I'll have to write a blog post on why I feel this is the wrong direction.
  10. @Tommy Per TechCrunch, 1-Tap Reviews are in beta. Does RM need to be adjusted for such?
  11. flixflix

    PayPal or Patreon option needed + nut query

    Do you make any money via the links and ads though? Is it enough to cover your costs and time?
  12. Tommy

    1 review "one-hit wonders"

    Hello- Thanks for posting. If all the reviewers have 2 reviews, the "Reviewer Participation" test should still raise a red flag.
  13. I've been onto fake reviews for a while now, ever since my mom bought a blood pressure checker on Amazon that turned out to be a piece of junk with thousands of 5 star reviews. That is concerning to public health. People don't know better and getting false BP readings can drastically change the course of their health/treatment. Anyways, I was looking for a head shaver today and have realized that the fake reviews are getting harder to spot now. I put in a "popular" head shaver into your site and it came back as a pass, but one of the criteria was about how if that is the only review the user has posted. When going through the reviews, most of the reviewers had 2 reviews, easily bypassing that check. Just FYI! Great work you do by the way, thank you.
  14. Tommy

    Help contacting ThePriceTest?

    Hi Ted - Have you tried reaching out to the email they have listed here? https://thepricetest.com/info Can you forward me an example "fake price change" email to [email protected] Why is it that you believe they are doing something to your device in the background?
  15. Ted Walters

    Help contacting ThePriceTest?

    thepricetest.com is a scam!!! there is no way to contact them. they are sending me and others fake price changes and apparently doing something to my device in the background block them
  16. Tommy

    should I trust your reports anymore

    Hello- Please see our post on the latest algorithm updates: https://reviewmeta.com/blog/0-unnatural-reviews-august-2019-algo-updates-explained/ As usual, we always recommend reading the actual reports and making a determination for yourself. In my opinion, neither of these two products seem to have reviews that look too bad - in fact, both seem to have many reviews deleted recently so they could have been "cleaned up" by Amazon in the meantime.
  17. This is not the first time this has happened. I was looking at car seat cushions and the first time I looked at this item I think it showed 54% potentially unnatural But since the last reiview was in May of 2019 I ran the report . This time it showed no (zero????) potentially unnatural responses. This has happened several times so I don't think I can trust this site anymore.THis was just the most obviously ridiculous result. Here is the review: https://reviewmeta.com/amazon/B07N68LW2H This is another one I ran a report onbut this time because an older report didn't show up. https://reviewmeta.com/amazon/B01EBDV9BU
  18. A couple reasons for open-source: - It could potentially be included in F-Droid (if appropriately licensed), thereby increasing RM's reach - Users who are devs (I'm not ) could help with feature requests &/or bug fixes. (e.g.: Using app as pure redirector, allowing to open in user's choice browser or using app to open RM links)
  19. The source of the app? I'm sure we could make it available. What changes did you want to make?
  20. ReviewMeta Phone Apps 🎉 Is the source available? Such would help expand RM's reach.
  21. Tommy

    How good is the system at spotting fraud?

    https://reviewmeta.com/blog/how-accurate-is-reviewmeta-com/
  22. Hi! I really like this website and find it quite useful given how deceitful sellers can be these days.. One thing I really wonder is whether there is any information on how good the systems' conclusions usually are. Like if we know 100 reviews or products are fraudulent, how many would be identified as such? And of 100 non-fraudulent how many would be marked as fraudulent? Maybe there were any tests like that? I would find such information very interesting and useful! Sorry if I missed this information somewhere on the website!! Best, Angela
  23. Tommy

    RM vs. Non-Sold Amazon Categories

    Wow, I didn't realize that Amazon started selling full cars now. Or you're just able to "add to list" so you're not even able to buy. Definitely showing these reviews as "unnatural" on RM because they have been solicited by Amazon. I still think that's the most accurate way of showing the reports.
  24. Tommy

    Help contacting ThePriceTest?

    They said it should work now.
  25. @Tommy For some reason, Amazon actively solicits reviews on product categories it doesn't/can't/won't sell. So RM gives strange results due to no products being verified purchased, nor able to be bought. Is this the best way to handle these? Here's an example: https://reviewmeta.com/category/amazon/10677469011
  26. TPS

    Help contacting ThePriceTest?

    Do you whether there's an update re: this contact?
  27. Tommy

    DormDoc Reviews

    Thanks for the response. We've linked to this page from your brand page and all products.
  28. DormDoc

    DormDoc Reviews

    We at DormDoc have never utilized any form of review solicitation/collection. Our reviews are legitimate. Sincerely, Elaine Harrington, R.Ph. President and General Manager DormDoc LLC San Antonio, Texas https://reviewmeta.com/brand/dormdoc
  1. Load more activity
×

Important Information

Welcome to the ReviewMeta Forums! By viewing our site, you agree to our Terms of Use