Jump to content

All Activity

This stream auto-updates     

  1. Today
  2. Yesterday
  3. Tommy

    High volume around holidays

    This is already somewhat considered - we always check to see what the high-volume day reviewers are saying vs. the rest of the reviews. In this case, we can see that the 9% of reviews that are written on high-volume days give the product 4.5 stars on average, while everyone else gives the product 4.1 stars. In my opinion, this would mean that these reviews are somehow biased. They may not be "fake", but there's something fishy going on about them. Either way, you can always click "View/Edit Adjustment" and slide the "Reviews on high volume days" dial all the way to the right, hit "Apply New Weighting" and you can see what the adjusted rating is with those reviews included.
  4. I noticed while doing some Christmas shopping that items that might be commonly gifted (in this case I was looking at Roombas), are frequently flagged for reviews created on high volume days. Looking closer at the high volume days, most are happening in the couple of weeks after Christmas where you would intuitively expect there to be a bump in the number of reviews for an item that was a popular gift that year. Here is an example page: https://reviewmeta.com/amazon/B07DL4QY5V It might be worth considering weighing high volume days in late December or January a little less to take this into consideration.
  5. Last week
  6. Earlier
  7. Reagentgrade

    Refresh Report not working on iPhone app

    Nothing happened. It was like it wasn’t actually a link. No action, no box.
  8. Tommy

    Refresh Report not working on iPhone app

    Thanks for the bug report. What happened when you clicked the refresh link on the iOS app? Did you see the box appear with the button?
  9. Today I used the share link prices on my iPhone. It showed 100% good reviews, but the number of reviews was shown as 21. The report was dated October 27. Today is November 13. There were actually 189 reviews. I tried to refresh the report but the app would not do so. Finally I went to ReviewMeta on chrome and was able to refresh the report. 71% of the reviews were now suspect.
  10. Nothing happens unless you actually go to the page. We're working on an extension improvement, but it still won't automatically update reports unless you go to the page on ReviewMeta.com.
  11. Yeah, but we don't like to use the word Fake: https://reviewmeta.com/blog/unnatural-reviews-and-why-we-dont-use-the-word-fake/ 0% chance that we're changing the name.
  12. Submit your ideas here! Mine is ReviewReviews. Or ReviewReviews. Or ReviewedReviews. Or ReviewsReviewed. Or ReReview. Or, go ahead and argue that ReviewMeta is perfectly adequate! (Staff not allowed! 😁)
  13. For products never-before analyzed, are they added to the analysis queue? For already-analyzed products, is the analysis added to the update queue?
  14. Hey Mike- It's always important to remember that both Fakespot and ReviewMeta are estimates - nobody can actually tell which reviews are fake and real as a matter of fact, although Fakespot isn't as upfront about this fact. Fakespot also doesn't share details about their analysis, so it's impossible to check their work or really know how their secret magic formula is working. I've seen Fakespot going around saying that 30% of reviews on Amazon are fake. We think this is way overblown, and put that number closer to 7-9%. Amazon itself claims less than 1%. So our understanding is that Fakespot is much more aggressive and is likely displaying a lot of false positives. Another thing to remember is that just because a product has fake reviews does not mean that it's a bad product. A Fakespot F or a ReviewMeta Fail doesn't necessarily mean the product is fake, defective or faulty. Always go with your gut and always return products on Amazon if you feel you've been deceived. Here's something we wrote about this topic a few years back and a good discussion in the comments: https://reviewmeta.com/blog/fakespot-vs-reviewmeta-for-analyzing-reviews/
  15. Hello,I’ve recently started using your app on Amazon and I have a question. Prior to using your app I was using Fakespot which I have continued to use in addition to yours. Tonight I looked up items that I knew would probably have questionable reviews (e.g. cheap Chinese products that have mainly all five star reviews). I did this with four different items and each time FakeSpot came up with the expected rating of F, meaning that fewer than 10% of the reviews seemed reliable. However, your app, on all four of the items that I looked at, did not change the numeric rating at all and said that it deemed all of the reviews legitimate. Even if the algorithms and metrics that you use are vastly different from that of Fakespot I would imagine there to be some correlation in results, however in these four cases I found no correlation whatsoever. As an example here is one products that comes up with an F rating on Fakespot and a perfectly clean rating on ReviewMeta. See if you could take a look at this and explain to me why. Example: “USB Hub,atolla USB 3.0 Hub with 3 USB 3.0 Ports and SD/Micro SD Card Reader, Ultra Slim USB Splitter with Individual Power Switches and LEDs” made by Atolla. Thanks, Mike
  16. Yeah, that notice should still be functioning and might be a good way for us to figure out which ones are using the one-tap reviews.
  17. I suppose they're making a minimal effort to protect customers by requiring all these to be "Verified Purchases" — which I think should be true for (almost?) all product categories, anyway. My main reason for bringing it up, though, is just in case RM runs into these beta reviews. I suppose these cases would be covered in the "x reviews reported, but y reviews found" &/or "words per review" tests?
  18. Yes, we are able to cover the costs of the server and a little bit extra. I haven't run the numbers in a while, but I think that I still haven't even recovered the full investment in the site (was losing money the first several years). I'm not doing this to get rich - I thought it would be a fun hobby to put together and now it's grown to a lot more than that. I still don't feel comfortable taking people's money and I'd rather them donate to the ACLU or CPJ as mentioned on the donate page: https://reviewmeta.com/donate
  19. Interesting - I personally believe that this is the wrong direction. It makes it easier to flood a product with a bunch of one or five star reviews that have zero justification, and possibly without even being able to see the reviewers behind these reviews. Makes it much harder for anyone to get a sense of whether the reviews are fake or real - maybe this is the whole point? We'll see how this goes. Amazon is still "testing" it at this point so we'll see if they keep it. My main question in terms of incorporating this change into ReviewMeta is how are they going to show to the customer the 1-tap reviews? Will it just say "400 reviews" but then they only display the reviews that have written text? Or will they display all 400 reviews, but just say "1-tap review" for each of the one-taps? I think I'll have to write a blog post on why I feel this is the wrong direction.
  20. @Tommy Per TechCrunch, 1-Tap Reviews are in beta. Does RM need to be adjusted for such?
  21. flixflix

    PayPal or Patreon option needed + nut query

    Do you make any money via the links and ads though? Is it enough to cover your costs and time?
  22. Tommy

    1 review "one-hit wonders"

    Hello- Thanks for posting. If all the reviewers have 2 reviews, the "Reviewer Participation" test should still raise a red flag.
  23. I've been onto fake reviews for a while now, ever since my mom bought a blood pressure checker on Amazon that turned out to be a piece of junk with thousands of 5 star reviews. That is concerning to public health. People don't know better and getting false BP readings can drastically change the course of their health/treatment. Anyways, I was looking for a head shaver today and have realized that the fake reviews are getting harder to spot now. I put in a "popular" head shaver into your site and it came back as a pass, but one of the criteria was about how if that is the only review the user has posted. When going through the reviews, most of the reviewers had 2 reviews, easily bypassing that check. Just FYI! Great work you do by the way, thank you.
  24. Tommy

    Help contacting ThePriceTest?

    Hi Ted - Have you tried reaching out to the email they have listed here? https://thepricetest.com/info Can you forward me an example "fake price change" email to [email protected] Why is it that you believe they are doing something to your device in the background?
  25. Ted Walters

    Help contacting ThePriceTest?

    thepricetest.com is a scam!!! there is no way to contact them. they are sending me and others fake price changes and apparently doing something to my device in the background block them
  26. Tommy

    should I trust your reports anymore

    Hello- Please see our post on the latest algorithm updates: https://reviewmeta.com/blog/0-unnatural-reviews-august-2019-algo-updates-explained/ As usual, we always recommend reading the actual reports and making a determination for yourself. In my opinion, neither of these two products seem to have reviews that look too bad - in fact, both seem to have many reviews deleted recently so they could have been "cleaned up" by Amazon in the meantime.
  27. This is not the first time this has happened. I was looking at car seat cushions and the first time I looked at this item I think it showed 54% potentially unnatural But since the last reiview was in May of 2019 I ran the report . This time it showed no (zero????) potentially unnatural responses. This has happened several times so I don't think I can trust this site anymore.THis was just the most obviously ridiculous result. Here is the review: https://reviewmeta.com/amazon/B07N68LW2H This is another one I ran a report onbut this time because an older report didn't show up. https://reviewmeta.com/amazon/B01EBDV9BU
  28. A couple reasons for open-source: - It could potentially be included in F-Droid (if appropriately licensed), thereby increasing RM's reach - Users who are devs (I'm not ) could help with feature requests &/or bug fixes. (e.g.: Using app as pure redirector, allowing to open in user's choice browser or using app to open RM links)
  29. The source of the app? I'm sure we could make it available. What changes did you want to make?
  1. Load more activity
×

Important Information

Welcome to the ReviewMeta Forums! By viewing our site, you agree to our Terms of Use