Rather than suggest a specific site for product reviews, I’ll do a short list of “tells” I use to rate a site as biased, corrupt, feeder, independent, sold out, expert etc. Based on the final “score” of a site, I’ve found it pretty effective at saving hours going through semi-convincing blarney and – more important – saving me losing money on purchasing substandard products.

  1. If there’s a link to Amazon (or equivalent) to buy the product being reviewed, the site is getting a kickback and the review will be biased. This has infiltrated many of the established review sites as they try to stay profitable, compromising integrity.
  2. Is the site is specialist? Does it focus solely on certain types of product (e.g. high-end audio equipment)? Is the site independent (i.e. not owned by some megacorp)? Then it will be worth looking into further.
  3. How much history does the site have? When was it founded? Was it once (or is it still) print media? How many mouths does it have to feed? Truth is, a site that began print media and contracted online only but maintains a staff, should be treated as suspect. They’ve every reason to sell bias and corrupt reviews.
  4. Does the site have no product purchase links of the type of product you’re looking into? Sometimes sites only link to specific affiliates (i.e. sponsors). Does it have no Amazon ‘deals’? If not, then this is promising.
  5. Is the site well established but not linked into the top pages of Google and not thrown up as a top search result by dint of sponsorship? This is promising.
  6. If the site is selling the product directly, in-house reviews will be biased and “verified purchaser” reviews will likely be corrupted.
  7. Sites like Analyze and identify fake reviews and ReviewMeta.com – Amazon Review Checker can be useful for extreme cases but these days they’re easily fooled too. Still, another layer of safeguarding.
  8. If the site is sufficiently general – like if it does more than just review products of a certain type – it may be on Trustpilot Reviews: Experience the power of customer reviews and this is worth checking. If the site is listed bad on Trustpilot then it’s got problems.
  9. What I do is pick a few products I know personally – ideally one’s with specific flaws that are obvious to any purchaser but won’t be in the press releases – and see if the reviews hit on those issues. If they do, and there’s similar integrity and fidelity on comparable products (i.e. the site isn’t just a feeder for a certain brand, rubbishing everything else) you may have hit gold!
  10. Is the site up to date? Is the site suspiciously recommended on 3rd party platforms like Quora, often using a cut and paste answer to semi-related questions (a keyword sniper bot).
  11. Does the site frontline articles that try to look like reviews but are trying to cover products not yet released to the public? Unless the site has a pre-release physical product this is a surefire blag and the reviews will be compromised.
  12. Does the site cover only big brand names? This is a bad sign. No product type is populated solely by big household name brands.
  13. Does the site cover a reasonable spectrum of price options for products of the same type? If not, there’s reason to be concerned. Example: a site reviewing action cameras will invariably have the GoPro but there’s options for 1/10th the price – not as good but still, similar basic functions.
  14. No review site can be authentic without dealing with both highest and lowest price and, most important, giving you satisfactory explanation WHY there is such a difference between top and bottom.
  15. Is the review site classifying based on price rather than rating? This should be looked into because many sites use the budget v best divide to avoid having to directly compare essentially similar products. It’s a tactic you’d expect from a site trying to sell expensive and budget versions without putting off the buyer by telling the rich guy he’s paying more for nothing, or the poor guy he can’t actually afford a decent version of the product.
  16. Does the review site include the name of the individual reviewer of the product? Is there contact details for that reviewer? Does the reviewer include an abridged resume? If not, it’s cause for concern.
  17. Online versions of newspapers sometimes do ‘best in class’ reviews for mainstream products under headlines like “Which is the best blah blah?”. This will likely be biased, compromised and an evolved money spinner for the newspaper.
  18. You’ll be able to tell a fake review if the review/list has very little about the shortcomings and frontline issues with whatever technology (industry standards) instead focusing on positives.
  19. A genuine enthusiast (or expert) will review with sincere interest in the product type as a whole. This review will put the product in context, make comparisons with other options, pinpoint pros and cons some of which will be industrywide and therefore on ALL reviews of the same type.
  20. Does the site review brand names far above anything generic, no matter what the generic spec is? This is a sign of bias and corruption.
  21. Does the site have a summary star rating or percentage? If so, a common check is to look at a popular frontline brand review – like Jabra Elite 65t earbuds at $125 – and look for an average generic review – like
  22. Does the review blindly accept the press release ‘industry standard’ limits and use language that tries to gloss over (or reassure you) about obvious objective negatives? This is another sign of corruption. For example: The TrustedHoverReviews.com site reviews the best hoverboards of 2019. It covers known brands and a few generics.
  23. Have you found some negatives in the review – enough to make it seem like it could be genuine? This algorithm is a bit more complicated. See, bogus reviews will smokescreen by writing up negatives but these will only be negatives by comparison with other product options of the same type. It’s a cheap tactic and can actually be a good sign of deeper corruption as the review is going to such trouble to hide its true agenda.
  24. Does the site or the product have customer submitted reviews? Verified purchase began as a good idea but it’s actually a sign the review is more likely to be corrupt as so many products include a chit with the product or send out the product free to a mailing list, offering discounts/deals in return for 5* reviews.
  25. If the site or product has customer submitted reviews the only useful info can be gleaned is from the “eloquent midrange” i.e. 2 and 3 star, and longer considered 4 star reviews. Cut off the 5s and the 1s. The former tend to be biased. The latter tend to be edgecases like a product delivery being screwed up (which tells you nothing about the product itself). Cut off reviews that are shorter than 2 sentences. Cut off reviews that don’t mention at least three specific features of the product with an opinion, i.e. the battery life was excellent is OK but the phone came with a replacement battery is not!
  26. Biased or covert-sponsored reviews share some techniques. Most abused is “infectious enthusiasm” where the reviewer moons over tech – scripted by press releases – glossing over shortcomings e.g. “the battery life is a decent 25 seconds” or “sleek, chic, Milan-inspired design contours more than make up for the loss of headphone jack; and of course most have switched to USB-C already.” Real reviews are less bright and bubbly, more demanding and pedantic.
  27. Many sites try to appropriate an expertise they may (through the individual contributors involved) be able to call on, but experts need money same as everyone else and selling endorsements – even on the level of an obscure tech geek – can be a nice little income; and is popular with corporate brands as it boosts edgy and niche credibility (on which mainstream success MUST be built, if it wants to last).
  28. Genuine enthusiasts, experts, geeks, or everyday serious users (e.g. a parent with a site reviewing baby high-chairs) will quickly J-curve their ratings so there will be very few highest rank (5-star, 98%+), more high-mid, mid and low=mid ratings and – most important – no less lowerst rank (1-star or 0-star, 25%-) than middling. This is the nature of consumerism: for every decent product idea there will be a handful of innovators, a handful of a handful of innovators also wed to quality,. a majority of copycats or minor variations on the copycat theme, and PLENTY of cheap shit, outright scams, etc. A non-corrupted review site will have a spread of ratings (or review conclusions) matching this range. Bias, corruption, covert-sponsorship: there will be an overload at the top/highest end and either an absence of 2-3 stars + plenty of 1* or a lack of 1* and some 2* and 3*. These latter will often be the non-brand name versions.
  29. True review articles will weight more towards criticism than congratulations, i.e. the review will be into the type of product or the genre of service so will be very aware – and keen to share – insight into the shortcomings of the industry as well as the failures of the actual product. Corrupted or dodgy sites won’t have this sort of scope. It doesn’t help the cause of easy profiteering to have normal industry standards exposed as sub-standard nor does a product sell as well if a trusted review spotlights its pros and cons without 4 to 1 favouring the pros.

These are just some rough notes on the product fake-news reviews and sites that claim to be a one-stop place to find information and comparison tests between ‘rival’ products. Keep an eye on this page for an updated version in the near future with polished, properly categorised points and – later this year – a website and an app with access to an NLP/machine learning benchmark set (code versions of what’s listed above) that’ll let you subject any site or any review or any media outlet or any selected product to unbiased uncorrupted scrutiny.


Comments are closed, but trackbacks and pingbacks are open.