From fake reviews to boost rankings, to early reviews on unreleased products, the desire to rank well and get in there first has led to a drastic decline in the quality of online reviews. We investigate this worrying trend.
Reviews That Aren’t Reviews
A review should be based on a calculated assessment of a final product, only when someone has had a substantial chance to explore the product and all of its features. Someone forget to tell that to T3.
It’s impossible to review the Nintendo Wii U at least five months before release, especially when the games used are Beta versions of its games. A busy conference is also not an ideal environment to test out hardware. Michael Rundle sums this up best:
He’s right. Being able to post the first review of a product has become a driving force for online publications. People love to search for “Product Name + Review” as a query, and so being able to have a historic page with lots of value, is essential. The first review will get lots of shares and links, and then be in a very strong position when people search.
The Race To Be First
If the T3 article jumped the gun, then PC Advisor jumped the shark, with their review of the Wii U which was posted in June 2011. Almost a year and a half before release.
At this stage of release, the Wii U had no games to show, only tech demos. It also had unfinished hardware which has since changed. Every single conclusion drawn in the article is based on an incomplete product, and games that have since been updated.
However when you search for “Wii U Review” it’s one of the top results. It outranks many of the “reviews” written this year, and much of that is because it is a strong historic result. Google has no way of knowing that it’s fluff content designed to pull in traffic. It paints a bad picture of the product and could put people off, even though the review isn’t based on the real retail product.
What About Brand Image?
Due to this, it’s unlikely this is something Nintendo would encourage, and you’d certainly assume they would be against reviews of early release products?
Well, another Page One result for Wii U Review is… The official Nintendo Magazine!
So the magazine designed to represent Nintendo also published a “review” of unfinished hardware and software. Interestingly this review is far more fair and accurate which if you’re going to publish a review early seems like a better approach.
Yes the official Nintendo magazine is impartial so this probably isn’t a deliberate strategy by the brand. However, it would certainly be worth Nintendo adding a page to their own website with “Wii U reviews” that they could use as a placeholder for when real reviews launch.
How Can This Be Stopped?
As the Wii U was shown at a trade show with thousands of people testing it, it would have been very hard for Nintendo to control who saw the console and when they chose to write about it. The film industry however seems to have the right idea, using embargoes and private press screenings to tie reviewers in to contracts. These commonly work as “If you want to see our movie you can’t review it until …”.
This works to a degree, helping to ensure that reviews are based on final copies of movies and that they all land at a similar time helping generate a buzz around each film.
It’s by no means a definitive solution though. The New Yorker broke an embargo on The Girl With The Dragon Tattoo which led to the reviewer being banned from future press screenings. Depending on the contract you sign an embargo, this could even lead to a hefty fine, making them a very powerful tool to control the media.
In addition, embargos only apply if you’ve signed the paper. A blogger who reviewed an unreleased Hasbro product found himself under threat from lawyers and “representatives” who turned up at his house. However he bought the product from a region where it was on sale (so didn’t break a street date) meaning no laws were broken, and was free to review it how he wished.
Reviews Shouldn’t Be Rushed
The other drive behind embargos is that companies often want reviews to go out as close to release as possible, sometimes afterwards. This stems from a fear that reviews may be bad and put people off (another consideration to the constant searches for Product Name + review), as well as a desire to control reviews so they land at the same time for maximum hype.
However due to the fear an embargo may be broken that tends to lead to final products being sent to reviewers very close to the embargo date. Reviewers then have to drop everything, write up their notes as fast as possible and queue them up to be first when the embargo ends.
It’s a horrible approach to reviews
A reviewer should be able to spend as long as they need in order to write their views. Especially if that product takes a long time to fully enjoy (e.g. a book, a video game or anything that takes a considerable time investment to consume). Rushing through something in order to write the first review isn’t fair on the reviewer or the readers.
That’s why I like the way Kotaku does their reviews. They don’t always have them ready for the release date of a game, and wait till they have time to try a game fully. Whilst they lose countless page views and advertiser dollars by doing this, it does mean their reviews are trustworthy. They also say in each review exactly how much time they invested in each game.
Google Aggregators Make The Problem Worse
I really like Rotten Tomatoes and Game Rankings.com. Instead of reading individual sites for reviews, I generally look at the consensus on there as it’s often published after release and tends to be based on high profile real reviews. These two sites both get aggregation right.
Google on the other hand… not so much.
It started with local listings and citations. Getting more reviews on a Google local profile allowed it to perform better in the rankings. So in order to do well in local SEO you had to drive lots of reviews to your profile.
That led to people gaming the system with fake positive reviews for themselves and fake negative reviews for competitors (so their listings would look bad). It’s not exclusively a Google problem. Yelp and Trip Advisor are two high profile sites often marred with fake reviews.
If not for Google local listings, these sites would both have far less fake reviews. It also doesn’t help that Google can’t seem to add up review scores, unfairly favouring positive reviews over negative ones.
Where Do We Go From Here?
First of all, I love reviews and I feel they are an essential way to gain opinion on what to buy next. For five years I wrote reviews for various websites, so to say that we should stop all reviews would be both wrong and hypocritical.
Ultimately, as long as people are breaking embargos and early reviews continue to rank and drive traffic the issue will continue. As long as fake reviews can go undetected people will keep doing it.
For consumers my main advice would be to find a website that you can trust and stick with it. Think of products you like, and don’t like and then find somewhere that reviews them with similar thoughts to your own. Then stick to that reviewer and follow their work. You don’t have to agree with everything they say, but if you can trust that they’ve really tried a product and gotten to grips with it, then that’s a good indicator of how much you can trust them.
Ignore sites like Amazon as a review source. It’s really easy to write fake reviews (such as one book that got 200 fake reviews in 2 days, which still remain in place) and hard to tell whether someone who likes a product agrees with your own opinions.
When finding sites with reviews try to choose those who use the full spectrum of a score. If they rate from 1-10 then check to see if they have reviews at the lower end of the scale. If a website gives everything a good score then it won’t be reliable source.
Alternatively, scrap review scores all together and make your own mind up after you have read the content of a website. With the Wii U Reviews above, it’s hard to get any real insights on the console, and I wouldn’t base a purchasing decision of them. When I find a real review, it should be clear from the text that they’ve actually given the console a good work over. If a review has little depth, it’s unlikely to be a good guide.
Reviews as a medium are beginning to appear dated, and they hold far less authority than they did in the past. By reading multiple reviews of a product, and not relying on Google to find the best review customers, you can piece together a far more accurate analysis of whether a product is right for them. Trust the power of public opinion and your own views not the top Google result.
Blank gravestone via BigStock
Hi, in my humble opinion Im not sure if I’d agree Mike that reviews are outdated. Perhaps they are seen this way within the SEO industry but as someone with a small business (tinkering with some SEO in house) I think reviews are super important, and as important to us today as they were years ago.
Having said that, there is certainly a need to make it SUPER clear to the user that the reviews are not biased or fake in any way. We use an external review site, and make that clear, for these very reasons – and yet we still get occasional comments/ FB messages from people saying theyre skeptical.
I’ve now written this post https://www.optibacprobiotics.co.uk/resource-centre/faq/category-optibac-probiotic-faqs/285-how-do-we-collect-our-product-reviews.html – and have high hopes that it will solve the problem! Will keep you posted.
This is a really well written article. It is really frustrating, and I never thought about it! I do love Rotten Tomatoes, but had not heard of Kotaku or Gamerankings.com. I am sad to hear that you don’t like Amazon reviews because that’s actually a major reason I use Amazon to shop- because I love the reviews.
The vast majority of what you said seemed pretty obvious to me years ago (e.g., dates of reviews, single reviews versus aggregates, fake reviews).
One thing that actually caught me off-guard was Yelp censuring real reviews. I had read Yelp reviews for several months before I tried to post my first review. I submitted my review with no apparent problems. Months later I was talking to a friend about my review and was told it wasn’t visible. So I submitted a new one; after several days it still wasn’t visible.
When I contacted Yelp to find out what was going on I was told something along the lines of, “if the first review is bad we assume it’s spam, so we don’t post it”. ??? Seriously. In both reviews I had gone into extensive detail about why I gave the rating I did, so if a reader had any doubt about my scoring they could have read why I rated it as I did.
Obviously spam reviews are problematic, but there are much better ways to handle spam. Frankly, I’d rather read through a handful of spam to get to one real review than to have no reviews at all – ESPECIALLY if that one real review details why I should NOT use a given service or product.
Needless to say, since Yelp censors genuine reviews – with no indication that they do so – I don’t use Yelp anymore. This experience highlighted the fact that even mismanaged aggregation review sites damage the potential value of online reviews.
Any review system that becomes trusted will eventually be targeted and perverted
Hmmm, I don’t think SEO is really to blame here. It’s makes for a catchy headline, but once these products actually are released the bigger, established new organizations are now rewarded with traffic. A few years ago, I would have agreed with you that SEO was too powerful in this space, but Google’s updates the last 18 months have really curbed it’s power here.
I’m with you Scott. The changes have really had an impact here.
I’d really like to see Apple fix this in the App Store. The first day some new game comes out they seem to be flooded with reviews that seem questionable. You have to dig down pretty deep to find people that really used the product and gave honest feedback.
Another problem with online reviews is that they don’t DIE – I was looking up a specific feature on a Kindle recently, and almost every review I found was a few years old & complaining about a bug that’s long since been fixed in the firmware.
Considering how fast things can change in hi-tech products, having old reviews cluttering search pages is really unhelpful – imagine somebody basing their expectations of an iPod on a review written five years ago!
I can’t recall the name of the site, but one of them shows ratings plotted against a trendline, so that you can see if a company had negative reviews for five years but has been getting positive ones for the past two, for instance. This avoids a situation where old, low ratings bring down the average score for someone whose recent ratings are much better.
An interesting piece Mike. I’m head of search, social and affiliates at Future Publishing, the publishers of both T3.com, Official Nintendo Magazine, and a partner of Reevoo…
For ‘big’ products like the Wii U where there’s bound to be a lot of searches for them, we do write advance pieces on them like the examples above. The decision to call them “hands-on review” is driven purely from a search perspective – people don’t search for “Wii U preview” (see: https://www.google.com/insights/search/#q=Wii%20U%20review%2CWii%20U%20preview&date=today%2012-m&cmpt=q) and frankly, what’s the point in sending someone to LA for E3 and getting them to write stuff that doesn’t generate any traffic? If articles don’t wash their faces, what’s the point in writing them?
We don’t claim that these are definitive review pieces, and we will either update the article with a complete review when we’ve had time to spend exhaustively going over the product, or we create a complete new article in the /reviews/ section of the relevant site.
We’ve tried some new approaches recently too, such as our ‘as it happens’ review of Diablo 3: https://www.pcgamer.com/2012/05/15/diablo-3-review-as-it-happens/ which was a really interesting experiment to try, with some great feedback from the readers too, and gave people much more insight into how the final review score came about.
Embargoes by the way don’t work – there’s always someone else that will break them (we don’t), and film studios tend to be hilarious in their lack of understanding about how the internet works, for example: You must take down this image immediately as it’s a North American exclusive image. As if no-one in the UK looks at US film websites…
I do though tend to agree that it’s a bit sucky, but you have to go where the traffic is, and what the people are searching for. Frankly, it all comes down to the economics of running websites which rely on traffic to drive ad impressions and it means you’ve got to be damn good at SEO just to break even. At some point in the future this way of working may well stop working, and at that point we’ll adapt and change to fit the new paradigm, the way we have with any other algorithm change.
I could go on for ages – there are some really interesting QDF factors that go into reviews for example, exact match domains are still ranking really well for reviews even though they’re full of crap, and aggregators killing it in reviews despite creating no original content themselves, but those are a whole different post…
@James That approach seems like a good compromise.
I really like the ‘as it happens’ review. It’s a nice refresh of the standard format and, if people do just want the standard review, the page provides a very obvious link to take them to it.
I should point out it’s not the position of Future Publishing per se, more the position of most publishing companies – you have to get the traffic to your site for it to make economic sense.
It’s the eternal problem of what works in Google though isn’t it? A title like “[product name]: hands on review” does work as a title, but not quite as well as “[product name] review: hands on” in certain circumstances. We’re trying to differentiate quite clearly between the two with the name though.
We’ve had some interesting times with products being announced and magazine deadlines – do we put a short, incomplete review in the magazine with a full, in depth review to follow, or do we ignore the product until we’ve had a chance to really look into it more six weeks later? In six weeks time the main hype over a new product may be over and people are more interested in what they can do with their new shiny toy.
Semantic search certainly leads to some new possibilities and some interesting changes in the way that search could work for ourselves and other large publishers, however giving too much data to the mighty Google means they may just decide to aggregate all the information and kill our traffic in the same way as they have with credit cards…
I’m glad you like the Diablo 3 experiment Mike and Kat – it worked well for search too, the PC Gamer servers struggled nearly as much as Blizzards!
You’re right; and often the employee in question is just trying to help, even though they’re going about it completely wrong.
Mixing up service and product feedback is a problem – and not just for the people reading the reviews (or just the average review score). When businesses look at their review data to pull out insights on what they’re doing right or wrong, and which products are making customers happiest, it becomes incredibly difficult if they have to manually pull apart comments about service and product. It’s easy enough to tweak the way you ask for reviews to get the service and product comments in different sections/answers but, unfortunately, not enough businesses do this.
It’s refreshing to hear such a balanced view of reviews! And that you’re all for real customer reviews, too. I’d be interested to hear how you police your reviews to track real vs. fake reviews. Although, I’m not sure if you’d be able to divulge that sort of information without risk of spammers using the information to their advantage!
Mike’s method is the one I adopt, too. It’s just very time consuming. It’s also important to factor in people’s own benchmarks. I recently searched for hotel reviews and it was clear that different people have high/low standards so you need to try to work out what’s the median or which people more closely fit your way of thinking.
Unfortunately you’re right and I can’t give away all our secret sauce. But our starting point is, like I said, only asking real customers for reviews, and we’ve never been shy about talking about this.
Essentially, this works by us linking in to businesses’ post-purchase/booking contact with customers. After the customer has had enough time to build up some experience of their purchase, they’re automatically sent an email them to invite them to review. That way, we keep it a closed system and we’re able to solicit reviews from all customers.
Completely agree about the different reviewer perspectives/standards. When the review system lets you filter by categories (eg family travel/business travel) that’s somewhat helpful, but you do end up weighing up the review with what you can learn about the reviewer, even after this filtering.
This is where the word “Preview” needs to be used. You shouldnt ever review something before it has a final release.
One genre that gets this correct (in my opinion) is the games (not the console). Someone like IGN splits games into: Preview (pre-release) and Review (post release) and gives them scores seperately. The preview always addresses the potential for change.
Agree with you, Mike! Online reviews have been completely spammed in many cases. I don’t trust them anymore and not sure if I ever will without checking multiple sources to backup information provided.
There are many other reasons why online reviews have been destroyed. I was browsing new TVs on the John Lew/s website and was shocked to see a reputable brand had allowed an employee / outsourcer to add hundreds of fake reviews to their own website. At first I thought this was for SEO. However, a trip to one of their stores made me realise their true motivation.
I was advised by a shop assistant to check out the online reviews for a particular TV on the John Lew/s website and that they conveniently had a computer in store for people to do just that. And only that. No other websites/review sites could be visited. So they’re doing it for both SEO and to increase sales in-store too.
Naturally, I told the sales rep all this. He was in shock. But I would willingly bet his total commissions for the year that he wouldn’t say a word about it…
That really is surprising – John Lewis has such a good reputation, you think they’d be more careful with it.
From my experience working with review solutions (I work for Reevoo so clearly I have a horse in this race) it’s never a good idea to use the kind of open system that JL use. If you’re not getting spam reviews, or fake reviews from employees or competitors, you’re getting people just generally musing on their prejudices for/against a particular brand or kind of product. I remember when the Kindle first came out and you’d see reviews saying “I’d never buy an ereader”. Not very helpful!
It’s far better to use a system where you only ask actual customers for reviews. As well as more trustworthy reviews, you also tend to get more reviews full-stop – and you’re not only hearing from the people who are furious or delighted. You get the full range of opinion, so the reviews are more useful.
Personally, I also use Mike’s approach and read a range of real-people reviews as well as expert reviews. You get very different perspectives from each, but both have their uses. It’s a shame that some sites are ‘poisoning the well’ for expert reviews in their rush for SEO ‘firsties’. Given how much effort Google is putting into ajusting their algorithms to avoid people gaming them, it’s possible that this will start to backfire, then die out. Here’s hoping.
Sign up now and get our free monthly email. It’s filled with our favourite pieces of the news from the industry, SEO, PPC, Social Media and more. And, don’t forget - it’s free, so why haven’t you signed up already?