The Shady Merchants Who Have Been Gaming Amazon’s Review System for Years - 5 minutes read




This article is from Full Stack Economics, a newsletter about the economy, technology, and public policy. You can click here to subscribe to the newsletter.


My wife wants a can opener for Christmas, so I went on Amazon to find one. I typed “can opener” in the search box and sorted the list by the average customer review. My plan was to get the old-fashioned hand-crank kind. But the first page of results had a bunch of electric can openers with excellent reviews and reasonable prices. I briefly considered getting one. Then I started reading the reviews.





“The garlic was all rotten and moldy,” reads the top review for this can opener. “I threw more out than I used, unfortunately.”





Amazon


















“Whole Foods is the best value for buying organic garlic,” another review claimed.


I checked that I hadn’t accidentally clicked on the wrong product page. I really was on the page for an electric can opener. But most of the reviews were for garlic.


The story seemed to be the same for other highly rated electric can openers. Here’s what the first page of results looked like on Wednesday morning:





Timothy B. Lee/Screenshot via Amazon






When I clicked through to individual product pages, I found that most of those hundreds of positive reviews were for products other than can openers:





• “I’ve been looking for a sturdy measuring cup of this size for a long time.”


• “The basket is great and the chocolates were good quality and plentiful.”


• “The tins arrived on time and in good condition.”


• “I was thrilled with this vase.”


• “For the price, I wasn’t expecting much but they are great napkins.“


Some pages had a few recent reviews about can openers, alongside many more about other products. In other cases, I didn’t see a single can opener review.


Apparently, shady merchants gain control of Amazon pages for highly rated items and then swap out the product descriptions. I assume their goal is to game Amazon’s search engine—and to trick customers who rely on a product’s star rating without actually reading the reviews. I can’t say I was shocked by this situation because I’ve encountered it before.














Last year, I was looking for a toy electric drone for my kids. Amazon’s top search result was a $23 drone with 6,400 reviews and an impressive five-star average rating.


Then I read the reviews. “Absolutely love this honey,” one reviewer wrote. “It’s quite different from any supermarket-purchased honey I’ve tried.”








The story was the same for other cheap, highly rated drones. Most of the five-star reviews were obviously for other products, including a box of Christmas cards, a bracelet, and a bottle of vodka.


I wasn’t the first reporter to notice this problem. BuzzFeed’s Nicole Nguyen wrote about the same issue three years ago.


“This iPhone X battery case listing used to be for a leather wallet phone case,” Nguyen wrote. “This iPhone battery case page was formerly a listing for Lightning charging cables. This Wi-Fi router was previously listed as nano computers and has been collecting reviews since 2003. This neck brace was formerly a shower caddy listing. What was formerly a listing for a guitar-string action gauge is a now a page for magnetic, glue-free eyelashes.”





A year after I wrote my first piece, and three years after Nguyen wrote hers, Amazon still doesn’t seem to be taking the problem seriously. Amazon didn’t respond to my email seeking comment for this story.


It’s not clear who is doing this. The Amazon pages for these can openers—like the drones I looked at last year—have throwaway “brands” like Ankuwa, Luckkya, LooQoo, and W-Dragon. They’re sold by third-party merchants that are mostly based in China. These merchants have names that are even more inscrutable than the brand names, like ZZSENM, BGroams, VVEgjkpps, and Millkkoo.





When I asked Amazon about the bait-and-switch review problem last year, a spokesperson told me that “we have clear guidelines about when products should be grouped together and we have guardrails in place to prevent products from being incorrectly grouped, either due to human error or abuse.” But the guardrails against abuse still don’t seem to be working. Abuse continues to run rampant, at least for can openers.








Internet companies and their defenders sometimes insist that this kind of problem is impossible to solve—that the scale of the web makes a certain amount of bad behavior impossible to stop. This argument is somewhat plausible for a company like Google or Facebook that has to sift through billions of pieces of user-created content.




But Amazon isn’t just a tech platform passively hosting and indexing user-created content. It’s a middleman that collects fees for everything customers buy on Amazon. In many cases, Amazon also warehouses and ships the products: The screenshot above, for example, is for Prime-eligible items that mostly ship from Amazon warehouses.


Amazon earns far more revenue from the sale of a can opener than Twitter earns from hosting a tweet—enough revenue that it should be able to easily hire more people to police product pages for obvious fraud.


Indeed, a lot of the fraud here is blatant enough that it should be possible to detect it automatically. Machine learning algorithms are getting sophisticated enough to figure out whether a batch of 100 reviews is mostly talking about can openers or garlic.


A company of Amazon’s size and sophistication should be able to build software to regularly scan product listings and flag ones where the reviews don’t seem to match the product listing. The fact that this continues to be a widespread problem suggests Amazon doesn’t consider it a priority to solve it.

Source: Slate Magazine

Powered by NewsAPI.org