The hidden side of politics

How Amazon’s Algorithms Curated a Dystopian Bookstore

Reported by WIRED:

Among the best-selling books in Amazon’s Epidemiology category are several anti-vaccine tomes. One has a confident-looking doctor on the cover, but the author doesn’t have an MD—a quick Google search reveals that he’s a medical journalist with the “ThinkTwice Global Vaccine Institute.” Scrolling through a simple keyword search for “vaccine” in Amazon’s top-level Books section reveals anti-vax literature prominently marked as “#1 Best Seller” in categories ranging from Emergency Pediatrics to History of Medicine to Chemistry. The first pro-vaccine book appears 12th in the list. Bluntly named “Vaccines Did Not Cause Rachel’s Autism,” it’s the only pro-vaccine book on the first page of search results. Its author, the pediatrician Peter Hotez, a professor in the Departments of Pediatrics and Molecular Virology & Microbiology at the Baylor College of Medicine , has tweeted numerous times about the amount of abuse and Amazon review brigading that he’s had to fight since it was released.

Over in Amazon’s Oncology category, a book with a Best Seller label suggests juice as an alternative to chemotherapy. For the term “cancer” overall, coordinated review brigading appears to have ensured that “The Truth About Cancer,” a hodgepodge of claims about, among other things, government conspiracies, enjoys 1,684 reviews and front-page placement. A whopping 96 percent of the reviews are 5 stars—a measure that many Amazon customers use as a proxy for quality. However, a glance at Reviewmeta, a site that aims to help customers assess whether reviews are legitimate, suggests that over 1,000 may be suspicious in terms of time frame, language, and reviewer behavior.

Once relegated to tabloids and web forums, health misinformation and conspiracies have found a new megaphone in the curation engines that power massive platforms like Amazon, Facebook, and Google. Search, trending, and recommendation algorithms can be gamed to make fringe ideas appear mainstream. This is compounded by an asymmetry of passion that leads truther communities to create prolific amounts of content, resulting in a greater amount available for algorithms to serve up … and, it seems, resulting in real-world consequences.

A recent resurgence of measles outbreaks has the World Health Organization, the Centers for Disease Control and Prevention, and the US Congress questioning the impact of anti-vaccine misinformation on public health. Investigative journalists and academics have been examining one facet of this problem: what curation algorithms are doing to tamp down—or spread— health misinformation online. What they’re finding isn’t encouraging.

Over the past decade or so, we’ve become increasingly reliant on algorithmic curation. In an era of content glut, search results and ranked feeds shape everything from the articles we read and products we buy to the doctors or restaurants we choose. Recommendation engines influence new interests and social group formation. Trending algorithms show us what other people are paying attention to; they have the power to drive social conversations and, occasionally, social movements.

Curation algorithms are largely amoral. They’re engineered to show us things we are statistically likely to want to see, content that people similar to us have found engaging—even if it’s stuff that’s factually unreliable or potentially harmful. On social networks, these algorithms are optimized primarily to drive engagement. On Amazon, they’re intended to drive purchases. Amazon has several varieties of recommendation engine on each product page: “Customers also shopped for” suggestions are distinct from “customers who bought this item also bought”. There are “sponsored” products, which are essentially ads. And there’s “frequently bought together,” a feature that links products across categories (often very useful, occasionally somewhat disturbing). If you manage to leave the platform without purchasing anything, an email may follow a day later suggesting even more products.

Amazon shapes many of our consumption habits. It influences what millions of people buy, watch, read, and listen to each day. It’s the internet’s de facto product search engine—and because of the hundreds of millions of dollars that flow through the site daily, the incentive to game that search engine is high. Making it to the first page of results for a given product can be incredibly lucrative.

Unfortunately, many curation algorithms can be gamed in predictable ways, particularly when popularity is a key input. On Amazon, this often takes the form of dubious accounts coordinating to leave convincing positive (or negative) reviews. Sometimes sellers outright buy or otherwise incentivize review fraud; that’s a violation of Amazon’s terms of service, but enforcement is lax. Sometimes, as with the anti-vax movement and some alternative-health communities, large groups of true believers coordinate to catapult their preferred content into the first page of search results.

Amazon reviews appear to figure prominently in the company’s ranking algorithms. (The company will not confirm this.) Customers consider the number of stars and volume of reviews when deciding which products to buy; they’re seen as a proxy for quality. High ratings can lead to inadvertent free promotion: Amazon’s Prime Streaming video platform launched with a splash page that prominently featured Vaxxed, Andrew Wakefield’s movie devoted to the conspiracy theory that vaccines cause autism.

Perhaps compounding the problem, Amazon allows content creators to select their own categories and keywords. While writing this article, I experimented with the listing tool for Kindle books; the keywords and categories are entirely self-selected.

Amazon
Amazon

With a product base as large as Amazon’s, it’s probably a challenge for the company to undertake any kind of review process. This is likely why quackery shows up in classified as “Oncology” or “Chemistry.” It’s a small reminder that Amazon isn’t exactly a bookstore or library.

There’s no silver bullet for solving health misinformation online. Guardian journalist Julia Carrie Wong recently summarized the anti-vax social media situation in a tweet: “We have a data void, an enthusiasm gap, bad recommendation algorithms, targeted advertising, coordinated harassment, and echo chambers. It’s a giant messy stew of every problem with social media, with very real consequences.”

It’s a complex and thorny problem, and Amazon is suffering from it as well. There are concerns that tackling this problem could lead to censorship. But there’s a big gulf between outright removing the content, or refusing to sell books, and rethinking amplification and categorization. Amazon can start by doing better when it comes to recommending and categorizing pseudoscience that may have a significant impact on a person’s life (or on public health). In recent months, YouTube and Facebook have begun to shift their policies to address health misinformation and conspiratorial communities: YouTube has both demonetized and downranked the anti-vax content, and Facebook has made a statement implying that it’s likely to follow suit. Google, to its credit, has long had a policy for Search called “Your Money or Your Life,” which recognizes that when people are searching for information about highly impactful topics, it has a responsibility to hold those results to a higher standard of care. Users looking to buy health books on Amazon should be afforded the same standard.

No major platform is immune to problems with gameable algorithms. But Amazon in particular—with its massive audience and extraordinary revenue—is remarkable for how little it has changed despite numerous investigations of quackery and review manipulation over the years. It simply ignores the problem and waits for the next press cycle.

Amazon has recently taken incremental steps toward curbing health misinformation, but primarily only when under significant pressure. The company responded to a letter from US Representative Adam Schiff (D–California) by pulling a few anti-vaccine documentaries from Prime Streaming. That’s a good start, but it’s not enough. The real-world impact of health misinformation makes the stakes too high and too important to ignore. Amazon needs to recognize that its ranking and recommendation engines have far-reaching influence—and that a misinformation pandemic can induce a different kind of virality.


More Great WIRED Stories

Source:WIRED

Share

FOLLOW @ NATIONAL HILL