Why the internet lie continues to work

About a month ago, I wrote about a viral herbal medicine book “Lost” that had at the time sold 60,000 copies on the TikTok Store, despite appearing to violate some of the app’s policies on health misinformation. The book’s sales were boosted by popular videos from health influencers on the app, some with millions of views, who falsely claimed the once-obscure 2019 book contained natural cures for cancer and other ailments.

Influencers, along with TikTok, made money from selling this scam book. I brought all of this to TikTok’s attention. The videos I flagged to a company spokesperson were removed after a review for violating TikTok’s policies prohibiting health misinformation.

The book remained for sale in the store and influenced young people. However, I haven’t stopped watching the TikTok Shop promotions for this book, The Lost Book of Herbal Medicines, since the year.

“This right here is why they’re trying to ban this book,” said a video of the TikTok Shop seller as he showed the book’s list of herbal cancer treatments. Later, he urged his viewers to click on a link in the Store listing and buy now because “it probably won’t be around forever because of what’s inside.”

The video received more than 2 million views in two days. Click through the link as instructed and you will see that sales for the book have doubled since my article came out. The Lost Book of Herbal Medicines has sold more than 125,000 copies through TikTok Shop’s e-commerce platform on TikTok alone. The book’s popularity doesn’t stop there, however: as of June 5, it’s the no. 6 best seller on Amazon and has been on the Amazon bestseller list for seven weeks running.

The “invisible rulers” of online attention

I was thinking about my experience digging into The Lost Book of Herbal Medicines while reading the next book Invisible rulers, by Stanford Internet Observatory researcher Renee DiResta. The book examines and contextualizes how misinformation and “customized realities” became so powerful and prominent on the Internet. She shows how the “collision of the rumor mill and the propaganda machine” in social media helped form a trinity of influencers, algorithms and crowds that work symbiotically to catapult pseudo-events, Twitter personalities and conspiracy theories. that have attracted attention. and undermined consensus and trust.

DiResta’s book is part history, part analysis, and part memoir, as it ranges from pre-Internet examinations of the psychology of gossip and propaganda to the biggest moments of online conspiracy and harassment from the age of social media. Ultimately, DiResta applies what she’s learned in a decade of closely researching online misinformation, manipulation, and abuse to her personal experience of being the target of a series of baseless allegations that, despite their lack of evidence, urged Rep. Jim Jordan, as chairman of the House Subcommittee on Armaments of the Federal Government, to launch an investigation.

There’s a really understandable instinct that, I think, a lot of people have when they read about misinformation or misinformation online: They want to know why it’s happening and who’s to blame, and they want that answer to be easy. Hence the meme-generated arguments about “Russian bots” causing Trump to win the presidential election in 2016. Or, perhaps, push to deplatform a person who went viral by saying something wrong and harmful. Or the belief that we can abstain-moderate our way out of online hurt altogether.

DiResta’s book explains why these approaches will always fail. Blaming the “algorithm” for a dangerous viral trend may feel satisfying, but the algorithm has never worked without human choice. As DiResta writes, “virality is a collective behavior.” Algorithms can pop, nudge, and mess around, but they need user input to do it effectively.

Parable, panic and prevention

Writing about individual viral rumors, conspiracy theories, and products can sometimes feel like telling parables: The Lost Book of Herbal Medicines becomes a guide to anything’s ability to become a TikTok Store bestseller, as long as the influencers pushing the product are good enough at it.

Most of these parables in the disinformation space do not have neat or happy endings. Disinformation journalist Ali Breland, in his recent article for Mother Jones, wrote about how QAnon became “everything.” To do this, Breland begins with the image of Wayfair, the cheap furniture retailer that became the center of a moral panic about pedophiles.

This moment in the history of online panic, which also features heavily in DiResta’s book, occurred in the summer of 2020, after many QAnon influencers and hubs of activity were banned from mainstream social media (which, incidentally, I interviewed DiResta at the time for a piece questioning whether such a move occurred too late to have any significant effect on QAnon’s influence).

Here’s what happened: Someone online noticed that Wayfair was selling expensive cabinets. Cabinets had feminine names. The person drew a few mental dots and connected them: surely, these listings must be coded evidence of a child trafficking ring. The idea caught fire in the QAnon spaces and quickly spread beyond the enclaves of paranoia. The wild and disillusioned idea chose a real hashtag used to raise awareness of actual human trafficking, which interfered with real investigations.

Breland, in his “Mother Jones.” part, shows how the central tenets of the QAnon conspiracy theory extended beyond its adherents and stayed there. Now, “[W]We are in an age of obsessive, weird, and pervasive fear of pedophilia — an age where the paranoid thinking of QAnon is no longer associated with the political fringes of middle-aged posters and boomers eventually lost in cyberspace,” he wrote.

Wayfair’s moral panic didn’t trend simply because of bad algorithms; it was proof that the attention QAnon had grabbed earlier had worked. Ban its hashtags and influencers, but the crowd remained, and we were, to some extent, in it.

of The Lost Book of Herbal Medicines it became a bestseller flowing through some well-worn grooves. The influencers promoting it knew what they could and couldn’t say from a moderate perspective, and when the rule breakers were removed, new influencers rose up to earn those commissions. My article and my efforts to bring this trend to TikTok’s attention did nothing to slow the demand for this inaccurate book. So what would work?

DiResta’s ideas on this echo conversations that have been happening for some time among disinformation experts. There are some things that platforms absolutely need to do from a moderation perspective, like removing automated trending topics, introducing friction to engagement with some online content, and generally giving users more control over what they see in feeds. their own and from their communities. DiResta also notes the importance of education and prevention, which is a more preventative version of addressing disinformation that focuses on online manipulation tactics and tropes. Also, transparency.

Would people be more likely to believe there isn’t a broad conspiracy to censor conservatives on social media if there was a public database of the platforms’ moderation actions? Would people be less enthusiastic about buying a book with dubious natural cures if they knew more about the commissions earned by influencers promoting it? I do not know. Maybe!

However, I do know this: after a decade of covering culture and manipulating information online, I don’t think I’ve ever seen things as bad as they are now. It’s worth trying, at least, something.

#internet #lie #continues #work
Image Source : www.vox.com

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top