The Present Age

Share this post
New study highlights the virality of hate
www.readtpa.com

New study highlights the virality of hate

And why "dunking" on our political opponents only fuels the problem.

Parker Molloy
Jun 29, 2021
13
4
Share this post
New study highlights the virality of hate
www.readtpa.com
(illustration via Getty Images)

A new research article published by the Proceedings of the National Academy of Sciences of the United States of America (PNAS) doesn’t bode well for efforts to fight political extremism and polarization. The paper’s authors analyzed 2,730,215 Twitter and Facebook posts published by members of the news media and U.S. Congresspeople, and came to the conclusion that the quickest way to social media success is to attack members of the “out-group.”

Steve Rathje, one of the paper’s authors, has a really great Twitter thread about his work.

He puts it better than I can for obvious reasons:

Twitter avatar for @steverathje2
Steve Rathje @steverathje2
🚨 Now out in @PNASNews 🚨 Analyzing social media posts from news accounts and politicians (n = 2,730,215), we found that the biggest predictor of "virality" (out of all predictors we measured) was whether a social media post was about one's outgroup. pnas.org/content/118/26…
Image
8:20 PM ∙ Jun 23, 2021
1,027Likes391Retweets
Twitter avatar for @steverathje2
Steve Rathje @steverathje2
Specifically, each additional word about the opposing party (e.g., “Democrat,” “Leftist,” or “Biden” if the post was coming from a Republican) in a social media post increased the odds of that post being shared by 67%.
Image
8:20 PM ∙ Jun 23, 2021
63Likes10Retweets
Twitter avatar for @steverathje2
Steve Rathje @steverathje2
Negative and moral-emotional words also slightly increased the odds of a post being shared, positive words slightly decreased the odds, and in-group words had no effect. Out-group words were by far the strongest predictor of virality that we measured.
Image
8:20 PM ∙ Jun 23, 2021
61Likes14Retweets
Twitter avatar for @steverathje2
Steve Rathje @steverathje2
Posts about the outgroup were almost exclusively negative (see examples below).
Image
8:20 PM ∙ Jun 23, 2021
46Likes4Retweets
Twitter avatar for @steverathje2
Steve Rathje @steverathje2
Out-group posts were very likely to receive “angry” reactions on Facebook, as well as “haha” reactions (likely indicating mockery), comments, and shares.
Image
8:20 PM ∙ Jun 23, 2021
42Likes4Retweets
Twitter avatar for @steverathje2
Steve Rathje @steverathje2
Posts about the ingroup received much less overall engagement, although they were slightly more likely to receive “love” and “like” reactions, reflecting in-group favoritism.
Image
8:20 PM ∙ Jun 23, 2021
36Likes3Retweets
Twitter avatar for @steverathje2
Steve Rathje @steverathje2
In other words, out-group negativity was a stronger driver of virality than in-group positivity. Indeed, the “angry” reaction was the most commonly used reaction out of all six of Facebook’s reactions in our datasets.
8:20 PM ∙ Jun 23, 2021
73Likes18Retweets
Twitter avatar for @steverathje2
Steve Rathje @steverathje2
This out-group effect was not moderated by political orientation or by social media platform. However, stronger effects were found among politicians than in the media.
Image
8:20 PM ∙ Jun 23, 2021
36Likes1Retweet
Twitter avatar for @steverathje2
Steve Rathje @steverathje2
These results are troubling in an attention economy where the social media business model is based on keeping us engaged in order to sell advertising. This business model may be creating perverse incentives for polarizing content, rewarding people for "dunking" on the outgroup.
8:20 PM ∙ Jun 23, 2021
123Likes38Retweets

This is all a choice. Facebook and Twitter long ago shifted from chronological to algorithmic timelines.

Facebook knows that its algorithm rewards extreme rhetoric and anger — it just doesn’t care. Last year, The Wall Street Journal uncovered an internal report Facebook put together in 2018 that found that the company’s “algorithms exploit the human brain’s attraction to divisiveness.” Even worse, the report’s authors found that if left in place, the algorithm would continue to serve “more and more divisive content in an effort to gain user attention and increase time on the platform.”

It’s easy to blame this on “the human brain’s attraction to divisiveness,” but without the profit-and-ideology-driven infrastructure put in place by social media giants, this problem would not exist — at least in any comparable way.

Facebook CEO Mark Zuckerberg (L) and Facebook Vice President of Global Public Policy Joel Kaplan (R) leave a 2019 meeting with Sen. John Cornyn (R-TX) in Washington, DC. (Photo by Samuel Corum/Getty Images)

Last year, The New York Times reported on an initiative that Facebook considered implementing that would have benefitted publishers with a high “news ecosystem quality” score, an internal ranking metric. This was part of an initiative that began months earlier to protect against the possibility that then-President Donald Trump would use his social media accounts to contest the results of the election in the event of his loss.

From the Times:

The change was part of the “break glass” plans Facebook had spent months developing for the aftermath of a contested election. It resulted in a spike in visibility for big, mainstream publishers like CNN, The New York Times and NPR, while posts from highly engaged hyperpartisan pages, such as Breitbart and Occupy Democrats, became less visible, the employees said.

It was a vision of what a calmer, less divisive Facebook might look like. Some employees argued the change should become permanent, even if it was unclear how that might affect the amount of time people spent on Facebook. In an employee meeting the week after the election, workers asked whether the “nicer news feed” could stay, said two people who attended.

After the 2016 election, Facebook developed “Project P,” an anti-propaganda initiative that would have cracked down on political misinformation from obscure websites promoting hoax stories (i.e. actual fake news). Facebook Vice President of Global Policy Joel Kaplan — who would later make headlines for being in attendance at Justice Brett Kavanaugh’s confirmation hearing and later throwing a party in Kavanaugh’s honor — reportedly quashed plans for its implementation because it would disproportionately affect conservative pages, which were more likely to share the false information.

Part of Project P included surveying users about whether specific posts were either “bad for the world” or “good for the world.” What they found, according to the Times, is that hyper-viral posts were more likely to be considered “bad for the world.” Why was it that Facebook’s algorithm kept promoting the most divisive and angry content, and why wouldn’t they do anything about it? Company leadership didn’t seem to care either way. Prioritizing accuracy over rage-bait led to less Facebook use, and when pressed to make decisions about the future of the company, it chose the path that would result in more use, not less.

But the problem isn’t entirely the fault of social media companies, and we really need to acknowledge this.

Ever find yourself quote-tweeting a bad take online to mock it? I have. Constantly. It’s an impulse and is fueled by both a subconscious desire to project our own beliefs to the world and an adoption of the “sunlight is the best disinfectant” approach to misinformation. Unfortunately, it’s part of the reason these algorithms favor extreme points of view. As far as social media companies are concerned, retweets and shares of approval and shares of disapproval are essentially the same. Internally, the calculation is that more engagement is good, so it learns that those are the types of posts to promote. In the end, “dunking” on our political adversaries only helps them.

It’s a tough habit to break, and it’s certainly one that I’m guilty of. But I’m going to try to make a point of not “dunking,” even when I really want to. You know, for the sake of the internet.

4
Share this post
New study highlights the virality of hate
www.readtpa.com
4 Comments
founding
Sean Corfield
Jun 30, 2021Liked by Parker Molloy

This is why I've deactivated my Facebook account twice now, to get away from this, for a few months at a time. Sometimes I've been absolutely horrified at some of the things my "friends" have reposted and I've found it hard to believe they really believe that sentiment... but I know deep down they must believe at least part of it :(

Expand full comment
ReplyCollapse
Jane Natoli
Writes Trader Jane's Emporium of Fine …
Jun 29, 2021Liked by Parker Molloy

Not dunking is a great habit to get into (not one I've always been super successful with, but I'm trying...). Even in a much smaller microcosm of local politics, it's gotten me in trouble, and while it's certainly an easy way for engagement, what lies beneath is scary

Expand full comment
ReplyCollapse
2 more comments…
TopNewCommunity

No posts

Ready for more?

© 2023 Parker Molloy
Privacy ∙ Terms ∙ Collection notice
Start WritingGet the app
Substack is the home for great writing