One such model, just published by researchers at Northwestern University, incorporates recent, and in some ways counterintuitive, findings by political scientists. One, from a 2018 study by [Duke sociologist Christopher} Bail, is that when you repeatedly expose people on social media to viewpoints different than their own, it just makes them dig in their heels and reinforces their own viewpoint, rather than swaying them to the other side. (Dr. Bail’s study was conducted on U.S. users of Twitter, but other studies have begun to replicate it, he adds.)
In the past, social-media giants have been accused of only showing us content that agrees with our preconceptions, creating echo chambers or “filter bubbles.” The proposed solution, trumpeted by pundits of every stripe, was to change the social-media algorithms so that they would show us more content from people who disagree with us.
According to David Sabin-Miller and Daniel Abrams, creators of this latest model, exposing us to viewpoints different from our own, in whatever medium we encounter them, might actually be part of the problem. The reason is probably intuitive for anyone who has the misfortune to spend an unhealthy amount of time on Facebook, Instagram, Twitter, YouTube or even cable news. (During the pandemic, that’s more of us than ever.) Because social media and Balkanized TV networks tend to highlight content with the biggest emotional punch—that is, they operate on the principle that if it’s outrageous, it’s contagious—when we’re exposed to a differing view, it often takes an extreme form, one that seems personally noxious.
Mr. Sabin-Miller and Dr. Abrams, both mathematicians, call this effect “repulsion.” In addition to the “pull” of repeatedly seeing viewpoints that reinforce our own, inside of our online echo chambers, repulsion provides a “push” away from opposing viewpoints, they argue. Importantly, this repulsion appears to be a more powerful force, psychologically, than attraction to our own side of a debate.
Bad actors on social media—such as Russian agents who have been active in advance of the 2020 election, attempting to divide Americans further—already appear to recognize repulsion as a tool, says Mr. Sabin-Miller. These trolls will assume roles on both sides of an ideological divide, and play dumb to make one side of the debate look foolish, while playing down the extremity of views on the other side.
...
Another model by Vicky Chuqiao Yang, an applied mathematician at the Santa Fe Institute, explored a phenomenon political scientists have previously described: the way political parties have themselves become more polarized over time. Her model buttresses past work that suggested that political parties play to their more extreme constituents because it’s more strategically advantageous than trying to go for ideological moderates, who often swing to one party or the other.
Bessette/Pitney’s AMERICAN GOVERNMENT AND POLITICS: DELIBERATION, DEMOCRACY AND CITIZENSHIP reviews the idea of "deliberative democracy." Building on the book, this blog offers insights, analysis, and facts about recent events.