Whaaat?

On the surface, it seems like social media has the boundless potential to expand our world, connecting us to ideas and people we otherwise would never have found. However, a new study claims just the opposite: Social media actually isolates us, creating and facilitating confirmation biases and echo chambers where old — and sometimes erroneous — information is just regurgitated over and over again.

If it sounds bleak, it’s because it kind of is.

Tell me more!

The findings were published in the Proceedings of the National Academy of Sciences. Using data modeling, a team of researchers from Italy mapped the spread of two types of content: conspiracy theories and scientific information.

“Our findings show that users mostly tend to select and share content related to a specific narrative and to ignore the rest. In particular, we show that social homogeneity is the primary driver of content diffusion, and one frequent result is the formation of homogeneous, polarized clusters,” the paper concludes.

In other words, you and all of your friends are all sharing the same stuff, even if it’s bunk, because you think alike and your tightly-defined exchange of ideas doesn’t allow for anything new or challenging to flow in.

“Users show a tendency to search for, interpret, and recall information that confirm their pre-existing beliefs.” This is called “confirmation bias,” and Bessi says it’s actually one of the main motivations for sharing content.

So instead of sharing to challenge or inform, social media users are more likely to share an idea already commonly accepted in their social groups for the purpose of reinforcement or agreement. This means misinformation — which is a much more appropriate term for “fake news” — can rattle around unchecked.

 

My 2 cents:

It follows that, in the pursuit of user engagement, viewership and organic sharing that social media (and plain old media) platform algorithms would begin to create these bubbles of opinions and “facts” that have recently come under greater scrutiny. Being caught unaware of the inherant shortcoming of predictive and reactive algorithms in our media feeds is as much a result of the end of our first decade under the influence of our Facebook feeds as the (naive) theory that humans would be able to discern absolute truth from social media truth, much in the way that we expect humans to be able to discern absolute truth from general media truth.

Recent US elections seem to have shoved this danger into the spotlight, and, as a parting gift of sorts, (former) President Obama finally called out how “splintered media” is fueling mass “group think”:

“For too many of us, it’s become safer to retreat into our own bubbles, whether in our neighborhoods or college campuses or places of worship or our social media feeds, surrounded by people who look like us and share the same political outlook and never challenge our assumptions. The rise of naked partisanship, increasing economic and regional stratification, the splintering of our media into a channel for every taste—all this makes this great sorting seem natural, even inevitable. And increasingly, we become so secure in our bubbles that we accept only information, whether true or not, that fits our opinions, instead of basing our opinions on the evidence that’s out there.”

Full article

Advertisements