Diving into Meta algorithms shows that polarization in the US has no easy solution

The four research papers, published in the journals Science and Nature, also reveal the extent of political echo chambers on Facebook, where conservatives and liberals rely on divergent sources of information, interact with groups that oppose each other, and they consume clearly different amounts of misinformation.

Algorithms are the automated systems that social media platforms use to suggest content to their users, by making assumptions based on the groups, friends, topics, and headlines a user has clicked on in the past. While they excel at keeping people engaged, the algorithms have been criticized for amplifying misinformation and ideological content that has sharpened the country’s political divisions.

“We found that algorithms are extremely influential in people’s experiences on the platform, and there is significant ideological segregation in political news exposure,” said Talia Jomini Stroud, director of the Center for Media Participation at the University of Texas in Austin and one of the people who spearheaded the studies. “We also found that popular proposals to change social media algorithms did not influence political attitudes.”

A significant divide can undermine trust in democracy or democratic institutions and lead to “affective polarization”, when citizens begin to see each other more as enemies than legitimate opposition. It is a situation that can lead to violence, which was what happened when supporters of then-President Donald Trump attacked the federal Capitol on January 6, 2021.

For the analysis, the researchers obtained unprecedented access to Facebook and Instagram data from the 2020 election through a collaboration with Meta, which owns the platforms. The researchers say Meta did not exercise control over their findings.

When they replaced the algorithm with a simple chronological list of friends’ posts—an option Facebook recently made available to its users—it had no measurable impact on polarization. When they turned off Facebook’s sharing option, which allows users to quickly share viral posts, people saw significantly less news from untrustworthy sources and less political news overall, but there were no significant changes in their political attitudes.

Likewise, reducing the content that Facebook users receive from accounts with the same ideological alignment did not have a significant effect on polarization, susceptibility to misinformation, or radical views.

Taken together, the findings suggest that Facebook users are looking for content that aligns with their views, and that algorithms help by “making it easier for people to do what they are inclined to do,” according to David Lazer, a professor at the University of from the Northeast who worked in all four studios.

Removing the algorithm entirely dramatically reduced the time users spent on Facebook or Instagram, while also increasing their time on TikTok, YouTube or other sites, showing just how important these systems are to Meta in the increasingly crowded landscape of social networks.

Responding to the research papers, Nick Clegg, Meta’s president of global affairs, said the findings showed that “there is little evidence that key features of Meta’s platforms alone cause harmful ‘affective’ polarization or have a significant impact on key political attitudes, beliefs or behaviours”.

Katie Harbath, Facebook’s former director of public policy, said they highlighted the need for more research on social media and challenged assumptions about the role social media play in American democracy. Harbath was not involved in the investigation.

“People want a simple solution, and what these studies show is that it’s not easy,” said Harbath, a fellow at the Bipartisan Policy Center and CEO of the policy and technology firm Anchor Change. “To me, it reinforces that when it comes to polarization or people’s political beliefs, there’s a lot more to it than social media.”

An organization that has criticized Meta’s role in spreading misinformation about elections and voting called the investigation “limited” and said it was just a snapshot taken during an election, and did not take into account the effects of years of misinformation on social media.

The Free Press, a nonprofit that advocates for civil rights in technology and media, called Meta’s use of the research a “calculated twist.”

“Meta executives are leveraging limited investigations to serve as evidence that they should not share the blame for increased political polarization and violence,” Nora Benavidez, senior counsel and director of digital justice and rights, said in a statement. group civilians. “Studies that Meta supports, which appear to be fragmented into very short time frames, should not serve as an excuse for allowing lies to spread.”

The four studies also revealed the extent of ideological differences among Facebook users and the different ways that conservatives and liberals use the platform to receive political news and information.

Conservative Facebook users are more likely to consume content that has been flagged as misinformation by fact-checking sites. They also have more fonts to choose from. The analysis found that among the websites featured in Facebook’s political posts, far more are targeted at conservatives than are focused on liberals.

Overall, 97% of political news sources on Facebook that fact-checkers identified as spreading misinformation were more popular with conservatives than liberals.

The study authors acknowledged some limitations in their work. While they found that changing Facebook’s algorithms had little impact on polarization, they note that the study only covered a few months during the 2020 election, and thus cannot assess the long-term impact the algorithms have had since they began. used for years.

Similarly, they reported that most people get their news and information from a variety of sources —television, radio, internet, and word of mouth—, and that these interactions could also affect their opinions. Many people in the United States blame the media for worsening polarization.

To complete their analyses, the researchers analyzed data from millions of Facebook and Instagram users and surveyed specific users who agreed to participate. All identifying information for specific users has been removed for privacy reasons.

Lazer, the Northeastern University professor, said he was initially skeptical that Meta would give researchers the access they needed, but he was pleasantly surprised. He said the conditions imposed by the company were related to reasonable legal and privacy concerns. More studies resulting from the collaboration will be published in the coming months.

“There is no study like this,” he said of the research published a week ago. “There has been a lot of rhetoric about this, but in many ways the research has been quite limited.”

FOUNTAIN: Associated Press

California18

Welcome to California18, your number one source for Breaking News from the World. We’re dedicated to giving you the very best of News.

Leave a Reply