Study finds changes in social media feeds shape content but not political views

Tech news
solo1 August 2023Last Update : 2 months ago
Study finds changes in social media feeds shape content but not political views

Researchers changed social media algorithms for thousands of consenting users to see if it affected their political views during the 2020 US election.

Changes to social media algorithms on Facebook and Instagram didn’t significantly alter users’ political views, but shaped what they saw on their feeds, according to new in-depth research on the platform’s impact on political polarization.

The first results from a multi-university team working together with meta researchers took a look at the influence of social media algorithms on what users see.

Co-led by Talia Stroud of the University of Texas at Austin and Joshua Tucker of New York University, the study found that changes to the algorithms, which help rank news and search items on social media, affected the amount of information people see on their feeds. What he saw impressed him.

But this did not lead to any change in his political beliefs.

“We now know how influential algorithms are in shaping people’s on-platform experiences, but we also know that changing algorithms for a few months is unlikely to change people’s political attitudes,” Stroud and Tucker said. Is.” a joint statement,

“We don’t know why this is. It could be because the time frame for which the algorithms were changed was not enough, or that these platforms have already existed for decades, or that while Facebook and Instagram are influential sources of information, they are not the only source for people,” he said.

The new research, part of the 2020 Facebook and Instagram Election Study (FIES), was published Thursday in a series of four papers in Science and Nature.

Despite Meta’s involvement, academic researchers maintained that they had final authority over writing and research decisions.

Experiment with different social media feeds

Researchers experimented with three changes in the way Facebook and Instagram users viewed content during the 2020 US presidential election.

The experiments, conducted with thousands of consenting US-based users, included blocking re-sharing, changing the feed from an algorithmic to a chronological feed, and reducing exposure to like-minded content.

The researchers found that removing re-shared content, for example, decreased the amount of political news as well as overall clicks and reactions. He added that it has also reduced biased news clicks.

Meanwhile, reversing the chronological feed instead of algorithmically selected content “significantly reduced” the time users spent on the platform.

In a third experiment, the researchers reduced the like-minded content of thousands of consenting users on the platform by a third, which they said increased exposure to other sources but did not change users’ ideology.

“These accurate predicted results suggest that although exposure to content from like-minded sources is common on social media, reducing its circulation during the 2020 US presidential election did not reduce polarization in beliefs or attitudes,” the researchers said in the paper. Happened.” Published in Nature,

In a paper published in Science, researchers analyzed data from 208 million US Facebook users about news consumption on the social media platform during the election.

He found a large ideological divide between right-wing and left-wing audiences in the US, with “a large part of the news ecosystem being consumed exclusively by conservatives”.

“Most of the misinformation, identified by META’s third-party fact-checking programs, exists in this homogenous conservative corner,” the study authors said.

‘Avoiding accountability’

Social media has long been criticized for promoting ideological polarization but Facebook has disputed its role in this.

in a statement on ThursdayNick Clegg, META’s president of global affairs, wrote that these new studies suggest that “there is little evidence in a growing body of research that key features of META’s platforms alone cause harmful ‘influential’ polarisation”. are, or have a meaningful impact on, major political attitudes.” belief or practice.”

Free Press, an American non-profit organization advocating for media reform, said the meta was misrepresenting the studies, saying they were limited and took place over a “narrow time period”.

“META officials are taking the limited research as evidence that they should not share blame for growing political polarization and violence,” said Nora Benavidez, Free Press senior counsel and director of digital justice and civil rights.

“This calculative cycle of these polls is part of a continuing shirking of accountability from the menace of political disinformation that has spread online and is undermining free, fair and secure elections around the world.”

Short Link

Sorry Comments are closed