joi, 7 mai 2015

Facebook Study Disputes Theory of Political Polarization Among Users



For years, political scientists and other social theorists have fretted about the Internet’s potential to flatten and polarize democratic discourse. Because so much information now comes through digital engines shaped by our own preferences — Facebook, Google and others suggest content based on what consumers previously enjoyed — scholars have theorized that people are building an online echo chamber of their own views.


But in a peer-reviewed study published on Thursday in the journal Science, data scientists at Facebook report that the echo chamber is not as insular as many might fear — at least not on the social network. While independent researchers said the study was important for its scope and size, they noted several significant limitations.


After analyzing how more than 10 million of the most partisan users of the social network navigated the site over a six-month period last year, researchers found that people’s networks of friends and the stories they see are in fact skewed toward their ideological preferences. But that effect is more limited than the worst case that some theorists had predicted, in which people would see almost no information from the other side.


On average, about 23 percent of users’ friends are of an opposing political affiliation, according to the study. An average of almost 29 percent of the news stories displayed by Facebook’s News Feed also appear to present views that conflict with the user’s own ideology.


In addition, researchers found individuals’ choices about which stories to click on had a larger effect than Facebook’s filtering mechanism in determining whether people encountered news that conflicted with their professed ideology.


“This is the first time we’ve been able to quantify these effects,” Eytan Bakshy, a data scientist at Facebook who led the study, said in an interview. “You would think that if there was an echo chamber, you would not be exposed to any conflicting information, but that’s not the case here.”


Facebook’s findings run counter to a longstanding worry about the potential for digital filtering systems to shape our world. For Facebook, the focus is on the algorithm that the company uses to determine which posts people see, and which they do not, in its News Feed.


Cass R. Sunstein, the Harvard law professor and President Obama’s former “regulatory czar,” worried that such recommendation engines would lead to a tailored version of news and entertainment that might be called “The Daily Me.” Eli Pariser, chief executive of Upworthy and a former director at MoveOn.org, labeled it the “Filter Bubble.” Some Facebook users have said they unfollow friends and acquaintances who post content with which they disagree.


And with political discussions becoming increasingly pitched in the run-up to a presidential election next year in which the Internet will be used as a primary campaign tool, the problem appeared to be getting worse.


“This shows that the effects that I wrote about exist and are significant, but they’re smaller than I would have guessed,” said Mr. Pariser in an interview about Facebook’s study.


Natalie Jomini Stroud, a professor of communications studies at the University of Texas at Austin, who was not involved in the study, said the results were “an important corrective” to the conventional wisdom. “There’s been so much hype about the algorithm and how it might be constraining what people are viewing,” she said.


The study adds to others that debate whether the Internet creates an echo chamber. A Pew Research Center report last year found that media outlets people name as their prime information sources about politics and news are strongly correlated with their political views. Another study late last year published as a working paper in the National Bureau of Economic Research analyzed Twitter usage during the 2012 election and found social media often exposed users only to opinions that match their own.


Dr. Stroud and several other researchers note that the Facebook study has limitations. All of the users studied were of a type: those who have self-identified as liberal or conservative in their profiles. Most of Facebook’s users do not post their political views, and Dr. Stroud cautioned that those users might be either more or less accepting of conflicting political views.


The findings are convenient for Facebook. With more than 1.3 billion users, the social network is effectively the world’s most widely read daily newspaper. About 30 percent of American adults get their news from the social network, according to the Pew Research Center. But its editorial decisions are drafted in a black box, with the company’s opaque News Feed algorithm deciding which of your friends’ posts you see, which you don’t and in what order. Facebook could use the study’s results to show that its secret algorithm is not ruining national discourse.


Facebook said its researchers were allowed wide latitude to pursue their research interests and to present whatever they found.


Facebook also noted that this study was substantively different from one that caused an outcry last year, in which the company’s scientists altered the number of positive and negative posts that some people saw to examine the effects on their mood. This study did not involve an experiment that changed users’ experience of Facebook; researchers analyzed how people use Facebook as it stands today.


For Facebook’s study, researchers first determined the point of view of a given article by looking at whether liberals or conservatives had shared it most. They found unsurprising partisan attitudes about well-known news sources: Fox News stories were shared mainly by conservatives, while articles on the Huffington Post were shared by liberals. The researchers also used a text analysis system to exclude “soft” news from the study — topics like sports and entertainment — and instead focused on national news and world affairs.


Then they analyzed the behavior of users, whose identifying details had been taken out, measuring how often their feeds displayed stories that conflicted with their professed ideologies, and how often they clicked on those stories.


Some academics said Facebook was always tweaking the News Feed and could easily make changes that would create a more sealed echo chamber.


“A small effect today might become a large effect tomorrow,” David Lazer, a political scientist who studies social networks at Northeastern University, wrote in a commentary on the Facebook study also published in Science.


He said the Facebook study suggested that creating an algorithm that shows fewer stories with opposing views might be to the company’s benefit because users might be more satisfied with the News Feed. “The deliberative sky is not yet falling, but the skies are not completely clear either,” Dr. Lazer wrote.


The study — which, in addition to Dr. Bakshy was written by Solomon Messing and Lada Adamic, also data scientists at Facebook — presented other findings on how people of different political persuasions use the world’s largest social network.


One is that liberals live in a more tightly sealed echo chamber than conservatives, but that conservatives are more selective about what they click on when they see ideologically challenging views. About 22 percent of the news stories that Facebook presents to liberals is of a conservative bent, while 33 percent of the stories shown to conservatives presented a liberal point of view. The difference, researchers said, is that liberal users are connected to fewer friends who share views from the other side.


But liberals were only 6 percent less likely to click on ideologically challenging articles instead of ideologically consistent articles that appeared in their feed. Conservatives were 17 percent less likely to click, meaning they appeared more reluctant to indulge opposing views.


The study also raised — but did not answer — the question of what happens after people click on an article that presents an opposing view: Are they being informed and persuaded by its arguments, or are they dismissing it out of hand?


“People who are really into politics expose themselves to everything — they’re junkies,” said Diana C. Mutz, a political scientist at the University of Pennsylvania. “So they will expose to the other side, but it could be to make fun of it, or to know what they’re saying to better argue against it, or just to yell at the television set.”


A click, in other words, is not necessarily an endorsement, or even a sign of an open mind.




Source link








- http://bit.ly/1Im1qeK

Niciun comentariu:

Trimiteți un comentariu

searchmap.eu