viernes, 18 de noviembre de 2016

MercatorNet: Facebook’s problem is more complicated than fake news

MercatorNet: Facebook’s problem is more complicated than fake news
Facebook’s problem is more complicated than fake news

Facebook’s problem is more complicated than fake news

Emotion and identity are the real filters.
R. Kelly Garrett | Nov 18 2016 | comment 

In the wake of Donald Trump’s unexpected victory, many questions have been raised about Facebook’s role in the promotion of inaccurate and highly partisan information during the presidential race and whether this fake news influenced the election’s outcome.
A few have downplayed Facebook’s impact, including CEO Mark Zuckerberg, who said that it is “extremely unlikely” that fake news could have swayed the election. But questions about the social network’s political significance merit more than passing attention.
Do Facebook’s filtering algorithms explain why so many liberals had misplaced confidence in a Clinton victory (echoing the error made by Romney supporters in 2012)? And is the fake news being circulated on Facebook the reason that so many Trump supporters have endorsed demonstrably false statements made by their candidate?
The popular claim that “filter bubbles” are why fake news thrives on Facebook is almost certainly wrong. If the network is encouraging people to believe untruths – and that’s a big if – the problem more likely lies in how the platform interacts with basic human social tendencies. That’s far more difficult to change.
A misinformed public
Facebook’s role in the dissemination of political news is undeniable. In May 2016, 44 percent of Americans said they got news from the social media site. And the prevalence of misinformation disseminated through Facebook is undeniable.
It’s plausible, then, that the amount of fake news on a platform where so many people get their news can help explain why so many Americans are misinformed about politics.
But it’s hard to say how likely this is. I began studying the internet’s role in promoting false beliefs during the 2008 election, turning my attention to social media in 2012. In ongoing research, I’ve found little consistent evidence that social media use promoted acceptance of false claims about the candidates, despite the prevalence of many untruths. Instead, it appears that in 2012, as in 2008, email continued to be a uniquely powerful conduit for lies and conspiracy theories. Social media had no reliably detectable effect on people’s beliefs.
For a moment, however, let’s suppose that 2016 was different from 2012 and 2008. (The election was certainly unique in many other regards.)
If Facebook is promoting a platform in which citizens are less able to discern truth from fiction, it would constitute a serious threat to American democracy. But naming the problem isn’t enough. To fight the flow of misinformation through social media, it’s important to understand why it happens.
Don’t blame filter bubbles
Facebook wants its users to be engaged, not overwhelmed, so it employs proprietary software that filters users’ news feeds and chooses the content that will appear. The risk lies in how this tailoring is done.
There’s ample evidence that people are drawn to news that affirms their political viewpoint. Facebook’s software learns from users’ past actions; it tries to guess which stories they are likely to click or share in the future. Taken to its extreme, this produces a filter bubble, in which users are exposed only to content that reaffirms their biases. The risk, then, is that filter bubbles promote misperceptions by hiding the truth.
The appeal of this explanation is obvious. It’s easy to understand, so maybe it’ll be easy to fix. Get rid of personalized news feeds, and filter bubbles are no more.
The problem with the filter bubble metaphor is that it assumes people are perfectly insulated from other perspectives. In fact, numerous studies have shown that individuals’ media diets almost always include information and sources that challenge their political attitudes. And a study of Facebook user data found that encounters with cross-cutting information is widespread. In other words, holding false beliefs is unlikely to be explained by people’s lack of contact with more accurate news.
Instead, people’s preexisting political identities profoundly shape their beliefs. So even when faced with the same information, whether it’s a news article or a fact check, people with different political orientations often extract dramatically different meaning.
A thought experiment may help: If you were a Clinton supporter, were you aware that the highly respected prediction site FiveThirtyEight gave Clinton only a 71 percent chance of winning? Those odds are better than a coin flip, but far from a sure thing. I suspect that many Democrats were shocked despite seeing this uncomfortable evidence. Indeed, many had been critical of this projection in the days before the election.
If you voted for Trump, have you ever encountered evidence disputing Trump’s assertion that voter fraud is commonplace in the U.S.? Fact checkers and news organizations have covered this issue extensively, offering robust evidence that the claim is untrue. However a Trump supporter might be unmoved: In a September 2016 poll, 90 percent of Trump supporters said they didn’t trust fact checkers.
Facebook = angry partisans?
If isolation from the truth really is the main source of inaccurate information, the solution would be obvious: Make the truth more visible.
Unfortunately, the answer isn’t that simple. Which brings us back to the question of Facebook: Are there other aspects of the service that might distort users’ beliefs?
It will be some time before researchers can answer this question confidently, but as someone who has studied how the various ways that other internet technologies can lead people to believe false information, I’m prepared to offer a few educated guesses.
There are two things that we already know about Facebook that could encourage the spread of false information.
First, emotions are contagious, and they can spread on Facebook. One large-scale study has shown that small changes in Facebook users’ news feeds can shape the emotions they express in later posts. In that study, the emotional changes were small, but so were the changes in the news feed that caused them. Just imagine how Facebook users respond to widespread accusations of candidates’ corruption, criminal activity and lies. It isn’t surprising that nearly half(49 percent) of all users described political discussion on social media as “angry.”
When it comes to politics, anger is a powerful emotion. It’s been shown to make people more willing to accept partisan falsehoods and more likely to post and share political information, presumably including fake news articles that reinforce their beliefs. If Facebook use makes partisans angry while also exposing them to partisan falsehoods, ensuring the presence of accurate information may not matter much. Republican or Democrat, angry people put their trust in information that makes their side look good.
Second, Facebook seems to reinforce people’s political identity – furthering an already large partisan divide. While Facebook doesn’t shield people from information they disagree with, it certainly makes it easier to find like-minded others. Our social networks tend to include many people who share our values and beliefs. And this may be another way that Facebook is reinforcing politically motivated falsehoods. Beliefs often serve a social function, helping people to define who they are and how they fit in the world. The easier it is for people to see themselves in political terms, the more attached they are to the beliefs that affirm that identity.
These two factors – the way that anger can spread over Facebook’s social networks, and how those networks can make individuals’ political identity more central to who they are – likely explain Facebook users’ inaccurate beliefs more effectively than the so-called filter bubble.
If this is true, then we have a serious challenge ahead of us. Facebook will likely be convinced to change its filtering algorithm to prioritize more accurate information. Google has already undertaken a similar endeavor. And recent reports suggest that Facebook may be taking the problem more seriously than Zuckerberg’s comments suggest.
But this does nothing to address the underlying forces that propagate and reinforce false information: emotions and the people in your social networks. Nor is it obvious that these characteristics of Facebook can or should be “corrected.” A social network devoid of emotion seems like a contradiction, and policing who individuals interact with is not something that our society should embrace.
It may be that Facebook shares some of the blame for some of the lies that circulated this election year – and that they altered the course of the election.
If true, the challenge will be to figure out what we can do about it.
The ConversationR. Kelly Garrett is Associate Professor of Communication at The Ohio State University. This article was originally published on The Conversation. Read the original article.

When liberals are holding the reins of power we don't hear all that much about truth. When they are losing, it's different. Suddenly truth, facts, objectivity become extremely important, and pundits weep buckets of tears over the masses who are led by their feelings and beliefs rather than objective facts. Lately, for rather obvious reasons, they have been mourning the rise of "post-truth politics". That explains why Oxford Dictionaries has declared "post-truth" the word of the year. I think the term misses the truth of what's going on, but you can read about that here.
Marcus Roberts also has an interesting post on the demographics of the US election outcome. And we have a piece by Ryan Anderson on what Trump can do right now to protect relgious freedom. In the longer term there's the possibility of a reversal of Roe V. Wade, the consequences of which, Tim Bradley of the Lozier Institte points out, are not as draconian as some allege.
Well, that's enough to be going on with. Enjoy your weekend!

Carolyn Moynihan
Deputy Editor,

To tell the truth, it’s not a ‘post-truth’ world
By Carolyn Moynihan
At least, not in the way the liberal intelligentsia mean it.
Read the full article
For the Democrats, demography was destiny
By Marcus Roberts
And they were wrong.
Read the full article
Protecting life, not punishing women
By Tim Bradley
Overturning Roe v. Wade does not mean women will be thrown into prison for having an abortion.
Read the full article
Facebook’s problem is more complicated than fake news
By R. Kelly Garrett
Emotion and identity are the real filters.
Read the full article
Make religious freedom great again
By Ryan T. Anderson
Undoing the damage of the Obama administration.
Read the full article
Migration Population in Middle East doubles in ten years
By Marcus Roberts
Largely thanks to the Syrian conflict
Read the full article
I’m a parent, therefore I am: thoughts on the value of caregiving
By Holly Hamilton-Bleakley
Cartesian-inspired reflections after 16 years as a stay-at-home mom.
Read the full article
The Mind of the Islamic State
By Robert Manne
Extract from a new book that traces the evolution of the jihadist group’s world view.
Read the full article
Jack Reacher: Never Go Back
By Raffaele Chiarulli
Our hero is not exactly a lone wolf, answerable only to his conscience.
Read the full article
Empty home syndrome
By Joanna Roughton
The real threat to our homes is not cyber-war.
Read the full article
Euthanasia fails in South Australia
By Paul Russell
But by the narrowest of margins
Read the full article
Why a fractured nation needs to remember Martin Luther King’s message
By Joshua F.J. Inwood
How can we heal a nation that is divided along race, class and political lines? With love.
Read the full article

MERCATORNET | New Media Foundation
Suite 12A, Level 2, 5 George Street, North Strathfied NSW 2137, Australia

Designed by elleston
New Media Foundation | Suite 12A, Level 2, 5 George St | North Strathfield NSW 2137 | AUSTRALIA | +61 2 8005 8605

No hay comentarios: