WASHINGTON — Amid a raging waterfall of criticism over the news that personal info from some 50 million users’ were accessed by a political data-mining firm linked to the Trump campaign, Facebook is once again scrambling to fix its already damaged reputation from the “fake news” epidemic during the 2016 election season. But a new survey finds the social media giant has its work cut out: just 3 in 10 Americans believe Facebook is a responsible company, while less than that feel it’s leaving a positive footprint in the world.
The survey, conducted by the non-profit internet watchdog Digital Citizens Alliance, polled 925 Americans last week after it was revealed that Cambridge Analytica, had gotten its grips on Facebook users’ data without consent. The results showed that many people are holding the company in a negative light, particularly when it comes to American politics
The survey showed that just a quarter of Americans think Facebook is a positive development for society, while 6 in 10 believe the company “has damaged American politics and made it more negative by enabling manipulation and falsehoods that polarize people.” Between the fallout from the rise of fake news circulated throughout the network to the latest allegations, it seems the 2016 election has been one giant black eye for Facebook.
“Facebook is at a crossroads because of its inability – nearly a year-and- a-half after the election – to get a handle on its divisive effects on society,” says Tom Galvin, Executive Director of Digital Citizens in a press release. “From spreading fake and manipulative information to becoming a ‘Dark Web-like’ place for illicit commerce, Facebook seems to losing the trust of the American public. Regulation will not be far behind for social media companies if things don’t change.”
While 54% of respondents felt that Facebook has negatively impacted political discourse, a small minority believe it’s not necessarily at fault for the political scandals. Thirty-one percent of respondents felt that Facebook was “a responsible company because it tries to do the right thing most of the time even if that gets in the way of it making profits.” On the other hand, 39% agreed with the thought that Facebook does put its profits first as opposed to doing the right thing for its users. The remaining 30% were unsure.
So what could Facebook do to right the ship moving forward? Fifty-three percent of those surveyed agreed it’s the network’s responsibility to warn users about potentially misleading stories or remove fake news content altogether. Meanwhile, 42% of respondents felt that anyone under 18 should be barred from the site completely.
The Digital Citizens Alliance argues that users should have more of a say in how their information is used while doing a far better job of monitoring the information spreading throughout its user base.
“Digital platforms have to rise to the occasion and assure internet users that their personal information will be safe, that the content will be legal, safe and not contrived to manipulate,” says Galvin. “In short, they have to demonstrate they will be the positive influence on our society that they espouse to be.”
Perhaps Facebook isn’t all bad when it comes to politics. One study in 2017 found that the site actually plays a major role in reducing government corruption.
- Study Identifies 4 Types Of Facebook Users: Which One Are You?
- Facebook ‘Likes’ Don’t Make People Happier, Study Finds
- Researchers: Fake News Did Not Alter Election Results
- Fake News 70% More Likely To Be Shared On Twitter Than Real News
- People More Likely To Fall For Fake News If Popular On Social Media, Study Finds