“Give everyone the power to share anything with anyone.” I was in the audience when Mark Zuckerberg shared this as one of its missions in April of this year at Facebook’s F8 developer conference. Fast-forward to today: the country’s most divisive election is over and fake news is making its own headlines. The intent and impact of this mission necessitates a closer look.
The context of the statement was in regards to Facebook’s efforts to connect the world and supply the internet to places that currently are part of the massive digital divide. An estimated four billion people globally don’t have access to the web. For millions that do, Facebook is the internet.
What Facebook didn’t publicly acknowledge until recently is its responsibility as the medium for information and (mis)information. Mark Zuckerberg posted this on the subject:
“Facebook is a new kind of platform different from anything before it. I think of Facebook as a technology company, but I recognize we have a greater responsibility than just building technology that information flows through. While we don’t write the news stories you read and share, we also recognize we’re more than just a distributor of news. We’re a new kind of platform for public discourse — and that means we have a new kind of responsibility to enable people to have the most meaningful conversations, and to build a space where people can be informed.
With any changes we make, we must fight to give all people a voice and resist the path of becoming arbiters of truth ourselves. I believe we can build a more informed community and uphold these principles.”
Facebook is a place that sells to advertisers the ability to impact buying decisions and behavior. Content that is consumed on Facebook has the same impact on its viewers beyond simply shopping. This cannot be discounted.
I believe in Facebook’s mission that a connected world is a better world. I’ve personally benefitted from this and Facebook’s algorithm. For example, when I was in Vancouver earlier this year, Facebook prioritized posts from another friend who lives in Singapore who was also visiting Vancouver in each of our News Feeds based on our locations. Because we realized we were both in town at the same time due to Facebook, we grabbed breakfast and caught up nine years after we first met!
While the intention of Facebook’s mission is ambitious and admirable, there is room for improvement in how the mission is executed. Facebook is taking steps, but more can be done to improve the community not just on the fake news front, but in regard to all content on the platform.
Fake news encompasses news and hoaxes that are truly fabricated for the purpose of getting clicks or deceiving a reader, like PizzaGate. It’s not satire, opinion, or speculation. While fake news accounts for less than 1% of the content people see on the platform, the content appears in exactly the same wrapper as content from credible publications, like the New York Times. This is problematic due to the lack of media literacy. A recent Stanford study uncovered that grade school students didn’t know how to vet or discern online information. They were easily “fooled by biased sources, ads that resemble news articles and even bogus social media pages.” This goes beyond just fake news.
Facebook is addressing the appearance of suspect news by introducing a warning symbol to stories that have been reported and disputed by third-party fact checkers. They are also making it easier to report content, updating their algorithm based on past hoax content behavior, and reducing the financial incentives for spammers to profit from fake news.
It’s a great start to see Facebook working with third-party fact checkers. More can and should be done though to improve the public discourse that takes place on the platform.
I’d like to see industry third-party algorithm watchdogs for companies like Facebook and Google. No matter how altruistic Facebook’s intentions are, watchdogs can add further perspective, diversity of thought, credibility, and balance that’s more pro-active and at the core of its service: the News Feed algorithm.
It is in the interest of the platforms and communities to get external feedback as there is moral decision-making that goes into creating algorithms. People create algorithms. It goes without saying that people are not perfect. Their algorithms are not perfect.
It is an enormous responsibility to be the lens for content which over a billion people consume. This problem is not isolated to Facebook nor should it rest solely on one corporation’s shoulders to solve.
Outside of Facebook there is also a need for digital literacy education, more rigorous journalism practices, and users taking initiatives to actively report suspicious content and be better informed. These are topics for more blog posts.
I’d love to hear your feedback, so please share your thoughts!
From a Facebook fan with a healthy dose of skepticism,
-Helen