Facebook is providing grants to various organisations around the world to help them study how the spread of fake news impacts elections.
Universities in the United States and Taiwan want to analyse how sharing posts and links influences election results, while a French project wants to focus on the impact of conversations happening on the platform.
Facebook is playing catch-up
With Mark Zuckerberg declaring that “the future is private”, it’s clear that Facebook is keen to rehabilitate its reputation and up until recently, it’s been on the wrong side of the privacy conversation.
Apple, for example, has used the intense debate around the Cambridge Analytica scandal to set out its own commitment to privacy. In response, Google’s CEO, Sundar Pichai, has stated that:
“privacy cannot be a luxury good” reserved only for “people who can afford to buy premium products and services.”
Facebook has some catching up to do to convince people that it wants to spend the time, money and effort to safeguard its users’ data (as Zuckerberg himself admits). But it’s making a good start with its research funding.
Facebook has built a system that will restrict the type and quantity of data that researchers can access. All user data will be anonymised, and researchers will only be able to analyse the data within the parameters of their research area.
The long road to rebuilding trust
We trust people who do what they say they will, and it’s no different for organisations. It’s easy for brands to make promises, but it’s those who have a public track record of doing what they say they will, living up to their values and being transparent about the areas where they need to improve, that win our trust.
Facebook has taken a good first step on the road to rebuilding its reputation, but its users will be looking for what Facebook does with the results of the research. Will the research results be published? Will it change fundamental things about the way Facebook operates if the researchers think that it’s necessary to prevent further exploitation of users?
Saying that you’re committed to doing the right thing is important, and to be welcomed. But it’s the action that Facebook will take as a result of this research that will build real trust.
Leave a Reply