
I’ve had enough of being the product for an unethical company. Hence I have decided to #deletefacebook (image source: http://www.beebom.com)
Like many of you, I’ve been reading the unraveling story of how Cambridge Analytica harvested and manipulated the data of 50 million Facebook users to build a system to predict and influence choices during the US Election in 2016. If you haven’t read about this yet, please watch the video below from the Guardian and The Observer media teams led by the remarkable Carole Cadwalladr.
This story is so remarkable that it seems more at home in Hollywood than in reality. But it is important to us all. Over 2.2 billion people use Facebook; that’s just under a third of the world’s population.
We are the Product
I’m not naive, I understand the trade-off when using any social site. To quote media theorist and writer Douglas Rushkoff, companies like Facebook sell us and our data to advertisers: “Ask yourself who is paying for Facebook. Usually the people who are paying are the customers. Advertisers are the ones who are paying. If you don’t know who the customer of the product you are using is, you don’t know what the product is for. We are not the customers of Facebook, we are the product. Facebook is selling us to advertisers.”
I’ve always been ok with that, and it was a trade-off that I’ve been willing to make. But the reporting around Cambridge Analytica and Facebook’s inaction concern me. As far as I’m concerned, Cambridge Analytica basically stole, with Facebook’s consent, 50 million user profiles. Facebook’s system gave Cambridge Analytica the ability to take from the 320,000 people or so who used it all of their friend contacts on the site. The 49 million people whose data was taken and then misused had no idea about what was happening and how their information was used to manipulate American voters in 2016. And I assume most of them still have no idea, because Facebook didn’t tell them.
Does Facebook care about us?
I have no intention of being manipulated online by firms like Cambridge Analytica, and I don’t want them to access data without my permission to reach my friends and family. Unfortunately, the best way for me to ensure this doesn’t happen is to not be on Facebook. I know many people who work at the firm, and they’re good individuals. But there’s something wrong at the top of the organization. Facebook knew about the Cambridge Analytica issue as far back as 2015. It took them three years to go public on this. Why?
Mark Zuckerberg may talk about connecting the world, but let’s be honest here. Facebook is a business, not an altruistic charity. It cares about revenues. And, sadly, that is leading Facebook’s leadership down a dark path with no care about me or my rights as a user. To quote the firm’s privacy policy on its collection of data, “We receive data whenever you visit a game, application, or website that uses Facebook Platform or visit a site with a Facebook feature … sometimes through cookies.”
What does Facebook care about more? Is it revenues or users? To me, the answer is obvious.
Facebook’s Lack of Ethical Leadership
Balancing what is profitable with what is right has never been easy, especially for publicly-listed companies. The expectation is that revenues will grow, quarter over quarter. While Facebook’s revenues may have grown, I’ve yet to see any ethical leadership from the company on pretty much anything. Facebook staggers from scandal to scandal. Take for example the story about how advertisers could target audiences by ethnicity, leading to the revelation that a brand could focus on users interested in antisemitic topics. Facebook’s leadership promised action, and little was taken.
And then there’s the story of how fake news producers have manipulated the site, most extensively during the US Presidential elections. What was Zuckerberg’s response (which has since come back to haunt Facebook)?
“Personally I think the idea that fake news on Facebook, which is a very small amount of the content, influenced the election in any way — I think is a pretty crazy idea. Voters make decisions based on their lived experience.”
The behavioral pattern hasn’t changed with Cambridge Analytica. Facebook’s executives remained silent for a week. Zuckerberg pledged that 2018 would be the year that he “fixed Facebook.” Maybe a more pertinent suggestion would be for him to finally admit that he’s out of his depth and that he hands over to a leader who can balance both ethics and business.
I’m no longer the Product
There are other reasons why I don’t love Facebook like I used to. It’s impact on the media industry, a profession that I started out in at the beginning of my career, has been disastrous. For all the above, I’ve decided that enough is enough. I don’t want to be the product any more. What I do want to do is share a message with Facebook that the company has to change. And as I’m the product, it won’t be able to sell my data, including all my likes and my posts, to advertisers. I’m still thinking over what this means for my presence on other sites such as Twitter and Instagram (which is owned by Facebook). But my taking a stand with others who have stepped away from the site, I hope that we’ll force the company to change for the better. There needs to be respect and protection for us as users, which the company’s leadership has never shown through its actions. I’ve taken the decision to #deletefacebook. Maybe you should too.