The Many Ways In Which Facebook Dropped The Ball

The Cambridge Analytica scandal is but the most recent of Facebook’s scandals. Over the years, they’ve made some seriously stupid decisions.
Jason Snyman
2018-04-11
It is easier to ask for forgiveness than it is to ask for permission. This is an old saying, one that Facebook seems to have adopted as standard business practice for the last couple of years. Maybe even since the beginning. Facebook’s developer mantra used to be ‘move fast and break things.’ It meant that the speed of creation was vital – whether the tools and features being built were perfect or not. Little could anybody know just how many things the social network would break, and how fast. Society is eroding, that’s no big secret. The way we think, behave and feel has deteriorated – all because of social media. Facebook had given so much thought to creating more, creating bigger and creating right now that they completely missed the plot. In its ruthless pursuit of ubiquity, it gave little thought to the ramifications down the road. How will the creations be used? Can they be weaponised against us? Are they good for us? If so, how can we prevent disaster? In a sense, Facebook put its faith in us – humanity – and we have proven just how naïve it was to do so. Again and again, the tools created by Facebook have been used to exploit and abuse its 2 billion-something user base. For too long, the company has sat watching, ignoring, downplaying awful circumstances and continuing to create products with no proper safeguards. Comfortable in its own negligence. This narcissistic approach has led to Facebook’s latest scandal. The name on everybody’s lips – Cambridge Analytica – which we won’t get too bogged down with. In this article, we’re having a look at all the ways Facebook has dropped the ball, and in a follow up – what to do when you wake up one day and decide, you know what... I’m happier without it.  

Facebook And The Endless Barrage Of Stupid Decisions

We’re all familiar with the natural effect of social media. Those feelings of envy and insecurity that arise whenever we see our newsfeed flooded with other people’s success and happiness. The urge to check that witty status you posted this morning every 5 minutes. How many likes? How many comments? Some effects, though, are malicious and unnatural in every conceivable way. They have been orchestrated, wittingly or no. Let’s have a look at the many ways that Facebook has impacted society in a negative manner.
SAVE up to 30% on car insurance today

Abuse Of Data

For years, Facebook offered an API that allowed developers to pull profile data on Facebook users and their friends. This was in order to create personalised products. For instance, people who enjoy certain television shows may be partial to a new, similar game that has been created. This way, app developers know their target market. Problem is, Facebook had weak enforcement mechanisms for its policies. It had no way of preventing these developers from sharing or selling the acquired data to others. This is exactly what Dr Aleksandr Kogan did with the data acquired from 270 000 users (and their friends) of his personality quiz app – thisisyourdigitallife. Kogan packaged it all up and sold it to Cambridge Analytica, which illicitly turned the data into psychographic profile information on 87 million people. Information which helped the Trump and Brexit campaigns pinpoint their messages. It’s more than likely that other developers have been pulling this same trick for some time, and whether Facebook has been aware of it or not, the fact remains it should never have happened at all.  

Your Shopping Behaviour Revealed

Back in 2007, Beacon was part of the company’s advertisement system, designed to send data from external websites we visit back to Facebook. Basically, all of our browsing and shopping data was given to Facebook and it allowed for targeted advertising. Then, stories about our behaviour began to appear in our news feed without our knowledge, and not only that – we couldn’t block which data was sent and which not. Following privacy complaints, tales of marital affairs discovered, Christmas gifts ruined and a class action lawsuit, Facebook CEO Mark Zuckerberg called Beacon a mistake and shut it down. 

Fake Newsfeed

Facebook’s Newsfeed was designed to show the most relevant content first. It does this by measuring engagement. What people comment on, like, share and watch. Problem is, sensationalist fake news authors knew about the ranking system and used it to spread partisan, false news articles with catchy headlines all over the place. People who didn’t bother doing any research into the story would share it some more, and the authors would sit back and rake in the ad revenue. Facebook has only recently admitted to this being a problem, and is now trying to fight the fake news epidemic. 

Game Spam

Most Facebook users from 2009-2010 will remember what it was like trying to find some kind of genuine interaction or communication. Facebook’s Newsfeed and Notifications were a minefield of online gaming ads and spam. This friend needs help in FarmVille. That friend has invited you to join Mafia Wars. These were constant and relentless and infuriating – and only when the annoyances began to threaten online attendance did Facebook finally do something about it. 

Share To Public – By Default

Facebook finally gave us updated privacy controls back in 2010. Its ‘recommended’ settings, though, pushed users to share their posts with the whole world, as opposed to just with their friends. Sharing With Everyone was the default setting, which had to be changed manually. It wasn’t until the following year that Zuckerberg was forced to apologise yet again, and sign settlements promising not to change our privacy settings without proper notice or opt-in.  

Racist Exclusion And Targeting

Businesses could previously target Facebook users of a specific ‘ethnic affinity’ for their ads. The idea may have been to help these businesses find specific customers, based on their race, who may be interested in their products. The tool, however, allowed for the exclusion of certain ethnic groups as well. For instance, don’t send my housing, loans or job vacancy advertisements to such-and-such people. Facebook has since disabled this kind of targeting and exclusion. The company still allows people to purchase ads, though, and businesses were able to target user-generated groups and pages. This may become a problem when certain ads are aimed at objectionable groups with dubious interests such as Boer killing or Nazism. Political operatives, such as what occurred in Russia, have been known to spread disruptive and divisive memes or fake news in order to turn people against each other. Fear mongering. Facebook has only recently begun to hire more ad moderators and shutting down long-tail user-generated ad targeting parameters.  

Power To Your Stalkers

Like so many of Facebook’s intentions, it seemed innocent and rosy at the time. If you force every user to use nothing but their real name, then you can reduce anonymous discourtesy and online bullying. For victims of abusive relationships, stalking and hate crimes, though, it’s just another way for people to track them down and harass them. Facebook only relaxed its strict real names only policy in 2015. And only after facing blistering criticism from the transgender community.  

Playing With Our Emotions

Facebook has allowed both internal and external researchers to conduct studies on its users. This research was meant to lead to academic discoveries in the field of sociology. Sometimes, though, these researchers haven’t been content with just observing us but have also conducted experiments. For instance, in 2012, 689 000 users were exposed to predominantly positive or negative posts in their feeds. Facebook then studied their status updates to ascertain whether or not the emotions were contagious. Upon publishing the research, Facebook was greeted with enormous uproar. Exposing emotionally vulnerable teenagers and those suffering from depression to sad posts – on purpose? Many users weren’t too forgiving. For all of Facebook’s faults, it can still be a valuable tool – and even fun. If Mark Zuckerberg could only spend less time apologising and more time fixing the actual problems, who knows? Be that as it may, Facebook and Zuckerberg still have moral obligations to society. Clean up the mess you made. Stop leaving the power tools laying around with the safety off. Some of our readers may want to get rid of Facebook. Check out our following article on how to go about it.