Cambridge Analytica and Facebook Miss One Thing About Your Data
It came as no surprise to many that the data and content that we so painstakingly crafted and curated over the past decade was taken and used against us. In the world of free social media networks, what really took us by surprise was just what the real cost was and just how invasive the data mining went.
Here’s Robinson Meyers’ (of The Atlantic) quick explanation of the Cambridge Analytica issue:
In June 2014, a researcher named Aleksandr Kogan developed a personality-quiz app for Facebook. It was heavily influenced by a similar personality-quiz app made by the Psychometrics Centre, a Cambridge University laboratory where Kogan worked. About 270,000 people installed Kogan’s app on their Facebook account. But as with any Facebook developer at the time, Kogan could access data about those users or their friends. And when Kogan’s app asked for that data, it saved that information into a private database instead of immediately deleting it. Kogan provided that private database, containing information about 50 million Facebook users, to the voter-profiling company Cambridge Analytica. Cambridge Analytica used it to make 30 million “psychographic” profiles about voters.
And that data was then used to craft advertising messages and target people with political repercussions ranging from the Trump campaign, the Brexit vote and even to far flung little islands like my own, Trinidad and Tobago.
But data without context is much like a foreign language movie with no subtitles — just a big moving useless picture. While it may be scary to think that your data is being mined and harvested to be then used on you, it got worse. The data wasn’t the ‘big’ anonymous data we thought it was — social media firms and their consultants could drill down into your data to assemble very detailed profiles on each of us in a way that would make pre-9/11 NSA very jealous. Here’s some info on the recent leak from a dating app, which highlights just how precious data wasn’t harvested from an overview perspective but rather on a very personal level:
For two years or so, users have been able to choose to declare on their HIV status on their profile…This feature has been broadly praised as a way of reducing stigma for people living with HIV, and as a spark for useful conversations. It now turns out this information has been shared with two analytics companies, Apptimize and Localytics, to help the network optimise features within the app and the roll out of new functionalities.. Even worse, it was shared alongside other personal data, making users uniquely identifiable.
— Matt Stokes (New Statesman)
You Are Being Controlled
Your morals aside, one would not have reasonably expected that your data being shared to one app, would be assembled with a profile on you, down to the GPS coordinates you share with the app, to be shared BY the app or rather its parent company. That’s downright frightening. And if you think these ‘breaches’ or violations are being only done by third parties and not the social media sites themselves, here’s a refresher from four years ago:
In 2014, over 600,000 users unknowingly participated in a psychological study done by Facebook itself (not an app developer) to test ‘emotional contagion.’ According to the The Telegraph article by Harriet Alexander, the study was then published with the following results:
“When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”
Four years later, armed with the information from other such tests conducted on an unwitting public and trillions of gigabytes (petabytes? geobytes?) of data since then, that data is being turned on all the users of the networks. Keep in mind that Facebook currently owns Instagram as well as Whatsapp. That means that Facebook can have the ability to drill down into your messages on three networks, your communities, your content, your life. And use all of that information to shape your emotion and your decisions.
But there is a fundamental flaw in believing that we can be so easily shaped and shifted. These companies thrive on keeping you online and engaged. They shape your psychology to do it too. But they overlook the fact that you have free will. And there’s one way to keep yourself from being consumed:
By being less on the grid you can minimise the behavioural changes that can come from being online all the time. Social media related depression is a real thing — the result of FOMO, of seeing everyone DOING something you aren’t, the result of a cumulation of things.
Would people be happier if social media never happened? Probably.
By assuming a strong position in your sense of self — by crafting a fastidious image of yourself FOR yourself and not for the masses on social media, you can become immune to the changes and influences of the networks. You want to realise that there is life outside of the networks and the networks miss that important bit of data. But they won’t for long.
Facebook for example is already pressing ahead with patents that would help the company track your activity when you are offline: from the pings of nearby wifi routers, to your expected routes, to events that you indicated interest in, Facebook wants to know what you are doing when you are offline.
There is life outside the box. That’s the one thing these networks all miss. But that’s one thing they also don’t want you to miss: your life, outside, with real people, in the real world. The longer you can give them your attention: the more they can monetise you.
You’re being pimped out by Big Data and the worst part is — you’re just getting f*cked all the time and never getting a dollar for it.