DAVID BELLARD

View Original

Why I Quit Facebook

This is a long blog post, so for those that just want the executive summary, here it is: Facebook is a shitty company with a shitty product, and I stopped using it.

I didn’t take a break or deactivate my account, I deleted it permanently with all the force my index finger could muster on the return key. I wanted my data erased permanently from the Faceborg and out of hands of the sociopaths in their C-suite offices.

I realize Facebook knows almost everything about me because I foolishly gave them every detail of my life over the past decade, but I don’t trust Facebook in any way, shape or form anymore, and I’m done. For the past five years it seems that every month there is a new, more nefarious revelation about Facebook, and as predictably, when they are caught they stonewall, lie, obstruct and spread disinformation about their unethical practices. They routinely violate user’s privacy (intentionally and through security breaches), and they willingly work with the shadiest political operations and authoritarian regimes around the globe to advance regressive agendas. They have more of your personal data than you have for yourself, and that’s a potentially very dangerous situation for any person.

I’m exhausted by social media, and I’m convinced Facebook in particular will become something more overtly malicious in the future, whether on its own or in conjunction with some other entity. The promise of a harmonious, technological utopia that the internet was supposed to bring seems to be a chaotic, dystopian anxiety machine that accelerates tribalism, racism, and nihilism, while rewarding toxic individuals with power and fame. Social media, particularly Facebook and Twitter, is the engine of that machine and quitting Facebook was a conscious step towards protecting my data, and maybe more importantly, a step towards better mental health.

It was hard to contemplate quitting something that had become part of my everyday life, but after a few months of reflection, I felt like I had six good reasons to quit…

To understand why that’s a problem, start with the knowledge that a data point is any piece of information I put online or is recorded/collected from me. This can be an online search, purchase, subscription, comment, site visit, photo, date, or even a period of inactivity. Now let’s assume that with 50 objective demographic data points such as gender, race, eye color, birthday, home address, marital status, etc. – you can know almost everything about an individual’s physical presence in the world. With less than 50 data points, an individual can be located and identified out of 6 billion people. In fact, researchers at MIT found that 90% of people can be identified individually out of 1.1 million from just four samples of credit card transactions!

Facebook has on average an astonishing 30,000- 50,000 data points on each one of its 2 billion users! If you only need 50 data points to get create a physical identity, imagine what could be done with 30,000: what you ate, what clothes you wore, who your friends were, what type of person you were attracted to, what made you upset, the frequency you did things, etc., etc. These data points are the road map of your emotions, thoughts, biases, ethics, politics, beliefs, and all the intangible pieces created by your mind, heart and soul that make you an individual. A clever person with that much information could manipulate you in ways you didn’t even realize were happening. The clever organization that has that much information about billions of people has power and leverage over them that is astonishing to comprehend.

It’s a dangerous thing for a corporation to know everything about you. Facebook knows the members of your family, your social network, your enemies, your thoughts, opinions, and preferences. They have an entire visual library of your life through the photos you share and they can connect other users in your photos, whether they are your friends or not, through facial recognition technology. They know the events you’ve been to, who you date, and who you voted for. They know when you get up in the morning and when you go to bed. They’ve analyzed and cataloged everything you’ve ever done on their platform, and even everything you do on other websites. With those 30,000 qualitative and quantitative data points, Facebook and their clients can manipulate your emotions, predict your actions, and change or reinforce your beliefs. This is the business of Facebook: knowing and predicting you. It’s the product they sell to companies like Cambridge Analytica and you don’t know what they’re doing with your data or who they’re selling it to.

The basic agreement with Facebook seems straightforward: you can instantly share anything you want with your network and you can stay connected to everyone around the world, in exchange for agreeing to be marketed to using the information in your profile. While every company must do market research and must know something about you, or something about their target audience in general, Facebook is not simply marketing to you. They’re actually doing intense psychological research and analysis with your data to build an entire psychographic profile of you.

Examples of “Dark Ads” produced by Cambridge Analytica to influence elections in Kenya and the U.K. The image above is from the C4 undercover investigation that exposed Cambridge Analytica’s illegal activities.

Facebook’s position: What they do with this psychographic profile is their business, you see, because you “signed” the agreement and are voluntarily inputting data daily into their research tool (a.k.a. the feed), and it’s no business of yours the extent of the research or how they profit from it. Obviously, every corporation uses research to improve their business, but Facebook is different and should be more closely investigated. It is the world’s largest media platform, and with the personal data on 1/3 of the world’s population, they’re one of the most powerful corporations in the world. The mystery that surrounds how and why they conduct research clearly has a lot of people raising alarm bells about the lack of oversight, and this lack of public transparency of data collection has some politicians calling for government regulations on the corporation.

Dark ad created for Donald Trump’s campaign by Cambridge Analytica during the 2016 presidential campaign.

One variation of Cambridge Analytica’s anti-HilLary dark ads that showed up in Facebook feeds during the election of 2016. Steve Bannon, Trump’s campaign manager and former advisor, was one of the founding partners of Cambridge Analytica.

An example of a Russian-Intelligence created Facebook group and advertisement during the 2016 election. Intelligence agencies around the world use Facebook to conduct covert disinformation and psyop campaigns against users.

There’s good reason for users to be alarmed at the lack of transparency regarding Facebook’s business. A research paper reported by Forbes this year details how Facebook conducted psychological research on users in 2014 to determine how emotions can be spread via social networks, in what the study called a “massive-scale emotional contagion through a network.” The results of that research are the stuff of dystopian science fiction - Facebook is an entity that has so much psychological data on their users they can significantly influence behavior by predicting your emotional reaction to messages. That the research in question was completed in 2014 means it was likely the basis for how Cambridge Analytica used Facebook to spread disinformation and manipulate voters in authoritarian elections around the world for the past 5 years.

It’s important to know that this is not “marketing” in action, or good business practice. This is highly unethical and dangerous, a massive psychological abuse perpetrated by an unaccountable corporation. Not one user knew they were part of an experiment because Facebook did this without user’s consent. Facebook brushed off this appalling lack of ethics with boilerplate PR speak regarding their user agreement everyone agreed to when creating an account, and they paid no price for it. As Australian Futurist writer Mark Pesce summarized about the incident:

“Presumably everything is going on at Facebook as before this revelation, with no indication that any of its business practices have changed. Never transparent about how it applies the information supplied by its two billion monthly users, Facebook had been caught red-handed: exploiting the weak spots of teenagers in their moments of greatest vulnerability, watching, waiting then delivering a targeted message at the moment of maximum impact. In another context, this could be read as something akin to torture or brainwashing. For Facebook, it was just business as usual.”

Regardless of your stance on your personal data privacy, the implications of Facebook selling access to millions of users to psyop firms Cambridge Analytica (or any other firm with enough cash) are startling given the political reality we deal with today. Authoritarian regimes all over the world know social media is the best tool to spread their messages and disinformation, and Facebook is willing to turn a blind eye to this hijacking, or worse, become a willing partner in the rise of this darkness if the price is right.

It’s a given to assume that any website collects your data while on the site, but did you know Facebook collects all your activity data from every website that uses the Facebook “share” or “like” buttons? The fact is, it’s impossible to be a “casual” Facebook user – by that I mean as long as you have a Facebook account, they will monitor almost all of your activity online, even when you are not logged into your Facebook account!

Facebook’s vague language regarding its off-platform data collection.

The most obvious off-site data collection is when you sign into a site using your Facebook ID, which is an option on many of the largest online retailers and services. Although the privacy issues and security breaches associated with third party logins have been widely publicized since 2015, the practice is still popular with users who are unaware of the danger. To its credit, Facebook has made it easy to disconnect or modify their access to your apps via third party login, however like most of Facebook’s privacy controls, the onus is on you to “opt out” instead of Facebook asking you to “opt in.”

Even if you don’t use Facebook to sign into other online sites, Facebook is still following almost all of your online activity. Every website that uses Facebook buttons on their pages is contractually obligated to share data with Facebook and vice versa. When you visit CNN every morning to read the news, Facebook knows you visited, what time, what articles you read, and how long you read them. If the next website shares data with Facebook, and it’s most likely they do, Facebook can track your entire online experience, in realtime. Isn’t that wonderful? There is no way to opt out of this, and the implication is stunning: Facebook can track almost all online activity of 2 billion people every second of the day.

A screenshot of the Onavo app that prompted Apple to ban it and shut down Facebook’s iOS access for a day. There was no indication it was spyware or that it was owned by Facebook.

Facebook’s off-platform data collection does not stop at websites, they have been caught launching fraudulent apps and collecting sensitive health and personal information from users who downloaded the apps to their phones. In February of 2019, an investigation by the Wall Street Journal discovered that 11 different apps were sharing sensitive user data including weight, blood pressure and ovulation status with Facebook, and Facebook could access the data even when the user was not signed in to Facebook.

In another scumbag move, Facebook was most recently caught paying people, mainly teenagers, to download a spyware VPN disguised as a research app called Onavo, which allowed Facebook to monitor their cell phone in an attempt to get more data on Facebook competitors such as Amazon. Spyware of any type is a direct violation of Apple policy and the app was immediately blocked after an investigation by techcrunch.com, with Google play eventually shutting it down as well. But this policing is the exception, not the rule. Tech giants like Apple, Google, Amazon, etc. are happy to participate in the data sharing agreements they have with Facebook and are also using your data to target-market products to you. The difference is that none of the other tech giants have anywhere near the personal data about you that Facebook has, and so far none of the others have been accused or proven to use your data in the highly unethical ways Facebook does. That may change in the future, but right now Facebook seems to be the one willing to cross ethical and moral lines.

These are just a few examples of the things Facebook has been caught doing. Imagine what we don’t know. Bottom line: in exchange for the ability to see what’s going on in other people’s lives, you consent to the recording of your life, on and off the site, to be used in ways you have no control over. That’s an absurdly unbalanced relationship and the product is just not worth it.

I won’t go deep into the history of Facebook and Zuckerberg, but it’s good to remember Facebook started life as a weird “hot or not” revenge website he started at Harvard because ladies were seemingly too creeped out to hang out with him, and he allegedly plagarized Facebook from one (or more) of his techbro friends. He’s settled several lawsuits to that effect but some of his former colleagues are still spilling the beans about Zuckerberg’s questionable ethics.

Mark Zuckerberg, the Chairman and CEO of Facebook, is worth an estimated $55 billion dollars.

With the ability to manipulate the emotions of its 2 billion people around the globe, Facebook is an undisputed leader in the techno-oligarch pack, and that power is completely concentrated in the hands of one man. He’s ranked number 10 on Forbe’s World’s Most Powerful People list, and he’s only 35 years old. He’s the majority shareholder in Facebook, maintains tight control over the operations, has no board oversight and doesn’t deal with any government regulations on his empire. Through his ownership of Facebook, Instagram and Whatsapp, three of the most popular communication tools on the planet, he has unprecedented power and influence over our emotions and choices. It’s impossible to predict what the world will become in the 50 - 60 years Zuckerberg has left, but it’s safe to assume he’s not going to stop collecting data. And going by his own words, it’s also safe to assume he’s not going to use it to benefit mankind in some altruistic way. As Emily Bell in the Guardian put it, “everytime he articulates his plans, it is like reading the nightmarish college application of an accomplished sociopath.”

Both Zuckerberg and his top lieutenant, Facebook COO Sheryl Sandberg, are products of Harvard Business School’s “leadership” industry and they’re both proven liars, having been caught time after time lying to government, inquiries, and the media, all the while having the gall to issue public statements proclaiming their determination to make Facebook safe for its users. After their complicity in the disinformation campaign run by Russian intelligence on their platform was made public, Zuckerberg and Sandberg refused for weeks to accept any responsibility for the site’s lack of content control or their amoral leadership.

Zuckerberg and Sandberg are polarizing figures in our C-suite worshipping culture. Though Sandberg is sometimes lauded as one of the best business leaders in America, her leadership style has been revealed as anything but transparent, authentic or ethical. As observed by Duff McDonald in Vanity Fair’s profile of Sandberg, “A true leader would not have had to write a post defending herself in light of her company’s hiring of a P.R. firm, Definers, that leveraged anti-Semitic conspiracy theories about George Soros to deflect attention from Facebook’s own missteps. A true leader would not have spent five whole days staying silent after The New York Times reported on Cambridge Analytica’s access and exploitation of Facebook user data in March 2018, only to later claim that she and Zuckerberg had previously asked the source of that leak to destroy said data but had failed to confirm that they had done so.”

Facebook was an enthusiastic partner to Cambridge Analytica as it manipulated the voters in over 30 countries and 100 elections. In addition to be a willing partner in CA’s global disinformation campaigns and ignoring Russian intelligence psyops, Facebook’s platform was weaponized by ultranationalists in Myanmar to incite violence against the Rohingya minority population, and when discovered Facebook did absolutely nothing to stop it. Facebook has been linked to violence in the Philippines, Libya, Germany and India, and It seems Zuckerberg prefers to watch and observe the results rather than act responsibly because they continue to do nothing to counteract this weaponization.

The reasons to quit are also related to my emotional and mental health. Facebook used to be one of the first things I looked at every morning on my cellphone, and one of the last things I read at night before going to bed. I actively posted something on my feed at least once a day, but usually more. I was so addicted to Facebook that until just a few weeks ago, I visited the site 10 times a day on average. I scrolled through Facebook like I was flipping through channels on cable tv back in the 90’s.

Doing anything 10 times a day is undeniably obsessive behavior, and I have no problem calling my Facebook habit an addiction. It was the app I would reflexively open when I reached for my cell phone, itself an almost involuntary action, and on my laptop I would open a new browser window to visit Facebook while in the middle of reading something else. In the months leading up to leaving Facebook, I struggled to understand what made it so addictive.

Richard Seymour at the Guardian likened Facebook to a slot machine, with users compulsively pulling the feed down like a slot arm to refresh the content, hoping to win some new content worth responding to. Though there have been several studies that suggest Facebook and other social media sites tap into the parts of our brains most associated with addiction and impulse control, whether or not the compulsion to use social media is an addiction is still up for debate.

For me using Facebook definitely felt addictive, similar to smoking cigarettes. Facebook became part of my life and the thought of quitting was akin to taking away part of what made “me” me. Sure, I was able to go hiking in the forest for a weekend or travel to places with no internet for days and be fine, but taking a break for a few days is not the same as quitting forever, which is scary, and that fear of quitting is a big reason why compulsive Facebook use is an addiction. There’s a fun and a social aspect to it in the beginning, but the longer you do it, it simply becomes a conditioned habit with very little pleasure associated with it, and quitting is hard. The very idea of quitting comes with real physiological reactions and emotional angst - the question “What will I do if I can’t do that anymore” is one every addict has before deciding to quit. The fact that I had this question about Facebook made the addictive nature of my relationship with Facebook obvious.

Everyone agrees that the Facebook algorithm is terrible, right? Starting around 2012, Facebook ramped up advertising in the feed and began aggressively filtering friend’s posts. Unlike the fun, early days of Facebook when the feed was chronological and featured all content from your friends, your feed is now curated by an adaptive algorithm that becomes more and more arcane every year, to the point that even their own clients don’t really understand how it works.

Facebook does give you, the user, a limited amount of control over this algorithm, allowing you to pick 30 other favored users to “appear first” in the feed, meaning ostensibly that the posts by these people will always show up in your feed. I found this to be false. I missed many posts from my favorite 30 users, and instead would regularly see more posts from people that I rarely interacted with. Users also have the ability to opt out of paid content, but only after it appears in their feed. Beyond that, almost nothing you see on Facebook is up to you. Yes, you have very easy access to the lives and thoughts of family and friends with a quick click, but that access comes with a price. Is it worth giving Zuckerberg and his empire access to your life, history, and opinions in exchange to see an ad every 4th post, between kid’s birthday pics from a high school friend you never talk to in real life?

Frankly, Facebook is not fun at all anymore. Few people actually posts anything written by themselves, tell jokes or share an opinion. Many people’s activity is reduced to sharing memes or articles, or worst of all, political content - and sadly I was one of the most prolific political ranters on Facebook. While I don’t regret my political or social positions, I’m definitely not proud of the way I constantly posted news articles or opinion pieces on Facebook. The absurdity of the political harranging became obvious to me a few weeks prior to quitting Facebook when I was wondering why Facebook brought out a compulsion to push political opinions. Pre-Facebook, what kind of crazy person would read a magazine article, make hundreds of copies of it, then mail them to every friend, coworker, and casual acquaintance they knew? And what kind of crazy person would do that multiple times a day? I’m not like this in real life, but Facebook provided a megaphone I couldn’t resist and I started shouting. This is part and parcel of my addictive relationship with Facebook. Constant political haranguing is annoying and I’m sure I was unfollowed by a lot of people, and if you are reading this and are one of the people I annoyed with the political content sharing, SORRY!!

In the grand scheme of it all, Facebook didn’t make me sign up back in 2008, and they provided a service which I eagerly used for more than a decade. In pure consumer terms, I used to love the product, but it’s shit now, it has been for years in fact, and it’s made by people I don’t trust. I’m no longer happy with the transaction and I’m going to stop using it - it’s not like they care, or that I need a Facebook profile as a requirement to get a job or participate in the economy in some way… yet. As I said at the top, it should be such a simple thing that hardly warrants a mention, but in a world where almost everyone in my social circle uses it, I felt the need to explain.

Although I quit Facebook I’ll continue shopping online, using Google searches and maybe even posting photos to Instagram, which I know is owned by Facebook. While Google and Amazon also collect a lot of my data, they do not have the amount of personal data that Facebook has, nor have they been caught in rampant unethical activities. Unless I want to live in the woods Grizzly Adams style there’s really no way I can avoid giving up data online. What I can control is not using platforms like Facebook when they continue to push the ethical boundaries of their product.

ON A PERSONAL NOTE: I want to thank everyone who is on my email list and takes the time to visit my blog and website. I really appreciate you taking the extra step to stay in touch. It’s a reminder of how we communicated in the world before Facebook. I’ll be blogging here more regularly and posting more content to my Vimeo and Soundcloud pages as well, so if you enjoy my work, please visit those pages. If you haven’t already, you can subscribe to the newsletter here.

I hope to keep in touch with you all and see you in the real world sometime soon!