IF a friend asked you to try heroin right now I assume your answer would be “No fucking way” right? Or at least I hope it’d be. You’d be right for saying no because it’s heroin and you know that heroin is highly addictive, not to mention destructive to both your physical and mental health.
Moreover, what if that same friend asked you to try a drug that gives you a dopamine high similar to that of heroin but without the harsh physical & mental effects?
What if they told you that you were already using it, and worst of all they knew what it was doing to you and they gave it to you anyway?
How would you feel about that friend, then? Would you still trust them?
Ok, so let’s say you smoke, and you call the plug to cop…
What if during the transaction, said plug somehow found out your full name, phone number, spending habits, family members, favorite brands, etc. And imagine if as soon as you found out they knew, they promised to never use any of it against you or sell it to another dealer, but that they’d hold onto your information for future transactions.
Would you trust them? Why or why not?
For months I’ve been telling friends and family about how detrimental technology can be to our mental health and invasive to our privacy in ways that we typically don’t even think about. While social media has its uses, especially for growing businesses and networking, the constant notifications from these platforms, and the endless news feeds are designed to hijack our attention while the incredibly long user agreements we sign prime us to give up personal data at a rate never seen before! The hard truth is that these products are both incredibly addictive AND predatory. And the harsh reality is that the tech industry has been well aware of this problem for some time and quite frankly, they don’t care.
So let’s bring back the analogy from earlier:
A. A drug dealer will sell you heroin but typically won’t do it with you. Why is that?
B. Because it’s highly addictive, both physically and mentally. And, it has adverse reactions that they know will fuck you up (for lack of better phrasing)
C. Therefore, a dealer will sell heroine to you, but won’t allow himself or his immediate family to smoke it. *RED FLAG*
Now, apply that same mentality to the tech industry, but keep in mind they have a multi-billion dollar infrastructure.
A. Tim Cook sells Apple devices by the boatload but told USA Today that doesn’t want his nephews to use them. Why? — if these are normal, harmless products it should be okay, right? That’s a big “if”.
B. Maybe Tim doesn’t want his nephews on iPhones because deep down Tim knows that Apple products are designed to create addictive habits that alter the way your brain works.
C. Therefore, Tim will sell you an iPhone, but won’t allow any of his family to use them. *RED FLAG*
Apply that same mentality to social media platforms with the same multi-billion dollar infrastructure, and see that the Cambridge Analytica scandal was foreseeable and why its the first (of what I’m sure are numerous) large scale data scandals that we as consumers should care about.
This particular scandal is problematic because:
A. If we’ve already allowed tech platforms, like Facebook, to access our call history, text history, pictures, videos, etc. what else do they have access to? — And how long have they had access? And why do they even need access, to begin with?
B. Though these platforms claim to not store any of the information how can we be so sure they destroy it? And by “destroy” does that mean they permanently delete it or store on an old server in the trash folder? Or does it mean that they send our data over to a 3rd party who then “destroys it” forever? (Yeah, right)
C. The moral implications of exactly what corporate responsibility entails in this context blurs more and more by the day.
Any entity that buries its actions in long user agreements and backtracks when we catch them red-handed cannot be trusted.
Chamath Palihapitiya was part of the executive team at Facebook and in charge of growth for quite a while. In recent times when asked about tech and how these platforms are “tearing apart the fabric of how society works” he’s quoted saying:
“I can’t control them,” Palihapitiya said of his former employer. “I can control my decision, which is that I don’t use that shit. I can control my kids’ decisions, which is that they’re not allowed to use that shit.”
Again, he’s a former Facebook exec, billionaire philanthropist etc. yet he doesn’t use the app he helped grow. What’s equally interesting is that Facebook even said in a recent community post that mindlessly scrolling the news feed can lead to feelings of dpression, and self-esteem issues. Yikes.
Sidebar: who knew that Facebook even conducted studies about the way their platform affects our emotions, and our behavior? And more important than that, how long have they known about the detrimental effects of the app? And if they’ve known this for a while without telling us (which they would have to in order to schedule and conduct a study, analyze the results, and then filter into an overly long media palatable statement for the public), what else do they know that they aren’t telling us?
Sounds paranoid at face value. But it’s a bit odd, wouldn’t you say?
Connect the dots and you’ll start to see a pattern growing: our innate ability to pay attention to tasks, complete meaningful work, and practice mindfulness is slowly, yet permanently, being fragmented by the use of social media and the constant streams of information we have available. It’s apparent every time our attention gets hijacked by our devices pinging us with constant notifications that let us know that somewhere, someone or something requires our attention.
To make matters worse, they know so much about us already that it’s incredibly easy for them to target us or sell our data to the highest faceless bidder, further perpetuating this predatory cycle.
Heavy, I know. But the solution is simple:
We need to get off our phones, even if just for a while.
Shoutout to the good folks at Grits & Gospel for collaborating with us on this one. More to come.
One.
-SNOBHop