Why worry about online privacy?

Some of my reasons are are likely not the reasons you would have, but the rest should be. When I was 18 I was diagnosed with mild anxiety and mild paranoia. This was caused by childhood trauma that included six years of bullying at a small private school where there were times when literally everyone in the class was out to get me and times where there was potentially lethal risk to my safety (choking). While I generally do a good job of maintaining a sufficient degree of skepticism to keep my paranoia in check, it does make me more sensitive to some of these issues than it would someone without my particular psychology.

In 2014 Facebook published a study about whether or not they can change people’s moods. They found in 2012 when they were doing the research that they could with minor tweaks to the alogithm, but they hadn’t included in their privacy policy that user data could be used in research until right after that study and they did not try to solicit any consent from users to be involved in the study. When I saw this study, I thought back and it seemed like that was about the time that I stopped enjoying Facebook and started spending less and less time there.

YouTube has been found to radicalize people, turning Bernie Bros into the Alt-Right and creating the Flat Earth movement. I know I’ve seen my own recommended videos, while never going to those extremes, start down some weird paths. For example watching history videos lead to recommended videos about the Boor War, which lead to more history about South Africa, which lead to videos about population genetics, and since I knew what was next I took a break from YouTube and stopped clicking on even history videos. Camping videos have also done similar, with the recommendations slowly drifting towards preppers one time and towards van life propaganda the other.

Then there’s QAnon which started on 8 Chan and then spread largely thanks to Facebook. Much like Flat Earth, it’s a grand conspiracy that encompasses almost all conspiracy theories. Once you have people buying into all conspiracy theories, then they are operating under a vastly different set of facts about the world than everyone else. In other words they’re creating an alternate reality that is being spread like a virus by the algorithms.

Unfortunately these algorithms are not even fully understood by the developers responsible for them. They rely on machine learning and making better copies of themselves with the goal of keeping you on the website as long as possible so you can be shown more ads. In theory this is benign, but it naturally favors more sensational and extreme content. These companies have enough data on people to create very precise psychological profiles that these algorithms can use manipulate people’s moods and keep showing more and more sensational content to keep them on the site. Without that data, they couldn’t create the profiles, and without those profiles they can’t manipulate the person, and with this manipulation they can’t maximize ad revenue.

It would be bad enough if it was just amoral algorithms using our data to manipulate us, but we know that Cambridge Analytica used games and quizes to harvest data from millions of Facebook users, both data from the people using the apps and all of their friends. This data was then used used by Russia to create personally targeted misinformation campaigns to interfere with the UK’s Brexit Referendum and the 2016 US Presidential Election. This manipulation helped tip the Brexit referendum over to the leave side and the 2016 US election to a Trump victory. This was done by firing up one side, while turning the other side against itself, and in the process polarizing everyone against everyone else. Of course they haven’t stopped and they have tried to interfere in the 2018 and 2020 US elections, in various European elections, and fueled resistance to pandemic mitigation efforts.

When you tie all of this together the risk is not that our data can be used to better sell us useless crap, it’s that we’ve clearly reached the point where bad actors and algorithms created by greedy corporations can use our data and psychological profiles to radicalize us, create alternative realities, and to manipulate voter behavior. It’s textbook brainwashing, and I’m not sure whether it’s scarier for it to be done by a hostile foreign power or a domestic company just wanting to sell more ads.

We need to be aware of these efforts and resist them when they’re employed against us, but we also need to protect our privacy so the algorithms can’t be as precise in how they target us. In other words, if Google or Facebook knows you better than you know yourself, then their algorithms can far more easily inadvertently brainwash you.

1 Comment


  1. Thank you!!1

Comments are closed.