Resisting Change, Automated

Users fear Facebook because it knows them so well — too well, when you get right down to it.

A recent news headline tapped into this fear when it reported that after fewer than 100 likes, Facebook knows you better than your spouse knows you. The headline exaggerated in the way that headlines often do, but it contained more than a grain of truth. What researchers actually found was that Facebook could predict a user’s future decisions and actions more accurately than the user’s closest friends and family members could.

We’ve been building up to this point for years. Five years ago, it startled most of us to learn that Facebook had a good chance of guessing that you were pregnant before you found out yourself. Facebook’s predictive capacity has only expanded since then. It has reached the point where ordinary users have reason to be wary.

These issues are in the news this month with the revelation that Facebook actively participated in throwing two elections in recent years: the EU referendum in the United Kingdom and the 2016 presidential election in the United States. It did so by displaying fake news reports to carefully selected emotionally vulnerable users. Some of the fake news placements may have been paid for by the Russian government, raising the issue of national sovereignty. But what of the issue of individual will? If Facebook knows when your weak moments are and knows how to press your buttons to get you to vote in a way that reflects neither your true opinions nor your best interests, then what other things can it get you to do? If you are on Facebook, is personal freedom no more than an illusion?

The same way shoppers eventually avoid stores that get too good at impulse selling, the more self-aware Facebook users have already taken to avoiding Facebook. Not many people have deleted their profiles, but users in general have become progressively less active on the site. For some, this means opening the app five times a day instead of 20, not such a big change but enough to matter. For millions, though, the slowdown has taken them from checking in on Facebook one or two times per day to one or two times per month. Even before this month’s scandal broke, Facebook was a short step away from losing its hold on a plurality of its users.

There is a cluster of related security concerns surrounding Facebook, and it helps to separate them so that we can take a look at all the risks. This month, the one risk that is getting the most attention is the reputational risk that goes with a company that has such a detailed record of your history. To put it mildly, Facebook is designed to embarrass you. Or to say the same thing in more ominous terms, Facebook could blackmail any one of us. And yes, that includes people like me who have never had a profile on Facebook. Facebook collects data just as systematically on Facebook users’ non-Facebook friends.

Reputational risk is a separate risk from the emotional manipulation or psyops, military-style psychological operations, that Facebook engages in. I mentioned the possibility that Facebook can control some of your decisions, but there is more to it than that. Part of Facebook’s manipulation of the 2016 U.S. election involved changing people’s emotions. It showed some users news items designed to infuriate them so that they would not be thinking as rationally as they normally would on the day when they voted. It showed other users news items designed to discourage and depress them. The plan was that these users would feel so hopeless and lethargic that they would be less likely to go out to vote on Election Day. There aren’t any good metrics on how successful this campaign was, but Facebook has proved it can do this kind of thing, so there is no reason to imagine these efforts went for nothing. Facebook manipulates users’ decisions and their emotions, and it does so in a way that transfers power from individuals to moneyed interests.

All that is sensational and bad, but there are more insidious risks in Facebook when you combine Facebook’s knowledge of you with its ability to manipulate you. Possibly the most damaging quality is the way Facebook inherently resists personal change. I consider this to be probably the worst effect Facebook has on the world simply because it affects everyone who interacts with Facebook.

Personal change is difficult to begin with. As soon as you decide to make a change, you meet resistance from your own habits. As you start to overcome this obstacle, you meet resistance from people around you who expect you to follow your old habits. If you are on Facebook, you also meet resistance from Facebook and its advertisers, who expect you to follow all of your old patterns.

And what else could Facebook expect or want? All the data Facebook has collected on each of us is from the past. Its policy is to never delete any information it has on anyone. The older data becomes less and less relevant as time goes by, but all of its data can become worthless if we are able to grow and change in life.

Facebook doesn’t want its data to become worthless, so it has an unavoidable incentive to try to keep us from growing and changing.

Even if Facebook did not have this incentive, any targeted advertising, targeted based on data collected in the past, is a continual reminder of your personal past — and Facebook doesn’t particularly care if this is the recent past or the distant past. Are you a cancer survivor? For years afterward, Facebook will show you advertising designed to persuade you to got more cancer diagnostics and treatment. That’s useful targeting if you in fact need further cancer treatment, but it can be discouraging if Facebook still sees you as “Ms. Cancer” five or ten years after you have put that illness (or mistaken diagnosis) behind you. Have you been though a hard-fought divorce or a harrowing experience with human trafficking? Facebook won’t let you forget that either. For that matter, are you a recovered heroin addict? Facebook won‘t hesitate to show you indirect reminders of how easily you could fall back into your old patterns — and remember, Facebook also knows when your weak moments are. In the worst case I can think of, are you in a witness protection program? Facebook will quickly connect your new identity with your old identity and will freely pass that information along to advertisers, which might include the very organizations trying to kill you.

The essence of this effect is that Facebook automates the resistance to change that you already get from the world around you. Worse, on Facebook, this resistance may persist for years, long after the people who know you have accepted you in your new and improved form.

However you are working to better yourself — improve your habits, eat better, lose weight, improve your health, change your job, gain the status that comes with a college degree — Facebook will make that change harder.

The desire to change personally clashes directly with Facebook’s business model. If you want to make yourself something special, you can’t do that while putting very much of your attention on Facebook.

This has been an issue with Facebook from the very beginning, but it has broken through as a public concern only in the last two years as Facebook’s analytics (notably face recognition) have improved, its presentation has become more intrusive, and its historical record of each user has grown longer.

Users don’t have to know the economic incentives or psychological impacts involved to recognize that Facebook has gotten “creepy” or that going on Facebook “feels like a minefield.” It’s an instinct for a human to know when he or she is being tracked and followed. When something is an instinct, it doesn’t require strategy or deep reflection for observations to turn into action. You might have cut back on your own Facebook logons, vaguely sensing that something is wrong there, without realizing you are doing it.

But Facebook, which knows us better than we know ourselves, knows when we cut back. It’s already trying to adapt its look and feel so that we won’t all go away. It may find some partial successes along the way, but as I mentioned, there isn’t a way to bring Facebook’s business model together with people who grow and change. Just as people who want to grow and change can never fully reconcile with Facebook, Facebook can never be a constructive presence in the lives of people who want to grow and change. Facebook cannot do so short of abandoning its business model, that is, and that is something I believe it will never do.

If you’re on Facebook, the first step is to realize that you’re not the only one having the problems that you’re having with Facebook. Everyone on Facebook will be facing the same issues, and you’ll notice that for a good fraction of your Facebook friends, the problems got so bad that they have already dropped off. The second step is to realize that there is no possible adjustment that will solve these problems within the context of Facebook. The embarrassments, the endless conflicts, the emotional manipulation, and most of all the resistance to change are built into the design of the platform. Seeing this can help you change the way you look at Facebook. It is not your friends. It is more like the guy on the corner who used to get you the stuff you were addicted to. If you can deal with Facebook on this level, it will still mess with you, but you’ll probably come out okay. If not, there is no immediate need to jump through all the hoops that Facebook has set up for users who want to delete their accounts. Just stop signing in. Then use your new free time and your newfound autonomy to think about how you want to grow and change.

Fish Nation Information Station | Rick Aster’s World | Rick Aster