It's almost become a joke: Facebook makes a change to its privacy settings that opts you in to a bunch of scary stuff, the entire Internet flips out about it, it rolls back the change, and then a few months or years later, it makes the same or a very similar update, opting you in to it again. It would be funny, if it weren't getting so damned insulting.
Here's the latest. In the wake of its F8 conference the other day, Facebook rolled out a slew of changes aimed at transforming the Web into one giant conduit for Facebook data collection. And, as usual, the lofty discussions of a more "semantically aware" Web are based on the assumption that the Facebook-ized Web in question has access to most of the personal data of, hopefully, everyone in the world.
Let's be clear: I hold few illusions that Facebook's business strategy has ever been about anything other than building up a huge user base and then selling ads to those users. And obviously, the more targeted the ads, the easier it is to get people interested in them. But as the opportunities for data mining and targeting grow, Facebook faces a growing problem: how to get the data, if the users won't share it.
Facebook has created an unprecedented web (if you will) of connected users, with connections to other users who are more than willing to specify, in great detail, their interests, hobbies, and buying habits. The only problem? Those pesky private profiles.
Users tend to want to protect that data, at least a little bit, and at least some of it has to be "public," if it's to be used for the kind of behavioral targeting and, ultimately, ad targeting that really brings in the big bucks. And that is really the only explanation left for why Facebook has now gotten so shrilly insistent on you publicizing virtually every facet of your life. It's not about the user anymore, people (if it ever was).
Among other things, Facebook this week announced new "personalization" changes--the stickiest of them being Instant Personalization, which shares all your publicly available information (name, profile picture, gender, and "Connections," another new way for you to publicize all the things you're interested in) with, right now, three partner sites: Yelp, Pandora, and Docs.com. It's sticky because, as with most of Facebook's annoying new features, it's opt-out.
So, does that mean that I can't use the Pandora, Yelp, or Docs.com applications on my Facebook profile, if I don't want to have to share all this information? Frankly, I can't tell if I'm just blocking them for all time, or just blocking them from this one feature. Unclear. In fact, it feels intentionally confusing; I guess that I'd rather share than lose access to Pandora on my profile page, right? Everywhere I turn, Facebook's boxing me into overshare.
Plus, let's say I block Pandora, Yelp, and Docs.com: what about future partners that want to access this treasure trove of information? It's a very, very good bet that I'll have to opt out of those partners individually, as they're signed. It's an exhausting treadmill of privacy protection. I think that Facebook likes it that way: eventually, you'll get too tired to care.
Now, there does seem to be an answer in Facebook's byzantine maze of privacy settings, under the heading, "What your friends can share about you." This lets you control what applications and Web sites can learn about you from your friends. It's unclear how this might affect "Instant Personalization" (see how that pattern of confusion and obfuscation just rolls on?), but wow, hey, look at all the things that are prechecked here!
It's almost quaint that Facebook didn't precheck my relationship status or religious views. So, you're telling me that "Instant Personalization" aside, my friends could be sharing nearly everything I put in my profile or on my wall or anywhere else with any application or Web site they use? Huh. Good to know. Notice how there's no "uncheck all" option here?
Even the stuff you're not opted in to is part of an inexorable pull toward revealing as much as possible to as many people as possible. The newly announced Facebook Connections feature, for example, would like you to replace your boring old text list of likes and dislikes with "Connections."
If you like skiing, you can instead "like" a skiing page, and then, presto-chango, your love of skiing goes from being part of your profile that you can decide to make private to being an interest that is public. Period. And even if you make your connections invisible to your friends, you'll still show up as a connection on those pages. (Note: This feature has only been announced and doesn't appear to be rolled out yet. Hope springs eternal.)
Now, Facebook will argue that it's being as transparent as possible about these changes, as evidenced by its myriad blog posts and announcements whenever it rolls out a new feature. But I think that it knows the truth: most people will either ignore those long, confusing posts and eight-section privacy statements, and I think that it also knows that its oh-so-transparent privacy settings are actually so involved, so click-intensive, so confusing, and so multilayered that they're essentially worthless.
Most users, minus those who now predictably freak out about privacy with every little change, will probably ignore their new settings or--and this is just so nefarious and sleazy--make a change that's essentially useless (like opting out of Instant Personalization without blocking applications or even spotting the screen that says your friends are still busy sharing everything you're "protecting").
This all begs the question: why? Why are the settings so confusing and multilayered? And how do you benefit from having so much information about you being made public in so many different ways? And how is it useful for users to have a Community Page dedicated to "skiing" that might have 100 million fans (oh, wait, they're not fans anymore...now they're, um, likers?), or at least so many that you could never really hope to find a skiing buddy listed there? The answer: it's all about the ads, people.
Don't get me wrong. It might actually be kind of cool to be bopping around the Web and have a list of the things your friends were interested in that's following you wherever you go. The idea of the Web as a living, breathing, social-recommendation engine--there's something to that. It's probably not personally threatening to me in any way to have Pandora know, from my Facebook profile, what kind of music I like. Those are all ideas that could be user-focused, offer tangible user benefits, and be presented in a fun, nonthreatening, opt-in way.
But since Facebook insists on opting me in to these features without my permission, and on opting in all of my friends, and on letting my friends share nearly everything about me by default on the sites and applications they use most (on top of everything they want me to share), it's pretty obvious that user desires are low on Facebook's priority list. What's high on its list is creating a massive data set that can be sliced, diced, and monetized until the cows come home.
There's nothing wrong with making a little money. Heck, there's nothing wrong with making a lot of money. But you should not, Facebook, get to make that money by tricking me into making personal information public, by creating an increasingly baffling web of privacy-violating loopholes, and by opting me in to every new moneymaking scheme you come up with. That's how you lose user trust, and losing user trust is how you lose users.
It might take a long time, and you might be feeling pretty cocky up there on top of the social-networking heap. But trust me: put users first, or you'll find yourself down there in the Friendster pile faster than you can say "Facebook Connect."