by: danah boyd

I've been watching the public outcry over Facebook's Beacon (social ads) program with great interest. For those who managed to miss this, Facebook introduced a new feature called Beacon. Whenever you visit one of their partners' sites, some of your actions were automagically sent to Facebook and published on your News Feed. The list of actions is unknown, although through experimentation folks have learned that they include writing reviews on Yelp, renting movies from Blockbuster, and buying things on certain sites.

Some partners were listed in the press release. When a Beacon-worthy action takes place, a pop-up appears in the bottom right, allowing you to opt-out. If you miss it, you auto-opt-in. There was no universal opt-out, although they've now implemented one (privacy - external websites - don't allow any websites). Furthermore, even if you opt out of having that bit blasted to the News Feed, it didn't stop sponsors from sending it to Facebook.
MoveOn started a petition, bloggers cried foul, and the media did a 180, going from calling Facebook the privacy savior to the privacy destroyer. Amidst the outrage, Facebook was also declared Grinch when unassuming users failed to opt-out and had their gifts broadcast to the recipients, thereby ruining Christmas. Privacy scholar Michael Zimmer also pointed out that the feature was peculiarly named because beacons give warning when danger is about to take place. Not surprisingly, the company was forced to adjust. Zuckerberg apologized and additional features were provided to let people manage Beacon. While this appeases some, not all are satiated. StopBadware argues that Facebook does not go far enough and NYU Law Professor James Grimmelmann argues that Beacon is illegal under the Video Privacy Protection Act.

For all of the repentance by Facebook, what really bugs me is that this is the third time that Facebook has violated people's sense of privacy in a problematic way. I documented the first incident - the introduction of the News Feeds - in an essay called "Facebook's Privacy Trainwreck." In this incident, there were no privacy adjustments until public outcry. The second incident went primarily unnoticed. Back in September, Facebook quietly began making public search listings available to search engines. This means that users' primary photos are cached alongside their name and networks on Google. Once again, it was an opt-out structure, although finding the opt-out is tricky. Under privacy settings, under search, there is a question of "Which Facebook users can find me in search?" If you choose "everyone," that includes search engines, not just Facebook users. The third incident is Beacon.

In each incident, Facebook pushed the boundaries of privacy a bit further and, when public outcry took place, retreated just a wee bit to make people feel more comfortable. In other words, this is "slippery slope" software development. Given what I've learned from interviewing teens and college students over the years, they have *no* idea that these changes are taking place (until an incident occurs). Most don't even realize that adding the geographic network makes them visible to thousands if not millions. They don't know how to navigate the privacy settings and they don't understand the implications. In other words, defaults are EVERYTHING.

Like most companies, Facebook probably chose the "opt-out" path instead of the "opt-in" path because they knew that most users would not opt in. Even if they thought the feature was purrrfect, most wouldn't opt-in because they would never know of the feature. Who reads the fine print of a website notice? This is exactly why opt-out approaches are dangerous. People don't know what they've by default opted-in to. They trust companies and once they trust those companies, they are at their mercy.

Most lofty bloggers and technologists argue that if people are given the choice, that's good enough. The argument is that people should inform themselves and suffer the consequences if they don't. In other words, no sympathy for "dumb kids." I object to this line of reasoning. Most people do not have the time or inclination to follow the fine print of every institution and website that they participate in, nor do I think that they should be required to. This is not simply a matter of contracts that they sign, but normative social infrastructure. Companies should be required to do their best to maintain the normative sense of privacy and require that users opt-in to changes that alter that normative sense. In other words, what is the reasonable expectation for privacy on the site and does this new feature change that? Of course, I also understand that this would piss companies off because they make lots of money by manipulating and altering everyday users' naiveté and sense of norms. Still, I think that the default should be "opt-in" and "opt-out" should only be used in situations that would protect users (i.e., a feature that would limit users' visibility).

I kinda suspect that Facebook loses very little when there is public outrage. They gain a lot of free press and by taking a step back after taking 10 steps forward, they end up looking like the good guy, even when nine steps forward is still a dreadful end result. This is how "slippery slopes" work and why they are so effective in political circles. Most people will never realize how much of their data has been exposed to so many different companies and people. They will still believe that Facebook is far more private than other social network sites (even though this is patently untrue). And, unless there is a large lawsuit or new legislation introduced, I suspect that Facebook will continue to push the edges when it comes to user privacy.

Lots of companies are looking at Facebook's success and trying to figure out how to duplicate it. Bigger companies are watching to see what they can get away with so that they too can take that path. Issues of privacy are going to get ickier and ickier, especially once we're talking about mobile phones and location-based information. As Alison wrote in her previous post on respecting digital privacy, users are likely to act incautiously by default. Thus, what does it mean that we're solidifying the precedent that "opt-out" is AOK?

Original Post:

Leave a Comment