just because we can, doesn't mean we should

futurelab default header

by: danah boyd

Learning to moderate desires and balance consequences is a sign of
maturity. I could eat only chocolate for all of my meals, but it
doesn’t mean that I should. If I choose to do so anyhow, I might be
forced to face consequences that I will not like. “Just because I can
doesn’t mean I should” is a decision dilemma and it doesn’t just apply
to personal decisions. On a nation-state level, think about the cold
war. Just because we could nuke Russia doesn’t mean that we should’ve.
But, just like with most selfish children, our nation-state thought
that it would be infinitely fun to sit on the edge of that decision
regardless of the external stress that it caused. We managed to grow up
and grow out of that stage (although I would argue that our current
leadership regressed us back to infancy).
am worried about the tech industry rhetoric around exposing user
data and connections. This is another case of a decision dilemma
concerning capability and responsibility. I said this ages ago wrt Facebook’s News Feed, but it is once again relevant with Google’s Social Graph API announcement.
In both cases, the sentiment is that this is already public data and
the service is only making access easier and more efficient for the end
user. I totally get where Mark and Brad are coming at with this. I
deeply respect both of them, but I also think that they live in a land
of privilege where the consequences that they face when being exposed
are relatively minor. In other words, they can eat meals of only
chocolate because they aren’t diabetic.

Tim O’Reilly argues that social graph visibility is akin to pain reflex.
Like many in the tech industry, he argues that we have a moral
responsibility to eliminate “security by obscurity” so that people
aren’t shocked when they are suddenly exposed. He thinks that forcing
people to be exposed is a step in the right direction. He draws a
parallel to illness, suggesting that people will develop antibodies to
handle the consequences. I respectfully disagree. Or rather, I think
that this is a valid argument to make from the POV of the extremely
healthy (a.k.a. privileged). As someone who is not so “healthy,” I’m
not jumping up and down at the idea of being in the camp who dies
because the healthy think that infecting society with viruses to see
who survives is a good idea. I’m also not so stoked to prepare for a
situation where a huge chunk of society are chronically ill because of
these experiments. What really bothers me is that the geeks get to make
the decisions without any perspective from those who will be
marginalized in the process.

Being socially exposed is AOK when you hold a lot of privilege, when
people cannot hold meaningful power over you, or when you can route
around such efforts. Such is the life of most of the tech geeks living
in Silicon Valley. But I spend all of my time with teenagers, one of
the most vulnerable populations because of their lack of agency (let
alone rights). Teens are notorious for self-exposure, but they want to
do so in a controlled fashion. Self-exposure is critical for the coming
of age process – it’s how we get a sense of who we are, how others
perceive us, and how we fit into the world. We exposure during that
time period in order to understand where the edges are. But we don’t
expose to be put at true risk. Forced exposure puts this population at
a much greater risk, if only because their content is always taken out
of context. Failure to expose them is not a matter of security through
obscurity… it’s about only being visible in context.

As social beings, we are constantly exposing ourselves to the public
eye. We go to restaurants, get on public transport, wander around
shopping centers, etc. One of the costs of fame is that celebrities can
no longer participate in this way. The odd thing about forced exposure
is that it creates a scenario where everyone is a potential celebrity,
forced into approaching every public interaction with the imagined
costs of all future interpretations of that ephemeral situation. This
is not just a matter of illegal acts, but even minor embarrassing ones.
Both have psychological costs. Celebrities become hermits to cope (and
when they break… well, we’ve all seen Britney). Do we really want the
entire society to become hermits to cope with exposure? Hell, we’re
doing that with our anti-terrorist rhetoric and I think it’s fucking up
an entire generation.

Of course, teens are only one of the populations that such exposure
will effect. Think about whistle blowers, women or queer folk in
repressive societies, journalists, etc. The privileged often argue that
society will be changed if all of those oppressed are suddenly visible.
Personally, I don’t think that risking people’s lives is a good way to
test this philosophy. There’s a lot to be said for being “below the
radar” when you’re a marginalized person wanting to make change.
Activists in repressive regimes always network below the radar before
trying to go public en masse. I’m not looking forward to a world where
their networking activities are exposed before they reach critical
mass. Social technologies are super good for activists, but not if
activists are going to constantly be exposed and have to figure out how
to route around the innovators as well as the governments they are
seeking to challenge.

Ad-hoc exposure is not the same as a vaccine. Sure, a vaccine is a
type of exposure, but a very systematically controlled one. No one in
their right mind would decide to expose all of society to a virus just
to see who would survive. Why do we think that’s OK when it comes to
untested social vaccines?

Just because people can profile, stereotype, and label people
doesn’t mean that they should. Just because people can surveil those
around them doesn’t mean that they should. Just because parents can
stalk their children doesn’t mean that they should. So why on earth do
we believe that just because technology can expose people means that it
should?

On a side note, I can’t help but think about the laws around racial
discrimination and hiring. The law basically says that just because you
can profile people (since race is mostly written on the body) doesn’t
mean you should. I can’t help but wonder if we need a legal
intervention in other areas now that technology is taking us down a
dangerous ‘can’ direction.

Original Post: http://www.zephoria.org/thoughts/archives/2008/02/04/just_because_we.html