Sunday, June 29, 2014

Tech culture versus social morés

Facebook manipulated some of its users' news feeds as part of an internal research study. What really got people brassed off — people other than the researchers, anyway — was that Facebook didn't get permission from the affected users.

In the course of discussing the study and the reaction to it, one of the researchers, Adam Kramer, wrote (on Facebook, of course):

Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused.
  1. Nobody thinks the goal was to upset people. To disclaim that non-goal is pointless.
  2. You clearly don't understand why people have concerns, or you would have issued a genuine apology.
  3. Being sorry for the way the research was described is not the same as being sorry for the research itself.
It's rare to see cluelessness about human nature this extreme — except in high technology.

That's why the phenomenon of "social media" has been so delightfully weird for me to watch unfold. Some of the most socially unaware yet high functioning people in our society are crafting the ways we interact with one another through technology. It's a standing wonder that the idea of "social media" has succeeded so well.

For your benefit, Adam Kramer, let me spell things out unambiguously:

Subjecting people to experimentation without their consent is WRONG. Human beings are not lab rats. Experimenting on people without their knowledge is unethical, even if you don't commit Mengele-inspired atrocities. And before you object, deliberately messing with the data they consume is experimenting on them.

That it never occurred to you your "research" was unethical speaks volumes about your and your colleagues' worrisome ignorance of the norms of human society. That you still seem befuddled by the uproar tells me your authority to conduct "research" must be revoked, and must not be returned until you demonstrate you understand at least the legal rules that govern research on human beings. (It's too much for me to hope you'll actually become acquainted with the subtler, more complex and thoroughly unofficial rules of human behavior.)

Facebook's management is also partly responsible for this fiasco: apparently nobody in charge understood or cared how wrong this research was. The appropriate punishment is to abandon the service unless it owns up to what a colossal violation of human decency this "study" was.

No comments:

Post a Comment