The Facebook Furor

A pyschological experiment conducted by Facebook on nearly 700,000 unwitting users has sparked widespread outrage, with users accusing the company of manipulating them to advance its own interests. But such complaints miss the point.

AMSTERDAM – There has been a lot of fuss lately about the psychological experiment that Facebook conducted on nearly 700,000 of its users. In order to gauge how people’s Facebook “News Feeds” affect their moods, the company temporarily implemented a new algorithm to display slightly more positive messages to some users, and slightly gloomier ones to others. As it turns out, people’s posts shifted to reflect the tone of their friends’ posts.

But the furor missed some of the most interesting questions, focusing (as usual) on Facebook’s tone-deafness (as usual). Nobody seemed interested in the obvious question of whether the findings reflected a genuine shift in mood, or simply a desire – conscious or unconscious – to fit in.

What has people outraged is the notion that Facebook is manipulating its unwitting users to advance its own agenda, with many citing the secrecy surrounding the research to illustrate the company’s misconduct (though the company published the results with no apparent sense of unease). But, though Facebook’s lack of transparency is certainly disconcerting – as is its deafness to its users’ concerns – these complaints miss the point.

To continue reading, please log in or enter your email address.

To access our archive, please log in or register now and read two articles from our archive every month for free. For unlimited access to our archive, as well as to the unrivaled analysis of PS On Point, subscribe now.

required

By proceeding, you agree to our Terms of Service and Privacy Policy, which describes the personal data we collect and how we use it.

Log in

http://prosyn.org/Fi1ERON;

Cookies and Privacy

We use cookies to improve your experience on our website. To find out more, read our updated cookie policy and privacy policy.