Thursday, October 30, 2014
5

The Facebook Furor

AMSTERDAM – There has been a lot of fuss lately about the psychological experiment that Facebook conducted on nearly 700,000 of its users. In order to gauge how people’s Facebook “News Feeds” affect their moods, the company temporarily implemented a new algorithm to display slightly more positive messages to some users, and slightly gloomier ones to others. As it turns out, people’s posts shifted to reflect the tone of their friends’ posts.

But the furor missed some of the most interesting questions, focusing (as usual) on Facebook’s tone-deafness (as usual). Nobody seemed interested in the obvious question of whether the findings reflected a genuine shift in mood, or simply a desire – conscious or unconscious – to fit in.

What has people outraged is the notion that Facebook is manipulating its unwitting users to advance its own agenda, with many citing the secrecy surrounding the research to illustrate the company’s misconduct (though the company published the results with no apparent sense of unease). But, though Facebook’s lack of transparency is certainly disconcerting – as is its deafness to its users’ concerns – these complaints miss the point.

Of course Facebook is manipulating its users – just like all companies that use advertising to induce consumers to crave a double cheeseburger, a sexy dress, or a sexy partner. Whether it is done through targeted advertisements based on a search history or billboards on a public highway, the (intended) result is the same.

A century ago, this might have been big news. Today, it is mundane. Yet people continue to react to explicit revelations of such manipulation with shock and outrage.

The bigger problem is the modern paradox of choice. Today, people are constantly presented with choices – and also with the option to avoid them, under the guise of speed or convenience.

While the power to make one’s own choices is appealing in theory, the sheer number of options can be exhausting and disorienting – not least because of the pressure to make the “right” choice. As Barry Schwartz has pointed out, choices are an opportunity for regret. When forces beyond our control make us unhappy, at least we do not feel angry with ourselves for putting ourselves in that position.

The logical response to this pressure is to delegate some decisions to others. But, when reminded of how others are shaping our lives, we become indignant, calling it “creepy” and a violation of our free will. Users let Google filter the deluge of emails they receive daily, but they are incensed when Google weeds out an important message.

Likewise, when Facebook responds to complaints that users cannot keep track of all of their friends’ posts, it develops an algorithm to show users only the most relevant. But what qualifies a post as “relevant”? Twitter is now trying to solve the same problem.

Some have said that the choices that Facebook makes for its users could endanger its users’ mental health. But so can an overworked high-school teacher who cannot dedicate the needed energy to troubled students; magazines that promote unrealistic body images; sermons from clergy who believe that God does not forgive everyone; or even a stranger acting rudely on a train. All of these actors have their own ideas and motivations, and they are all manipulating our perspectives – and our moods – every day.

Of course, with advertising, the manipulation is particularly overt. But marketers also regularly test users’ emotional reactions to less explicit aspects of their products, from the color of their packaging or their placement in the store to their celebrity spokesperson. And they lure consumers into paying more by offering an over-priced option that makes anything less expensive seem reasonable.

Reading about Kim Kardashian’s new dress may make you want to buy the dress, but does it also make you feel ugly? If you’re a tech entrepreneur, you may aspire to be Marc Andreessen, but reading about him may also make you feel inadequate – especially if you are female.

Facebook inadvertently raised this issue of emotional manipulation and unintended consequences indirectly and in one context, by being secretive about its research and not giving people the chance to opt out beforehand. (Now, that would have been a smart choice!)

But, in the end, there is no problem to be solved – or even an issue that concerns Facebook in particular. What we are seeing is a fundamental improvement in our ability to discover the short-term and long-term impact of our actions and those of others. (Global warming, anyone?) As our ability to measure things and detect the impact of any change improves – in short, as we get better at empirical research – we need to consider what kind of responsibility is entailed by the knowledge that we gain.

Hide Comments Hide Comments Read Comments (5)

Please login or register to post a comment

  1. CommentedPaul Peters

    The numbing down towards passive acceptance of such psychological abuse is highly disturbing. I have four embryonic business plans that can dramatically change this game and i find it impossible to get seed funding to work them out because 'hot' money seeks for the biggest bang for their buck. It is not that there is no willingness or lack of ideals and possible solutions, but due to sheer quantity this herd of gate keepers is crowding out most good alternatives from getting access to funds they would usually have access to. Being right is not good enough anymore.. the more abuse we accept the more we numb down and dumb down and the more we lose our humanity, belittling us to passive bystanders while some marketing team lives our life.

  2. CommentedEllie Kesselman

    Twitter provides multiple toggles for users to opt-out of such relevancy filters. The option is quite granular; for personalized advertisements, for suggestions about users of potentially similar interests, and so forth. Similarly, Twitter users may choose not to have their location displayed, nor the history of their location retained. That is the ethical approach. Allow me to choose between relevance or serendipity!

    I am fully aware of Professor Barry Schwartz's findings about the "paradox of choice". I am a graduate of Swarthmore College; I learned psychology as a student in Professor Schwartz's classes. I don't recall that his coursework on choice and uncertainty would be applicable to the recent Facebook experiment.

  3. CommentedEllie Kesselman

    No. That comparison is incorrect. Merchandise advertising (whether targeted or not) is very different from distorting non-advertising content viewed by 700,000 individuals as part of a research study without informed consent. I realize that research subject consent was not legally required of Facebook. To conduct an experiment on such a large scale, without consideration of the ethical implications, is disturbing. Manipulating the flow informational content crosses the line into what is more accurately described as propaganda.

  4. CommentedProcyon Mukherjee

    Barry Schwartz has pointed out how the enormously funded marketing campaign for prescription drugs (it apparently makes no sense as we cannot buy them) is orchestrated to create the very cascade event that precludes the emotional isolation to give way to an overwhelming acceptance (almost like a contagion) that any repetitive promotional campaign leaves us with. Why do we bury the results that Facebook actually comes up with from an experiment that is so true for any network event or promotion within a network?

Featured