davidow1_Visual China Group via Getty ImagesVisual China Group via Getty Images_chinasubwayphones Visual China Group via Getty Images

Special Edition Magazine, Spring 2020: Beyond the Techlash

Dopamine Capitalism

It is no secret that with the digital revolution has come many new forms of addiction, as users chase after social-media "likes" and other online stimuli. But less understood is the extent to which most of the tech industry now relies on behavioral manipulation to maximize profits at the expense of our wellbeing.

MENLO PARK – Here in Silicon Valley, it is an open secret that countless corporations and start-ups are working on ways to turn humans into robots they can control. More and more, the industry is focused not so much on technology as on what might be called “personality-disorder marketing.” Technologies are being created and deployed in recognition of the fact that all pleasures are more or less equal to the human brain, whether they originate with a win at the blackjack table, a line of cocaine, or “likes” on social media.

That’s why, over the past few decades, the powerful companies (and, in some cases, governments) that control the Internet have moved from accidentally or unwittingly creating human “robots” to knowingly doing so. Contrary to the usual warnings about artificial intelligence and automation, the biggest near-term threat to humanity is coming not from our machines, but from the people designing them.

Those shaping the current technological era have violated the public trust by choosing business models that are openly amoral or even immoral. Following in the footsteps of the tobacco companies and the casino business, they are consciously creating and fostering addictive behavior in the name of profits. In 2000, the average American spent 9.4 hours per week online; now, some estimates put that figure at 30 hours. And with the arrival of consumer virtual-reality devices and the Internet of Things (IoT), it is easy to imagine that we will soon spend 75% of our waking hours in virtual spaces designed to manipulate our behavior.

To be sure, “programmed” humans are nothing new. Throughout history, legions of soldiers have marched knowingly to their deaths, religious followers have accepted articles of faith without question, and consumers have purchased goods and services that they know they don’t need. In the 1930s, B. F. Skinner, the controversial Harvard psychologist, pioneered the field of behavioral analysis. Skinner believed that the “freedom” everyone treasures is in fact illusory; that everyone is actually controlled by subtle and complex rewards and punishments. This led him to conclude that a “technology of behavior” could be used to improve the human lot. Through a cycle of cue, activity, and reward, he developed a process of “operant conditioning” of subjects’ behavior from which casinos have been profiting ever since.

What is different now is how efficient, pervasive, and sweeping the technologies of human manipulation have become. Institutional “controllers” (to use a term from the industry) have dramatically improved their control processes, perfected their feedback loops, and sharpened their sensing mechanisms, all to collect more information about our bodies, emotions, habits, and brains.

In part, the technology-driven transformation of humans into robots is a feature of the modern age. It has happened very quickly over the course of the last century, starting with Frederick Winslow Taylor’s use of a stopwatch to time assembly-line workers, and evolving at breakneck speed in the last 40 years. Now, with the IoT’s spread, governments and corporations will be able to use an ever-wider array of information and tools – most of which will be invisibly embedded in the world around us – to control individual and collective behavior. Owing to rapid advances in facial recognition, our emotions will increasingly be an open book, subject to new, subtle forms of influence.

Subscribe to Project Syndicate
Bundle2020_web_beyondthetechlash

Subscribe to Project Syndicate

Enjoy unlimited access to the ideas and opinions of the world's leading thinkers, including weekly long reads, book reviews, and interviews; The Year Ahead annual print magazine; the complete PS archive; and more – all for less than $2 a week.

Subscribe Now

Moreover, the basic costs of turning people into robots have fallen significantly. Rather than luring people into casinos or getting them hooked on cigarettes, the leading tech platforms, which already have access to almost everyone who owns a smartphone, need only draw on the necessary behavioral science in designing their products and services for addiction.

It doesn’t have to be this way. But first, policymakers, business leaders, and ordinary citizens must recognize the magnitude of the problem. The engine of behavioral manipulation runs on personal information, much of which we hand over willingly. Ask yourself how often you refuse “freeware” like Gmail or Facebook, each of which puts all of your online actions and secrets directly into the hands of distant corporations. Did you know that your purchase of a candy bar at the store yesterday will be cataloged across hundreds of servers and thousands of institutions around the world over the next few weeks?

For now, we remain free. It is our right to choose whether to participate in the manipulation, and to demand more information about what that process entails. Citizens should have complete access to and ownership of the collected records of their personal information. Restoring ownership and control might mean allowing businesses to store credit card information for a few years, while limiting their ability to access a user’s browsing history to no more than a few days. But the controllers also should be required to disclose their techniques of behavioral manipulation; and “opt-out” should be the default user setting. All of this will require government regulation, with strong penalties for any organization that steps out of bounds.

An even more radical proposal would let users not only decide how and when their data are shared, but also monetize it. If an online-game company wants to manipulate you into providing it with behavioral data for eight hours every day, perhaps it should pay you for that service. But we also must exercise self-discipline. We are awash in wonderful new services and social networks that could enhance our lives in the real world, rather than sucking us deeper into virtual holes. Our goal should be to create a business environment in which these are the models that succeed.

By claiming ownership of our data and placing a price on behavioral manipulation, we can alter the fundamental conditions that have given rise to a toxic industry. If we can establish user control and transparency, further positive change could follow: legislation mandating additional cyber-protections; industry-wide codes of ethics; and a generation of tech entrepreneurs that gets back into the business of creating value for society.

We have seen Silicon Valley disrupt a growing train of activities and industries worldwide in a shockingly brief period. Now it needs to get on board.

https://prosyn.org/IivKLyq