Policymakers, scholars, business leaders, and ordinary citizens are all grappling with the far-reaching implications of digital technology, and economists are no exception. In fact, more than most other academic fields, economics urgently needs to revise its assumptions to make sense of the current era.
CAMBRIDGE – Just within the past few decades, digital technology has transformed the global economy and societies worldwide multiple times. In the 1980s, the automation of manufacturing produced waves of outsourcing and offshoring. In 1989, computer scientist Tim Berners-Lee invented the World Wide Web, and beginning around 2007, a confluence of smartphones, 3G/4G, and new algorithms brought much of the world’s population online, where we have been living ever since.
With the new technologies have come global production chains, e-commerce, social media, and the platform economy. And owing to advances in artificial intelligence, genomics, additive manufacturing, the green transition, and advanced materials, an even broader transformation is still on the horizon. During such periods of change, there are always more questions than answers for policymakers and academics alike. Each wave of digital disruption raises new problems for economists, in particular.
Some of these problems are well known, particularly the perceived threat to jobs – an area where David Autor of MIT and the University of Oxford’s Carl Benedikt Frey, among others, have already contributed significant scholarly work. Yet many other issues still need to be addressed. One is economic measurement. Standard definitions and data-gathering processes do not go far enough in capturing digital activities such as cloud computing and contract manufacturing.
To continue reading, register now.
Already have an account? Log in