WASHINGTON, DC – Over the last decade or so, productivity growth has slowed considerably in most major developed economies, even as impressive advances have been made in areas like computing, mobile telephony, and robotics. All of these advances ostensibly should have boosted productivity; and yet, in the United States, a world leader in technological innovation, business-sector labor productivity growth in 2004-2014 averaged less than half the rate of the previous decade. What is going on?
One theory that has gained a lot of traction lately is that the so-called productivity paradox does not actually exist. Productivity growth only seems to be dropping, the logic goes, because the statistics we use to measure it fail to capture fully recent gains, especially those from new and higher-quality information and communication technology (ICT). If prices do not reflect quality improvements in the new products, price deflators are overestimated, and real output is underestimated.
Moreover, the skeptics point out, standard measures of productivity are based on GDP, which, by definition, includes only output produced. Consumer surplus – which is growing fast, as Internet-based services like Google search and Facebook generate substantial utility to consumers, at a market price of close to zero – is ignored.
There is some logic to this argument. Indeed, a recent review of research on productivity by the Brookings Institution and the Chumir Foundation confirmed that gains from new technologies are underestimated, owing to measurement issues relating to both product quality and consumer surplus.