Artificial intelligence is turbocharging individual companies’ stock market valuations but overall economic productivity is sliding. These are not contradictory trends, they are deeply symptomatic of a transition phase when stunning new technological advances seep into society and slowly become general purpose technology — like the internet, electricity or the internal combustion engine. Like in all debates, pessimists and optimists abound and they often talk past each other. Yet, the reasons to bet on the optimists are compelling.
Past productivity growth is unable to explain transitory phases when new technology is slowly assimilated in the wider economy. Incumbents are slow to change their ways because of management hubris and a settled habit to go after bigger markets rather than new, small and emerging opportunities. AI has three crucial characteristics capabilities that put it in the league of a future GPT: it is pervasive, marked by consistent improvement over time and it has the ability to spawn complementary innovations. But “never mistake a clear view for a short distance.”
Superhero or small potatoes — most theories on the rise of artificial intelligence (AI) go in these two wild directions.
There’s a third middle-of-the-road argument gaining currency: the bot paradox.
Stunning advances in artificial intelligence technology are reflecting everywhere except in productivity statistics. Three leading researchers in the field argue that this is because the prospect of future economic success and proven statistics of past economic performance typically show maximum gaps at a time of intense technological transformation and one such time is now.
Productivity growth in most of the world’s rich countries has been sluggish since the early 2000s and way before the crash of 2008. In parallel with the rise of smartphones, self driving cars and computers which can beat humans at the world’s most complicated mind games, the key measure of technological progress has been a puzzle. Economists are calling it the ‘productivity paradox.’
Back in 1987 MIT economist and Nobel Prize winner that year Robert Solow told The New York Times: “You can see the computer age everywhere but in the productivity statistics.” We know today how all that changed so dramatically in the 1990s.
Could AI in 2018 be journeying through precisely the same continuum that Solow pointed to in the ’80s?
One of the biggest reasons for lags between the immense possibilities of new technology and underwhelming economic performance in the short term is the “curse of knowledge” that afflicts incumbents in business. As organisations grow, new markets seem less attractive to them because they are either trained to play for well understood and accepted margins or don’t have the data to support going after new opportunities. Since data exists in the past tense, new technology adoption slows down in established businesses.
This is an important storyline wedging itself between the two extremes of thinking about AI and the divergence of performance and statistics is far from contradictory, they are “symptomatic of an economy in transition,” say Erik Brynjolfsson and Daniel Rock of the MIT Sloan School of Management and Chad Syverson of the University of Chicago Booth School of Business in a 44 pager published by the US National Bureau of Economic Research.
“Never mistake a clear view for a short distance,” the authors say as they analyse what they call “compelling” evidence of the secret sauce that lies at the intersect of implementation and restructuring lags that inform the ascent of new technologies in society.
Brynjolfsson, Roch and Syverson also make the case that AI is a general purpose technology (GPT) with broad implications for economy-wide wealth creation.
The authors argue that the most important economic effects of AI and machine learning will come because they have the three characteristics of GPTs — pervasive, marked by consistent improvement over time and the ability to spark complementary innovations.
AI and machine learning, they say, have the capabilities to become a GPT just like the steam engine, electricity, the internal combustion engine and more recently, computers. One way in which AI may be superior or at least different from all the earlier GPTs is their ability to improve themselves over time based entirely on large sets of data being fed into them.
“Only one self-driving vehicle needs to experience an anomaly for many vehicles to learn from it. Waymo, a subsidiary of Google, has cars driving 25,000 ‘real’ autonomous and about 19 million simulated miles. All of the Waymo cars learn from the joint experience of the others. Similarly, a robot struggling with a task can benefit from sharing data and learnings with other robots that use a compatible knowledge-representation framework. Thinking of AI as a GPT dramatically changes the implications for output and welfare gains.”
Although AI led systems are surpassing human performance and turbocharging stock prices for many companies intimately tied to these systems, overall economic productivity has slid by half over the last 10 years in America.
Over the long arc of innovation in human history, there have always been lags between the rise of new technologies and their economy wide assimilation as a ‘general purpose technology’ like the internet of today or say, the internal combustion engine.
The benefits of AI will not accrue suddenly to economic performance, the winners will be those with the lowest adjustment costs and those who can align complementary skills and systems quickly at the firm level.
So, how do you impute a value to AI technology and the complementary skills that are essential to make it work at capacity? So far, this is at best an intangible capital which is reflecting in remarkable stock market valuations of some firms.
Yet, these valuation problems are not peculiar to AI, it has happened before in the long continuum of economic progress. Data typically exists in the past tense and investors of capital are forward looking. At a time of rapid technological shifts the disagreement on a common rubric between two sides between can be almost impossible to bridge.
“Ironically,” say Brynjolfsson, Rock and Syverson, “the more profound and far-reaching the potential restructuring, the longer the time lag between the initial invention of the technology and its full impact on the economy and society.”
The past decade’s productivity data tell us little about trends for the coming decade. Slowing productivity in the present day does not rule out faster productivity in the future. Also, past productivity growth is unable to explain transitory phases when new technology is slowly assimilated in the wider economy. One big reason why: Looking only at productivity data, we could not have predicted either the lull in the early 1970s or even the upside of information technology in the 1990s.
Two dominant reasons why there’s a lag between recognising a new technology’s potential and its measurable productivity: Firstly, it takes time to include a new technology at scale inside the firm and secondly, these investments cannot rev up in isolation. Complementary investments are essential to juice the investments — business process realignments, re-skilling, training and maybe even a more fixed nature of assets like additional real estate and so on.
Although a convenient parallel, it’s not accurate to compare the first wave of computerisation and the present wave of machine learning capabilities. There’s a fundamental shift in the methods being used, say Brynjolfsson et al. In computer programmes, human programmers mapped inputs to outputs while machine learning systems use ‘categories’ of general algorithms to find relevant mappings on their own after being fed very large sample data sets. Perception and cognition — two essential skills for most types of human work — are central to machine learning systems as distinct from computer programmes.
The best possible use of AI involves new firm level business processes, re-skilling, investment in real estate, computing resources and multiple other expenses. This involves both capital and maintenance costs.
So how long must we wait until AI is pervasive enough for us to see it everywhere? Hindsight tells us it took 25 years after the integrated circuit for the capital stock of computers to plateau at 5%. Even 10 years prior, it was only half of that. These seemingly insignificant digits inform today’s productivity lags that are running in parallel to a time of intense technological shifts.
A good recent example is from the era of Information Technology and the absorption of computers in the world.
Benefits from firms’ IT investments peaked around the seven year mark. Also, firms typically spent much more on business process redesign and training than on the direct costs of hardware and software. Research shows that “each dollar of IT capital stock is correlated with about $10 of market value.” Firms that have combined IT assets with new business processes have generally done better than those who chose one over the other in isolation.
The combination of IT investments and processes alone are crucial but not sufficient for productivity gains, there are other ducks to be lined up: supply and distribution chains too.
The retail industry in the US proves that successes for some early adopters of technology and complementary processes mean some other firms may be killed off in the bargain — especially those who did not move fast enough to mesh the two investments quickly.
Usually management hubris is the first step to doom.
Also, industry innovation does not always go in the direction we see 20 years later. In retail for example, the rise in e tailing did not push all firms towards e-commerce investments. Many went for outsize superstores and warehouses first, because that’s what they knew how to do and the incumbent mentality pushed them in the direction of what they knew rather than the new technology which promised only a small market at the time.
Online retail became a hot topic in the 1990s as internet penetration in the US rose to nearly 60% of all households by 1999. Yet e-commerce share in total retail commerce was only 0.2 % of total sales in 1999.
Applying lenses from the past to the age of AI, the paper argues that productivity growth driven by a GPT can arrive in multiple waves and each of these waves can initially go in contradictory directions too.
For success though, the old ways of organising work, education and production are “unlikely to remain optimal in the future,” the paper says.
“Theory predicts that the winners will be those with the lowest adjustment costs and that put as many of the right complements in place as possible,” says Brynjolfsson.
But Brynjolfsson himself agrees that only slices of current jobs are suitable for machine learning, not entire processes. It follow from here that the reality is more complex than the “simple replacement and substitution story emphasised by some.”
“Although economic effects of machine learning are relatively limited today, and we are not facing the imminent ‘end of work’ as is sometimes proclaimed, the implications for the economy and the workforce going forward are profound,” he writes.
We see both this concern and understanding of the long arc reflected in the AI strategies of almost every country that has scrambled to put together thought papers on the subject.
In multiple waves, the steam engine, electricity and computers have transformed how humans live and work. But for that transformation to touch our lives, entire businesses had to be reinvented, and other complementary technologies had to be created to exploit the breakthroughs. That did not happen overnight, it took decades of work.
A leading voice on the other end of the spectrum — Northwestern University economist Robert Gordon — says that today’s AI advances are small in comparison to say, breakthroughs like the electric motor or plumbing. Also, many of the benefits coming from today’s most valuable companies are free and instant. How do you measure this stuff?
Scott Stern of MIT’s Sloan School of Management warns against quick turnaround times on AI explosion in the wider economy: “If I tell you we’re having an innovation explosion, check back with me in 2050 and I’ll show you the impacts. General purpose technologies take a lifetime to reorganize around.”
Brynjolfsson, Erik; Rock, Daniel; Syverson, Chad; Artificial Intelligence and the Modern Productivity Paradox: A Clash of Expectations and Statistics; NBER Working Paper; 2017.
Gordon, Robert; The Rise And Fall Of American Growth; Princeton University Press; 2016.
Raines, Howell; Nobel in Economics To MIT Professor; The New York Times; 22 October 1987.
Rotman, David; The productivity paradox; MIT Technology Review; 18 June 2018.
Syverson, Chad; Challenges to Mismeasurement Explanations for the US Productivity Slowdown; NBER Working Paper; 2017.
The views expressed above belong to the author(s). ORF research and analyses now available on Telegram! Click here to access our curated content — blogs, longforms and interviews.
Nikhila Natarajan is Senior Programme Manager for Media and Digital Content with ORF America. Her work focuses on the future of jobs current research in ...Read More +