Technology and human welfare | Deloitte UK has been saved
Limited functionality available
Technological change seems to be proceeding at a blistering pace. Fast, cheap and ubiquitous communication and internet access have changed our lives. Sector after sector has been transformed, from advertising, newspapers and travel to books, banking and auction houses.
Our everyday experience testifies to the transformational effects of many new technologies. The puzzle for economists is that they seem to have had far less impact on the efficiency of the economy than they have had on our lives. Measured productivity – effectively the efficiency of production – has improved far more slowly than the capacity of computers.
This is not new. Almost 30 years ago, the Nobel laureate economist, Robert Solow, observed that, "You can see the computer age everywhere but in the productivity statistics." What has come to be known as the Solow Computer Paradox remains a central debate in economics.
One possible explanation is that it takes a long time for inventions to have their full economic effect. Economic historians often cite the long lag between the pioneering experiments of Thomas Edison in the 1880s and the organisation of US factories around electrical power, 40 years after. On this theory the future looks bright, with productivity and growth set to rise as earlier waves of innovation have their full effect.
An alternative view is that the notion of unprecedented change is an illusion. Google, Twitter or smart phones appear transformational to us. But their effect on human welfare and growth is dwarfed by earlier innovations, such as electricity, the internal combustion engine and antibiotics. This school of thought, championed by the US economist Robert Gordon, suggests that most of the big, life-changing innovations have already happened. As the Atlantic Magazine noted drily in 2012, "Even by the standards of a field known as the dismal science, Northwestern University economist Robert Gordon is a remarkably gloomy thinker".
More optimistically technology may well be lifting human welfare, but in ways that are not captured by conventional measures of economic activity. On this argument GDP is unable to measure the full benefit to consumers of today's technology. The unmeasured gain is the "consumer surplus". It is represents the difference between what a consumer pays and what they would have been prepared to pay. If you pay £1.20 for a coffee that would have been prepared to pay £2.20 for, your consumer surplus is £1.
The consumer sector is awash with technology and services which add to the consumer surplus.
Wikipedia is free to consumers but creates welfare gains that are not wholly captured by the charitable donations that fund it. TripAdvisor, Google or Twitter create value for users in excess of the advertising sales needed to fund them. Or consider the Kindle, a device which, for around $100, gives users instant access to a bookshop of hundreds of thousands of titles.
Economists have tried to measure the contribution of technology to the consumer surplus in numerous ways, including measuring the amount of leisure time people spend on-line and calculating the time saved by being able to search on the Internet.
Economists at the University of Michigan found that it took researchers, using the University library, 22 minutes to answer a battery of questions, which could be solved on-line in seven minutes. This calculation probably understates the value of the internet, in that it is based on access to a university library and it ignores the time and cost in getting there. Using the University of Michigan data Google's Chief Economist, Hal Varian, estimated that the internet generates a consumer surplus in the US of $65-150 billion per year.
Conventional measures of GDP do, indeed, fail to capture such effects. Yet this has always been the case. New goods and services are taken up by consumers because they make us happier, richer, more effective or healthier. David Landes, in his economic history of the world, describes how Nathan Mayer Rothschild, the richest man on earth at the time, died in 1836 for want of an antibiotic which would cost about $10 today.
This takes us back to the question of impact. Technology is, indeed, adding to the consumer surplus, but it always has. The real question is whether today's inventions are improving our welfare at a faster rate than the inventions of the past.
Most of us are ill-equipped to make such judgements. We suffer from what economists call a recency bias – a tendency to rate the recent past as being more significant than the more distant past. We take for granted past inventions and are captivated by today's breakthroughs.
In 1913 the magazine Scientific American ran a competition inviting readers to write in with a list of "the greatest ten inventions of our time". The rules defined "our time" as the period 1888-1913 and stated that the invention had to be patentable and in commercial use. The most popular choices included reinforced concrete, motion pictures, the aeroplane, x-rays and the car.
The transformational effect of such inventions from a 25-year period in the late nineteenth and early twentieth centuries slightly dampened my excitement about today's inventions. Future generations may be better placed than us to judge the true impact of today's technologies. In assessing the impact of the latest "breakthroughs" we should, perhaps, take the same line as Zhou En Lai when asked, in 1972, about the effect of the French Revolution: "It's too soon to tell."
Ian Stewart is a Partner and Chief Economist at Deloitte where he advises Boards and companies on macroeconomics. Ian devised the Deloitte Survey of Chief Financial Officers and writes a popular weekly economics blog, the Monday Briefing. His previous roles include Chief Economist for Europe at Merrill Lynch, Head of Economics in the Conservative Research Department and Special Adviser to the Secretary of State for Work and Pensions. Ian was educated at the London School of Economics.