The third industrial revolution: a damp squib or a time bomb?

Photo: Montecruz Foto @ flickr.com. Photo: Montecruz Foto @ flickr.com.

There’s no evidence that the internet revolution has made most of us better off. So was it all hype, or are the good times still just around the corner?

Open a newspaper or browse your favourite news site and you’ll soon find someone telling you how the internet will change everything. Well, it’s certainly changed the way I work and I’m well aware that my daughter’s childhood will be quite different to mine because of it. The internet has destroyed the encyclopaedia industry, cut music industry executives down to size and wiped the smug smile off the faces of London cabbies.

But has the internet and associated technology – smartphones, connected devices, social media and so on – made us better off? The internet is not that new. It’s been around for more than 20 years – almost half a working life. In that time, have we seen big advances in prosperity and standards of living? Nope. Since the early 90s, we’ve had a boom, a mini-crash, a mini-boom, the mother of all crashes and a period of stagnation. Not much to write home about.

The big claim – economically – for this “third industrial revolution” was that it would make us much more productive, with gains to set alongside those of the first (steam power, railways) and second (electricity, running water). It just hasn’t happened. Or at least not yet.

Economists measure productivity by looking at “output per hour” – how much stuff we make or get done for each hour we work. However you look at it, the picture’s the same. There’s nothing remarkable about productivity in the internet era. Here’s the chart showing output per hour for the US, the UK and the whole G7 for the last twenty years.

prod-ukusag7-9414-oecd
Output per hour worked: 1994-2014, 2005=100. source: OECD.

And if you don’t like that one, I have others. Try this one showing rates of G7 productivity growth since 1971, or this one showing output per hour for the UK. Short-term fluctuations aside, a long-term decline sets in around the mid-seventies, with a definite slowdown from the mid-noughties onwards — before the 2008 crash.

Back in 1987, when the standalone IBM was king, the Nobel prize winning economist Bob Solow wrote, “We see computers everywhere except in the productivity statistics.” The same goes for the internet today. Neither digital enthusiasts nor economists dispute this. The big argument is about why it hasn’t happened and whether it’s ever going to.

It seems to me that there are three possible explanations. First, digital technology isn’t all it’s cracked up to be. Second, it’s still early days and the big gains are yet to come. Third, something else is going on that we don’t yet understand and can’t measure in conventional economic terms.

No big deal

Many mainstream economists simply argue that the web, our smartphones and other digital gizmos haven’t delivered the sort of fundamental boost to labour productivity we got from steam power in the late 18th century, or from electricity, cars and running water between 1880 and 1930. As the novelist and economic investigator John Lanchester puts it,“The lightbulb changed the world; Facebook is just a way of letting people click ‘like’ on photos of cats that resemble Colonel Gaddafi.”

The heavy lifting on this was done by Robert Gordon, an economics professor at Chicago’s Northwestern University. In his 2012 paper, Is US Growth Over?, Gordon argued that we gained much more by removing drudgery from everyday manual tasks than from automating paperwork and speeding up communications. Digital technology made some earlier inventions more efficient and easier to consume, but it hasn’t given us as many new things that save time or make us more productive at work. An onboard computer may make a washing machine easier to use and more efficient, but it’s the electric motor and running water that make it what it is. Likewise, digital technology has transformed the film industry in many ways, but it was the Lumière brothers’ invention of the moving picture that created a whole new industry in the first place.

What’s more, we’ve already had most of gains from computer technology, says Gordon. Using computers for large-scale calculation and record-keeping is so 1970s, and word processors were making typists obsolete back in the 1980s. The late 1990s spurt of productivity, as firms established websites and e-commerce services, was largely over by 2005. Gordon doubts that the internet itself is a technological breakthrough on a par with the steam engine. It’s just a live communications network, and we’ve had telephones since the 1880s.

Gordon’s final argument is that some of the digital gains cancel themselves out. Computers crash, people spend more time “consuming” leisure products like movies, music and porn, and we waste creative energy dealing with the negative side effects — as the sci-fi writer Neal Stephenson puts it, “I see the best minds of my generation writing spam filters”. And the lightening-fast growth in computing power is often gobbled up by increasingly complicated and power-hungry software and digital media. “What Intel giveth, Microsoft taketh away,” quips Gordon.

For “digital pessimists” like Gordon, the third industrial revolution simply wasn’t as significant as the first two and didn’t last nearly as long — hence the significant tailing off of productivity in recent years.

You ain’t seen nothing yet

Not surprisingly, digital evangelists – aka people running internet companies – think economists like Gordon have got this all wrong. The economists are looking backwards rather than forwards, they say. Forget that standalone IBM: the third industrial revolution isn’t computers, or even tablets and smartphones. It’s the network.

“Everything changed… when the computer married the telephone,” wrote Kevin Kelly in his response to Gordon’s paper. “This is when ordinary people noticed computers. They could get online. Everything went online. Retail changed, production changed, occupations changed.” And, he says, it may take another 80 years for us to feel the full effects.

“What if we’re just a few decades into the new industrial revolution, living in a contemporary version of 1780?” asks Lanchester. After all, Thomas Newcomen invented the steam engine in 1712, but it was another 65 years before Richard Arkwright deployed one in a cotton mill, and Richard Trevithick’s first steam locomotive didn’t appear until 1801 (the French, naturally, claim that Nicholas Cugnot got there first with his 1769 steam wagon, but that – literally – crashed and burned). More than a century elapsed between Benjamin Franklin’s initial experiments with electricity and Thomas Edison’s invention of the light bulb in 1879.

If this is “1780”, we’ll soon find out the hard way that productivity is a double-edged sword: the more productive we are, the more we can earn, but fewer of us are needed to do the work. According Google CEO Larry Page, millions, if not billions, of us face being made obsolete as advances in robotics and network technology wipe out the remaining skilled or semi-skilled manual jobs, while making deep inroads into “white collar” work. Take just two products from Google’s pipeline: imagine the driverless car destroying all the driving jobs, while Google Translate wipes out translators and interpreters. Then multiply that by a hundred or a thousand.

In 2013, two Oxford dons, Carl Frey and Michael Osborne, analysed over 700 jobs and found that 47% were likely to be replaced by machines in the next 20 years. Many are traditional white collar jobs like telemarketers, insurance underwriters, accounts clerks, bank tellers, secretaries, and credit analysts. As for most of the remaining skilled manual jobs – machine setters, drivers, and even chefs – forget it.

According to Page, this could lead to a rather frightening world (for most of us) where massive productivity gains lead to severe deflation (falling prices and wages), collapsing property values and a severe shortage of work. Even more frightening is that Page, one of the most powerful men in the world, doesn’t seem bothered at all.

This time it’s different

Photo: Uros Velikovic @ flickr.com

In a way, both these explanations are backwards looking. They look to patterns in the past – the first two industrial revolutions – as a guide to the future. But what if something else is going on? What if, this time, it’s different?

It could be that the gains from the internet aren’t visible yet because, rather than repeating previous cycles, we’re transitioning to a new type of economy. So, perhaps we can’t see them because we don’t yet understand them or have the tools to measure them.

Kevin Kelly reckons labour productivity statistics are the wrong place to look. “I think the real wealth in the future does not come from saving labour but in creating new kinds of things to do. In this sense, long-term wealth depends on making new labour,” he says.

In conventional economics, a productivity “gain” means doing the same thing quicker or more effectively; it can’t really cope with us spending time doing new things which might not result in time saved or more widgets produced. As Kelly puts it, “In short, productivity is for robots. Humans excel at wasting time, experimenting, playing, creating and exploring. None of these fare well under the scrutiny of productivity.”

Channel 4 economics editor Paul Mason picks up on some of this thinking in his new book, Postcapitalism. Mason thinks we’re not seeing the gains we expected because of the specific nature of information technology, which reduces the value of our labour while delivering benefits which are hard to capture and measure in terms of cold hard cash. The result looks like a period of stagnation.

Previewing his book in the Guardian in July, Mason said: “The modern equivalent of the long stagnation of late feudalism is the stalled take-off of the third industrial revolution, where instead of rapidly automating work out of existence, we are reduced to creating… ‘bullshit jobs’ on low pay.” Not surprisingly, the lower pay of these bullshit jobs is a result of their low productivity.

Mason reckons that digital technology is undermining the way the conventional economy works because the main product – information – is abundant and easy to make freely available. Copyright and patents are becoming harder and harder to enforce. You can’t really stop people from copying and sharing data, designs, software and technical know-how any more than you can stop them copying music, videos and other people’s words (as I’ve just done with Mason’s). He points out that many of the star products of the third industrial revolution – Wikipedia, the Linux operating system (used in 97% of the world’s supercomputers), WordPress and the Apache server software, for example – are free, produced under a non-market, open source system. This basically means people doing it for nothing in their spare time (and, increasingly, in their work time — with or without their employers’ blessing).

The growth of “collaborative” methods of production like open source, sharing, swapping and crowdsourcing, make it hard to slap a value on much of the work done in the new economy. Improvements to WordPress and Linux will undoubtedly increase many people’s productivity but we have no price yardstick to measure their value by – because they’re free. In conventional economics, price is only indicator of value we have.

As more and more stuff gets done without anyone being paid directly for doing it, Mason says, work is being separated from wages. The time we spend creating websites, working with online information tools, blogging and “creating content” may be productive to us, but how much of it do we actually get paid for? If you book your holiday using HomeExchange.com or offer your gardening skills via a time bank, no money changes hands, so there’s no “product” for economists to measure. But the same service is being provided as when someone pays for a hotel room or hires a gardener.

Using conventional economic tools to measure the impact of these changes is like asking a medieval court official to gauge the likely impact of industrial capitalism looking at things like crop yields, tithes and the reproductive behaviour of serfs.

Economists are rightly sceptical about things which can’t be measured quantitatively and some commentators, like Telegraph columnist Iain Martin, have scoffed at Mason’s collaborative production as little more than hobbies. Mainstream economists prefer answers that fit into their existing models, so most tend to agree with Gordon that the internet’s effect on productivity was over-hyped.

But I’m not so sure. Economists are notoriously slow to spot changes which challenge their basic beliefs about how the world works. Hardly any mainstream economists saw the 2008 crash coming. Trapped in models where everyone behaves rationally, they couldn’t envisage banks being so stupid as to invest in things that were worthless. If something big, strange and unmeasurable is happening to our economy, economists will probably be the last people to tell us about it.