Computing, Communications and Cognition: Why Moore's Law Hasn't Delivered Productivity Growth (1976–2025)
In 1987, the Nobel laureate Robert Solow made an observation that would haunt economists for the next four decades. Writing in the New York Times Book Review, he quipped: "You can see the computer age everywhere but in the productivity statistics." It was a devastatingly concise summary of a puzzle that remains, in many ways, unsolved.
By 1987, the personal computer had been on the market for six years. IBM had launched the PC in 1981. Apple had debuted the Macintosh in 1984. Spreadsheets, word processors, and databases were transforming office work. Microprocessors were doubling in power every eighteen months. Yet measured productivity growth in the United States had been declining since the early 1970s and showed no sign of recovering. Computing power was growing exponentially. Productivity growth was flat.
This is Lesson 5 of the Productivity 250 series: the story of a fifty-year era in which the most powerful General Purpose Technologies in human history delivered less measured productivity growth than the steam engine. The paradox is real. The explanation, as we shall see, lies not in the technology but in what we measure and how we account for value.
The Context: After the Golden Age
The previous lesson covered the Golden Age of Productivity (1926-1975), when TFP growth averaged a remarkable 1.8% per year. Suburbanisation, mass consumption, jet travel, and the interstate highway system powered the longest sustained productivity boom in recorded history.
That boom ended abruptly. The 1973 OPEC oil embargo quadrupled oil prices overnight. A second oil shock followed in 1979. Stagflation--the toxic combination of high inflation and low growth--gripped the Western economies. Between 1973 and 1982, US TFP growth fell to roughly 0.5% per year, a collapse that mystified policymakers.
Yet even as macroeconomic chaos unfolded, a technological revolution was quietly beginning. In 1971, Intel released the 4004--the world's first commercial microprocessor. It contained 2,300 transistors on a single chip. In 1965, Gordon Moore had predicted that transistor density would double approximately every two years. He was right. By 2025, a single chip contains over 100 billion transistors. Computing power has increased by a factor of roughly one billion since Moore made his prediction.
The C3 era produced the most powerful General Purpose Technologies in history--microprocessors, the internet, smartphones, and artificial intelligence. Yet measured TFP growth averaged just 1.0% per year, barely half the Golden Age rate. This is the IT productivity paradox, and it remains one of the most important unsolved puzzles in economics.
The General Purpose Technologies
The C3 era--Computing, Communications, Cognition--produced five General Purpose Technologies, each building on its predecessors.
The microprocessor (1971). Intel's 4004, designed for a Japanese calculator company, was the first commercial microprocessor. It could perform 60,000 operations per second. Today's processors perform trillions. The microprocessor didn't just make computers cheaper; it made computation ubiquitous. Within two decades, processors were embedded in cars, appliances, factory equipment, and medical devices. Computing ceased to be a specialist activity and became infrastructure.
The personal computer (1981). IBM's PC and its clones democratised computing. Before the PC, a computer cost hundreds of thousands of pounds and required dedicated staff. After the PC, any office worker could have computing power on their desk. Spreadsheets (VisiCalc in 1979, Lotus 1-2-3 in 1983, Excel in 1985) transformed financial analysis. Word processors replaced typing pools. Databases replaced filing cabinets. The PC didn't just automate tasks; it eliminated entire occupational categories.
The internet (1990s). Tim Berners-Lee's invention of the World Wide Web in 1989 at CERN, combined with the Mosaic browser (1993) and the commercialisation of internet access, created the first global communications network. Email replaced post. Websites replaced catalogues. E-commerce (Amazon, 1994) challenged high-street retail. The internet reduced transaction costs to near zero for information exchange. It also created entirely new categories of economic activity: search (Google, 1998), social networks (Facebook, 2004), and the platform economy.
The smartphone (2007). Steve Jobs launched the iPhone and put a computer, camera, GPS receiver, music player, and communications device in every pocket. Within a decade, there were more smartphones than people with running water. The smartphone created the mobile internet, the app economy, and a radical shift in consumer behaviour. It also produced, for the first time in history, continuous real-time data about human activity at global scale.
Large Language Models (2022). OpenAI's ChatGPT, launched in November 2022, brought artificial intelligence into mainstream use. For the first time, machines could generate human-quality text, translate between languages, write code, and reason about complex problems. LLMs represent the potential fifth GPT of this era--a technology that could augment or replace cognitive labour the way steam engines replaced physical labour.
The electrification analogy is instructive. Paul David's famous 1990 paper demonstrated that electricity was invented in the 1880s, yet factories didn't achieve full productivity gains until the 1920s--a lag of 30-40 years. The reason: factories had to be completely redesigned. Early electric factories simply replaced steam engines with electric motors, keeping the same layout. Only when engineers realised they could distribute small motors throughout a factory--eliminating the central power shaft, redesigning the floor plan, introducing assembly lines--did productivity surge. We may be at a similar inflection point with AI today.
The Productivity Paradox
Despite this extraordinary technological progress, the productivity numbers tell a sobering story.
TFP Growth by Sub-Period
| Period | TFP Growth (Annual) | Key Technologies | Context |
|---|---|---|---|
| 1976-1995 | 0.7% | PCs, early internet | Stagflation, deregulation, offshoring |
| 1995-2005 | 1.5% | Dot-com boom, broadband | Brief productivity acceleration |
| 2005-2019 | 0.5% | Smartphones, cloud, social media | The "great stagnation" |
| 2020-2025 | 1.2% (estimated) | Remote work, AI adoption | Pandemic restructuring + LLMs |
The pattern is striking. The 1995-2005 period--when businesses finally integrated internet technologies into their operations--produced a genuine productivity acceleration. But it was brief. After 2005, despite the smartphone revolution, cloud computing, and social media, TFP growth collapsed to just 0.5% per year. Robert Gordon, the Northwestern economist, called this "the great stagnation" and argued that the internet's productivity impact was fundamentally smaller than that of electricity, the internal combustion engine, or indoor plumbing.
Three explanations have been advanced for the paradox.
The measurement problem. GDP statistics were designed for an economy that produces physical goods. How do you measure the productivity gain from free services like Google Search, Wikipedia, or GPS navigation? How do you account for the quality improvement when a smartphone replaces a camera, a map, a calculator, a music player, and a telephone simultaneously? Erik Brynjolfsson and Andrew McAfee at MIT have argued that conventional GDP statistics miss perhaps 40% of the value created by digital technologies because that value accrues to consumers as free services rather than as market transactions.
The J-curve effect. New General Purpose Technologies initially reduce measured productivity. This seems counterintuitive, but the logic is sound. When a transformative technology arrives, firms invest heavily in it (reducing current profits), reorganise their operations (disrupting existing processes), and retrain workers (temporarily lowering output per hour). Only after this painful reorganisation period--which can last a decade or more--do the productivity gains materialise. The J-curve hypothesis, developed by Brynjolfsson, Rock, and Syverson, suggests that the 2005-2019 productivity slowdown was partly the downward stroke of a J-curve, as firms invested in AI, cloud infrastructure, and data capabilities without yet reaping the full returns.
The distribution problem. A significant share of IT investment has gone into activities that redistribute value rather than create it. High-frequency trading, targeted advertising, social media engagement optimisation, and algorithmic pricing are highly sophisticated technologies. They generate enormous returns for the firms that deploy them. But they don't necessarily increase total economic output. They shift profits from one firm to another, or from consumers to producers, without creating new goods or services.
The Balance Sheet Inversion
Perhaps the most consequential transformation of the C3 era is invisible to casual observers. It is the inversion of the corporate balance sheet.
In 1975, according to research by Ocean Tomo, 83% of the market value of S&P 500 companies was attributable to tangible assets: factories, equipment, inventory, real estate, and cash. Intangible assets--patents, brands, software, customer relationships, organisational capability--accounted for just 17%.
By 2020, those proportions had reversed. Intangible assets accounted for 90% of S&P 500 market value. Tangible assets: 10%.
This is not a subtle shift. It is a wholesale inversion of what constitutes corporate value. A company like Apple is worth over $3 trillion. Its factories (largely outsourced to Foxconn) represent a fraction of that value. Its design capability, its brand, its ecosystem of apps and services, its customer loyalty, its supply chain orchestration--these intangible assets are where the value lives.
The Balance Sheet Gap
In 1975, an investor could look at a company's balance sheet and see 83% of its value. By 2020, the same balance sheet revealed only 10%. The gap between market value and book value -- the "intangible premium" -- has become the defining feature of modern corporate finance. Measuring this gap is at the heart of Opagio's Valuator.
Two landmark accounting standards attempted to close this gap, but succeeded only partially.
SFAS 86 (1985). The Financial Accounting Standards Board issued Statement 86, which for the first time required companies to capitalise the costs of developing computer software intended for sale. Before SFAS 86, all software development costs were expensed immediately--treated as current-period costs rather than long-term assets. SFAS 86 allowed companies to put software on the balance sheet once "technological feasibility" was established. This was a first tentative step toward recognising intangible assets, but it applied only to software for external sale, not internal-use software.
IAS 38 (1998). The International Accounting Standards Committee issued IAS 38, the first comprehensive international standard for intangible assets. IAS 38 defined an intangible asset as "an identifiable non-monetary asset without physical substance" and established criteria for recognition: the asset must be identifiable, the entity must control it, and future economic benefits must flow from it. IAS 38 was a conceptual breakthrough. But it contained a fatal limitation: internally generated goodwill, brands, mastheads, customer lists, and similar items could not be recognised as assets. Only acquired intangibles--those purchased through a business combination--could appear on the balance sheet.
IAS 38 created a two-tier system. A brand acquired through an acquisition is a balance sheet asset. The same brand, built organically over decades, is invisible. This means that organic growth--building brands, training employees, developing proprietary processes--is systematically penalised by accounting standards. The most productive investments a company can make are the ones that accounting cannot see. This is the measurement crisis that Opagio exists to address.
Key Figures: The Architects
Gordon Moore (1929-2023)
Moore co-founded Intel in 1968 and in 1965 published the observation that would become "Moore's Law": the number of transistors on a chip doubles approximately every two years. This wasn't a physical law but an empirical prediction about the pace of semiconductor innovation. It held for over fifty years, driving the exponential growth in computing power that enabled every technology in this lesson. Moore's insight was that progress in computing was not linear but exponential--a pattern that humans find deeply unintuitive.
Tim Berners-Lee (b. 1955)
A British computer scientist working at CERN, Berners-Lee invented the World Wide Web in 1989. His proposal--a system of interlinked hypertext documents accessible via the internet--was initially dismissed as "vague but exciting" by his supervisor. Crucially, Berners-Lee made the web's protocols (HTTP, HTML, URLs) open and royalty-free. Had he patented the web, the digital revolution would have unfolded very differently. The web is perhaps the most valuable intangible asset ever created, and it appears on nobody's balance sheet.
Steve Jobs (1955-2011)
Jobs didn't invent the smartphone, the personal computer, or the digital music player. He integrated existing technologies into products that redefined consumer expectations. The iPhone (2007) combined a phone, an iPod, and an internet communicator. But Jobs's real innovation was the App Store (2008), which created a platform for third-party developers and generated an entirely new software economy. Jobs understood something that technology companies routinely forget: the value isn't in the hardware. It's in the ecosystem.
The Workshop Floor: Hidden Stories of Productivity
Solow's Quip and the Statistician's Nightmare
Robert Solow won the Nobel Prize in Economics in 1987--the same year he made his famous remark about computers and productivity statistics. The irony was exquisite. Solow had won the Nobel partly for developing the "Solow residual," the growth accounting method that attributes productivity growth to technological progress after accounting for capital and labour inputs.
The Solow residual was designed for an industrial economy. You measure inputs (hours of labour, tonnes of steel, kilowatt-hours of electricity) and outputs (cars produced, tons of wheat harvested). The residual--the gap between input growth and output growth--is productivity. But what happens when the output is "better search results" or "more convenient communication" or "entertainment consumed during a commute"? These outputs are real. People value them enormously. But they don't appear in GDP statistics because nobody pays for Google Search and nobody accounts for the time saved by GPS navigation.
Solow's quip was prescient. The computers were generating productivity. The statistics just couldn't see it.
The Software That Ate the Balance Sheet
In 1980, the entire US software industry had revenues of roughly $2 billion. By 2020, global enterprise software revenues exceeded $500 billion. Yet for decades, most of this value was invisible to accountants.
Before SFAS 86 in 1985, every dollar spent developing software was expensed immediately. A company could spend $100 million building a software platform that would generate revenue for twenty years, and the entire cost would appear as a current-year expense. The balance sheet showed nothing. The income statement was distorted. Investors had no way to distinguish between a company burning cash on worthless projects and one building an enormously valuable asset.
Even after SFAS 86, the rules applied only to software developed for external sale. Internal-use software--the custom systems that run supply chains, manage customer relationships, and coordinate operations--remained invisible. A company might spend $50 million building a proprietary logistics platform that gave it a decisive competitive advantage. On the balance sheet: nothing.
Today, software is the most valuable asset class in the global economy. It is also the most poorly measured.
Go Deeper: Recommended Reading
| Book | Author | Year | Why Read It |
|---|---|---|---|
| The Rise and Fall of American Growth | Robert Gordon | 2016 | The definitive argument that the digital revolution's productivity impact is smaller than earlier GPTs. Provocative, data-rich, and essential reading for anyone thinking about technology and growth. |
| The Second Machine Age | Erik Brynjolfsson & Andrew McAfee | 2014 | The optimistic counterpoint to Gordon. Argues that digital technologies are transformative but that the gains are poorly measured and unevenly distributed. |
| Capitalism Without Capital | Jonathan Haskel & Stian Westlake | 2018 | The best book on the rise of the intangible economy. Explains why intangible-intensive economies behave differently from industrial ones: scalability, spillovers, sunk costs, and synergies. |
| The Innovation Illusion | Fredrik Erixon & Bjorn Weigel | 2016 | Argues that corporate risk-aversion and regulatory burden, not technology, explain the productivity slowdown. A useful corrective to purely technological explanations. |
| Measuring and Managing Information Risk | Jack Freund & Jack Jones | 2015 | Practical framework for quantifying the value and risk of information assets--the intangible assets that dominate modern balance sheets. |
Connection to Today
The C3 era leaves us with a paradox and a measurement crisis.
The paradox: the most powerful technologies ever invented have delivered less measured productivity growth than railways, electricity, or the assembly line. The measurement crisis: the assets that drive modern productivity--software, data, algorithms, brands, organisational capability--are largely invisible to accounting standards designed for an industrial age.
These two problems are deeply connected. If your measurement system cannot see the most valuable inputs and outputs in the economy, it will systematically understate productivity growth. The IT productivity paradox may not be a paradox at all. It may simply be a measurement failure.
Consider the electrification analogy once more. Electricity was invented in the 1880s. It took 30-40 years for factories to reorganise around it. During that lag period, productivity growth was disappointing. Observers questioned whether electricity mattered. Then, in the 1920s, productivity surged as factories finally adopted the new organisational model that electricity made possible.
We may be living through a similar lag with artificial intelligence. LLMs arrived in 2022. Most organisations are still in the early adoption phase--bolting AI onto existing processes rather than redesigning work around AI capabilities. If the electrification analogy holds, the real productivity gains may still be a decade away.
But this time, there is an additional challenge. When electrification finally delivered its productivity surge, the gains showed up clearly in GDP statistics because the outputs were physical: more cars, more appliances, more houses. When AI delivers its productivity surge, the gains may be invisible--better decisions, faster diagnoses, more creative solutions, reduced friction--because they accrue as quality improvements and time savings rather than as additional physical output.
This is why measurement matters. This is why the gap between market value and book value--the intangible premium--is not merely an accounting curiosity. It is the central challenge of modern economic policy. If we cannot measure what drives growth, we cannot manage it, incentivise it, or invest in it rationally.
Opagio's Valuator exists to close this gap: to make the invisible assets visible, to quantify what accounting standards cannot see, and to give businesses and investors a true picture of where value lives. The Pioneers of 1776 created the first productivity revolution and the first framework for intellectual property. Two hundred and fifty years later, we are still trying to build a measurement system worthy of the economy those Pioneers set in motion.
This is Lesson 5 of the Productivity 250 series. Previous: The Golden Age of Productivity (1926-1975) | Next: Five Big Challenges for the Next 50 Years of Productivity Growth
Want to measure your own intangible assets? Try our free tools:
TFP Growth Across 250 Years
Each bar represents an era's average annual Total Factor Productivity growth rate.
Test Your Knowledge: Lesson 5
5 questions about C3 Era (1976–2025). Your score contributes to your Productivity IQ.
Golden Age (1926–1975)
The peak decades of TFP growth — mass consumption, Solow's growth equation, and post-war prosperity.
Next Lesson →Five Challenges (Next 50 Years)
Increasing global TFP, distributing gains fairly, keeping humans in charge, solving climate, and defending democracy.
All Lessons
Subscribe to our newsletter
Get the latest insights on intangible asset growth and productivity delivered to your inbox.
Ready to measure your own intangible assets?
Every era in the Productivity 250 series expanded what could be measured and owned. The next expansion is intangible assets. Start with the free questionnaire.