Innovation appears to be getting harder. At least, that’s the conclusion of Bloom, Jones, Van Reenen, and Webb (2020). Across a host of measures, getting one “unit” of innovation seems to take more and more R&D resources. To take a concrete example, although Moore’s law has held for a remarkable 50 years, maintaining the doubling schedule (twice the transistors every two years) takes twice as many researchers every 14 years. You see similar trends for medical research - over time, more scientists are needed to save the same number of years of life. You see similar trends for agriculture - over time, more scientists are needed to increase crop yields by the same proportion. And you see similar trends for the economy writ large - over time, more researchers are needed to increase total factor productivity by the same proportion. Measured in terms of the number of researchers that can be hired, the resources needed to get the same proportional increase in productivity doubles every 17 years.
Like Michael, I also found this article through FS.
Very interesting stuff! Props on how you give us the gist of the papers in not that much words (well summarized), and even reproduce the graphs. I thought that was very helpful.
The topic of ‘the death of the Renaissance man’ interests me very much. If that trend continues, and chances are it will, perhaps we’ll become so specialized/things will be so complex that almost all of us will lose the ability to understand structures like the financial system and even perhaps public policy. Already now, for example, very few people have an even remotely accurate causal model of super important things like e.g. climate change and the 08 crisis.
Do you have a view on that? Is this worrisome, or not really? Maybe the Enlightenment ideal of intellectual autonomy is terribly outdated and not worth striving for.
Less intellectually, one worry could be that, because (due to hyperspecialization) no one understands all the factors involved, no one is equipped to assess the merits of most decisions (and so our decisions will be no good).
Great article , Matt ! It will be interesting to see how funding plays a role in the selection of ideas out of a certain idea pool.
Thought-provoking article Matt. I agree with you as well as the data that incremental innovation is becoming harder. I'm curious to see how we'll tackle the challenge.
My bet is that organizational changes like crowdsourcing the collective intelligence of hundreds and thousands of people towards solving problems and assembling a "generalist" all-star team composed of various specialists can and will make an impact.
Furthermore, as the bulk of most innovation knowledge available is unstructured text, natural language processing machine learning algorithms might be the superpower humans need to innovate more efficiently.
If two heads are better than one, why not use 1,000 heads and some silicon heads while we are at it?
I agree with most of this with three big caveats.
1. Despite the title, it isn’t really about ideas, or most kinds of knowledge but about a particular kind of innovation found in team-based academic disciplines
2. Time scales are critical. Yes, I agree about Moore’s law but at the beginning was the huge leap of the invention of the transistor and we are now seeing another huge leap of quantum computing.
3. I think ideas are getting easier to find, and easier to use and enjoy, and the burden of knowledge is often ignored, but some sectors, egregiously academic research, as well as S&T research, has become overly bureaucratic.
I found this article through Farnam Street and thought you did a great job of laying out the argument for declining innovation. I don't think it applies to software.
I've been developing software for 50+ years now, and I'm repeatedly stunned by the accelerating increase personal innovation that I (and developers like me) are experiencing, as well as the accelerating rate of collective innovation that is an important reason that individuals are able to be increasingly productive.
This may be unique to software, or we might have reached an inflection point that we'll see in other domains.
My rate of innovation has been continuously improved by better tools and techniques, but Google search along with what I will call "knowledge aggregation" sites and activities has dramatically changed the way that I collect the knowledge that I require to solve a problem.
Every line of code is the application of knowledge to solve a problem. So I think it qualifies (on the micro scale) as a kind of innovation. And it's not just lines of code, but delivered, user functionality that matters. What I can do now in a day is FAR more than the best team of engineers that I used to manage could do in weeks.
In the old days: dig it out of a book. Or see if someone in your personal network had the knowledge. Or painfully develop the knowledge by tedious trial-and-error. At the start of my career, I got one trial-and-error cycle per 24 hours, often to discover a typo. Now I get error correction feedback with every keystroke and often can see the results of a completed code change in less than a second.
One of my practices is writing Daily Pages--roughly 750 words-- of whatever comes to mind.
Today, inspired by your article, I've written 1200 words (so far) that I expect to turn into a blog post. The above is a quick summary of the first part of the response. I'll also examine what I think are some hidden premises in the arguments that you cite that main explain why software is different and might predict some other fields where the trend might change.
If you are interested in reading or reviewing or even collaborating on a post that would examine the idea that innovation is declining in some areas with what seems certain to be an accelerating rate of innovation in software, let me know.
Meanwhile, I've subscribed, and I'll be chewing my way through your earlier posts.