Exponential Extrapolation has Never Worked
Timing is Everything.
there is exponential period, which might grow faster than you thought!
But the exponential period WILL end, and it will probably stop faster than you thought!
Sometimes we do not yet know what the salient constraints are, but we sure as shit know they will come.
It’s all about applicable ranges (you could even call that context). (A linear amplifier only approximates linear within saturation limits) The amplifier’s case makes it easy to see (and thereby, design) where the “context” of the linear relationship is.
Kurzweil himself points to the END of the exponential phase with the appropriate word “singularity”. SIngulratiy means our model breaks. So he’s saying: intelligence will grow exponentially until the singularity, where something ELSE happens. His something ELSE is actually the exact OPPOSITE of ELSE: the unabated continuation forever of the exponential growth. (I am assuming we can all agree and measure intelligence by the way, just to illustrate how numbingly insane his argument is)
it’s ok to laugh when we’re wrong; how else do we grow?
“so what is the Singularity?”
“you’re saying this singularity is the point where this exponential growth breaks down?”
“yes something QUALITATIVELY DIFFERENT happens at that point”
“which is the EXACT SAME EXPONENTIAL growth that preceded the singularity?”
“oh. no, it’s SUPER EXPONENTIAL!”
WHat are the best singularity arguments?
an actual not-wrong thought
I think Eliezer, Kurzweil, Bostrom land exactly in the camp of “this is likely to be the only way it CANNOT BE”
Eliezer Yudkowsky
The future does not need an intelligence explosion to become fundamentally unpredictable!
Daniel Kokotajlo
No one can predict the future, we are all guessing. But sometimes we can at least identify which guesses CANNOT POSSIBLY BE THE RIGHT GUESS.
The one thought that is less wrong is: the exponential will end. Will it end in 30% of the population dying as in the Black Death? WIll it end in terminal velocity? Will it end in Terminator 2: Judgement Day (this IS a plausible dystopia, with no need at all for Skynet or even exponential growth at all)
Give the Devil his Due
To be charitable: “A sufficiently large or long-lasting logistic curve will remain exponential longer than you can remain alive.” ( A corollary to the market can remain irrational longer than you can remain solvent.)
maybe AIs will blast off into space to convert enough energy to keep growing after exhausting the energy around Earth. Maybe this depletes our sunlight so much we die. We already have the technology to blot out the sun for decades or centuries (nuclear bombs). The question is how do we never do that? How do we never run AI inference to the point of using up all the electricity and water? Well, it seems physical constraints show up here. How does AWS build more data centers for the AIs to take over?
We build exponentially our nuclear stockpiles but keep the number of launches to zero, forestalling possible S-curves of nuclear launches which would wipe us out before settling back to a power law.
We achieve terminal velocity in a skydive and pull a parachute to further affect the velocity.
we don’t buy exponentially bigger clothes in expectation of growing and growing.
So we know the shape (logistic). The question becomes, what is the timing and size of the logistic growth?
- Can we predict these based on announcement of technology? (what is the predictive a priori data? when can we get it?) COuld anybody have predicted the 6 wavelets identified in the paper?
- how accurately and how soon can we predict the “characteristic time” and amplitude of the logistic given some a priori information?
- we then can ask if we can predict future wavelets at all. did we know at wavelet 3 that we would have 4, 5, and 6?
- Doomers assume AI would be able to generate arbitrarily many wavelets, hence building an approximate exponential in practice out of multiple stacked logistic curves. SO this is the debate then: back to first question: can we say anything about what can or cannot happen in terms of an AI discovering and applying each logistic growth breakthrough? INdeed, if anything could predict the growth of knowledge, then there wouldn’t be any new knowledge, because the prediction IS the new knowledge. Now we are back to why did we gain any knowledge at all and why at the rate we did? Was there an identifiable constraint in nature? Too few people? no way to write things down? mass cooperation to build tools to extend our senses (including mining, building, trading, shipping, designing)?
- instead of “predicting”, maybe we should “prepare”. be exposed to shocks that benefit you (anti-fragile) instead of sweating the probabilities. You know, even if they’re right, we should try to prepare. How do we prepare for the possibility of large nubmbers of wavelets?
Power laws vs Exponentials
Exponential : $y = ae^{bx}$ (Unsustainable, temporary)
Power Law : $y = ax^b$ (Sustainable, natural)
Nature tends toward power law distributions precisely because they represent equilibrium states that can persist, while exponentials represent disequilibrium that must eventually correct.
Logistic Curve: $K/(1+ae^{-rt})$
Early phase: When y « K, it approximates exponential growth: $y \approx ae^{rt} $
Late phase: As it approaches carrying capacity K, growth rate follows power law decay
Transition: The inflection point where exponential “breaks” and constraints dominate
This perfectly demonstrates your point: there are no true exponentials, only nested logistic transitions that create exponential-like behavior when viewed at the wrong time scale.
The “bi-logistic” captures the two major technological eras, while the six waves show the detailed mechanism of how each era actually unfolds through discrete breakthrough cycles.
I read “In the next 5-10 years my family and I will rent more AI generated movies on prime than those made without.” I think this is dumb on many levels.
First, rent? renting movies is already an example of a metric that has become obsolete under a new paradigm (that of subscription streaming service)! Why will your family rent ANY movies?! Why do you think the concept “movie” will survive 5-10 years? (Hint Lindy implies movies will stick around, we still have 2 hour movies and cinemas even though we also have TikTok and youtube and genAI video) “renting AI generated movies” is too unimaginative; will your family even watch one thing together? Is that what they do now?
Second, exponential extrapolation has never, ever been correct. Exponential patterns are inherently unstable and mark a state of transition from one stable paradigm to another. Moore’s law has roughly held for Every extrapolation has a deadline (Durden: On a long enough timeline the survival rate of everyone drops to zero)
People now pay for book summaries like Blinkist, listen to podcasts.
I agree the process of storyboarding/prototyping is already much cheaper and faster. So blocking shots, pitching concepts should become easier rapidly.
Markets are not efficient but they can get more efficient.
NOTE: get the actual data behind these samples and then show the extrapolation vs actual.
The actually useful question is not to extrapolate but to guess or act to bring about the new stable state after the phase transition. Or even “how long will this exponential transition phase be? what will happen afterwards? How can I be anti-fragile through it?”
Human height
Terminal Velocity
Stock Market growth
Moore’s Law
transistor density evolution of the past decade conforms to a linear trend connoting slow and incremental advances, but also signifying a substantial departure from Moore’s exponential law.
https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0256245
What we call “Moore’s Law exponential growth” is actually:
- Six logistic curves (individual breakthroughs)
- First commercial planar transistor (1959)
- MOSFET technology (1964)
- Silicon gate technology enabling the 4004 processor (1971)
- High-density short-channel MOS for the 8086 (1977)
- Advanced integration allowing complex circuitry like the 80486 (1989)
- Deep-UV lithography deployment (1990s)
- Grouped into two larger logistic curves (major paradigm shifts)
- Which together create the illusion of sustained exponential growth
Amazon survived the dot com bubble. So did PayPal. Investment mania was unsustainable but not everyone knew what the new paradigm would be.
Lindy Effect
follow the trend
exponential trend talk
- amount of horseshit in new york gutters before automobile
- covid-19 deaths until vaccines and slowdown
- we did something about this
- nuclear bomb explosions throughout earth
- could still happen…
- population bomb: mass starvation from overpopulation
- green revolution
- .com companies in 1999
- ice ages happened
- Moore’s law
But AI has not yet done much, has it? We use it a lot..
- dark leisure
- seen productivity gains not happening yet since 1970s
- higher unemployment for entry-level
it does seem like singularities CAN occur. We CAN blow up the world with nuclear weapons. A virus CAN and HAS (black death!) decimated the human population.