Before 2028, will there be enough inference capacity to generate 30T frontier model tokens per day?
Plus
7
Ṁ4692028
40%
chance
1D
1W
1M
ALL
In "Situational Awareness: The Decade Ahead", Leopold Aschenbrenner claims:
Another way of thinking about it is that given inference fleets
in 2027, we should be able to generate an entire internet’s worth of
tokens, every single day.
Resolves YES if by the end of 2027, there is enough deployed inference capacity to generate 30 trillion tokens in a 24-hour period using a combination of frontier models. "Frontier models" in the sense that GPT-4 is a frontier model today in mid-2024.
This is one of a series of markets on claims made in Leopold Aschenbrenner's Situational Awareness report(s):
Other markets about Leopold's predictions:
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Related questions
Related questions
Will an AI achieve >85% performance on the FrontierMath benchmark before 2028?
70% chance
Will a new lab create a top-performing AI frontier model before 2028?
60% chance
Will anyone train a TokenFormer model at scale before 2026?
25% chance
Will an AI achieve >85% performance on the FrontierMath benchmark before 2027?
64% chance
100GW AI training run before 2031?
37% chance
Will OpenAI inference costs fall by 100x over the next 18 months?
32% chance
Will a US Department of Energy high performance computing cluster be used to train a foundation model of more than 500B or more by January 1st 2025?
37% chance
$1T AI training cluster before 2031?
56% chance
Before 2028, will any AI model achieve the same or greater benchmarks as o3 high with <= 1 million tokens per question?
69% chance
10GW AI training run before 2029?
43% chance