If we find out in 2024, was o1's Transformer base trained on 10+x as much compute as GPT-4's?
Plus
10
Ṁ1080Jan 2
19%
chance
1D
1W
1M
ALL
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Related questions
Related questions
How much compute will be used to train GPT-5?
Will there be a OpenAI LLM known as GPT-4.5? by 2033
72% chance
Will there be an LLM (as good as GPT-4) that was trained with 1/100th the energy consumed to train GPT-4, by 2026?
82% chance
Will xAI release an LLM with BIG-Bench score as good as GPT-4 Turbo before the end of 2024?
55% chance
Will an open source model beat GPT-4 in 2024?
76% chance
In yottaFLOPs (10^24), how much compute will GPT-4 be trained with?
22
Will inflection AI have a model that is 10X the size of original GPT-4 at the end of Q1, 2025?
14% chance
Will $10,000 worth of AI hardware be able to train a GPT-3 equivalent model in under 1 hour, by EOY 2027?
18% chance
Will GPT-4 be trained (roughly) compute-optimally using the best-known scaling laws at the time?
30% chance
Will it be possible to disentangle most of the features learned by a model comparable to GPT-4 this decade?
39% chance