On January 1, 2027, a Transformer-like model will continue to hold the state-of-the-art position in most benchmark
Plus
36
Ṁ30772026
62%
chance
1D
1W
1M
ALL
Tracking external bet: https://www.isattentionallyouneed.com/
This question is managed and resolved by Manifold.
Get
1,000
and3.00
Sort by:
https://blog.rwkv.com/p/eagle-7b-soaring-past-transformers
It even links the wager page in its bragging points:
All while being an “Attention-Free Transformer”
Related questions
Related questions
Will transformers still be the dominant DL architecture in 2026?
61% chance
Will Transformer based architectures still be SOTA for language modelling by 2026?
70% chance
Will the most capable, public multimodal model at the end of 2027 in my judgement use a transformer-like architecture?
55% chance
A major ML paper demonstrates symbolic-enhanced transformer successor outperforming standard transformers by March 2025
20% chance
Which AI will be the best at the end of 2025?
Will any model get above human level on the Simple Bench benchmark before September 1st, 2025.
69% chance
Will superposition in transformers be mostly solved by 2026?
73% chance
Will an AI achieve >85% performance on the FrontierMath benchmark before 2027?
64% chance
By EOY 2025, will the model with the lowest perplexity on Common Crawl will not be based on transformers?
27% chance
When will a non-Transformer model become the top open source LLM?