Will the US implement software export controls for frontier AI models by 2028?
➕
Plus
22
Ṁ412
2028
75%
chance

This market will resolve to yes if the US creates a policy by 2028 to control the export of frontier AI models, which are defined as those with highly general capabilities (over a certain threshold) or trained with a certain compute budget (e.g. as much compute as $1 billion can buy today). The policy may also restrict API access. This policy aims to limit the proliferation of AI models that pose significant risks, such as powerful AI technology falling into the wrong hands.

Luke Muehlhauser from Open Philanthropy suggests this idea in his April 2023 post, "12 tentative ideas for US AI policy." This market idea was proposed by Michael Chen.

Get
Ṁ1,000
and
S3.00
Sort by:

Should resolve YES given this updated policy from the US Bureau of Industry and Security (within the Commerce department):

[...] BIS is requiring a license to export, reexport, or transfer (in-country)

the model weights of any closed-weight AI model—i.e., a model with weights that are not

published —that has been trained on more than 10^26 computational operations. [...]

To ensure that the licensing process consistently accounts for the risks associated with the most

advanced AI models, BIS has decided to apply a presumption of denial review policy

(implemented in § 742.6(a)(12)) to every license application involving the model weights of

those models.

From page 34 of the "Framework for Artificial Intelligence Diffusion": https://public-inspection.federalregister.gov/2025-00636.pdf

Great market!

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules