Share this comment
Yeah, "overhang" has been used to mean 'inference is cheap,' but that usage seems less common recently. See lesswrong.com/posts/icR…. This "mismatch" seems worth noticing because it implies that training run X could cause us to go from _nobody having access to X-level capabilities_ to _anyone with the model weights being able to run lots…
© 2025 AI Impacts
Substack is the home for great culture
Yeah, "overhang" has been used to mean 'inference is cheap,' but that usage seems less common recently. See https://www.lesswrong.com/posts/icR53xeAkeuzgzsWP/taboo-compute-overhang. This "mismatch" seems worth noticing because it implies that training run X could cause us to go from _nobody having access to X-level capabilities_ to _anyone with the model weights being able to run lots of X-level inference_, but marginal changes in the "mismatch" don't seem very decision-relevant to me (and cheapness-of-inference in absolute terms seems more important than the mismatch).