This is a guest post by Ben Garfinkel. We revised it slightly, at his request, on February 9, 2019. A recent OpenAI blog post, “AI and Compute,” showed that the amount of computing power consumed by the most computationally intensive machine learning projects has been doubling every three months. The post presents this trend as a reason to better prepare for “systems far outside today’s capabilities.” Greg Brockman, the CTO of OpenAI, has also used the trend to argue for the plausibility of “
Reinterpreting “AI and Compute”
Reinterpreting “AI and Compute”
Reinterpreting “AI and Compute”
This is a guest post by Ben Garfinkel. We revised it slightly, at his request, on February 9, 2019. A recent OpenAI blog post, “AI and Compute,” showed that the amount of computing power consumed by the most computationally intensive machine learning projects has been doubling every three months. The post presents this trend as a reason to better prepare for “systems far outside today’s capabilities.” Greg Brockman, the CTO of OpenAI, has also used the trend to argue for the plausibility of “