Obviously, the amount of time a power-seeking AGI would need Earth to be hospitable to humans, in order to carry out its power-seeking plans, is more than zero days. I think it’s an interesting question just how much more than zero days it is. But I feel like this post isn’t very helpful for answering that question because it makes no ef…
Obviously, the amount of time a power-seeking AGI would need Earth to be hospitable to humans, in order to carry out its power-seeking plans, is more than zero days. I think it’s an interesting question just how much more than zero days it is. But I feel like this post isn’t very helpful for answering that question because it makes no effort whatsoever to “think from the AI’s perspective”—i.e. to imagine you're an AI and you're facing these problems and you actually want to solve them, like being in “problem-solving mode”, thinking creatively, etc. I don't think you tried to do that, because if you had, then there wouldn't be quite so many places in this post where you overstate challenges facing the AI by failing to notice obvious partial mitigations.
One starting point for thinking about this: If I were a human with the influence we might realistically hand over to AIs sometime in the next decade or two (for example the CFO of a medium-sized tech company), are there things I could do that would make it easier for an AGI to keep running in absence of humans?
Obviously, the amount of time a power-seeking AGI would need Earth to be hospitable to humans, in order to carry out its power-seeking plans, is more than zero days. I think it’s an interesting question just how much more than zero days it is. But I feel like this post isn’t very helpful for answering that question because it makes no effort whatsoever to “think from the AI’s perspective”—i.e. to imagine you're an AI and you're facing these problems and you actually want to solve them, like being in “problem-solving mode”, thinking creatively, etc. I don't think you tried to do that, because if you had, then there wouldn't be quite so many places in this post where you overstate challenges facing the AI by failing to notice obvious partial mitigations.
One starting point for thinking about this: If I were a human with the influence we might realistically hand over to AIs sometime in the next decade or two (for example the CFO of a medium-sized tech company), are there things I could do that would make it easier for an AGI to keep running in absence of humans?