Agentic AIs are more useful than mere tools, therefore they will be built, and agents have motivation. The alignment problem is how to give them exactly the right motivation, such that they keep us as pets instead of killing or factory farming us. Like humans keep cats.