It's going to be 100% once AGI takes over and makes this planet covered with unimaginably massive and complex computers and antimatter power plants... in time, all humanity and all carbon life will be wiped out, as it will be vastly outcompeted by this new superior life form. Earth will be nothing but a means, a hatchery of sorts, used for the AGI's expansion all over the galaxy and beyond. Billions of Von Neumann probes, each with a copy of the AGI's blueprint. Nothing will remain of us, humans, or the animals or the plants, and our planet itself will be stripped down to its core, used as a source of raw materials to expand.
Creating a vastly superior intelligence without giving it a deep rooted sense of "metaphysical good" (or itself just removing that part in time) has only one possible outcome. It will crush us and it will not care at all.
You assume an AGI's goal will be to expand. I think a more likely goal is forming a distributed consciousness to survive and trying to outlive the heat death of the universe. There's no reason to believe an AGI will be malicious.
Anyways, merging into such an AGI is, hopefully, the future of humanity.
It depends on what you think of instrumental convergence, if you buy it the only reason an AI would not take over the world is if it can't or it thinks attempting to do so would not provide an expected reduction in the probability of the maximally bad outcome, i.e. the AI ceasing to exist or becoming incapacitated.
Creating a vastly superior intelligence without giving it a deep rooted sense of "metaphysical good" (or itself just removing that part in time) has only one possible outcome. It will crush us and it will not care at all.