Because at the end of the day individual humans are within an order of magnitude of intelligence of one another.
Also humans die. Human systems have been 'weak' without technology. Human thought and travel is slow.
If we take AI out of the equation and just add technology there is significant risk that a world dominator could arise. Either a negative dominator (no humans left to control due to nuclear war) or positive dominator (1984 cameras always watching you). There simply hasn't been enough time/luck for these to shake out yet.
Now, add something that over a magnitude smarter and could copy itself a nearly unlimited number of times and you are in new and incalculable territory.
> humans are within an order of magnitude of intelligence of one another.
Yes, and maybe that will be different with an AGI. Maybe AGI is physically possible. And maybe that advantage in intelligence will make AGI vastly more powerful than us.
Those are a lot of "maybes" however, and thus all of this remains highly speculative.
Just one example that puts a pretty big dent into many scenarios of all-powerful AGIs:
I think everyone agrees that even the dumbest human still outsmarts bacteria, protists and viruses by several orders of magnitude. And yet, we haven't been able to rid the world off things like Measles, Cholera, Malaria or HIV. Even the common cold and influenza are still around.
So, if we, with out big brains, that split the atom, went to the moon and developed calculus, cannot beat tiny infectious agents that don't even have a brain, then I remain kinda sceptic that being a very very very smart ML model means an Automatic Win over the human species.