You don't need any "weird" goal like paperclips. You just need the basic goals of survival and expansion that every species posesses (implicitly) to understand why a superintelligence is a danger.
You don't need any "weird" goal like paperclips. You just need the basic goals of survival and expansion that every species posesses (implicitly) to understand why a superintelligence is a danger.