Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Nuclear weapons have a very tangible negative outcome which everyone can imagine. Existential AI risk has no meaning right now outside of a blurry vision of something like Skynet. The issues are not alike.


The day AI can be used in war, through robots for example, loosing a war might mean total annihilation. That is beyond nuclear weapons risk.


The day in which Arnold Schwarzenegger robots travel back in time to kill Sarah Connor... very risky!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: