Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Leaving aside the fact that anthropomorphization is a perfectly valid discursive shorthand - yes, exactly. LLMs have an insanely-high risk profile to be granted access to anything without a human-in-the-loop.

> any sane engineer won't just let an LLM loose, they'll build guardrails around it

Sure seem to be plenty of insane engineers around these days. And, worse - plenty of them with good marketing teams that can convinced non-engineers that their systems are "safe" and "reliable".



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: