Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Given just how poorly we react to terrorism and sharks, I’m not sure if that’s the direction I’d want to be headed in. I’m in favor of automation, but not with Uber’s “move fast and break things” attitude. Responsible, ethical, and careful developments of dangerous technology, even if the danger promises to be reduced compared to the norm in some future iteration, is a must.

Keep in mind that some version of “killled by robots” has been in the zeitgeist of fear for decades now. In that sense, “sharks amd terrorists” might not be a bad comparison. It will not take much to ruin a good thing (automation) with typically scummy practices. The important thing to remember is that automation doesn’t require the likes of Uber playing fast and loose with it to become a reality.



> Responsible, ethical, and careful developments of dangerous technology, even if the danger promises to be reduced compared to the norm in some future iteration, is a must.

Agree. And I'd like to see an approach similar to an NTSB investigation of an air crash. Not to place blame, but to identify root causes and mitigate the chance that they recur.

Air travel is exceptionally safe. Has been for many decades. Yet we still investigate any air crash because any crash means something went wrong, perhaps something that can be corrected.

Self driving cars need to be treated the same way, at least for now, and until well after they have demonstrated themselves to be safer than human drivers.


This is the way to go, and it would help (I think) to ease the calls I’ve seen around for holding individual devs responsible. The point is not to collect scalps, but prevent tragedy and malfeasance. Holding companies responsible, sure, but some random programmer who missed a bug being the scapegoat seems like a way to ensure companies avoid responsibility by foisting it off on a low totem member; thst can’t be allowed to happen with lives on the line. Like an aircraft, self driving car software is never going to be the work of a single Dev. The NTSB isn’t perfect, but as these things go, very admirable.

IIRC the NTSB is investigating this case too.


Responsible and ethical are usually not words which are used together with Uber, but in this case it seems like it was simply an accident. Computers cannot beat Physics, if the woman suddenly stepped in front of a car which is unable to brake, it’s a tragic accident.

The idea of the investigations compared to air-travel is pretty good and also much easier to execute given they record everything on camera.


With the level of tech commonly posted on this website (and in general), there seem to be many suprising dangers that the average lurker is not going to know about that the experienced developer / whatever will know about. It seems better to seek out and point out (if non-obvious): 'be careful about x,y,z' rather than hide such warnings in obscure blog posts / hope people will understand some kind of implicit message in whatever (especially since some times evidence can be conflicting). A good example of this is the BIS export warning for crypto (assuming in the US), I've talked in the past to several 'experienced' blockchain people who had no idea about this, but if you browse around on enough crypto github pages you can eventually find it.


This is commonly called 'attempting to saving someones butt with a reasonable explanation, rather than trying to make a quick buck / whatever to their detriment.'


Did you reply to the correct post? This seems unconnected to the topic at hand in the extreme, but maybe I’m missing something.


I was specifically responding to your comment "Responsible, ethical, and careful developments of dangerous technology, even if the danger promises to be reduced compared to the norm in some future iteration, is a must" and not responding to the topic of Uber or specifically self-driving cars.


How is Uber's experiment different than any other 20 companies working on same thing?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: