Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Here is an interesting question I recently came upon, does the driver of a Tesla need consent from their passengers before turning on FSD? I would argue yes, because I don’t want to unknowingly beta test software with my life, but I am curious what others think.


As a kid, do you even have consent when your parents tell you to hop in the car cause you're going somewhere? As a passenger, isn't consent of trusting the driver inherent when you get in the car willfully ride somewhere? Could a passenger not interject and say, hey I don't feel comfortable riding with you if you enable autopilot?


These are good questions.

The first example I think is same in a Tesla or a Toyota. Still interesting to ponder, I think there is still some ethical questions left.

The second example requires a bit of nuance, I trust the driver to operate a vehicle, not necessarily the vehicle itself. Does my trust extend to their decision whether or not to turn on autopilot? You might argue yes, but you also might not be informed enough to make that decision as a passenger.

The last example assumes the passenger has knowledge of the car and FSD which they may not.


In the last example, if you're riding with a driver and not paying attention and they are texting or reading a newspaper and get into a crash, then the liability would be on the driver. Not the cell phone company or newpaper company. I think the same for autopilot. If a warning displays to the driver that they are still liable for controlling the vehicle, and they ignore it then the driver would be liable to the passenger, not the car.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: