My guess is that the car is looking for a place to pull over (perhaps because the rider pressed the "pull over" button. In this mode, the car hugs the right side of the road until it can find a spot to stop, which would normally work alright, except in this one particular spot where the "right side of the road" is only this tiny island.
they tried that. in an emergency they just stopped. the problem was they were stopping in the middle traffic and in the middle of intersections, so it's now an pull over button. or just open the door.
the button is giant and blue and impossible to miss and you can see from the video it isn't on the screen. the screen in the video is mostly brown with a sliver of blue on the side.
I guess the only option they have is "pull over" which in this case just caused the car to continue circling looking for a safe place to pull over. If they had an actual kill switch, we'd probably be watching another video of some guy on a call to waymo support while stuck in the middle of a highway.
To be clear, I'm talking specifically about the first line of support at Waymo here. I am not precluding that they have higher levels of control behind layers of authorisation.
Yes, in much of the world there are mandatory passenger-facing emergency break levers in every carriage of passenger trains. The US is the outlier here.
And yes, passengers should absolutely be able to bring their vehicle to an immediate stop. It's an "emergency break"! Of course you need an emergency break in an autonomous car! What exact alternative are you proposing for when you're in an AI-operated car hurtling under the chassis of a white truck that it failed to detect in snow conditions?
It seems like an incredibly obvious and basic legislative requirement for self-driving cars to have some kind of immediate manual break for emergencies. I'm kind of shocked that that apparently isn't the case now?
Sounds likely, in which case there needs to be a much more "break glass in case of emergency" control which gradually lowers the maximum speed cap of the vehicle.
So even if the vision/pathfinding believes there is nowhere to park and nowhere else to turn, it will still coast to a stop in a way that is not inherently less-safe than a more-normal car running out of gas and stalling on the road.
It's easy enough to imagine an actual emergency which would necessitate remote or local intervention to stop the car, and the call seems to indicate that they don't have an emergency override or at least not without escalation.
What if there were a:
medical emergency of the passenger
crash up ahead
fire up ahead
earthquake
flood
malfunction of the driverless car
really anything that would make you pull over your actual car to the side of the road for your own safety or emergent needs.
And then you have to imagine if so, even with an e-stop button are you in a less safe situation if you do not have ability to reach the wheel from the back seat.
I usually think it's a stretch when people compare new tech to old sci-fi stories, but "brilliant machine just runs in circles endlessly and nobody can stop it" is straight out of I, Robot.
So the guy is in the car and he's concerned about catching his flight but when the operator asks him to do something in the waymo app he doesn't want to do it. Could it be that he'd rather keep filming for internet notoriety than stop filming and actually solve his problem?
Could it be that Waymo should just be able to stop the car? It doesn't seem at all ridiculous to you that he is already on the phone with them, and then they just read the script to tell him to pull out his phone and fire up the app? Just what I want is to ride in a car with Comcast level of customer service.
>Could it be that Waymo should just be able to stop the car?
That kind of remote control opens up the possibility - maliciously or accidentally, likely or unlikely - for every Waymo in the fleet to abruptly stop, regardless of whether it's safe to do so. That scenario is orders of magnitude worse than "sometimes a Waymo gets lost in a parking lot and it takes a thirty second call to fix it".
You don't think Google is able to stop a Waymo vehicle? They can literally control it remotely. The issue is that the person behind the phone did not have the authorization or access.
Of course not, but I'm open to the idea that they should be able to escalate to someone who can. This is a transitional period with self-driving cars, and it helps to mitigate serious potential safety issues.
Ideally, in the future you would have to expressly press some button to send an OTP to someone you wanted to allow to control your car, with fine-grained permissions.
Will that future happen? Probably not, and we'll probably see extreme corpgov oversight and mass reduction in individual freedom in return for convenience.
Anyway, I was only responding to OP's claim that support for remote access capabilities could lead to an exploit: There's already remote access capabilities in these cars.
This makes sense. If one thing is bad, another somewhat similar thing is good. Being trapped in a moving vehicle without a driver is not bad because being in a vehicle with a person is bad. Similarly getting hit with a brick is not bad because getting hit with a shovel is bad.
> Could it be that he'd rather keep filming for internet notoriety than stop filming and actually solve his problem?
The guy kept filming for under sixteen seconds after asking about and receiving clarification that she is unable to intervene with the car and needs him to use the app. I like the idea that this somehow indicates bad faith or ineptitude in your estimation.
It's worth making, but there's some serious drama queen vibes that make it feel pretty overblown. If an uber is late to the airport a reasonable person doesn't threaten the driver with covering the cost of their flight.
If an Uber driver caused you to miss a flight by driving around a parking lot in circles at a speed you can't exit the vehicle, you don't think it would be a reasonable request for the customer to ask Uber to make it right?
Fair enough, there is a difference. But now we are not looking at a missed flight so much as attempted kidnapping or imprisonment or some other much more serious crime. Which is interesting to think about with the Waymo example, but hard to take seriously in the context of the video since the rider declines to do what the customer service rep asks them to do (at least appears to for the sake of producing additional outrage for their video)
It's interesting how quickly we've gone from discussing an interesting failure mode of autonomous robots that travel in public spaces, and switched to calling it PEBKAC.
This is a unironically a great signal society is willing to accept self driving cars. Even computer security isn't this good at playing blame the user.
"Society" didn't blame the user, I think "society" would have no problem blaming the car. Hacker News isn't representative of society at large.
The signal you're seeing is the tendency of tech people to consider any risk to human safety to be less important than the benefits of the technology itself, and to always blame the human and never the tech.
You also see this manifest in any conversation about the failure modes of AI, in the inevitable knee-jerk response of "humans also do x."
Yes, that could be. And you could focus on being annoyed at this guy's social media behavior, if you like. However, it doesn't mitigate, and is less important than, the problem of the car getting into this state and Waymo not having control over it.
"…the operator asks him to do something in the waymo app he doesn't want to do it."
Why are you trying to offer an excuse for the inexcusable?
The app should have nothing to do with it. Where are the emergency stop and exit controls for the passenger? He should be able to exit the vehicle at any time.
I fed up with this bad tech and the fact that governments let Big Tech act irresponsibly to get away with this shit. Why aren't there regulations in place before this tech is allowed loose on an unsuspecting public?
If CEOs of tech companies were held directly responsible with the threat of jail time it'd stop almost instantly.
This poor man was forced to slowly circle around a parking lot eight times! It took over five minutes before they fixed the problem! He almost missed his flight! Outrage, outrage.
Reckon that's a bridge too far even for those miserable reprobates. Besides, it'd only take for a couple to be locked away to scare the shit out of rest of them. There's nothing like making an example out of a few to bring better behavior.
The real problem these days is that the culprits who perpetrate this shit are able to hide behind corporate walls—at worst the company gets fined (albeit rarely) but those who are actually responsible escape both freely and anonymously. Laws need to be changed to make employees directly accountable for their actions and to take the consequences.
Unfortunslely, introducing such laws is somewhat complicated and would meet with huge resistance for many reasons too involved to list here. Nevertheless, there's one I should mention, sometimes unavoidable mistakes occur (or important facts remain unknown) even after due diligence and rigorous testing. Laws should not hold individuals responsible for force majeure† (so-called acts of God) that they have no control over. Any change in the law would have to allow for this.
That said, we know damned well that that provision/exception is totally irrelevant in most of these cases/customer unfriendly fuck-ups, clearly they're as guilty as hell.
I'd advocate another change in the law in that shareholders need to be identifiable. We need a way of
publicly embarrassing and shaming shareholders when they invest in 'carpetbagger' companies. To avoid shame investors who invested in good faith would be left with little option than to divest themselves of their shares. In effect, shareholders have to share the responsibility for a company's bad behavior and be seen doing so.
Mind you, I can't see this happening anytime soon, as many shareholders are just too greedy to allow laws like this to come into effect. Unfortunately, the political will just isn't there.
_
† A classic example of where hubris, cocksureness, cost cutting and insufficient engineering rigor — AND force majeure led to a disaster was the collapse of the Tacoma Narrows Bridge. If engineering rigor had been followed to the letter and penny-pinching avoided then it's likely the bridge wouldn't have collapsed despite force majeure entering the scene (the physics of Kármán vortex street turbulence was not well understood at the time).
Yeah, I got a negative impression of the guy. It looped around the parking lot for five minutes. I'm not a fan of self-driving cars, but this doesn't seem like a huge deal.
> Surely the driverless car hasn't locked him inside. Right?
Every (modern) car I’ve been in, driverless or not, will lock the doors once the vehicle is in motion - automakers are not in the habit of letting passengers fall into the road like that. Just imagine the lawsuits…
It's been my (limited) experience that there's a manual override that can still be unlocked. Can't I slide the lock lever to the unlocked position while the car is in motion?
I'd also submit that in any case where I'd open the door while the vehicle is in motion, I'd have a damned good reason for wanting to do so[0] and would not take kindly to being thwarted.
[0] Usually: backing the plow truck up to a small, flat utility trailer that's invisible through the rear window.
Child safety locks that prevent people from opening locked doors from the inside are standard on four door vehicles for the rear doors, but they can usually be disabled.
My two-door had them disabled by default and it's not uncommon for taxis and rideshares to disable them.
It seems insane not to disable them in a Waymo - you wouldn't be able to escape in an emergency.
I think it's usually locked from the outside. Opening the door from the inside (in most cars I've been in) works.
I would actually be surprised of the opposite (not being able to leave a vehicle because the doors are locked). This sounds very scary in an emergency situation where you have to leave the vehicle
The locks on my car don’t prevent me from opening front doors while driving. The only rear doors have a child safety switch which would stop the doors from opening when locked. But you have to turn that on.
You don’t want a locked door preventing you from exiting the car if you have a crash.
This is wild! Wow! I used to think highly of Waymo but this could be the worst possible way to handle a situation where a 2-ton object went awry.
She should have stopped the car immediately after she became aware of the situation, within the first few seconds of the call. She kept following this dumb scripted conversation as if it was someone calling support because their router won't turn on. What an absolute shit display of incompetence and recklessness.
Yeah, this happened to me at home. I have this lock on the door and I couldn’t get in because it was locked. I called support and they said I needed to use the key on the lock but that SHOULDN’T BE NECESSARY. Completely unsafe that the door wouldn’t unlock. Shelter is a human right and I was trapped outside of my house just because I wouldn’t use some “device” that so-called “support” was trying to get me to use.
Same situation as this guy and the “End Ride” button. It’s actually horrifying.
Ah you’re right. I should’ve used The Adage of The Man With The Door Handle. Unusual story about a guy who was trapped in his home because he had to press down on a device to disengage a lock. Perhaps President Biden will issue an order against these terrible things.
Are door handles evil? Find out this weekend on Hacker News as it explores the concept of clicking “End Trip” on an app dispatched taxi ride that was embarked on by app.
So the car just circles around it indefinitely.