It’s not really an issue. 99.9% of the time the passengers will already be safe and the pedestrian is the one at risk. The only time I see this being an issue is if the car is already out of control, but at that point there’s little anyone can do.
I mean, what’s the situation where a car can’t break but has enough control where it HAS to kill a pedestrian in order to save the passengers?
That’s a problem with the software. The passengers in the car were never at risk and the car could have stopped at any time, the issue was that the car didn’t know what was happening. This situation wouldn’t have engaged the autopilot in the way we are discussing.
As an aside, if what you said is true, people at Tesla should be in jail. WTF
It’s not really an issue. 99.9% of the time the passengers will already be safe and the pedestrian is the one at risk. The only time I see this being an issue is if the car is already out of control, but at that point there’s little anyone can do.
I mean, what’s the situation where a car can’t break but has enough control where it HAS to kill a pedestrian in order to save the passengers?
deleted by creator
That’s a problem with the software. The passengers in the car were never at risk and the car could have stopped at any time, the issue was that the car didn’t know what was happening. This situation wouldn’t have engaged the autopilot in the way we are discussing.
As an aside, if what you said is true, people at Tesla should be in jail. WTF
deleted by creator
Here is an alternative Piped link(s):
related video
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.