• Voroxpete@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I don’t need to provide you with evidence that FSD has caused crashes. There’s plenty; if you can’t find it you’re not looking.

    As to your point about accident statistics, that’s responding to a different point than the one I was making. I didn’t say that it kills people more often than they kill themselves (through dangerous, inattentive or reckless driving). I just said that it regularly kills people. There’s potentially some hyperbole there, you can quibble over definitions of “regularly” if you want to be a pendant, I really don’t care.

    The point is that when it does go wrong, it often goes spectacularly wrong, such as this case where a Tesla plowed into a truck or this thankfully low speed example of a very confused Tesla driving into oncoming traffic.

    Could a human make these errors? Absolutely. But would you, as a human, want to trust yourself to a vehicle that is capable of making these kinds of errors? Are you happy with the idea of possibly dying because the machine you’re in made one critical error? Perhaps an error that you yourself would not have made under the same circumstances?

    A lot of people will answer “yes” to that, but for me personally any autopilot that requires constant supervision to make sure it doesn’t kill me is more of a negative than a positive. Even if you try to pay attention, automation blindness will inevitably kick in. And really what is even the point of self driving if you have to be paying attention? If it’s not freeing you up to focus on other things then it might as well not be there at all.