A Tesla was in its self-driving mode when it crashed into a parked patrol vehicle responding to a fatal crash in Orange County Thursday morning, police said.

The officer was on traffic control duty blocking Orangethorpe Avenue in Fullerton for an investigation into a suspected DUI crash that left a motorcyclist dead around 9 p.m. Wednesday when his vehicle was struck.

A Fullerton Police Department spokesperson said the officer was standing outside his vehicle around midnight when he saw a Tesla driving in his direction and not slowing down.

  • @Sludgehammer@lemmy.world
    link
    fedilink
    English
    3
    edit-2
    11 months ago

    IMHO it’s the flashing lights. I really think they overload the self driving software somehow and it starts ignoring changes in driving conditions (like say an emergency vehicle parked in the road).

    • Tiefling IRL
      link
      fedilink
      811 months ago

      To be fair, police lights at night are so bright they give me migraines

    • @Geyser@lemmy.world
      link
      fedilink
      711 months ago

      I’ll bet you’re right that it’s the lights, but I don’t know about “overload” of anything.

      The problem with camera vision (vs human vision or LiDAR) is poor definition range. This means that pointing a light at it, like happens with emergency vehicle lights, can cause it to dim the whole image to compensate and then not see the vehicles. Same thing as when you take a backlit photo and can’t see the people.

  • bitwolf
    link
    fedilink
    29
    edit-2
    11 months ago

    They give so much lenience to Tesla.

    Yet Cruise was kicked out of California for someone else hitting a pedestrian into the Cruise vehicle and running. This was while also providing dashcam footage to capture the assailant.

  • @NameTaken@lemmy.world
    link
    fedilink
    311 months ago

    Ugh I know people feel strongly about FSD and Tesla. As some one who uses it ( and still pays attention hands on wheels when activated) when FSD is active as soon as it sees anything resembling emergency lights it will beep and clearly disengage. I am not sure, but it’s possible this person probably is just using Tesla as a scape goat for their own poor driving. However in my experience it will force the driver to take control when emergency lights are recognized specifically to avoid instances like this.

      • @NameTaken@lemmy.world
        link
        fedilink
        111 months ago

        Yeah sure if that’s what makes you happy… 👍. Nothing like blinding random people in cars in your spare time.

        • @vxx@lemmy.world
          link
          fedilink
          1
          edit-2
          11 months ago

          No, not the driver, the faulty sensors and programming that should’ve never been approved for the road.

          • @NameTaken@lemmy.world
            link
            fedilink
            011 months ago

            Wait so how is it faulty and bad programming if it disengages when emergency vehicles are present? You’d prefer it to stay on in emergency situations?

    • @NotMyOldRedditName@lemmy.world
      link
      fedilink
      411 months ago

      Assuming something was on, I’m not even convinced it was FSD and it could have easily been AP.

      The media and police get that wrong more often than right, and the article isn’t even specifically naming either one.

        • @NotMyOldRedditName@lemmy.world
          link
          fedilink
          111 months ago

          I can’t imagine a scenario where they’d be on FSD or AP pressing the accelerator AND looking at their phone.

          It’s one thing to press it because it’s hesitant or something but that would usually mean you’ve presses it because it’s not doing what you want which means you were watching.

          Him admitting he was on his phone (if truthful) would mean he was pressing the accelerator thus overriding the input AND not paying attention.

          It’s a stretch to far.

          If he lied about the phone to try and blame AP/FSD then that could make sense.

    • Joelk111
      link
      fedilink
      711 months ago

      Doesn’t Tesla usually look at the logs for a situation like this, so we’ll know shortly?

      • @Matty_r@programming.dev
        link
        fedilink
        2211 months ago

        “As you can see by looking at the logs, the FSD was disengaged 276ms prior to the crash, therefore the driver is at fault” /s

  • @Buffalox@lemmy.world
    link
    fedilink
    25
    edit-2
    11 months ago

    I just heard from Enron Musk that it crashed into the patrol car way more safely than a human would have done.
    Also according to Enron Musk Full self driving has been working since 2017, and is in such a refined state now, that you wouldn’t believe how gracefully it crashed into that patrol car. It was almost like a car ballet, ending in a small elegant piruette.

    As Enron Musk recently stated, in a few months we should have Tesla Robo Taxies in the streets, and you will be able to observe these beautiful events regularly yourself.

    Others say that’s ridiculous, he is just trying to save Enron, but that’s too late.

    • @brbposting@sh.itjust.works
      link
      fedilink
      English
      1211 months ago

      All I do at night is open my garage door to let my car out. A few months later here I’m a millionaire. Thank you full self driving Roboenron 😍

      • @Buffalox@lemmy.world
        link
        fedilink
        611 months ago

        Yes I remember that, and then he has repeated every year since, that they will be ready next year. But THIS year he changed his tune somewhat, and claimed it was a matter of months.
        How is this con man not in jail?

          • @Buffalox@lemmy.world
            link
            fedilink
            111 months ago

            He is very good at what he does, being a con man.
            But I think step one is to be a malignant narcissist, having such an oversized ego and disregard for others, that you actually believe you deserve it.

            Apart from that I can’t really explain it, except IMO the stockholders that gave him that bonus are morons, that let themselves be exploited.

  • @IsThisAnAI@lemmy.world
    link
    fedilink
    -4111 months ago

    Jesus you Elon haters can’t help yourself.

    This 👏 isn’t 👏 news. It’s an obsession with a billionaire.

    • Stopthatgirl7OP
      link
      fedilink
      611 months ago

      I just think it’s funny that it crashed into a cop car. It made me laugh.

    • @Railcar8095@lemm.ee
      link
      fedilink
      911 months ago

      It’s time to face reality. That stock you bought at 300 is not going to recover any time soon.

      Cut your loses and stop defending your favorite billionaire.

      • @IsThisAnAI@lemmy.world
        link
        fedilink
        -17
        edit-2
        11 months ago

        Yes. It’s a single crash, no details, and happens every day with non assisted driving, LKA, ACC, etc. But since it’s a Tesla anti Elon zealots have to post every single rage bait article they can get their hands on.

        It’s an obsession and has nothing to do with technology.

        • @Zorg@lemmy.blahaj.zone
          link
          fedilink
          12
          edit-2
          11 months ago

          In 59 crashes examined by NHTSA, the agency found that Tesla drivers had enough time, “five or more seconds,” prior to crashing into another object in which to react. In 19 of those crashes, the hazard was visible for 10 or more seconds before the collision. Reviewing crash logs and data provided by Tesla, NHTSA found that drivers failed to brake or steer to avoid the hazard in a majority of the crashes analyzed.

          NHTSA also compared Tesla’s Level 2 (L2) automation features to products available in other companies’ vehicles. Unlike other systems, Autopilot would disengage rather than allow drivers to adjust their steering. This “discourages” drivers from staying involved in the task of driving, NHTSA said.
          “A comparison of Tesla’s design choices to those of L2 peers identified Tesla as an industry outlier in its approach to L2 technology by mismatching a weak driver engagement system with Autopilot’s permissive operating capabilities,” the agency said.
          Even the brand name “Autopilot” is misleading, NHTSA said, conjuring up the idea that drivers are not in control. While other companies use some version of “assist,” “sense,” or “team,” Tesla’s products lure drivers into thinking they are more capable than they are.
          https://www.theverge.com/2024/4/26/24141361/tesla-autopilot-fsd-nhtsa-investigation-report-crash-death

          It is not a single crash. There are assisted driving system out there using pupil tracking, to make sure drivers are still paying attention.
          Tesla’s solution is something along you need to be resting at least one hand on the steering wheel. And don’t get me started on how they are diluting the concept of “full self driving”…

          But yeah, you’re right, the only reason I’m sceptical of Tesla’s semi-self-driving tech; is because I think Elon is an egomaniac little bitch, who is incapable of ever admitting he was wrong in even the smallest way.

          • @IsThisAnAI@lemmy.world
            link
            fedilink
            -4
            edit-2
            11 months ago

            It doesn’t increase the total volume of crashes per mile driven. Humans are shitty drivers, the bar is low. We’ve heard ad nauseum about the name of FSD. It’s a truth in advertising issue snd idiotic driver issue. Not safety.