• tsonfeir@lemm.ee
    link
    fedilink
    English
    arrow-up
    57
    arrow-down
    2
    ·
    1 year ago

    The more important person to punish is the one who let them do it

  • Nic Cage@lemmy.world
    link
    fedilink
    English
    arrow-up
    54
    arrow-down
    13
    ·
    edit-2
    1 year ago

    Considering 68.25% of all US crashes involving driver assist systems were due to Tesla Autopilot, I agree it’s an experiment.

    Edit: let me clarify, ALL lane-assist based systems in my opinion are not ready for public road use. Tesla sucks, but they all suck if they are causing accidents and fatalities.

    • Kage520@lemmy.world
      link
      fedilink
      English
      arrow-up
      25
      arrow-down
      5
      ·
      1 year ago

      Woah that sounds really great actually, considering Tesla probably has 10x the autopilot miles driven compared to other manufacturers.

      • Nic Cage@lemmy.world
        link
        fedilink
        English
        arrow-up
        16
        arrow-down
        15
        ·
        edit-2
        1 year ago

        Any accident caused by lane-assist technology is too many. I won’t accept the loss of human life for a convenience technology.

          • Nic Cage@lemmy.world
            link
            fedilink
            English
            arrow-up
            14
            arrow-down
            10
            ·
            1 year ago

            Are they? Maybe here in the EU they are, but growing up in American Suburbia, a car was a necessity.

            I’m not going to go down your slippery slope of ever expanding scopes on convenience technology though.

            • irmoz@reddthat.com
              link
              fedilink
              arrow-up
              12
              arrow-down
              2
              ·
              edit-2
              1 year ago

              Suburbs and diffuse urban centers connected by highways are a consequence of cars, not the other way around. The US could have instead opted for public transport and densely packed services so a full shopping trip doesn’t take you all the way around the state. Here in the UK I can just walk into town and all the things you need are an easy walk from each other,

              • postmateDumbass@lemmy.world
                link
                fedilink
                English
                arrow-up
                5
                arrow-down
                1
                ·
                1 year ago

                In America walking from one store to another store 4 stores away could be an over half a mile long stroll.

                • AngryCommieKender@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  ·
                  1 year ago

                  Caused by minimum parking laws that we don’t need. We could fix the problem by building our cities the way we used to before GM bought all the trolleys, and scrapped them, to sell more cars.

              • Nic Cage@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                arrow-down
                1
                ·
                1 year ago

                I’m right there with you. I immigrated to the Netherlands and I no longer own a car (well I have a track car, but that’s different). I just bike or take the train everywhere.

            • EvacuateSoul@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              ·
              1 year ago

              People are always going to adjust their risk upwards as technology gets safer. Even if all cars were self-driving and perfect, some pedestrian will push the bounds of physics, stepping out with no time to stop.

              These drivers aren’t going to sleep or Tiktoking in the first 30 minutes. They are being lulled into complacency by a tech that generally does a good job, and they have been told by marketing that we are so close to FSD.

            • jaybone@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              ·
              1 year ago

              Isn’t every technology a convenience technology?

              We weren’t making fires or using levers to inconvenience ourselves.

              • Nic Cage@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                If you don’t limit the scope of time, yeah, although I’d say yesterday’s convenience tech can become today’s necessary tech.

                I don’t know, the more I think about “convenience technology”, the more I dislike the term.

            • rckclmbr@lemm.ee
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              3
              ·
              edit-2
              1 year ago

              I live in American suburbia and have far more miles on my bike than a car. And yes I have kids too. Yes the zoning sucks, but also Americans are just more lazy.

              • Nic Cage@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                ·
                1 year ago

                One of the fortunate American suburbanites, many are not so lucky. I now live in the Netherlands and either bike or take the train where I need to go.

        • nutsack@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          2
          ·
          1 year ago

          you have to compare it with human shampoo drivers to have this number mean anything

        • joshhsoj1902@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Your statement only works if you’re also accounting for accidents prevented by lane assist technology. It’s also worth factoring in cases where these technologies were able to make an accident less severe.

    • Final Remix@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 year ago

      My parents have that Lidar cruise control on their Toyota. It was active—but not on—one day when I was driving, and the damned car started freaking out BRAKE BRAKE BRAKE thinking I’m about to plow into a parked car because there was a gentle curve in the road.

        • quaddo@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          “WHOA THERE DUDE! Geez, didn’t you see that paper cup being blown by the wind?? Totally saved your ass.”

    • EvacuateSoul@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      25
      ·
      1 year ago

      This is one of those times you should realize how misleading statistics can be. Can you think of what might be a more informative measurement if we are actually after the truth?

        • EvacuateSoul@lemmy.world
          link
          fedilink
          English
          arrow-up
          20
          arrow-down
          3
          ·
          1 year ago

          Love it haha. I don’t care about Tesla at all, but including the share of miles driven on Autopilot versus other companies’ tech would be much more revealing. If 90% of miles driven were on Autopilot, they would be outperforming their competitors.

          • whoisearth@lemmy.ca
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            13
            ·
            1 year ago

            How does that make it any more “right” that they’re testing on public roads?

            Will you bend over for Elon when one of his “tests” ram a minivan on a highway killing a family of 5?

            “Oh but this was one accident out of 5000 test miles driven”

            • EvacuateSoul@lemmy.world
              link
              fedilink
              English
              arrow-up
              18
              arrow-down
              1
              ·
              1 year ago

              I am not defending him, just saying it’s wrong to use misleading stats even with a good point.

              • whoisearth@lemmy.ca
                link
                fedilink
                English
                arrow-up
                3
                arrow-down
                8
                ·
                1 year ago

                You’re being pedantic then. The issue is not the stats because the fundamental is they should not be beta testing this on public roads. Have you signed any waivers if one kills you or maims you? I know I haven’t.

                • EvacuateSoul@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  arrow-down
                  2
                  ·
                  1 year ago

                  You should go to another part of the comments, then, because over here we were discussing the application of the statistic.

      • Nic Cage@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        2
        ·
        1 year ago

        Okay mate, why don’t you show us all what the “more informative measurement” is for this?

        • Grimy@lemmy.world
          link
          fedilink
          English
          arrow-up
          16
          arrow-down
          1
          ·
          edit-2
          1 year ago

          It would be nice if the above statistic mentioned the ratio of Tesla’s compared to other cars. If 90% of cars with autopilot are trslas but they only account for 70% of crashes, that’s a good thing. There’s also the problem with wording, driving assist does includes a lot more than just a fully self driving car.

          But the only important statistic is how likely a self driving car is to get into an accident compared to a human driver.

          People really have to learn to seperate the tech from the man. Elon Musk is a piece of shit, that doesn’t mean everything he has his hand in is. Self driving cars are cool as fuck and if they aren’t safer than human drivers atm, they clearly quickly will be.

          • Nic Cage@lemmy.world
            link
            fedilink
            English
            arrow-up
            8
            ·
            1 year ago

            Yes, you’re correct. From source:

            The National Highway Traffic Safety Administration (NHTSA) cautioned against using the numbers to compare automakers, saying it did not weight them by the number of vehicles from each manufacturer or how many miles those vehicles traveled.

            I’m trying to find the Tesla:others ratio, but that’s proving a bit difficult.

            A bit of a moot point in my eyes as I consider all 400 accidents unacceptable, but you are right, I shouldn’t use stats just to shit on Tesla.

          • EvacuateSoul@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            ·
            1 year ago

            Close, but usage matters too. Just owning a car with driver assist doesn’t mean you use it at the same rate. Share of miles driven with assist features would be better.

            Then if you want to get gritty, I guess we could try to quantify how complex the miles were. Dense city miles and construction zones should count more.

        • r_se_random@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          9
          ·
          1 year ago

          I guess accidents per thousand/million cars on road would be more representative.

          Think of it like this, if ~70% of all autonomous driving cars were Teslas, and they have a ~70% contribution to the accident volume, then they’re as bad as the competition.

          I’m not saying Tesla’s auto pilot doesn’t have problems, but this particular metric is not the best one to say how it is compared to the competition.

          Personal opinion: No manufacturer has an auto pilot capable enough to be on the road.

          • dmention7@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            1 year ago

            Another point that rarely seems to be accounted for is what type of miles are being used for comparison.

            Aggregate autopilot crash rates may look good compared to non-autopilot rates, but if autopilot cannot be used in inclement weather, challenging roads, or other risky situations, then the statistic is misleading. (Statistics??? Misleading??? Well, I never…)

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    16
    ·
    1 year ago

    This is the best summary I could come up with:


    “In late 2021, Lukasz realised that—even as a service technician—he had access to a shockingly wide range of internal data at Tesla,” the group’s prize announcement said.

    Krupski was also featured last month in a New York Times article titled, “Man vs. Musk: A Whistleblower Creates Headaches for Tesla.”

    But Krupski now says that “he was harassed, threatened and eventually fired after complaining about what he considered grave safety problems at his workplace near Oslo,” the NYT report said.

    Krupski “was part of a crew that helped prepare Teslas for buyers but became so frustrated with the company that last year he handed over reams of data from the carmaker’s computer system to Handelsblatt, a German business newspaper,” the report said.

    The data Krupski leaked included lists of employees and personal information, as well as “thousands of accident reports and other internal Tesla communications.”

    Krupski told the NYT that he was interviewed by the NHTSA several times, and has provided information to the US Securities and Exchange Commission about Tesla’s accounting practices.


    The original article contains 705 words, the summary contains 172 words. Saved 76%. I’m a bot and I’m open source!

  • Poggervania@kbin.social
    link
    fedilink
    arrow-up
    14
    arrow-down
    1
    ·
    1 year ago

    It’s funny how some of Elongated Muskrat’s testing and experiments involve the subjects dying.

    Monkeys dying with the Neuralink experiments, and humans are dying with these autopilot tests!

  • Kage520@lemmy.world
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    18
    ·
    1 year ago

    Lemmy as a whole appears to irrationally hate Tesla because of their stupid CEO. I think his penchant for calling what is essentially “advanced autopilot” FULL SELF DRIVING should be illegal. But he’s a car salesman and for some reason the government is letting him call it that. Be mad at our lawmakers for that. He’s just a sheister and our lawmakers suck at reining him in. Tesla cars themselves are actually really good. Very safe cars that don’t roll over because of the heavy battery located so low, very responsive acceleration, and some nice quality of life low hanging fruit in the technology department, like my phone being a key. I was told by my Tesla rep when I bought the car to not buy FSD. It’s experimental and will not ever probably be driving you to your destination safely. The fact that they sell it with a name that implies it will is the problem. And people believe it. That’s incredibly dangerous.

    • NotMyOldRedditName@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      2
      ·
      edit-2
      1 year ago

      On the same note of blame the lawmakers

      There’s a lot of hate about Teslas cars not reaching EPA estimates on highways.

      The EPA test is the problem. The test doesn’t include real world driving such as at 70mph and for whatever reasons, a Tesla often takes a bigger hit at 70mph than some other cars.

      I don’t doubt Tesla did some ratio optimization on the motors to get better EPA numbers, that’s just playing the game, but please lobby the EPA to change the testing methodology.

      Tests need to better include faster driving. Manufacturers should be required to show both numbers not a combined number in their advertising materials, and they really need to add some sort of cold weather test.

      Edit: also the whole 2 different test cycles they can choose between is ridiculous. Make it all the same.

      • nxdefiant@startrek.website
        link
        fedilink
        English
        arrow-up
        17
        ·
        edit-2
        1 year ago

        This infuriates me to no end. The EPA could just mandate multiple numbers!

        I want a graph of the car going every speed between 55, 65, 75, and 85 on a treadmill at 0, 25, 50, 75, and 100°F while maintaining a cabin temp of 72°F.

        I want to know how much battery it used at those temps, simulating catching every red light in a downtown setting, in an hour.

        I want discharge rates for all those temps with it just sitting there for a week, same for a month.

        “Combined blah” is horseshit.

    • lud@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      7
      ·
      1 year ago

      It’s consentual if you buy it though.

      Calling it a war crime is slightly extreme.

      • spudwart@spudwart.com
        link
        fedilink
        English
        arrow-up
        10
        ·
        1 year ago

        Except the other drivers on the road aren’t all in Teslas, yet they are non-consentually and possibly even unknowingly a part of this experiment.

      • there1snospoon@ttrpg.network
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        If you hit another motorist or pedestrian, it’s no longer consensual.

        War crime is a tad much sure. Let’s just make it a felony.

  • rsuri@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    2
    ·
    edit-2
    1 year ago

    Random question I’ve always wondered about in case anyone is more familiar with these systems than me. My understanding is that autopilot relies on optical sensors exclusively. And image recognition tends to rely on getting loads of data to recognize particular objects. But what if there’s an object not in the training data, like a boulder in a weird shape? Can autopilot tell anything is there at all?

    • Captain Janeway@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      1 year ago

      Yeah obstructions can be generalized to a road being blocked. Object recognition includes recognizing the shape of an object via curves, shadows, depth, etc. You don’t need to know it’s a boulder to know a large object is in the road.

  • BB69@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    52
    ·
    edit-2
    1 year ago

    FSD, maybe. But autopilot operates fine and is no different than what most major manufacturers offer.

    Edit: Lots of people that have never used Tesla or other manufacturers lane keeping systems I see.

    • Ghostalmedia@lemmy.world
      link
      fedilink
      English
      arrow-up
      23
      arrow-down
      1
      ·
      1 year ago

      Last time I tried autopilot was 4 years ago, so I imagine things have become better. That said, on a test drive, on a rainy day, auto lane change did some fighting stuff. Thought lanes were clear, learned they weren’t, then violently ripped the car back to the origin lane in conditions that were prime for hydroplaning.

      My wife and I were scared shitless, and the woman from Telsa, who was also in the car, tried to reassure us by saying “it’s ok, this is normal.”

      Then we return the car to the parking lot and auto park almost took out a kid in an enclosed parking structure.

      I imagine it’s become better in 4 years, but how that was street legal baffled my mind.

        • Ghostalmedia@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          2
          ·
          edit-2
          1 year ago

          Yes they are. There are two tiers of autopilot functionality. Basic and Advanced. This is part of the Advanced Autopilot tier.

          https://www.tesla.com/support/autopilot

          Telsa refers to those features as “autopilot”, and this former employee is referring those features as “autopilot” in his whistle blower claims.

            • Ghostalmedia@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              1 year ago

              This is like arguing that an iPhone Pro isn’t a “iPhone,” it’s a “iPhone Pro.”

              Call it whatever you want. This whistle blower, the press, and this comment thread are all referring to unsafe features of Tesla’s L2 automation that are currently available to the public.

                • Ghostalmedia@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  2
                  ·
                  edit-2
                  1 year ago

                  Enhanced Autopilot is very popular. All the hardware is already installed on the car, it just needs to be unlocked by purchasing the subscription in the app. The Full Self Driving package is also unlockable via a software subscription. FSB will be out of beta soon, but advanced autopilot has been a popular purchase for many years. It’s one of the main reasons people buy a Telsa. It is most definitely not on “practically zero” Teslas.

                  As for “according to whom” - you replied to my comment about my experience with autopilot. So according to me.

                  Advanced autopilot did some frightening stuff during the little time I spent driving a model 3. I really wanted to like the model 3 and was expecting to whip out my checkbook, but that test drive scared the shit out of my wife and I. It made some very dangerous lane changes and the autonomous parking stuff almost hit a kid in a parking lot. The latter is definitely widely reported. I’m not the only person to have experienced that problem.

      • BB69@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        35
        ·
        1 year ago

        None of what you mentioned is in basic autopilot. Autopilot is lane keep and traffic aware cruise control only.

          • nxdefiant@startrek.website
            link
            fedilink
            English
            arrow-up
            14
            arrow-down
            5
            ·
            1 year ago

            If these were called “cruise control”, “adaptive cruise control”, and “Blue Cruise” would it matter if the article said “cruise control” but was referring to “Blue Cruise”?

            Tesla’s names for these things are “Autopilot”, “Enhanced Autopilot”, and “FSD Beta”.

            At the very least, the names matter so that we can all agree we’re talking about the same things.

          • BB69@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            19
            ·
            1 year ago

            Which is not included with the base vehicle. It’s an extra purchase.

              • BB69@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                arrow-down
                11
                ·
                1 year ago

                Sure, which I consider part of FSD, which almost killed me like 3 times when I had a loaner with it active.

                But that’s not basic autopilot. AP is fine assuming people pay attention.

                • Ghostalmedia@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  1 year ago

                  which I consider part of FSD

                  Well when Telsa, this former employee / whistleblower, and these journalists refer to “autopilot,” they’re specifically talking about the software and hardware marketed under the “____ Autopilot” banner that Telsa uses for those features.

                  Some of these more advanced autopilot features clearly have issues, and it probably stems from the fact that they’re only using cameras and ultra sonic, not lidar.

                  In my experience with a Model 3 and AAP, when those cameras and sensors were wet, it was pretty clear that they were getting dangerous. It started raining during our test drive, so we had a before / after experience on the same roads. Once everything got obstructed with water, you could see the car’s collision detection struggle to detect other objects. Objects on the center display would erratically pop in out of view. And this was a showroom car, it wasn’t the first rain of the year, and it was behaving “normally” according to staff.

                  Even if basic autopilot was fine, this left such a sour taste in my mouth that I had no appetite to give that company my money. Almost dying and almost killing a kid were a big “fuck this company” for me.

                • brbposting@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  1
                  ·
                  1 year ago

                  AP is fine assuming people pay attention.

                  There’s a human tendency to become complacent after a while, which presents a risk.

                  Can’t wait for safer-than-human self-driving technology, and know we’ll need to take certain risks of some sort to get there, but there are good arguments against “PLEASE remain fully attentive 100% of the time for this technology that will in fact only require full attentiveness in edge cases”. You might be an exception of course! But Average Meat Driver is going to slip into complacency after many, many miles of perfect autopiloting.

    • AtmaJnana@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      3
      ·
      edit-2
      1 year ago

      My vehicle can do almost all the same stuff as “autopilot” but it turns the autosteering and cruise off if I dont touch the wheel every 30 seconds. Its all the same types of sensors,etc. And mine isn’t even a luxury brand. Just the higher end trim package of a budget vehicle.

      edit: actually, it’s just 10 seconds before the warning and another 5 or so before it disables lane-keeping

    • inclementimmigrant@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      1
      ·
      edit-2
      1 year ago

      No.

      I own a model 3 and a 2022 palisade with Lane assist and used to own a Subaru with Lane assist.

      The model 3 auto steer, exit to exit EAP, and auto lane change are very different than the simple lane assist that either other cars offer and honestly after using EAP for five years, while I do use AP under specific circumstances, I have come to the opinion that it is not ready for prime time and has some major issues, especially the auto lane changing, that should have been worked out before release and I still never use that feature.

      Given my background in embedded software, I honestly think the way they rolled out and advertised these features was reckless.

      • BB69@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        13
        ·
        1 year ago

        EAP is not based autopilot and closer to FSD. Base autopilot is on par with most manufacturers. I’d argue it’s safer than some in regards to capabilities with less common lane setups or lack of clear road lines.

    • Nic Cage@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      1
      ·
      edit-2
      1 year ago

      Edit: before anyone goes and reads all of this, I’ll sum it up:

      This is a textbook argument from anecdote. They presented their anecdotal evidence and conclusion implying that because they’ve done 10k miles, it follows that Autopilot accidents shouldn’t be this big of a deal (blown out of proportion). A scalding hot garbage take. I got a little emotionally off topic with cause of deaths in a fallacious appeal to emotion.

      Ah yes, “I’ve personally done this and I’m the most important, therefore it’s not true. It’s the experts and engineers who are wrong.”

      Besides “everybody look at me!!”, what is your point?

        • Nic Cage@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          edit-2
          1 year ago

          And you say that based on personal anecdotes, rather than education and design/testing/systems implementation experience.

          Do you think 10k miles (unknown time frame, but my point stands even at one year) is a singular meaningful data point for the thousands (millions now?) of Autopilot enabled Teslas?

          What is the right proportion for it to be blown up to your loved one was killed?

            • Nic Cage@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              arrow-down
              1
              ·
              1 year ago

              If it was an “experiment” I would know about it because it would be driving erratically and attempting to kill me but instead it is “pretty good”.

              Mate, that’s not an experiment lifecycle works at all. The ignorance on display here is palpable.

              I would be very upset that there was such an irresponsible driver behind the wheel for sure. That’s not Tesla’s fault. You can’t blame Tesla because they added safety features to the car…

              How convenient, you avoided answering my question at all! And rather blame the poor implementation of an inaccurate technology, disguised as a safety feature (that has directly caused 17 fatalities, mind you), you blame the user (victim). Actually, I’d make the argument this isn’t a “safety feature” at all, more of a convenience feature, but I digress.

              Also there are dozens of other cars with this same technology on the road but everyone wants to pick on Tesla because it makes headlines.

              I’ll gladly shit on all the other manufacturers implementing this technology. It’s all garbage if it poses a threat to a human life.

              All of this is out of scope as my point is your anecdotal experience is not significant and dismissing results due to your own biases is quite ignorant.

                • Nic Cage@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  arrow-down
                  1
                  ·
                  edit-2
                  1 year ago

                  Agreed.

                  eye roll I’m so ignorant. My CompE Master’s and my previous work in vision-based AI collision detection systems has left me woefully unprepared to have educated and qualified experience for this exact topic.

                  Why do you hate children?

                  I never mentioned children, don’t put words in my mouth. Keep deflecting.

                  :citation needed:

                  Here you go: https://www.washingtonpost.com/technology/2023/06/10/tesla-autopilot-crashes-elon-musk/

                  Nifty: https://www.tesladeaths.com/

                  It protects human life.

                  I’m sure the 17+ dead human lives really appreciate this “protection”.

                  Maybe “results” wasn’t the right word. Dismissing the findings of engineers close to the technology might fit better, but I’m definitely calling the number of fatalities and accidents caused by Autopilot as results of Autopilot and therefore Tesla.

                  It is not my intention to attempt to change your mind on this technology though, I’m just pointing out your anecdotal experience is not significant and dismissing the findings of engineers close to the technology due to your own biases is incredibly ignorant. This mindset is harmful whether you accept that or not.

                  Best of luck and good day.