Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The final seconds of a fatal Tesla Autopilot crash – A reconstruction (washingtonpost.com)
41 points by thunderbong on Oct 7, 2023 | hide | past | favorite | 100 comments


Looking at the evidence, it would be quite hard to argue that Tesla is not in the wrong. Despite the other commenter focusing on tangential issues like semi side skirts, the fact is that Tesla autopilot has been represented as being capable of handling complex road conditions as well or better than a human. And this is a simple case that demonstrates that it is not capable of detecting an object in front of it, on a flat road in good weather, that it will clearly impact.

The problem is not Tesla experimenting with self driving tech. The problem is Tesla is overstating its capabilities, confusing consumers with mixed messaging between their marketing/CEO statements and reality, and reaping the financial benefits of convincing users they are buying autonomous-capable cars when they are nowhere near autonomous.


I would say the problem is they are experimenting with self driving tech on the general public. And it’s resulting in deaths of people not driving teslas - like heck read how the truck driver feels post incident dudes shook up


It's wild that this experimental feature is legal.


Even more wild they haven’t been fined or successfully sued for millions by the families of the dead yet


True. Where is this crazy habit of American consumers winning absurd amounts of money in court for big corp faults of extremely varying severity when you need it just once to stop an actually severe practice?


> Tesla autopilot has been represented as being capable of handling complex road conditions as well or better than a human

Has it? In question here is Autopilot and not FSD Beta, with the car stating that Autosteer is just to keep you within the lane lines with this popup required to enable it[0] (and a different popup to enable auto-lane-change, if you pay for enhanced). Note that rental cars (or at least my Hertz rental) also have it turned off by default to make sure the person driving sees this.

The post also mentions:

> However, the NTSB also cited Banner’s “overreliance on automation,” saying Tesla’s design “permitted disengagement by the driver” and contributed to the crash. Four years later, despite pleas from safety investigators, regulators in Washington have outlined no clear plan to address those shortcomings, allowing the Autopilot experiment to continue to play out on American roads, with little federal intervention.

In relation to Tesla's efforts, they have since implemented active driver monitoring - if you look down at your phone, or even passenger, for more than a few seconds with it engaged, it loudly beeps and requires you look back at the road. If you don't look at the road, or with enough instances of this, it will suspend your autopilot privileges until you put it in park and back in drive.

0: https://rr.judge.sh/Starling/a850b5/36G5oYNYqt.jpg


This Consumer Reports review from January this year, rates the three top Active Driving Systems, and none of them is the one from Tesla. Tesla it now at the middle of commercial offers. It also call Tesla driver monitoring not effective.

https://reprints.theygsgroup.com/cr/reprints/FordBlueCruise....


> Has it? In question here is Autopilot and not FSD Beta

Per Musk himself[1]: “The data is unequivocal that Autopilot is safer than human driving by a significant margin”

Also Musk famously overrepresented in a promotional video[2]

The worst part about this is that I think Elon genuinely believes that this is correct. And that is a huge problem that creates complacency. He is already benefitting from convincing some portion of the public of this view.

[1] https://www.nytimes.com/2023/01/17/magazine/tesla-autopilot-... [2] https://www.bloomberg.com/news/articles/2023-01-19/elon-musk...


The data backs him up there[0] - note that

> To ensure our statistics are conservative, we count any crash in which Autopilot was deactivated within 5 seconds before impact, and we count all crashes in which the incident alert indicated an airbag or other active restraint deployed.

Focusing on single incidents will pull at your heartstrings, but each incident is one in a sea of data that shows an overall improvement despite isolated heart-breaking incidents as a result of our inability to create a perfect system.

Although I agree that his "tweets" are garbage and his cult mentality/god-like demeanor puts a lot of false confidence in people (who are likely a majority of those paying the $200/mo for beta), not to mention that thousands of people won't consider buying a Tesla due to his involvement. Chances are Tom Zhu will take over eventually.

0: https://digitalassets.tesla.com/tesla-contents/image/upload/... from https://tesla.com/vehiclesafetyreport


The phrasing “Autopilot is safer than human driving” makes it sound like Autopilot is a replacement for human driving, rather than a supplement. That kind of marketing is part of the problem: it’s teaching consumers to rely on Autopilot too much and encouraging users to misunderstand Autopilot’s capabilities and limitations.


You are confusing Autopilot with FSD which are two very different systems with very different capabilities.

Autopilot doesn't even stop at a red light, no Tesla owner thinks it's more capable than human.


> You are confusing Autopilot with FSD which are two very different systems with very different capabilities.

This is exactly the problem: consumers are confusing Autopilot and FSD. The part where liability comes in is that confusion is not accidental, but rather a direct and deliberate product of Tesla’s marketing around these products.


Maybe the CEO of Tesla would feel less empowered to make those claims, if many of the darlings of AI, would not be silent, while sharing the stage with him during those wild statements.

https://youtu.be/zhr6fHmCJ6k


I have no opinion on Tesla or their autopilot system, but this is a poorly researched article even for wapo. It's focusing on the wrong thing in this particular case.

Even normal collision avoidance systems cannot easily detect a semi truck sideways, because semi trucks and trailers in NA aren't required to have side skirts and protection bars. This won't happen in EU because all their trucks have mandated side skirts and bars. So when the car hits it the occupants are protected through the air bags and crumple zones.

The objective should be to prevent severe bodily harm or loss of life to the occupants in the event of a collision, with/without an avoidance system.

Find out why trucks in NA don't have mandated side skirts. Who's lobbying against it ?

If they really want to talk about lack of proper safeguards in autopilot they could've investigated many other reported scenarios like Teslas suddenly steering into the opposing lane, dividers. There are also reports of their system behaving erratically when the road slopes up into the horizon.


> Even normal collision avoidance systems cannot detect a semi truck sideways because semi trucks and trailers in NA aren't required to have side skirts and protection bars.

You know what can detect a semi truck sideways? Human eyes.

Collision detection systems are meant to trigger far later anyway, potentially after it's already too late. Side skirts would be helpful for sure in reducing fatalities in the event of a collision, but better still is avoiding the collision in the first place.

This kind of response is extremely condescending: "this is a poorly researched article because it didn't focus on the aspect of the story that I find most important." It's a perfectly valid approach to focus on the fact that Tesla's marketing of their "fancy cruise control" (their lawyer's words) causes people to trust it with their lives in unsafe ways, even when it's apparently incapable of something as simple as seeing the broad side of a semi truck. If you think the lack of side skirts is such an important issue, write your own article on it, don't cast aspersions on someone else's article.


The lack of side skirt is an important issue, though human eyes are certainly more important.

We do not just rely on just one system to keep us safe on the road, but multiple systems. Road design, trucking regulations, the software or lack thereof, human elements, all contribute to our safety.


Right, but it's completely unfair of OP to say that the article is poorly researched because it doesn't touch on something tangential. Tesla has marketing materials that say "the person in the driver’s seat is only there for legal reasons", while their lawyers in court say it's "basically just fancy cruise control". That's the story wapo chose to tell, and it's a perfectly legitimate story.

I'm sure there are plenty of other things we can learn from this incident, but they're not lazy for not focusing on them in this piece.


I disagree about this being tangential. This article focuses in-depth on a single fatal accident with fancy animations. It's the job of the reporter to explore how other vehicles protect against this and how other regions like EU mitigate loss of life. This is what you call "fact-based" and "unbiased" reporting, that WaPo claims. Otherwise all they would be doing here is exploiting someone's tragedy to drive a pre-ordained conclusion, also known as "opinion".

If the article is really about marketing issues with autopilot, they are many more avenues to explore, like how they're actually not allowed to call it "Autopilot" in Germany (try to find the word "Germany" in this article).


> Otherwise all they would be doing here is exploiting someone's tragedy to drive a pre-ordained conclusion, also known as "opinion".

They're emphasizing the point that the man's family is trying to make in court. That's hardly exploiting someone else's tragedy, it's telling their story.


Legal reasons include a big popup gating access to the autosteer feature, as well as active driver monitoring to ensure you're watching the road. It's not "only there for legal reasons" and I don't see any marketing material that implies as such [0].

0: https://www.tesla.com/autopilot


It does not focus on the wrong thing. The relevant fact here is that the system failed in a very straightforward and important obstacle-detection-and-avoidance case, and this fact cannot be waved away by supposing that the driver would have survived if the truck had been better designed (and neither can it be waved away by blaming either diver.)

The article also focuses on the irresponsible mixed messaging coming from Tesla.


Adding those rules in the US has been tired more than once.

https://www.truckinginfo.com/10138132/underride-legislation-...


It seems to have gone nowhere, I suspect due to lobbying.


You seem to be conflating impact mitigation with obstacle detection


I'm focusing on neither. I'm saying there should be regulations to prevent loss of life even when impact happens, especially when other regions have implemented known solutions.


I see. I missed that because it's quite tangential. WashPo didn't miss the point by not talking about it.


> Even normal collision avoidance systems cannot easily detect a semi truck sideways, because semi trucks and trailers in NA aren't required to have side skirts and protection bars.

This sentence seems to be saying that skirts and protection bars are the reason that self driving cars cannot see these vehicles. Is that really true? Is it a conclusion you've come to on your own, or is it some kind of research or something? It seems like the crux of your whole post, but is absent.


While not what you're addressing, yes skirts and underride bars save lives:

2011: https://youtu.be/bT3G-kcKN70

2013: https://youtu.be/C3MPKLy9qHU

2018: https://youtu.be/WamlGVhGUYs


So by "normal", I meant the radar-based systems in normal cars, also known as AEB. They are usually mounted low on the front fenders or grille. Semi truck trailers are tall, have a huge empty gap between them and the road, and just high enough to not be seen by the radar. From the radars POV it appears like the road is unobstructed.

Obviously this isn't hard to detect with full LIDAR systems in self driving cars because they are on the roof and you have full depth information. This is why I do not mention "self driving cars" in my parent comment.


Do you have a source for this? There’s a gap between semi wheels, sure, but the deck height of a semi trailer in North America is about 48”. If automotive radars weren’t able to detect objects above this height, they would miss a lot of obstacles…

One automotive radar I’m familiar with, the Bosch MRR [1], lists a vertical FOV of ±15 degrees, and a 300 m maximum detection range. Assuming the sensor is mounted perpendicular to the ground at 0.5 m high, it should be able to see the top of a 13’6” semi trailer from ~15m away all the way up to its full range.

Teslas don’t have radar anyway, so of course this wouldn’t have helped, but an AEB radar that can’t see the top of a semi truck isn’t very useful.

[1] https://www.bosch-mobility.com/en/solutions/sensors/front-ra...


Radars are usually tuned to ignore them because they don't have the fidelity to differentiate between a trailer on it's side and a tunnel and it leads to phantom braking.


Sure, but that’s a lot different from not being able to see them at all.


Normal collision detection systems don’t try to use cameras. Radar will have no issues in an instance like this if the detection zone is vertical enough


The detection zone is not vertical enough. Radar is tuned to mostly ignore overhead objects otherwise you’d have phantom emergency braking whenever entering a tunnel, parking garage or going through a pedestrian crossing. It wouldn’t normally have enough accuracy to distinguish between that, and the 1.5m high side of a semi truck.

Modern tech is changing that, but most collision prevention systems of today will not prevent this type of accident. Yet another reason to make guards mandatory. There are thousands of underride crashes and hundreds of deaths yearly, you only hear about the “automated driving” ones because it brings more clicks.


I tried but found nothing to verify your claim most collision avoidance systems do not sense veritable enough, nor anything discussing them plus truck guards outside Tesla failing.

Can you please provide some sources for your claims?


That was just off the top of my head, but here are some links I found:

https://www.ti.com/lit/wp/spry312/spry312.pdf shows how 24ghz radar, the most common until recently, has resolution measured in half-meter increments

https://www.kiforte.com/fca_radar-656.html Kia manual showing how the radar cone only covers a height of 1.2 meters in front of the car

These radar sensors project a very wide beam, they can't see much right in front of the car. For example, there was a car reviewer on YouTube that recently put ultrasonic parking sensors against a camera-based Tesla, and they perform about the same, with the camera slightly better in some situations because it's looking from above - the bumper-mounted radar cannot see objects in the ground right beneath it. Honda also just added a camera in their latest generation collision avoidance system, to cover a wider range of situations.


A half meter is fine, more then enough to look for 2-3m of “thing” Infront of the car

And that is not specifying the cone/detection range of the radar? It is just the work area to keep clear while performing an alignment - otherwise the horizontal range of the sensor would be near useless

Ultra sonic is not radar, and using radar cars regularly will follow things 10s of meters away. Seeing a semi with them is not a problem.

Parking is not collision avoidance- and using cameras for it is dumb. They don’t work in fog snow or adverse conditions that radar will


None of what you’re saying contradicts my earlier points. How can radar tell the difference between a 1.5m high truck undercarriage, and a 2.1m high passage if resolution is .5m? If they could detect the truck, why aren’t cars constantly triggering emergency brakes when near an underpass? Why are 10000+ collisions and 100+ deaths like this happening every year when a majority of cars have these systems installed? Why did Honda add cameras if radar is good enough? The answers seems obvious, but you can do your own research.


Maybe because the bottom of a truck trailer is far below an overpass? Lol like multiple meters?

You are making up statistics now

And because camera’s improve safety in some situations when used in conjunction with radar?

That point is meaningless in a discussion about Tesla who removed radar entirely and the point being made here “cameras alone are not as good as radar” not “radar plus camera is better then radar”


> That point is meaningless in a discussion about Tesla who removed radar entirely

How so? You're trying to say that if they've kept radar this accident could not happen, and I'm showing you this is not necessarily true. Do you have any sources to back up the idea that radar-based collision-avoidance systems can prevent this type of crash? The data says otherwise.

Here is a very extensive article on the subject, and it gives a much higher estimate (400 deaths/year): https://www.pbs.org/wgbh/frontline/article/trucks-underride-...

That was the 'do your own research part', honestly there is a ton of data online about everything we discussed here. You cannot expect someone to provide sources for everything they say while all you're offering is opinions.


This particular Tesla was pre 2021 so it should have both radar and cameras.


So after 2021 Tesla got even more irresponsible?


> A Washington Post analysis of federal data found that vehicles guided by Autopilot have been involved in more than 700 crashes, at least 19 of them fatal, since its introduction in 2014

19 fatalities in 8 years... less than 2.5 fatalities per year compared to 42,795 automobile crash fatalities in just 2022 in the USA. Has the WaPo or anyone analysed the break up of those fatalities to see what's causing them and what can be done easily to prevent at least 3 fatalies per year, since that seems to be the bar for going to the effort for such deep and painstaking analysis in the article?

Or would that not get enough clicks(including on HN) to warrant a story even if it eventually leads to saving way more lives?


The Tesla aren’t driving in the nearly the same conditions of those 42k deaths along with many other differences means that you cant directly compare these numbers. The EV number will probably be lower, but you aren’t using the correct data to support your point.


Your data and comparison is bad. Just bad and wrong.

Autopilot deaths vs all car crash death? When Tesla is <1% of cars on the road? And that 19 is just autopilot not all involving a Tesla, the stat your comparing to??


If you wanted to save lives, you would focus on the low hanging fruit in the 43K deaths per year, not the 2.5 deaths per year. You could save 5 times more lives by banning all vending machines in the US or by making them safer when they fall than by banning AutoPilot, since they directly kill about 13 people a year by falling on them.

That's not even counting the number of lives potentially saved by AutoPilot because human drivers cause 43K/deaths a year.


You are still comparing meaningless numbers. If Tesla assisted driving systems are less safe then the rest that is where the focus should be, and last I checked 70% of accidents involving driver assisted systems where teslas, and they are only 1% of cars on the road.

That is a HUGE problem and 100% where regulation should be focusing.

https://amp.theguardian.com/us-news/2022/jun/15/tesla-us-car...


If there are only 2.5 fatalities a year with 70% share then those are great numbers.


Again, you are making wild assumptions as you have no idea what it is for the 30%, it could be .5.



I was able to enjoy the interactive animation by canceling the page load before the paywall kicked in.


Wow. Such a simple hack.


> Because of the orientation of Tesla’s cameras, the person said, it was sometimes hard to discern the location of the tractor-trailers. In one view, the truck could appear to be floating 20 feet above the road, like an overpass. In another view, it could appear 25 feet below the ground.

What sort of camera or lenses are they using that causes something to appear 20 feet underground or 20 feet in the sky?


It sounds like it's less about the type of lens that they use and more about the fact they put the cameras in awkward places (presumably to preserve the aesthetic of the car), looking down and looking up in ways that make it hard to judge distance.

It doesn't help that they also chose to avoid using lidar or radar, because Musk was adamant that if human eyes can do it with just visual light then his computers would too.


I googled a bit (and read the model 3 handbook) but found 1 source that said Tesla's forward collision warning does use radar.

Anyone more knowledgable know whether that's true?

I'm curious how much it's relying on object recognition via cameras vs. the more traditional radar based foward collision braking that are standard on "dumb" modern cars.

...or in other words, would a "dumb" modern car still fail in the same scenario?


It did, but they removed it in 2021, and I think disabled it in software of cars shipped with it


They disabled in software, and disabled the hardware on many vehicles taken in for unrelated service.


Wow I guess you don’t own your Tesla if they can randomly disable features


The guy would still be alive if he chose a different car manufacturer


Or if they paid attention while driving.


Paying attention isn't optional in a car without a half-baked autopilot feature.


There are plenty of cars with adaptive cruise control. And autopilot is not half-baked, it works just as well as any of them.


Forgetting to pump your brakes isn't an option in a car without ABS, yet ABS still overall saves lives.


I don't get the point.

Autopilot is "just" an adaptive cruise control with line assist. It is not self driving. Of course it is going to crash when you are cruising at high speed and a semi suddenly rolls across the street. It also doesn't stop for stop signs or red lights, those are known limitations.

We can proceed into the discussion whether it is okay to call such a system 'autopilot'. One side will say that autopilot (also in airplanes) means driver's assist, not self driving. The other side will say that a common person thinks it means self driving and therefore it is misleading. But I think this terminology was debated to death on hacker news.


> I don't get the point.

I'm happy to see more reporting like this now that Teslas are increasingly available as short-term rentals (e.g. Hertz's Tesla fleet). Meaning there's now a whole lot of Tesla drivers on the road with only a few days experience driving the machine. Very small chance car rental drivers read the manual or do any research on the car before driving it.

But I agree the issue has been debated to death on HN specifically. That said I think Tesla's PR would vehemently disagree with your description of autopilot being adaptive cruise control with lane assist. (If Tesla actually called it that, everyone would be happy)


Fair, Hertz should explain what's autopilot to their customers if it's their first time. Same goes for Tesla owners lending their vehicles to friends.


If they don't watch the manual video, they aren't going to figure out how to engage autopilot. There isn't a button labeled that. The manual also indicates limitations. If the rental agency isn't disabling the autopilot, and at least forcing the renter to agree to the disclaimer when they re enable it, they are being unwise. They need to clear out phones linked as keys anyway.


> If they don't watch the manual video, they aren't going to figure out how to engage autopilot. There isn't a button labeled that. The manual also indicates limitations.

Wanna guess what the odds are the person has a smart phone and defaults to answering such questions by searching google/youtube for a video clip showing how it's done?

Let's not act like "the manual video" is the only way to find such information...


That is a fair point. Another reason the rental agency would be really wise to disable it after each rental forcing the renter to agree they understand its limitations when they turn it back on.


Autopilot is not advertised or described to consumers as "just" adaptive cruise.


I don't know what meaningless mambo jumbo Tesla uses to describe the features and what BS Elon tweets, but at the very least Tesla is super clear about one thing:

> Current Autopilot features require active driver supervision and do not make the vehicle autonomous.


Elon has said it is better/safer then a human.


Elon claims a lot of crazy stuff, but do you have the original source in context?

Because autopilot is not autonomous and no one claims it to be. It e.g. cannot solve crossroads or react to traffic lights.

So I guess he either said that about FSD (which would still be questionable but at least not nonsense) or he meant that drivers using autopilot assistance have in general less crashes than drivers who don't use it.


Yea he does, which is the problem. People will hear the claims and believe them, go out buy a car, then die or kill someone’s because of it.


So no source?

Again, no Tesla owner thinks autopilot is full autonomy, since it very clearly isn't. I don't know if you have ever used it, but you would find out after driving a few meters.


I’m not digging up something Elon has said many times about Tesla autopilot

Also right here, in this case, we have a Tesla owner who seemed to think it was, and is dead.


I doubt Elon said what you think he said.

And regarding the victim – nah, he knew full well, he was just careless. Many drivers are.


Source(s)?


> Autopilot - Future of Driving - Autopilot enables your car to steer, accelerate and brake automatically within its lane under your active supervision, assisting with the most burdensome parts of driving. [...]

https://www.tesla.com/en_eu/models

Reading this, without any other context or knowledge about their capabilities, it does sound like the car can control itself and pilot itself automatically. The name is disingenuous and misleading, so no wonder people think the car can steer, accelerate and brake by itself even though in cases like this, it did neither.


It can do all of those things though. Usually does. Usually is why you are there. I can't imagine anyone who has even just test driven one doing anything but frantically scanning the roadway while somehow white knuckling with a loose grip. The manual and the somewhat erratic behavior of the 3d model representing what it is aware of are not confidence inspiring. Added since this incident, the driver monitor camera, which uses ir illumination at night, disables autopilot if it decides you aren't looking at the road, has some false positives. The only thing I trust it for is stop and go traffic on a major highway. It does that reliably. I'm still scanning nervously to avoid a low speed collision. I've had it automatically engage brakes once in what it thought was an emergency when cruise control was off. I saw the person cutting me off and was about to brake with plenty of time. My wife agreed with the car...


"The car is literally driving itself" as on-screen text in one of the advertisements


It really says so much that we allow these systems on our public roads.


No, it does not. Tesla's safety stats clearly show FSD systems prevent accidents, and in such a large volume, that making such a statement is fundamentally a counterargument for safety. Interpret the data, then comment. Not the other way around.

https://www.tesla.com/VehicleSafetyReport


Excuse me if I take Tesla's self reported statistics with a grain of salt.

"...statisticians have pointed out serious analytical flaws, including the fact that the Tesla stats involve newer cars being driven on highways. The government’s general statistics include cars of all ages on highways, rural roads and neighborhood streets. In other words, the comparison is apples and oranges."

https://www.latimes.com/business/story/2022-12-27/tesla-stop...


I’m sorry if I don’t believe anything from the company that has been proven to lie, deceive, deflect and make things up.

Independent source?



That's just blogspam about how amazing and wonderful the other report you linked is. It's somehow even less credible than tesla.


Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith. Read the article before posting. It has links and contains scores for IIHS Crash avoidance & mitigation and Euro NCAP Safety Assist.

https://news.ycombinator.com/newsguidelines.html


How about instead of nitpicking someone’s comment you actually try to reply to their valid criticisms of your link? It doesn’t actually collaborate teslas claims (it is also as they said even more slimy and blog spammy) and the bit you are talking about are just links to it’s own articles which again don’t really have real metrics or stats on how their autonomous systems fare


Because my info is simply discarded as fake, and you haven't done any homework on a counter-argument. I've spoon-fed facts that support my argument, you clicked the links, but have not read nor interpreted them. Nor have you made any conclusions based on them.


Not your info, Tesla’s - find a source that isn’t Tesla.


I would be happy to allow these systems on our public roads if they had fewer accidents/fatalities per mile than human drivers.

Google found this claim "According to the report, Tesla cars with Autopilot turned on are much safer than cars driven by people. In Q4 2022, there was only one crash for every 4.85 million miles driven."

Here is an article that discusses the safety of driverless cars. https://arstechnica.com/cars/2023/09/are-self-driving-cars-a...


I don't know if you've paid much attention to the other systems we allow on our public roads, but they're looking at their phones instead of avoiding collisions, or they're aggressively refusing to cooperate instead of collaboratively producing flow and throughput, or they're impaired by substances or medications or diseases or untested capability degeneration, or


We don't allow these things on public roads, there's laws against texting, alcohol, aggressive driving &c. But Tesla's autopilot with no sense of the world around it is the equivalent of a teenager in his second driving lesson. We don't let those drive.


Some places have some laws against those things, but they are still "allowed on the road" and only "punished" after they are "caught".

In any case, I assure you Tesla's autopilot is vastly beyond a teenager in their "second driving lesson." Driving lessons! How quaint! As if these are obligatory prior to being "allowed on the road" everywhere!

Simply darling.


> there's laws against texting

Not in several US states, including where I live

> alcohol

Most states legally allow BAC up to 0.08, which is too high for someone expected to make split second decisions

> aggressive driving

Maybe there's laws against it but never seen it enforced. Aggressive driving is extremely common here.

And you say we don't allow these things to happen on public roads?


Whatever happened to the oath of the engineer?

I’m not buying that this issue is only about psychopathic business people.

Neither domain nor pay grade leave room for “professional” excuses here.

Oh and yes “even” interaction design including nomenclature / wording used to fall under engineering considerations / vetting as well.

For a “software” car the interaction on the whole is terribly designed. UI out of immediate view, low contrast, distracting, etc. I was appalled how bad the experience was driving a Tesla over a recent short vacation (2022 Model 3).


Most people don't care about others' lives. Engineers are no different. It takes regulators to keep them from killing people, not an oath.


Was really autopilot (full autonomous drive, I mean) available at that time?


I'm a WaPo subscriber, and saw this on the front page. I haven't read it, though, because I really don't want to get a thrill from exploring someone else's tragedy.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: