Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Tesla should be liable by virtue of calling things 'autopilot' or 'full self driving' which aren't.


Something that just flies an aircraft straight and level is still an autopilot, so I don't think it's a totally inaccurate metaphor.

Ironically in these situations a more understated name would probably work better for PR. Headline of "Tesla on cruise control rear ends motorcycle on freeway, killing rider" gives a much stronger impression of it being the driver's fault.


You bring up a valid argument but, unfortunately, it cuts both ways.

Aircraft manufacturers are liable when their vehicles crash due to malfunction.


If the pilot turns on autopilot and then stops paying attention, and then the plane crashes, it's absolutely not the manufacturer's fault.


If the pilot turns on autopilot and looks away for 60 seconds he's using it as intended. This is what autopilot means. Cruise control is not autopilot. The only way your comment would make sense would be if the typical environment for a car were a flat plane of asphalt 100s of km across with all other cars having centrally planned routes that were guaranteed by a third party not to intersect and drivers were trained to turn it off on anything resembling a normal road.


If the pilot turns on AP, looks away, and then 30s later crashes into a small aircraft that didn't have their transponder on, is he relived of any responsibility? I don't think so, and am pretty sure there has to be a pilot in the cockpit/controls at all times even while AP is on.


The time-scale in which a tesla driver must react to it failing is 10s or so to under a second, not 30s.

If this happened anywhere that policy and law said it was okay to use autopilot then you would see changes in the policy, and fault laid on air traffic control at minimum, and then the scope in which autopilot could be used would be massively restricted.

If boeing had advertised the feature for use at low altitude in congested airspace and written the training with that in mind then most likely someone at boeing would be criminally at fault.


For convenience features advertised as requiring active driver supervision and not making the vehicle autonomous, I think it only really makes sense for the driver to be liable.

Similarly, if some airplane instrument is stated to require regular calibration to stay within acceptable error limits, I don't think the manufacturer would be liable if the instrument starts to drift when those calibrations are not carried out. Or if some crash-contributing decision is made based on assuming a higher accuracy than promised.


Aircraft manufacturers are not responsible for an autopilot crash


“Straight and level” in a car is cruise control. The expectation for autopilot would be much higher.


This is possibly the dumbest argument against Tesla one could make.

From the origin of autopilot from its aviation history, its pretty clear that autopilot was not designed to prevent you from hitting things.


In aircraft or maritime it's a system you can activate and stop paying attention to the controls or do a task like navigation, eating, or going to the bathroom for timescales on the order of seconds to minutes.

Tesla's 'autopilot' system is cruise control but with more mental overhead when the human needs to react because the human is being hypnotised by lack of stimulus and has to keep a mental model of what the car is planning in their head.

And 'full self driving' is an automated way of making the driver as unprepared for novel stimulus as possible. "You have to sit here and do nothing, but react in milliseconds when I fuck up" is not a task a human can do.

Tesla know this and have known this since before they made the features available and gave them misleading names. So when the human inevitably doesn't react in time in a scenario where it is impossible for them to react in time, they are the ones responsible.


To just arbitrarily redefine autopilot for cars as something that can avoid accidents isn't a good argument. You are supposed to still have awareness when using autopilot in an airplane, ship - paying attention to things like radar, radio, ADSB, e.t.c. And in a Tesla, you are supposed to pay attention visually.

Full self driving is still in beta test, with present dangers that you have to accept.

All the accidents that happen with Tesla autopilot use are driver error, full stop. Just like its driver error to get drunk and drive, or be looking down at a phone distracted, and rear end a motorcycle. As someone who rides, Id rather have roads full of Teslas with FSD, because for every time it fucks up there are vastly more times for it to avoid an accident, given that you are not going to get rid of drunk, high, or distracted drivers.


I think we could make argument that autopilot is a unfit system for car travel.

Both air and water the distances and reaction times needed are much much longer. At least in the situations autopilots are used. And they are not really used in situations where manual control is needed.


I've never seen Tesla's "autopilot" avoid an accident that a human driver wouldn't have reacted to sooner. Maybe the standard of driving in the US is very low, or something.


It actually is. Some states are far worse than others. In my state, a learning permit can be obtained at the age of 15. Not exactly a good age for handling grave responsibility.


I mean I grew up in a part of the world where if you can't drive and maintain a tractor and a fair selection of implements at the age of 12 they start phoning special schools and highly qualified educational psychologists, but I get that this is very much the exception. It's weird like that even down south here at 56°N.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: