Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Moof1904

macrumors 65816
May 20, 2004
1,053
87
The reporting around this event isn't terribly impressive. First of all, despite the headlines and any marketing labels, the Tesla "autopilot" feature is not autonomous driving. It isn't intended, nor is it represented to be that. It's simply auto-steering lane control coupled with traffic-aware cruise control. The product documentation is abundantly clear on this. The UI even reminds the driver to "keep hands on the wheel at all times" and because neither the auto-steering nor the automatic braking is perfect, the driver is advised in the UI to "be prepared to take control at any time." The steering wheel senses the presence of the driver's hands and periodically nags the driver if the vehicle senses that the driver's hands aren't present on the wheel.

In addition, the product documentation points out repeatedly that the auto-steering feature is still in beta and reminds the user, again, to keep both hands on the wheel at all times.

Tesla also advises that the vehicle will not brake for objects that are stationary when the Tesla first detects them. This is to prevent spontaneous, and dangerous, braking if the Tesla rounds a gentle curve with cars parked on the shoulder. If the Tesla responded to stationary objects with its automatic braking, every parked car on the shoulder, median, or in an an adjacent lane on a curved road would potentially cause dangerous, and unwarranted braking. Because the 18-wheeler was perpendicular to the path of the Tesla's travel, this is quite possibly a factor in the Tesla not applying the brakes when the truck turned directly in front of the vehicle.

In short, auto-steering is a driver assistance feature, designed to cut down on fatigue. It is not "autopilot" or autonomous driving in any way. Auto-braking is designed to help reduce the likelihood and severity of a crash. Intentional limitations of auto-braking that are designed to prevent false braking with stationary objects are probably a factor here. This is a known, and documented behavior of the automatic braking.

The driver of the Model S elected to disregard all of the warnings in the product documentation and in the UI and was watching a movie on his iPhone at the time of the crash. Compounding this, the driver of the 18-wheeler turned directly in the path of the oncoming Tesla. The truck driver's actions have, reportedly, compelled the police to open an investigation into bringing charges of vehicular manslaughter.

We have a situation where driver assistance features, never designed nor marketed to be autonomous, were, quite foolishly, fully relied upon by a driver, who elected to ignore all product documentation and watch a movie on his phone instead of even looking out of the windshield. At the same time, a truck driver recklessly turned directly into the path of the oncoming Tesla.

While the two humans involved were displaying incomprehensibly reckless judgement, the systems of the Tesla failed to detect the bright surface of the raised truck against the bright sky and automatic braking did not engage.

In 130 million miles of "autopilot" enabled miles, this is the first recorded instance of the auto-braking not detecting such an object and causing an injury. To the contrary, online videos show a number of cases where drivers were saved from injury by the automatic braking.

No visualization technology is perfect. Had someone strung a steel cable or a thin metal barrier across the roadway, I'm sure the vehicle would have failed to see that, too. (And, most likely, the human.) That's why they're driver assistance features, not autonomous driving.

This is, by no means, an autopilot fatality. Get past the headlines.
 

Benjamin Frost

Suspended
May 9, 2015
2,405
5,001
London, England
The reporting around this event isn't terribly impressive. First of all, despite the headlines and any marketing labels, the Tesla "autopilot" feature is not autonomous driving. It isn't intended, nor is it represented to be that. It's simply auto-steering lane control coupled with traffic-aware cruise control. The product documentation is abundantly clear on this. The UI even reminds the driver to "keep hands on the wheel at all times" and because neither the auto-steering nor the automatic braking is perfect, the driver is advised in the UI to "be prepared to take control at any time." The steering wheel senses the presence of the driver's hands and periodically nags the driver if the vehicle senses that the driver's hands aren't present on the wheel.

In addition, the product documentation points out repeatedly that the auto-steering feature is still in beta and reminds the user, again, to keep both hands on the wheel at all times.

Tesla also advises that the vehicle will not brake for objects that are stationary when the Tesla first detects them. This is to prevent spontaneous, and dangerous, braking if the Tesla rounds a gentle curve with cars parked on the shoulder. If the Tesla responded to stationary objects with its automatic braking, every parked car on the shoulder, median, or in an an adjacent lane on a curved road would potentially cause dangerous, and unwarranted braking. Because the 18-wheeler was perpendicular to the path of the Tesla's travel, this is quite possibly a factor in the Tesla not applying the brakes when the truck turned directly in front of the vehicle.

In short, auto-steering is a driver assistance feature, designed to cut down on fatigue. It is not "autopilot" or autonomous driving in any way. Auto-braking is designed to help reduce the likelihood and severity of a crash. Intentional limitations of auto-braking that are designed to prevent false braking with stationary objects are probably a factor here. This is a known, and documented behavior of the automatic braking.

The driver of the Model S elected to disregard all of the warnings in the product documentation and in the UI and was watching a movie on his iPhone at the time of the crash. Compounding this, the driver of the 18-wheeler turned directly in the path of the oncoming Tesla. The truck driver's actions have, reportedly, compelled the police to open an investigation into bringing charges of vehicular manslaughter.

We have a situation where driver assistance features, never designed nor marketed to be autonomous, were, quite foolishly, fully relied upon by a driver, who elected to ignore all product documentation and watch a movie on his phone instead of even looking out of the windshield. At the same time, a truck driver recklessly turned directly into the path of the oncoming Tesla.

While the two humans involved were displaying incomprehensibly reckless judgement, the systems of the Tesla failed to detect the bright surface of the raised truck against the bright sky and automatic braking did not engage.

In 130 million miles of "autopilot" enabled miles, this is the first recorded instance of the auto-braking not detecting such an object and causing an injury. To the contrary, online videos show a number of cases where drivers were saved from injury by the automatic braking.

No visualization technology is perfect. Had someone strung a steel cable or a thin metal barrier across the roadway, I'm sure the vehicle would have failed to see that, too. (And, most likely, the human.) That's why they're driver assistance features, not autonomous driving.

This is, by no means, an autopilot fatality. Get past the headlines.

You miss the point completely.

The auto-steering and cruise control features in combination tempted the driver to watch a video, something that simply would not happen in a car without those features. That is what lead to the fatal crash.

So it is correct to say that these semi-autonomous features played a crucial part in the driver's death. Tesla, in encouraging a halfway house, are playing a dangerous game. If you encourage a driver to rest his mind, he is likely to rest it too much. That is what happened here, with fatal consequences. Better to drive without those features and be forced to concentrate on the road ahead.
 
  • Like
Reactions: mw360

mobilehaathi

macrumors G3
Aug 19, 2008
9,368
6,352
The Anthropocene
The reporting around this event isn't terribly impressive. First of all, despite the headlines and any marketing labels, the Tesla "autopilot" feature is not autonomous driving. It isn't intended, nor is it represented to be that. It's simply auto-steering lane control coupled with traffic-aware cruise control. The product documentation is abundantly clear on this. The UI even reminds the driver to "keep hands on the wheel at all times" and because neither the auto-steering nor the automatic braking is perfect, the driver is advised in the UI to "be prepared to take control at any time." The steering wheel senses the presence of the driver's hands and periodically nags the driver if the vehicle senses that the driver's hands aren't present on the wheel.

In addition, the product documentation points out repeatedly that the auto-steering feature is still in beta and reminds the user, again, to keep both hands on the wheel at all times.

Tesla also advises that the vehicle will not brake for objects that are stationary when the Tesla first detects them. This is to prevent spontaneous, and dangerous, braking if the Tesla rounds a gentle curve with cars parked on the shoulder. If the Tesla responded to stationary objects with its automatic braking, every parked car on the shoulder, median, or in an an adjacent lane on a curved road would potentially cause dangerous, and unwarranted braking. Because the 18-wheeler was perpendicular to the path of the Tesla's travel, this is quite possibly a factor in the Tesla not applying the brakes when the truck turned directly in front of the vehicle.

In short, auto-steering is a driver assistance feature, designed to cut down on fatigue. It is not "autopilot" or autonomous driving in any way. Auto-braking is designed to help reduce the likelihood and severity of a crash. Intentional limitations of auto-braking that are designed to prevent false braking with stationary objects are probably a factor here. This is a known, and documented behavior of the automatic braking.

The driver of the Model S elected to disregard all of the warnings in the product documentation and in the UI and was watching a movie on his iPhone at the time of the crash. Compounding this, the driver of the 18-wheeler turned directly in the path of the oncoming Tesla. The truck driver's actions have, reportedly, compelled the police to open an investigation into bringing charges of vehicular manslaughter.

We have a situation where driver assistance features, never designed nor marketed to be autonomous, were, quite foolishly, fully relied upon by a driver, who elected to ignore all product documentation and watch a movie on his phone instead of even looking out of the windshield. At the same time, a truck driver recklessly turned directly into the path of the oncoming Tesla.

While the two humans involved were displaying incomprehensibly reckless judgement, the systems of the Tesla failed to detect the bright surface of the raised truck against the bright sky and automatic braking did not engage.

In 130 million miles of "autopilot" enabled miles, this is the first recorded instance of the auto-braking not detecting such an object and causing an injury. To the contrary, online videos show a number of cases where drivers were saved from injury by the automatic braking.

No visualization technology is perfect. Had someone strung a steel cable or a thin metal barrier across the roadway, I'm sure the vehicle would have failed to see that, too. (And, most likely, the human.) That's why they're driver assistance features, not autonomous driving.

This is, by no means, an autopilot fatality. Get past the headlines.

This is a really excellent summary of what happened, although I disagree slightly that Tesla is entirely blameless. I'd say the vast majority of blame lies with the the drivers of the car and the truck, but I think there is a tinge of hubris on Tesla's part for releasing 'beta' driving assist software and marketing it as 'Autopilot.'
 
  • Like
Reactions: satcomer

r3m1

macrumors regular
Apr 7, 2012
220
120
Earth
Last night I saw a Tesla S in the flesh.

Was on dutch plates here in UK.

Looks very sexy on outside.

In the Netherlands Tesla's are used as taxi - just step outside Schiphol airport :)
[doublepost=1467880879][/doublepost]
This is a really excellent summary of what happened, although I disagree slightly that Tesla is entirely blameless. I'd say the vast majority of blame lies with the the drivers of the car and the truck, but I think there is a tinge of hubris on Tesla's part for releasing 'beta' driving assist software and marketing it as 'Autopilot.'

The reply of Tesla after this sad event was terrible - stating 'oh, but you know, we have x million of safe miles' - who is running their PR?

Say mea culpa - if you put in it in a car and people use it - don't run away from the responsibilities from your end.

:(
 

Moof1904

macrumors 65816
May 20, 2004
1,053
87
You miss the point completely.

The auto-steering and cruise control features in combination tempted the driver to watch a video, something that simply would not happen in a car without those features. That is what lead to the fatal crash.

So it is correct to say that these semi-autonomous features played a crucial part in the driver's death. Tesla, in encouraging a halfway house, are playing a dangerous game. If you encourage a driver to rest his mind, he is likely to rest it too much. That is what happened here, with fatal consequences. Better to drive without those features and be forced to concentrate on the road ahead.

I couldn't disagree more.

By your logic, an airline pilot who engages autopilot (a true autopilot, a device that far more stridently encourages one to "rest one's mind" than Tesla auto-steering does) and then takes a nap and misses an oncoming aircraft (an event that autopilot in an aircraft can't likely detect) then the autopilot is to blame for the resulting collision because the presence of autopilot compelled the pilot to "rest his mind"?

There is no mechanical device on the planet that can exert telepathic mind control on a human to destroy the ability to exercise free will and display common sense. The role of the Tesla's driver assist features were not misrepresented in its documentation nor in its UI. To the contrary, repeated cautions, warnings, and software feedback through the product documentation make the user's responsibility more than abundantly clear.

Nothing absolves a pilot of an aircraft or a car of the responsibility for operating that craft. Nothing. We have free will and the driver of that Tesla chose to abandon his responsibilities behind the wheel.

When a user chooses to blatantly disregard the clearly stated instructions and intentions of a device and, with complete reckless disregard, abandon all common sense to use the device in an appallingly unsafe manner, the blame rests entirely on the human.
 
  • Like
Reactions: lucidmedia and R3k

Benjamin Frost

Suspended
May 9, 2015
2,405
5,001
London, England
I couldn't disagree more.

By your logic, an airline pilot who engages autopilot (a true autopilot, a device that far more stridently encourages one to "rest one's mind" than Tesla auto-steering does) and then takes a nap and misses an oncoming aircraft (an event that autopilot in an aircraft can't likely detect) then the autopilot is to blame for the resulting collision because the presence of autopilot compelled the pilot to "rest his mind"?

There is no mechanical device on the planet that can exert telepathic mind control on a human to destroy the ability to exercise free will and display common sense. The role of the Tesla's driver assist features were not misrepresented in its documentation nor in its UI. To the contrary, repeated cautions, warnings, and software feedback through the product documentation make the user's responsibility more than abundantly clear.

Nothing absolves a pilot of an aircraft or a car of the responsibility for operating that craft. Nothing. We have free will and the driver of that Tesla chose to abandon his responsibilities behind the wheel.

When a user chooses to blatantly disregard the clearly stated instructions and intentions of a device and, with complete reckless disregard, abandon all common sense to use the device in an appallingly unsafe manner, the blame rests entirely on the human.

You're making an inappropriate comparison.

A trained pilot is working in a professional environment. A car driver has no such training, and is operating in a very different environment.

Sad though this death was, it is not at all surprising, and highlights the fundamental flaw of autopilot in a car without professional trained drivers.
 

ThunderSkunk

macrumors 68040
Dec 31, 2007
3,783
3,990
Milwaukee Area
34,000 auto fatalities per year in the US.
People operating while impaired by medication, distractions, physical & mental limitations, and general ineptitude.
Expecting perfection from an Autopilot system from day 1 without any errors is not only unrealistic, it was never realistic for any automobile or transportation technology innovation. Every time any manufacturer of any part of it tries anything, it results in fatalities. Whether its seat belts, glass & plastic materials, road surface compounds, tire tread patterns, steering wheel texture, transmission gear hardeness ratings, millions of variables are constantly trying to improve safety on the roads, but do have quantified fatality rates associated.

There will be more accidents. But Autopilot doesnt have to achieve instant perfection. It just has to be more reliable than intoxicated/oblivious suburbanites on antidepressants careening all over the road staring at iPhones, which if you've got a habit of looking in peoples car windows as you drive by them, you'll have noticed accounts for about 90% of the people hurling 4000lb death machines around out there.

But regarding your point, Benjamin, it is too early to encourge "drivers" to be so passive. I think the green grass on the other side of the development hill, when no one need pay any more attention to the operation of the vehicle than a passenger on a bus, is when all cars about to collide communicate with each other, and make minor corrections to speed and direction to avoid colliding, smoothly, precisely mising each other even if only by inches. Ideally, there'd be a network knowing where each vehicle is and where it's going and a giant beast of a program planning their trajectories efficiently, but I can already hear the authoritarians licking their lips at that prospect. Getting over the hill to the sweet spot is the tricky bit.
 
Last edited:

thelookingglass

macrumors 68020
Apr 27, 2005
2,138
631
You're making an inappropriate comparison.

A trained pilot is working in a professional environment. A car driver has no such training, and is operating in a very different environment.

Sad though this death was, it is not at all surprising, and highlights the fundamental flaw of autopilot in a car without professional trained drivers.

Yeah, but the point he's trying to make is that the Tesla system is not a fully autonomous system. It's basically glorified cruise control with limited lane keeping assist. "Autopilot" is a marketing term, but the system prompts the driver multiple times to keep hands on the wheel. And if you've ever had first hand experience with this system, you'll know that it's best on highways with very, very close supervision. This guy, RIP, was being very cavalier about his reliance on autopilot. In addition to watching a movie while driving, it's being reported that he was driving at about 85-90mph.
 
  • Like
Reactions: Benjamin Frost

Benjamin Frost

Suspended
May 9, 2015
2,405
5,001
London, England
Yeah, but the point he's trying to make is that the Tesla system is not a fully autonomous system. It's basically glorified cruise control with limited lane keeping assist. "Autopilot" is a marketing term, but the system prompts the driver multiple times to keep hands on the wheel. And if you've ever had first hand experience with this system, you'll know that it's best on highways with very, very close supervision. This guy, RIP, was being very cavalier about his reliance on autopilot. In addition to watching a movie while driving, it's being reported that he was driving at about 85-90mph.

Fair enough. I think it is very unwise of Tesla to market it as AutoPilot.

I have cruise control which adapts to the speed of the car in front. It's wonderful, and works very well, but I still have to watch the road like a hawk. There are various situations in which the radar loses the car due to going through a tunnel, going round a bend, odd tail lights, etc. It also takes a while to react, about a second or so.
 

wesk702

macrumors 68000
Jul 7, 2007
1,809
368
The hood
The level of incorrect information in this thread is ridiculous. The owner was not watching a movie, that was just something the truck driver said and I'm sure bias has a little something to do with it. Police later came on record saying that they found the DVD player but it was in fact not running. The car was going extremely fast and the high trailer of the truck resulted in the sensors not detecting it. That in combination of the blinding sun, that caused the driver to not see the truck resulted in the accident. Tesla's autopilot operated as it should. If any of you followed the news correctly you would have seen that the truck driver is now possibly going to face charges for crossing high speed traffic recklessly. Some truck drivers think that just their sheer size gives them the right to hog the road. Show some respect for the deceased. He served our country and was NOT watching Harry Potter at the time of his accident.
 
  • Like
Reactions: satcomer

rhp2424

macrumors regular
Jul 23, 2008
122
18
Some truck drivers think that just their sheer size gives them the right to hog the road.

You are exactly right. It seems the more appropriate* headline should be along the lines of: "Reckless truck driver commits murder as a result of his cavalier attitude".

*based on everything I understand from all the reports I've read
 
  • Like
Reactions: satcomer

Fattytail

macrumors 6502a
Apr 11, 2012
902
242
The level of incorrect information in this thread is ridiculous. The owner was not watching a movie, that was just something the truck driver said and I'm sure bias has a little something to do with it. Police later came on record saying that they found the DVD player but it was in fact not running. The car was going extremely fast and the high trailer of the truck resulted in the sensors not detecting it. That in combination of the blinding sun, that caused the driver to not see the truck resulted in the accident. Tesla's autopilot operated as it should. If any of you followed the news correctly you would have seen that the truck driver is now possibly going to face charges for crossing high speed traffic recklessly. Some truck drivers think that just their sheer size gives them the right to hog the road. Show some respect for the deceased. He served our country and was NOT watching Harry Potter at the time of his accident.

I agree that we should show respect for the deceased, but frankly we don't know what he was doing at the time of the crash. The reports also say that he didn't slow down one bit AND a witness (not the driver) said he zoomed past her while she was doing 85mph. This guy also has a bunch of speeding tickets from the last few years. It's entirely possible he trusted Autopilot way too much and wasn't even paying attention. And the crash was in the mid afternoon, not exactly when the sun is almost level with the road. I find it hard to believe that a driver actually paying attention to traffic wouldn't be able to see the truck.
 

wesk702

macrumors 68000
Jul 7, 2007
1,809
368
The hood
I agree that we should show respect for the deceased, but frankly we don't know what he was doing at the time of the crash. The reports also say that he didn't slow down one bit AND a witness (not the driver) said he zoomed past her while she was doing 85mph. This guy also has a bunch of speeding tickets from the last few years. It's entirely possible he trusted Autopilot way too much and wasn't even paying attention. And the crash was in the mid afternoon, not exactly when the sun is almost level with the road. I find it hard to believe that a driver actually paying attention to traffic wouldn't be able to see the truck.
On the tesla motors club forum there's a member that lives in the area of the crash. He even said that where the accident happened, the sun setting is absolutely blinding. No, I'm not condoning either in this issue, just trying to get the facts straight. The mobileye sensors are also incapable to detect or stop for perpendicular traffic, which is also why Tesla is rumored to be parting ways with mobileye and partnering with more advanced Bosch sensors.
[doublepost=1469669375][/doublepost]
You are exactly right. It seems the more appropriate* headline should be along the lines of: "Reckless truck driver commits murder as a result of his cavalier attitude".

*based on everything I understand from all the reports I've read
"manslaughter" is appropriate
 

diamond.g

macrumors G4
Mar 20, 2007
11,069
2,421
OBX
On the tesla motors club forum there's a member that lives in the area of the crash. He even said that where the accident happened, the sun setting is absolutely blinding. No, I'm not condoning either in this issue, just trying to get the facts straight. The mobileye sensors are also incapable to detect or stop for perpendicular traffic, which is also why Tesla is rumored to be parting ways with mobileye and partnering with more advanced Bosch sensors.
[doublepost=1469669375][/doublepost]
"manslaughter" is appropriate
Parting with Mobileye isn't a rumor. Though they claim they may build the sensors in house. We will see...
 

satcomer

Suspended
Feb 19, 2008
9,115
1,973
The Finger Lakes Region
You miss the point completely.

The auto-steering and cruise control features in combination tempted the driver to watch a video, something that simply would not happen in a car without those features. That is what lead to the fatal crash.

So it is correct to say that these semi-autonomous features played a crucial part in the driver's death. Tesla, in encouraging a halfway house, are playing a dangerous game. If you encourage a driver to rest his mind, he is likely to rest it too much. That is what happened here, with fatal consequences. Better to drive without those features and be forced to concentrate on the road ahead.

Tesla does remind drivers to hang on the wheel but YouTubers soon figured out how defeat this! Tesla AutoPilot even slow the car down!
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.