Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
As you said in your first sentence "we know nothing about the situation".

Do you think she was trying to die? Or it is possible that she simply made a mistake, misjudged the speed of oncoming traffic, didn't see because of a bend in the road or an obstruction to her view... any of a hundred different way to mess up. People make mistakes, and sometimes they die because of it. We can't control everyone's actions.

We can and should control other things, though. Unless we're talking about a freeway here, road design can and should make it as safe as possible for other users besides just auto drivers. And in the case of autonomous (or semi-autonomous) vehicles, they obviously need to be programmed to "see" and even anticipate the presence of all users of the road.

So yeah, you're right. You can't stop any individual from randomly stepping in front of a car, but good design can make it less likely that someone would do so by accident.

Completely agree. Hopefully this terrible loss is used as a learning, for both the self driving car industry and the general public. It would be far more tragic if this death ends up killing self driving vehicles. There are going to be more situations like this, that the sheer physics of stopping a vehicle before hitting a person, animal or other moving object cannot be overcome - even with the most sophisticated software and safety features.
 
  • Like
Reactions: ignatius345
when the family files a lawsuit, is it against Uber, Volvo, the guy sitting in the drivers seat, city of Tempe for allowing these vehicles on the road and/or the autonomous software?
Volvo deserves punishment. As a loyal Volvo customer, I chose this brand because they used to take safety seriously. The single fact they invest in self-driving research take them to the same place any other brand in the market belongs: prostitutes that no longer believe in what the brand used to stand for.
 
Not sure why people are having a go at the person who died for not using the crosswalk, you people should come to the UK where the majority of the time people cross a road where and when they can, that wasn't and certainly shouldn't have been the cause of the accident. If these autonomous vehicles can't deal with that properly they will never be allowed.
 
I think the real question is, can AI be programmed with a good-enough model to let it use its strengths (sheer reaction time and speed of processing; inability to be "distracted") to overcome its limitations? And, can AI cars leverage each others' training to become more effective drivers? The promise is great: cars that obey traffic laws, don't get drunk, and can see and react faster than any human.

My thoughts on this, from a developers point of view. Probably. But not right now. And what most people don't want to hear, probably not for a long time.
 
  • Like
Reactions: ignatius345
It will be illegal to drive when you’re 75. Remember this when you march with your protest sign outside of your local City Hall.

Humans driving cars are dangerous.
I disagree, there’s so much culture around driving to completely outlaw it. Look what happened with prohibition...
 
  • Like
Reactions: ignatius345
[doublepost=1521493448][/doublepost]
I disagree, there’s so much culture around driving to completely outlaw it. Look what happened with prohibition...
It’s about public safety. Driving will only be permitted in specific places, but not on public roads. Time will tell.
[doublepost=1521493616][/doublepost]
AI is only as efficient as its programmed and trained to be. While a computer can solve problems significantly faster then a human can, they are still only as good as their programming. This is a clear distinction that I see people often misunderstand or simply ignore because it doesn't fit their argument.

If the car is driving down the street and there are children playing in a yard, will the AI consider that at any moment one of them may run into or across the street and slow down in preparation for such an event? If a ball goes flying across the street, will the car slow down to avoid hitting the ball, or will it also consider any number of children running after the ball? I do hesitate to ask that because after seeing some of the sad responses on this thread, I'd assume many will blame the child for running in the street just to avoid hinting that maybe the technology isn't ready yet.
Yes, a self-driving car can identify children playing and slow down.

https://www.ted.com/talks/joseph_re...o_recognize_objects_instantly/up-next#t-27087

I see signs on houses stating “drive like your kids live here”. Clearly, humans speed in places with children.
 
Last edited:
First off, this is a really sad incident and I'm truly sorry for the victim.

But, while this will undoubtedly slow the progress of testing, I don't think it will change the end result. I believe that the whole morality issue of self driving cars will go away once the technology is truly ready. By "ready" I mean: either self driving tech is basically perfect or it won't be implemented at all.

There's no way, even if the rate is way lower than that of human drivers, that anyone allows commercially available equipment that might accidentally kill someone. It may be hard to imagine now, but either the tech is perfected or its shelved. Sure- there will be accidents, but they would almost always end up being the fault of an external factor (and the driverless tech could prove it so). Any flaws in the tech would immediately be grounded/fixed before continued usage, and likely caught long before through extensive testing. (and yes, nothing is "perfect" but actual accidents would be so rare that the mindset would be different- think one accident across the US every few years.)

Keep in mind- we already rely on computers for our lives. Think driverless trains, autopilot on planes, software used to analyze structures that we walk/drive over/under, and even the electronics within cars already that control ignition timing, anti-lock brakes, etc. We don't have the moral questions we do with driverless cars, because we assume these things to be "ready" using the definition above. Driverless cars will be the same, or they won't be a thing at all.

Edit: Just to be clear- I mean to imply this will be hard. I wouldn't be surprised if the world continues to test driverless cars for 20 years or more before they are "ready". My whole point is that driverless tech has to prove to regulators it is ready- however long it takes, even if it can't happen at all.
 
Last edited:
  • Like
Reactions: fairuz
I call shenanigans.

If you scale the accident rate for autonomous versus human chauffeured miles driven I posit you'll *still* find a statistically significant difference in favor of the autonomous vehicles.
Probably, but nobody here has taken the effort to do that, and I wouldn't say it's obvious without seeing data.
[doublepost=1521494232][/doublepost]
[doublepost=1521493448][/doublepost]
It’s about public safety. Driving will only be permitted in specific places, but not on public roads. Time will tell.
There are already millions of human-operated cars in existence that will all function for a very long time. They're going to be grandfathered in at the very least. I don't see a problem anyway. Those who love driving enough to stay with those after they're technically worse are the best drivers. Most people would opt for the self-driving cars.
 
Last edited:
The car can't "see" -- that's the biggest problem with them. They rely on some sensory data, but primarily on photographic information (pixels) and software to try to figure out what those pixels mean.

Getting vehicles to "see" the world accurately, including all of the countless things a vehicle could encounter, is the whole challenge. If it can "see", telling it what to do is straight forward.

You need to do some homework. In addition to optical sensors that are better than our eyes (full resolution over the entire scene, no need to shift focus, no competition for processing resources, etc.) autonomous cars also use RADAR and LIDAR, giving them the ability to track objects in the scene with centimeter accuracy.

The algorithms that process the torrent of data, and the hardware on which they run, are improving at a very rapid pace. Humans are not improving at all. It's possible that, absent any evolutionary selection pressure for superior hand/eye coordination, rapid reaction time and situational awareness (all advantageous in a predatory environment) we'll actually evolve to be less capable drivers over time.

I don't know how long it will take to happen, but humans driving cars will be but a blip in human history, far shorter than humans riding horses.
 
^^^agreed...without enough details its hard to know what happened - but in the case of Tesla - with sufficient distance the radar is able to detect the obstacle (car, human, bicycle, etc.) and stop safely

- see video of brave human tester

Tesla FANBOYs are extremly retarded. I bet they would sue the company for their "tryout" if something happend.
 
Probably, but nobody here has taken the effort to do that, and I wouldn't say it's obvious without seeing data.
[doublepost=1521494232][/doublepost]
There are already millions of human-operated cars in existence that will all function for a very long time. They're going to be grandfathered in at the very least. I don't see a problem anyway. Those who love driving enough to stay with those after they're technically worse are the best drivers.
I said by the time he was 75. This won’t happen overnight, clearly.
 
Very sad. I have to wonder why the human behind the wheel failed to take over in this situation though. That's the point of testing with a human behind the wheel isn't it?

You just can't take over when you need to react in a fraction of a second. Everybody seems to fail understanding this.
 
  • Like
Reactions: Jayderek
Completely agree. Hopefully this terrible loss is used as a learning, for both the self driving car industry and the general public. It would be far more tragic if this death ends up killing self driving vehicles. There are going to be more situations like this, that the sheer physics of stopping a vehicle before hitting a person, animal or other moving object cannot be overcome - even with the most sophisticated software and safety features.
Here's the learning: Self driving cars are inherently dangerous.
 
  • Like
Reactions: asiga
Depending on how old you are, you may eventually have no choice.
No choice of what? Do you consider being in a self-driving vehicle the same as driving rather than being in a taxi? What all of you ignore is that it is true that you’ll have no choice, but not because of age, but because of government: human driving will be eventually impossible if this goes on. So, the only hope is to stop it. And, mind you, when I arrive to 80, my wish is not to be in a self-driving car connected to Tim’s or Google’s money-making machinery through data recording, but being in a car driven by a person.
 
You just can't take over when you need to react in a fraction of a second. Everybody seems to fail understanding this.

I imagine that it will become clear through the investigation whether a) the autonomous vehicle correctly detected the pedestrian b) whether it reacted correctly c) whether it ever had enough time to react and d) how much time the human driver had to react.
 
  • Like
Reactions: tooloud10
Not sure if you're agreeing with me, or if you're confusing deaths with "death rate"... The reason this is getting outsized attention is because of the rate (annual deaths per vehicle count) and because these are test vehicles for an unproven technology.

Sure, this may be a statistically insignificant sample, and in 10 years we may realize that the rate is lower than it appears today, but saying nobody cares about people killed by human drivers is just deflection.

This is an important story for any number of reasons, but foremost among those reasons is that someone died during corporate product testing while the safety of those products is in dispute.
Well, in terms of death rate, I neither agree nor disagree with you. There is a sum total of one death involving a fully autonomous vehicle (and we don't know the full circumstances of that death). For you to infer a "death rate" is so presumptuous that it can't be taken seriously. There was bound to be a death at some point, but if it had happened after the number of miles driven had been double, would you calculate the rate to be half of what it currently is. You can't do statistics with a sample size of one.

In any event, the death did occur, and we can learn from it ("we" meaning those who have access to the full details), but we won't learn anything from the "death rate".

The point that I was making was that at this point of time, the average pedestrian, or even the average person, has virtually zero exposure to autonomous automobiles. You should of course avoid stepping out into the street into oncoming traffic. But it's not because of autonomous traffic. The real danger to you right now is human drivers. And arguably, by the time the exposure rate to autonomous traffic becomes significant, and the number of incidents like this grow to something that is statistically significant, the autonomous traffic will be far more sophisticated, and safer than it is now. Human-driven traffic may also be safer, thanks to some of the autonomous technology finding its way into human-driven vehicles (human driven, but autonomous braking when needed), but it's highly unlikely that humans themselves will be safer.
 
This all sounds great to me. Motorists have this idea in their heads that it's their god-given right to have acres of city space devoted to storing their cars, and that every road has to be optimized for as much car traffic as possible to move as quickly as possible -- safety and environment be damned. And more and more cities are realizing it doesn't have to be that way. Not every city has to be a sea of cars with one or two people in each one.

Besides, every one of those buses you're complaining about moves enough people for several blocks' worth of car traffic, so one would think you'd be on board with this idea.

I am, until flu season. Or the guy with five bags starts changing clothes. Or someone asks me about my relationship with Jesus. Or someone wants to share their passion for their music.

Or the cyclists who want to be seen as cars when it serves them, but pedestrians when that’s better. Or when it is raining and the bike takes up the space of five people and so some people can’t get on because there are so many bikes. Or when it does get crowded and someone gets mad because people touched his bike. Or when they want to get their bike off the bus before everyone else get off.

I can’t change some people, but I can avoid them.
 
  • Like
Reactions: tooloud10
Meanwhile (in the US only) 15 pedestrians will be killed today by negligent human drives. 15 more will die tomorrow, 15 died yesterday and 15 die everyday. Why does no one care about that?

Why do you assume nobody cares about that? Of course people care. The ratio of deaths from self driving cars, though, to cars with drivers is phenomenal, if you must know.
 
[doublepost=1521493448][/doublepost]
It’s about public safety. Driving will only be permitted in specific places, but not on public roads. Time will tell.
[doublepost=1521493616][/doublepost]
Yes, a self-driving car can identify children playing and slow down.

https://www.ted.com/talks/joseph_re...o_recognize_objects_instantly/up-next#t-27087

I see signs on houses stating “drive like your kids live here”. Clearly, humans speed in places with children.
Let’s hope it’s not my lifetime.
 
Cowboys. This is not like other software that you just throw out into the wild.
 
I'm pretty sure there isn't a single person posting here who hasn't crossed outside of a crosswalk a few times in their lives.
You're probably right. Darwinism is concerned with macro trends, not anecdotal ones. Stepping off the crosswalk doesn't automatically doom you, and staying on the zebra stripes doesn't mean you'll always be safe. In fact, risk-takers can sometimes end up being the "fittest" who survive. Or not. It's the end result that matters. The traits that survive are the "fittest", with no moral value assigned to those traits.

It's just safer to cross at designated crosswalks, and those who routinely jaywalk are presumably more likely to get hit.
 
  • Like
Reactions: tooloud10
That's going to make Top Gear (The Grand Tour) a very boring show.
Top Gear is still a show, it’s got new presenters and is getting to be as good as it was with the old presenters. (I actually think this season of Top Gear is better than The Grand Tour Season 2)
 
Here's the learning: Self driving cars are inherently dangerous.

That's not the learning - not even close.

Even today's cars with pedestrian detection and automated braking aren't going to work 100% of the time, but that doesn't mean they're not worth having and continuing to develop. Human drivers are clearly challenged on multiple levels and the cause for ~99% of accidents and deaths on the road today. The only way we're going to reduce the number of deaths is to either eliminate vehicles altogether or shift some of the burden away from humans.
 
  • Like
Reactions: motulist
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.