Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
But there would be fewer vehicles and probably fewer vehicle-hours in total on the road. Why own a car if you can get one by using an app on your mobile phone?
You don’t have kids with car seats or drive in cabs that smell like cherry feces.

I want my own car. Many others will too.
 
  • Like
Reactions: tooloud10 and dk001
Your comment is ABSURD. A person DYING is not "one of the best possible scenarios". It's a completely ridiculous thing to say. The best scenario would be for the car to stop/avoid the person. Training can be done with mannequins or other things that don't DIE when the test fails.

I really don't know what is wrong with you people.

That was one of the complaints California had with Uber testing: how potential accidents and normal driving were handled. ARTICLE Also who was in control and test licensing ARTICLE. Another reason why Uber went to Az to test...
o_O

The software should always place human life as priority. It can still learn from new situations as they occur.
 
That was one of the complaints California had with Uber testing: how potential accidents and normal driving were handled. ARTICLE Also who was in control and test licensing ARTICLE. Another reason why Uber went to Az to test...
o_O

The software should always place human life as priority. It can still learn from new situations as they occur.

Agreed, but do we know it wasn't the human being behind the wheel overriding the automation, say to avoid a crash with another car that the automation would have chosen over hitting the pedestrian? We should wait for the details.
 
I had to evict a tenant in Portland for not paying her rent (single mother who could afford to send her daughter to St. Mary's). Won a judgement. A year or two later after I had moved to another city, got a check. She had stepped out in front of a TriMet bus just as it came to a stop. Pretty sure she had calculated this move as she was knocked over, but not seriously injured. TriMet settled with her out of court, and I got my check as part of the process. I doubt the Uber accident was intentional, but never underestimate what people will do.

So... the takeaway is what, treat every pedestrian who is injured by a vehicle as a suspected criminal?
 
The software should always place human life as priority. It can still learn from new situations as they occur.

Right, but then the question becomes: whose human life takes priority?

There are many cases where an instant decision has to be made, and one worry is that cars will be programmed to always protect the occupants first. Sometimes that makes sense, sometimes it's not the best choice,

E.g. kid darts out into highway. Many adults would steer even at high speed into a pole or wall or other car, in order to avoid the child.

E.g. same scenario, but you have your own kids in your car. Obviously you want to preserve them.

What if the kid who darts into traffic is yours? How bad would you feel if your autonomous car killed her instead of taking a chance and crashing you instead?

Computers can't make instant judgement calls the way a human can.
 
  • Like
Reactions: dk001 and Bigsk8r
So I’m guessing being a machine, it wouldn’t even bother breaking, if it calculated that there’s no way it could possibly have stopped in time to prevent the accident. It also has to take in account the potential dangers of breaking or swerving vigorously to other vehicles on the road.
[doublepost=1521588225][/doublepost]
Right, but then the question becomes: whose human life takes priority?

There are many cases where an instant decision has to be made, and one worry is that cars will be programmed to always protect the occupants first. Sometimes that makes sense, sometimes it's not the best choice,

E.g. kid darts out into highway. Many adults would steer even at high speed into a pole or wall or other car, in order to avoid the child.

E.g. same scenario, but you have your own kids in your car. Obviously you want to preserve them.

What if the kid who darts into traffic is yours? How bad would you feel if your autonomous car killed her instead of taking a chance and crashing you instead?

Computers can't make instant judgement calls the way a human can.
Well if you look at health and safety laws as it is here. You have to protect your own life first above all else. I think the laws for sell driving vehicles will be set up the same, that is to put the welfare of the vehicle occupants first. There’s no use swerving dangerously if it could potentially cause more fatalities. You want to reduce loss of life.

So like even firemen don’t just walk needlessly into a situation if it puts his own life or his men in imminent danger. They have to factor all the risks. And minimise fatalities.
 
Last edited:
Right, but then the question becomes: whose human life takes priority?

There are many cases where an instant decision has to be made, and one worry is that cars will be programmed to always protect the occupants first. Sometimes that makes sense, sometimes it's not the best choice,

E.g. kid darts out into highway. Many adults would steer even at high speed into a pole or wall or other car, in order to avoid the child.

E.g. same scenario, but you have your own kids in your car. Obviously you want to preserve them.

What if the kid who darts into traffic is yours? How bad would you feel if your autonomous car killed her instead of taking a chance and crashing you instead?

Computers can't make instant judgement calls the way a human can
.

That, is the $64 million question o_O
 
So I’m guessing being a machine, it wouldn’t even bother breaking, if it calculated that there’s no way it could possibly have stopped in time to prevent the accident. It also has to take in account the potential dangers of breaking or swerving vigorously to other vehicles on the road.

Defiantly circumstances we don't know. Such as if there was another car in the Left-hand lane, preventing sudden diversion into that lane. We also don't know if there was a car behind the Uber, although that doesn't seem likely.

Doing some quick measurements and math, Herzberg (the pedestrian) could have "materialized" in view giving the Uber's sensors maybe 2-3 seconds at most before she was in the street. If she didn't pause to check for oncoming traffic. There is blocking foliage that would have prevented LiDAR, and maybe fouled radar detection (of her bike). That time window would have been barely enough time for a human to break, if they'd seen and recognized her immediately.

My sense is that Uber's scanning systems failed to ID Herzberg when she first appeared. And by the time they did (if they did) it was well and truly too late. And again we're talking 2, maybe 3 seconds. The human operator didn't even notice her until the collision happened.

I also have a feeling that the Uber's driving systems may not have a defensive enough bias. The screening foliage should have been prompting the Uber into going slightly slower, not speeding (reports have it going 43 in a 40 zone). A slightly slower speed and biasing towards that blind spot may have made a difference.
 
  • Like
Reactions: dk001
There are roughly 300 million cars registered in the US and maybe 100 of the are driverless vehicles, so your conclusion that the rates of pedestrian deaths are the same for both is illogical and absurd. That’s why no one cares.
I love how his post was liked by over 70 other people that didn't think it through
 
Defiantly circumstances we don't know. Such as if there was another car in the Left-hand lane, preventing sudden diversion into that lane. We also don't know if there was a car behind the Uber, although that doesn't seem likely.

Doing some quick measurements and math, Herzberg (the pedestrian) could have "materialized" in view giving the Uber's sensors maybe 2-3 seconds at most before she was in the street. If she didn't pause to check for oncoming traffic. There is blocking foliage that would have prevented LiDAR, and maybe fouled radar detection (of her bike). That time window would have been barely enough time for a human to break, if they'd seen and recognized her immediately.

My sense is that Uber's scanning systems failed to ID Herzberg when she first appeared. And by the time they did (if they did) it was well and truly too late. And again we're talking 2, maybe 3 seconds. The human operator didn't even notice her until the collision happened.

I also have a feeling that the Uber's driving systems may not have a defensive enough bias. The screening foliage should have been prompting the Uber into going slightly slower, not speeding (reports have it going 43 in a 40 zone). A slightly slower speed and biasing towards that blind spot may have made a difference.

Pushing a bicycle laden with plastic shopping bags, a woman abruptly walked from a center median into a lane of traffic and was struck by a self-driving Uber operating in autonomous mode.

“The driver said it was like a flash, the person walked out in front of them,” said Sylvia Moir, police chief in Tempe, Ariz., the location for the first pedestrian fatality involving a self-driving car. “His first alert to the collision was the sound of the collision
 
Wow... have we descended to this level of low? Where the ends justify the means? Human life matters none as long as we can have technological changes?

That is...
His point is from bad things happening many good things will come about.
Imagine if doctors stopped doing or researching if a particular procedure didn’t have a 100% positive result.
 
I was expecting people to intentionally troll driverless cars. Something like throwing eggs to obscure the sensors, do erratic and stupid sudden movements like jumping in front of a driverless car to see if it would brake etc etc.

Kinda curious now to see what really happened that could cause fatality. Did the car continue driving over the women or was it a really old women that just crumbles when she was hit by the car.

the human driver behind the wheel are really stupid, they tend to thing because its a driverless car they can relax and show off that they aren't holding the steering wheel or something. they should implement the auto drive function by placing both hands on the steering wheel and once it detects a heart beat like those treadmill machine in the gym then they will turn on driverless feature. at least implement those in the testing phase, that way there is no way to blame the machine but the human driver when things like this happens
 
I really don’t understand who wants these things. Personally I love driving and hate being a passenger. I guess for long road trips? Having a nap on the way to work?
 
I really don’t understand who wants these things. Personally I love driving and hate being a passenger. I guess for long road trips? Having a nap on the way to work?
Think about people with disabilities and people who don’t drive. Also driverless cars will one day be safer than human drivers. Humans cause a lot of preventable collisions.
 



An autonomous test vehicle being tested by Uber struck and killed a woman in Tempe, Arizona late Sunday night, marking what appears to be the first pedestrian killed by an autonomous vehicle, reports The New York Times.

The Uber vehicle in question was in an autonomous driving mode with a human safety driver at the wheel, and the woman who was struck was crossing the street outside of a crosswalk, according to local police. No other details on the accident are available at this time.

lexussuvselfdriving2-800x511.jpg

One of Apple's autonomous test vehicles
Uber is cooperating with Tempe police and has suspended all of its self-driving vehicle tests in Tempe, Pittsburgh, San Francisco, and Toronto at the current time. Uber's autonomous vehicles have previously been involved in collisions, as have vehicles from other companies like Tesla, but this is the first pedestrian-related accident that has resulted in a fatality.

This incident will likely have wide-ranging implications for all companies who are testing autonomous vehicles, including Apple, and it could potentially result in more oversight and regulation.

Apple has been testing its autonomous vehicles on public roads in California near its Cupertino headquarters since last year. Apple vehicles, which include a series of Lexus RX450h SUVs equipped with a host of sensors and cameras, have not been involved in any known accidents to date.

To date, most autonomous vehicles in California and Arizona have been using safety drivers behind the wheel who are meant to take over in the event of an emergency, but California in February lifted that rule.

Starting on April 2, companies in California that are testing self-driving vehicles will be able to deploy cars that do not have a driver behind the wheel. Arizona also allows driverless cars to be tested in the state, and Waymo has been testing autonomous driverless minivans in Arizona since November.

Update: Tempe police chief Sylvia Moir told the San Francisco Chronicle that based on a preliminary investigation, it does not appear Uber is at fault in the accident. "It's very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway," she said. She also clarified that the Uber vehicle did not make an attempt to brake.

Moir did say, however, that she will not rule out the potential to file charges against the back-up driver in the vehicle. Tempe police will work with investigators from the National Transportation Safety Board and the National Highway Traffic Safety Administration to further investigate the accident.

Article Link: Self-Driving Uber Car Kills Pedestrian in Arizona, Accident Could Have Implications for Autonomous Vehicle Testing


People that don't follow safety rules but their lives in danger. She took the risk and this time it cost her her life. This in no way should be the fault of the car, the company or anyone in the car. The fact that it was a self driving car should only be a footnote. Is it sad she died? yes, but it was her own fault. If this had been in a crosswalk and she was following the rules of the road then it would be a different story.
 
  • Like
Reactions: Ursadorable
Right, but then the question becomes: whose human life takes priority?

There are many cases where an instant decision has to be made, and one worry is that cars will be programmed to always protect the occupants first. Sometimes that makes sense, sometimes it's not the best choice,

E.g. kid darts out into highway. Many adults would steer even at high speed into a pole or wall or other car, in order to avoid the child.

E.g. same scenario, but you have your own kids in your car. Obviously you want to preserve them.

What if the kid who darts into traffic is yours? How bad would you feel if your autonomous car killed her instead of taking a chance and crashing you instead?

Computers can't make instant judgement calls the way a human can.
This is brought up, but I don’t think it will be too complicated. Cars will be programmed in a specific manner that the company / government feels is the best case scenario. I’m sure we’ll see laws that dictate who’s life takes priority if a car needs to make a decision.

People may disagree with certain outcomes, but unlike with humans, we’ll be able to dictate how accidents are handled prior to them happening.

This is a a problem that needs solving, but it’s a good thing to be able to have control of.
 
Wow... have we descended to this level of low? Where the ends justify the means? Human life matters none as long as we can have technological changes?

That is...

Yup, Humans suck as a species. greedy, cruel, control freeks you know the rest. One less does not phase me one bit.
 
Your comment is ABSURD. A person DYING is not "one of the best possible scenarios". It's a completely ridiculous thing to say. The best scenario would be for the car to stop/avoid the person. Training can be done with mannequins or other things that don't DIE when the test fails.

I really don't know what is wrong with you people.
My comment was related to the training of these autonomous vehicles. This accident could've involved many people and cars but it didn't.

Autonomous cars have been avoiding people and objects for years now. That is not machine learning because they're just running the same routines over and over again. They need to be challenged which is why they are one real roads with cars and people. Unless you've developed some kind of walking/running city full of mannequins and cars that can simulate pedestrian and driving behavior, this is the only way to effectively train autonomous driving. It's unfortunate but will save so many more lives in the near future.
 
Last edited:
That is the miss. California came down on Uber and its' autonomous program for precisely that reason: lack of defensive capability.

Can we expect everything to be perfect?
Heck approved safety devices designed to protect us in the event of an accident are in turn killing people.

Not all things can be perfectly compensated for or calculated.

the loss of life is tragic but we shall learn from it
 
I think autonomous cars should be held to a much higher standard! It should not have hit the women period. A machine can be repaired but no amount of effort will bring this dead woman back to life. Apparently, it did not even try to stop or evade. Total Failure! What if this had been your little child? The technology is not ready by a long shot. I recommend when the time comes, these vehicles be limited to restricted access roads (code word meaning freeways) only until they build an extensive driving record, such as, ten years. City/Suburban roads would be off limits. Some would argue that it defeats the main reason for having them in the first place. I say, explain it to the dead woman.
 
Can we expect everything to be perfect?
Heck approved safety devices designed to protect us in the event of an accident are in turn killing people.

Not all things can be perfectly compensated for or calculated.

the loss of life is tragic but we shall learn from it

Perfect? No.
That however does not excuse deliberately taking actions, defined, required actions that others have to lower the risk.
Uber did this. It is also why they are testing in Az.
 
Apparently, it did not even try to stop or evade.

The video shows that a human probably wouldn't have seen her in time, it was so dark where she crossed.

But yeah, the infrared cameras should've seen her from a hundred feet away, crossing the road. Plenty of time to calculate her movement and slowed or turned.

Something had to have gone wrong in the avoidance code.
 
  • Like
Reactions: ftaok
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.