Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This is utter nonsense. People are incredibly inadequate drivers on average, and half thus below average. The Tesla crash wasn't a case of bad tech: it was bad judgement and idiotic behavior.
That may all be true, but what exactly is the point of a self-driving car if I still have to pay just as much attention as if I was driving myself?

Also, say what you will, but I think if the auto pilot cannot distinguish a truck from an overhead sign, that is "bad tech", in this case insufficient sensors. There is a reason why real autonomous cars like the ones that Google has been testing for years have lidars. And even in simple cases the system often doesn't react very well, like e.g. in this example:

 
You keep saying that, but what exactly is so superior about this system?
Even Tesla's current semi-autonomous driving in the $100K Model S and X isn't mature, as the recent incidents show. If you think the Model 3, where they will have to cut a lot of corners to get the cost down, will debut with fully autonomous driving with no driver intervention, you'll be just as disappointed as the customers who believed their Model X would be delivered in early 2014 as originally promised by Elon Musk.
[doublepost=1468365470][/doublepost]Well, I'm no Mercedes fan, but on their web page it looks like the 2017 model is already out? That's a bit further than just "on the roadmap". :p
Thanks for showing your true colors. You're here to inflame and solely to criticise and diminish Tesla's successes to date.

Tesla keeps its promises even though they can sometimes run over time, they don't cut corners however. Tesla keeps their promises and are leading the industry which is kicking and screaming for Tesla to stop all the disruption.

Mercedes is playing a completely frantic catch up game, just like the rest of the industry, and doesn't even have a tenth of the technology Tesla includes.

Your comments are completely way off being anywhere accurate or credible. How embarrassing for you.
 
I'm sorry, but the way Apple has routinely and in some cases sadistically beta tested its OS for more than a decade on new adoptees and released product before it was ready, you couldn't pay me a million bucks to climb into an Apple car of ANY kind. Until they start bettering that atrocious record, especially among media people (who've had sound and video editing disasters with almost every new iteration of the OS, many of which are STILL on SNOW LEOPARD, believe it or not), they had best think twice before getting in the car biz. This is NOT going to be their "next big thing" savior from the toymaking crash. How about making better state of the art computers and making sure they and the OS works faster, quicker, and better than anything else? Yeah, there's a concept.

I was right about the toymaking crash as cheaper and sometimes, even better options came along. They'd best be paying attention.

Besides, tube transport is the real way of the future. Not this interim half-time solution of automated cars that give some semblance of autonomy. Until you crash.

:apple:
 
  • Like
Reactions: AlexGraphicD
I am one to believe there is a middle ground here.

The autonomous vehicle technology has great potential to reduce fatalities due to incompetent and irresponsible driving, as well as be able to serve those who have lost the ability to do so (blind/disabled, etc).

However, I for one would probably NEVER use it, insofar as I am able to drive. This is not like flying, where the risk of collision is much, much less, given current flight plan and air traffic control standards. And that is the key word: autonomous vehicle implementations have to be a government-regulated and overseen standard in all vehicles for it to work safely.

Driving is dangerous. You bet your life every time you do so.

It is also a 360-degree operation. Can current technology avoid an incoming vehicle from a standstill, for example? I have dodged many accidents by moving out of the way of cars/trucks whose drivers just didn't see me or were not paying attention.

So the issue is control. I will trust my own judgement and awareness over a computer's algorithms any day. Hell, I own (and prefer) manual transmission vehicles. I enjoy driving.

So who knows? Maybe we'll get to the point like in the I, Robot movie where we can choose to go into autopilot mode at will.

I am just not willing to put my life (or the lives of my loved ones) on the line to get there.
 
Google said early on that they learned people learn to trust the self driving features far too much, and that you can't have a car drive itself perfectly for weeks/months/years and expect the driver to be fully alert and ready to take over at any time.

For that reason, they said they do not consider the technology ready until the human driver does not ever need to take over.

I have always thought that Tesla rolling out their Autopilot features was dangerous for exactly the reasons Google said. It works well enough that a human driver will not be prepared to take over if necessary.

Edit:

After doing some research to try and find the blog where Google says they don't believe Level 3 autonomy (Level 3 means the human driver has to be ready to take over) is safe. Every other major car company apparently has come to the same conclusion, Level 3 gives a false sense of safety and it isn't realistic to expect a human driver to be attentive enough to take over in an emergency.

Tesla claims Auto-pilot is Level 2, but most others think it falls under Level 3.

http://www.theverge.com/2016/4/27/11518826/volvo-tesla-autopilot-autonomous-self-driving-car

Seems like an obvious but very important observation. I'm probably gonna get a lot of hate for posting this but to be entirely honest, I usually drive best behind a manual transmission. I sort of hate automatic. Slapstick always feels like a mess so I don't even bother. I'm sad to see manual becoming increasingly niche, and I agree with the curmudgeons: everyone should have to learn how to drive a manual transmission vehicle.

Side Thought: I wonder what driver error/crash statistics would look like if manual tx was the norm today (in the USA).
 
That may all be true, but what exactly is the point of a self-driving car if I still have to pay just as much attention as if I was driving myself?

Also, say what you will, but I think if the auto pilot cannot distinguish a truck from an overhead sign, that is "bad tech", in this case insufficient sensors. There is a reason why real autonomous cars like the ones that Google has been testing for years have lidars. And even in simple cases the system often doesn't react very well, like e.g. in this example:

The point is you are comparing a 90% accurate system with a half as accurate system, by asking the less accurate system how it thinks it does.

We already know two things: people suck at driving; people are hard wired to overestimate themselves.

The systems are an improvement and already have caused a decrease in lethal accidents.

You're advocating the halt of those advances, because it causes people to trust the autopilot "too much". I think you trust yourself too much, and you're not alone, we are hardwired that way.

The only problem is we can't get an update for that.
 
Manual Transmission drivers understand the engine and torque limitations of their vehicles better. Electric drivers don't have a transmission nor the limitations: it's a whole different ballgame and your driving habits will change because of it.

I can get and post 1000's of clips here with situations that would have been preventable with semi autonomous systems.
 
The point is you are comparing a 90% accurate system with a half as accurate system, by asking the less accurate system how it thinks it does.

We already know two things: people suck at driving; people are hard wired to overestimate themselves.
But wouldn't that speak in favor of systems that only intervene in clear emergency situations (like the emergency braking systems), rather than pseudo-autonomous systems that are giving the driver the illusion that they don't need to pay as much attention?
The systems are an improvement and already have caused a decrease in lethal accidents.
I think that still has to be proven. I know that Tesla cites the number of accident-free miles driven by their cars in autopilot mode, but there is very likely a significant selection bias. For example, a large portion of car accidents are caused by young, inexperienced drivers, but Tesla drivers likely have a higher average age than the general driving population simply due to the high price of the vehicles. There aren't enough samples yet to really determine what effect these semi-autonomous systems have.
You're advocating the halt of those advances, because it causes people to trust the autopilot "too much".
I'm actually more in favor of waiting until fully autonomous systems are mature, rather than releasing the current half-baked solutions on the public.
 
Last edited:
But wouldn't that speak in favor of systems that only intervene in clear emergency situations (like the emergency braking systems), rather than pseudo-autonomous systems that are giving the driver the illusion that they don't need to pay as much attention?
I think that still has to be proven. I know that Tesla cites the number of accident-free miles driven by their cars in autopilot mode, but there is very likely a significant selection bias. For example, a large portion of car accidents are caused by young, inexperienced drivers, but Tesla drivers likely have a higher average age than the general driving population simply due to the high price of the vehicles. There aren't enough samples yet to really determine what effect these semi-autonomous systems have.
I'm actually more in favor of waiting until fully autonomous systems are mature, rather than releasing the current half-baked solutions on the public.

In my experience the semi autonomous modes made me relax my concentration, making it less straining. The constant reporting of cars around me (including cars I could not see myself, something that made me very very humble towards myself) made me have better assessment options towards the potential dangers ahead.

I think paying constant attention deteriorates the concentration. We're not equipped to do that.

I agree with you on the numbers: statistics are hard to interpret. I disagree on the younger drivers stats for similar reasons. I suspect older drivers (atleast in Europe) having more dangers. But the numbers should be worked on.

The systems were talking about do two things by the way; they alert and prevent collision (help breaking when speed is too high, break when danger is imminent, alerting you when switching lanes if there is someone there (or even accelerating into that spot(!).

I think driver distraction caused by overestimation of the system is a problem, that needs to be addressed without "throwing away the child with the bath water". The risk in that is that we really overestimate ourselves. Problem with that risk is that overestimation is beneficial to our own mental health.

Google's stance on this scares me. It has an aura of taking away our capabilities, even if there's a possibility we'll actually improve ourselves. Makes me feel like I'm part of a sheep herd.
 
In my experience the semi autonomous modes made me relax my concentration, making it less straining.
That's just the thing: With a semi-autonomous system, you can't really relax. Failure to pay constant attention is what led to the Tesla autopilot accidents.
The constant reporting of cars around me (including cars I could not see myself, something that made me very very humble towards myself) made me have better assessment options towards the potential dangers ahead.
Many non-autonomous cars have this kind of sensors (blindspot detection, cross traffic detection, IR cameras in the dark etc.).
The systems were talking about do two things by the way; they alert and prevent collision (help breaking when speed is too high, break when danger is imminent, alerting you when switching lanes if there is someone there (or even accelerating into that spot(!).
Those are safety systems. My current car has them too (well, not the "accelerating into a spot" thing). But autonomous driving is a different animal.
 
Again you're just basing this on the assumption your self assessment is right.

People cannot concentrate prolonged periods of time. You can't either. If your concentration can relax (obviously not to the point you start to read a book of course) you have better focus when it matters.
 
Yep. There's nothing like relying on good old human senses. Those only screw up enough to cause 6,400,000 accidents and kill 30,000 people a year with mature technology. Three accidents and a single fatality by early versions of 2 totally different systems definitely closes the book on all this technology.

Back to horses and buggies everyone!

Well, there's only a few of these things on the road too (compared to many orders of magnitude normal cars with real human drivers). And, all but a few either a) are still being driven by responsible drivers (i.e.: not wrongly concluding the Teslas have a real 'autopilot') or b) driving like 20 mph max like a drunken kid on the first day of driver's ed class.

Have you not watched any of the YouTube videos of Tesla drivers trying out 'autopilot'?

I have absolutely no doubt that the next decade will prove this statement to be inaccurate.

It's not a quantitative thing, it's a qualitative one. It will, no doubt improve... but in the grand scheme of things, that's not saying much. The only way it's going to work well is if *every* vehicle is computer-controlled, and they all communicate, and the sensors used are *WAY* better than the current ones.
 
Autopilot did not cause that accident. It was caused by: a trailer doing a wrong turn, low light conditions, a distracted driver, and bad luck. It doesn't stand out statistically by any means, isn't special, isn't pinpointed to be caused by autopilot or tesla (as in the car made it happen, changed direction, speeding up, allowing high speed on auto etc).

The only reason we're taking about this is that all of the current drivers of automobiles think they can judge the autonomous systems on a scale to themselves. It's ridiculous. We continuously switch attention and judge cars on sight; wrongly guessing speed, can't predict at all, cannot keep track of more than two vehicles, misjudge time itself and have a response time that's about a hundred times slower than a computer. On top of that the mind will, in an emergency, simply decide for you what you will see or not, and afterwards play you a movie of what just happened. You'll see the movie as truth, and you will fabricate memories of your own actions to improve your behavior, and fend off guilt.

So yeah; people are a problem, in any case. But the arguments against autopilot (People can't handle the responsibility) is really a case against driving. And if you ask me; we should either stop driving and prevent 1000's of deaths; or keep driving and progress towards something better. The steps in between... well... it will be something to adjust to. We can do it, we're tool builders. Learn to use the new tools.

Do you know what's really wrong with the whole drivers responsibility discussion? There's no mental stability test before a driving test. There are no critical thinking skill classes required. No psychology 101... and there won't be, because we can't handle it; we just break down when it comes to our shortcomings and limits.

Woah there's the rant! I was going to delete it but meh. Let's see what happens.
 
Let's investigate the victim? Sure, under-ride guards should be a requirement. But... No, the failure is in Tesla's system unfortunately. There are always going to be cases where vehicles and other obstructions are going to cause issues on the road. The onus will alway need to be on the self driving car to identify all threats and respond appropriately without harm. Harm to the occupants and those outside the vehicle must always be paramount responsibilities for the autonomous vehicle. The bar is incredibly high and it will be possible for Tesla to reach and rise above it, I'm confident of this.

Here's the catch: It's not a self driving car. It doesn't make any claims to be a self driving car. You get told that it _helps_ you driving the car. You are still the driver. The driver almost killed himself one week earlier. It's a miracle that he lived that long. He may get a Darwin Award for removing himself from the gene pool.
 
Autopilot did not cause that accident. It was caused by: a trailer doing a wrong turn, low light conditions, a distracted driver, and bad luck. It doesn't stand out statistically by any means, isn't special, isn't pinpointed to be caused by autopilot or tesla (as in the car made it happen, changed direction, speeding up, allowing high speed on auto etc).

The only reason we're taking about this is that all of the current drivers of automobiles think they can judge the autonomous systems on a scale to themselves. It's ridiculous. We continuously switch attention and judge cars on sight; wrongly guessing speed, can't predict at all, cannot keep track of more than two vehicles, misjudge time itself and have a response time that's about a hundred times slower than a computer. On top of that the mind will, in an emergency, simply decide for you what you will see or not, and afterwards play you a movie of what just happened. You'll see the movie as truth, and you will fabricate memories of your own actions to improve your behavior, and fend off guilt.

So yeah; people are a problem, in any case. But the arguments against autopilot (People can't handle the responsibility) is really a case against driving. And if you ask me; we should either stop driving and prevent 1000's of deaths; or keep driving and progress towards something better. The steps in between... well... it will be something to adjust to. We can do it, we're tool builders. Learn to use the new tools.

Do you know what's really wrong with the whole drivers responsibility discussion? There's no mental stability test before a driving test. There are no critical thinking skill classes required. No psychology 101... and there won't be, because we can't handle it; we just break down when it comes to our shortcomings and limits.

Woah there's the rant! I was going to delete it but meh. Let's see what happens.


Hahaha, I'll bite. ;)

"we should either stop driving and prevent 1000's of deaths; or keep driving and progress towards something better."

I believe that driving is a necessary skill, and like any other skill, it requires cultivation. The problem is not that people are driving. The problem is that training is non-existent. The burden of it is on the driver. There is no real validation that people are trained to drive, other than a license you get in one day.

An aircraft pilot needs to be certified as such, meaning he has to demonstrate, by an accredited organization, that he has the required education, training, and hours logged. He can't just show up, take a test, and fly away.

By comparison, the roads are the wild west, and the education system currently all but ignores this very basic need for training. I had to study and test for many subjects in school I'll NEVER use. But EVERYONE should know how to properly operate a vehicle, obey road signs, and develop a 360-degree sense of vehicular awareness.

We now want to obviate this skill alltogether, and let a machine operate a vehicle.

I feel this is INSANE, unless ALL vehicles are part of a coherent system and use standards regulated by a single organization. And even so, given the inherent danger in driving, just letting a machine do it is crazy. What happens when that machine fails (like they always do) at 70 MPH? Does the person in the vehicle have the knowledge to deal with the situation?

Therefore, we lose drivers (and innocent passengers and bystanders) not because of human capacity, but because of (a lack of) human ability.

Training is the only proper solution here.

Self-driving cars will solve NOTHING, unless we want to give away the freedom that we have when on the road.

I, for one, agree with the sentiment:

"Those who are willing to give up freedom for obtaining security deserve neither freedom nor security."
 
  • Like
Reactions: SteveW928
Hahaha, I'll bite. ;)

"we should either stop driving and prevent 1000's of deaths; or keep driving and progress towards something better."

I believe that driving is a necessary skill, and like any other skill, it requires cultivation. The problem is not that people are driving. The problem is that training is non-existent. The burden of it is on the driver. There is no real validation that people are trained to drive, other than a license you get in one day.

An aircraft pilot needs to be certified as such, meaning he has to demonstrate, by an accredited organization, that he has the required education, training, and hours logged. He can't just show up, take a test, and fly away.

By comparison, the roads are the wild west, and the education system currently all but ignores this very basic need for training. I had to study and test for many subjects in school I'll NEVER use. But EVERYONE should know how to properly operate a vehicle, obey road signs, and develop a 360-degree sense of vehicular awareness.

We now want to obviate this skill alltogether, and let a machine operate a vehicle.

I feel this is INSANE, unless ALL vehicles are part of a coherent system and use standards regulated by a single organization. And even so, given the inherent danger in driving, just letting a machine do it is crazy. What happens when that machine fails (like they always do) at 70 MPH? Does the person in the vehicle have the knowledge to deal with the situation?

Therefore, we lose drivers (and innocent passengers and bystanders) not because of human capacity, but because of (a lack of) human ability.

Training is the only proper solution here.

Self-driving cars will solve NOTHING, unless we want to give away the freedom that we have when on the road.

I, for one, agree with the sentiment:

"Those who are willing to give up freedom for obtaining security deserve neither freedom nor security."
I'm not in the US. Over here there's significant training involved in getting a license. Manual transmission is the norm.

Having automated systems assist and take over parts of driving is not in any way linked to drivers losing ability.

While I think nearly all drivers overestimate themselves, I also think people are better than what you project them to be.

We let 18 y.o drive in explosion driven gas containers. Of course ability is needed. Training will be needed. Driving is still a skill.

We're going in circles so I'm quitting the discussion. I do see you points and I do disagree, that's fine by me. My experience in the Autopiloted car humbled me and made me see the benefits, I hope you'll atleast try it (if you haven't already).
 
I'm not in the US. Over here there's significant training involved in getting a license. Manual transmission is the norm.

Having automated systems assist and take over parts of driving is not in any way linked to drivers losing ability.

While I think nearly all drivers overestimate themselves, I also think people are better than what you project them to be.

We let 18 y.o drive in explosion driven gas containers. Of course ability is needed. Training will be needed. Driving is still a skill.

We're going in circles so I'm quitting the discussion. I do see you points and I do disagree, that's fine by me. My experience in the Autopiloted car humbled me and made me see the benefits, I hope you'll atleast try it (if you haven't already).

In case you really didn't quit...

I agree that there would be benefits to automation technology. That said, I would never use the capability in a fire-and-forget scenario unless I had to (example: I'm incapacitated, and can say "car, take me home").

I do not know where you're located but in most places I've been around the world (I'm ex-military) driving is generally far more severe and agressive than here in the CONUS (continental USA), so I can see your bias for automation instead of against.

But my contention is that a fully autopiloted car, without the regulation and standards similar (though way,way more stringent) to what the international community does for Air Traffic Control, is simply too dangerous. I believe it's an all-or-nothing deal: Either every car has the tech or the system is smart enough to react to any situation that places the vehicle and it's occupants in danger (like avoiding an oncoming vehicle even from a standstill). I don't see that happening without transponders or sensors being built in to all vehicles, so that they all communicate with each other constantly.

That is not to say that it can't be done, but that we need it done everywhere.

So, if we can implement the automation without having to relinquish control of our vehicles (there must be an OFF button so the car becomes manual), then I'm all for it.

Note: I LOVE and drive a manual daily, and given the choice, I would not drive anything else. :)
 
In case you really didn't quit...

I agree that there would be benefits to automation technology. That said, I would never use the capability in a fire-and-forget scenario unless I had to (example: I'm incapacitated, and can say "car, take me home").

I do not know where you're located but in most places I've been around the world (I'm ex-military) driving is generally far more severe and agressive than here in the CONUS (continental USA), so I can see your bias for automation instead of against.

But my contention is that a fully autopiloted car, without the regulation and standards similar (though way,way more stringent) to what the international community does for Air Traffic Control, is simply too dangerous. I believe it's an all-or-nothing deal: Either every car has the tech or the system is smart enough to react to any situation that places the vehicle and it's occupants in danger (like avoiding an oncoming vehicle even from a standstill). I don't see that happening without transponders or sensors being built in to all vehicles, so that they all communicate with each other constantly.

That is not to say that it can't be done, but that we need it done everywhere.

So, if we can implement the automation without having to relinquish control of our vehicles (there must be an OFF button so the car becomes manual), then I'm all for it.

Note: I LOVE and drive a manual daily, and given the choice, I would not drive anything else. :)
:). Manual cars are way cooler than automatic transmissions. But in electric cars it wouldn't make any sense ;-)

Ps I tried to quit
 
There's nothing like relying on good old human senses. Those only screw up enough to cause 6,400,000 accidents and kill 30,000 people a year with mature technology. Three accidents and a single fatality by early versions of 2 totally different systems definitely closes the book on all this technology.
 
I'm sure competitors are behind making this a mountain out of a molehill but reality is Tesla and Google autonomous vehicles are far safer than idiotic human drivers.
 
I'm sure competitors are behind making this a mountain out of a molehill but reality is Tesla and Google autonomous vehicles are far safer than idiotic human drivers.

The simple point is that the car was not made to drive itself unattended. The driver was supposed to remain aware and active.

That's like suing Chevy or Ford because cruise control didn't slow down or stop when the light turned red or someone stopped in front of you.
 
Please quit comparing this to cruise control. Cruise control allows a driver to be more focused on the road because they no longer have to keep looking down to check their speed and make sure they are obeying the law.

We aren't allowed to watch movies in the front seat because it can distract the driver. We aren't allowed to text, or talk on the phone while driving because it can be distracting, etc. This feature increases the chance that the driver will become distracted, and I think it is dangerous for Tesla to be offering it to anyone who wants it at this time. I don't care what kind of warning signs etc you put on it.

I appreciate what they are trying to do, but it should really only be in the hands of professional drivers at this time. I'm really surprised the government hasn't gotten involved and made it so the system has to pass some sort of NHTSA certification before it can be offered to consumers.
 
Usually when you write an article such as this you do complete research on the topic, especially when you use something a bit defamitory, such as the slam on Tesla. The National Transportation Safty Board determined the accident to be driver error, not the Tesla.

"Joshua Brown, the Tesla driver killed last year while using the semi-autonomous Autopilot mode on his Model S, received several visual and audio warnings to take control of the vehicle before he was killed, according to a report from the National Transportation Safety Board. Despite the warnings, Brown kept his hands off the wheel before colliding with a truck on a Florida highway."
 
Driverless cars do a miniscule fraction of miles compared to driven cars, and always on carefully selected our even private roads.

...or in the case of Tesla's "autopilot" they're used as super-cruise-control for long-distance driving on freeways/motorways which is about the lowest-risk (certainly per mile) form of driving.

I support the self-driving initiative but something is a little worrying and hasty about Tesla's autopilot rollout. Self-drive should either totally work, or totally not.

This. self-driving isn't ready until its ready. As soon as you allow drivers to take their hands off the steering wheel, you have to assume that they're going to get out their phone and start updating their Facebook. Because, human beings. Heck, far too many people think they can do that with a regular car!

Yes, its irrational that self-driving cars are going to be held to stricter safety standards than the safest of human drivers, when simply being better than the typical driver would save lives overall - but when did we start expecting the word to be rational? I have a lot of respect for the way Tesla are pushing the idea of EVs in general, but rolling out "beta" self-driving features to the masses and hiding behind "well, the car did warn them to keep their hands on the wheel" is risking a backlash.

The other backlash will come if autonomous vehicles are "safe" because they drive everywhere at 5mph below the posted speed limit, never pull out until there are no oncoming vehicles in sight and assume that every empty cigarette packet blowing across the road might be a cunningly disguised child.

I'm sure that self-driving cars will be with us eventually, but I suspect that they will be "nearly ready" for a long, long time - or kicked into the long grass the first time one takes out a school bus (regardless of whether or not the driver qualified for a Darwin award).

IMO the "system" that failed wasn't Tesla's, it was the one that allowed the highway-entering truck that cut him off to drive without under-ride guards.

Under-ride guards might be lifesavers if you hit them while braking from 30mph, but this guy drove into the side of the truck at 74mph without even attempting to brake. Even with guards, that's not going to end well.
 
It was caused by: a trailer doing a wrong turn, low light conditions, a distracted driver, and bad luck. It doesn't stand out statistically by any means ...

Yes, but things like that happen. If you've driven for many years, you've encountered dozens of them and (if you're posting here) survived.

But, here's the thing... I'm not a statistic. I don't drink and drive. I don't text and drive. I've done some SCCA racing years back, and am relatively good at controlling a car, even in emergency handling. I pay pretty close attention to things happening around me in traffic.

So, I don't think it's really fair to trade the advantages I've gained (and others like me) over the 'average statistic' (which keeps rising due to increasing negligence and stupidity) in order to save some lives on the whole. A better solution would be to get the idiots off the road. (And, move towards AI assists, instead of autonomous.)

Here's the catch: It's not a self driving car. It doesn't make any claims to be a self driving car.

Fair enough, though it is called 'auto-pilot' and it seems people aren't reading the disclaimers (do they ever?). Maybe a different name or some educational publicity push. (Which I'm guessing these companies don't want to do, because they are hard-selling the autonomous future to the public right now.)

Yes, its irrational that self-driving cars are going to be held to stricter safety standards than the safest of human drivers, when simply being better than the typical driver would save lives overall - but when did we start expecting the word to be rational?

If it's lives we're really concerned with (I don't think it is), then there is a LOT we could already do. Car manufacturers and safety standards have really stepped up to the plate. Driver training and law enforcement, on the other hand, haven't.

We give people a minimal amount of training in their mid-teens, and then they are deemed good to go until near-death. And, when I say minimal training, I mean scarily minimal!

Then, many police departments seem to focus on very simplistic stuff like speed-traps, rather than reckless or careless driving. We've only recently begun to take drunk or impaired driving slightly seriously.

And, even if AI gets pretty good, I don't think it's going to solve the problems unless it's 100% AI. I don't want to go there, and I don't think it's realistic to think it will go there any time soon short of draconian policy.

The other backlash will come if autonomous vehicles are "safe" because they drive everywhere at 5mph below the posted speed limit, never pull out until there are no oncoming vehicles in sight and assume that every empty cigarette packet blowing across the road might be a cunningly disguised child.

I'm sure that self-driving cars will be with us eventually, but I suspect that they will be "nearly ready" for a long, long time - or kicked into the long grass the first time one takes out a school bus (regardless of whether or not the driver qualified for a Darwin award).

Exactly. AI is artificial. Programmers need to account for every situation (impossible), OR it needs to build a big enough database (decision tree) from real-world data collection and correction, to account for everything (more possible, but will take a long time and never be perfect).

How many 'learning experiences' will society put up with in trade for the (dream?) of true autonomous vehicle systems? How many of us really want that anyway? And, are you willing to become simply a statistic? (Clearly we are, for example air travel... but the safety is pretty darn high, and we're not doing it every day, and the benefit tradeoff is huge. I don't see that with cars.)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.