Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Yeah, your ignorance is excused. I did not mention any race on my comment. I was referring to a comment that as an English speaker you would agree it wasn't English. I just wanted to know what it means and that's all? Is that really a race issue?

Oh now u c how quick u understand my words dud
 
Those of us who have had graphics driver issues crash our computers over and over, are not encouraged to trust these same companies to write 100% crash free drivers for cars.

If they can't launch OS updates for phones without tons of bugs and issues, why should I expect the car OS to be reliable?
 
If they can't launch OS updates for phones without tons of bugs and issues, why should I expect the car OS to be reliable?
because compared with telephone - for the car is much more difficult, but the development team of this area is less than a few times
 
The public is stupid.

I don't think most people truly realize the dangers of having millions of slow-response-time, error-prone, likely distracted, potentially intoxicated, potentially angry or upset, potentially aroused, horrible at multitasking human creatures sticking their foot on a pedal to make a several ton weapon rocket forward at 60-70 miles per hour, all just feet or even inches away from a bunch of other idiots in weapons doing the same thing, with everybody HOPING that we all stay inside of the little white lines of paint that we call lanes.

Anybody even a little open-minded and even a little knowledgable about how dangerous automobiles are when humans are at the wheel should be able to put their fears of not being in total control behind them and should be very excited for a self-driving future.

Be careful who you call stupid. I'd rather have me, and not a machine, make decisions on how to best avoid said humans and human-made (read: error-prone) machines. You want to simply switch the trust from drivers on the road to those who manufacture, test, and install the self-driving mechanisms in the vehicle you now have no control over.

The problem is not human drivers. The problem is a few humans driving badly. Most people on the road are safe drivers. The jackasess are the ones that drive the accident rate up. When you compare the two, it's not even close: the amount of good drivers on the road far, far outnumber the bad.

I will never fly in a fully-autonomous aircraft, I will not ride in a similar car, I would not allow a robot to perform surgery on me, etc. I want a person at the controls, or at least supervising and ready to take over.

Now, if we're talking about assisting a driver, then that's another story.
 
Top Gear is going to get pretty boring.

I think that happened last year.

The number of people who will want to drive within large cities is already dropping. The real problem isn't developing the technology, it's sorting out the laws and defining responsibility. Even if automatic cars are 1000 times safer it still means someone gets hurt every day, and at least a few killed each month. Who's responsible when no person is driving the car? The auto manufacturer? The programmer? The person who stepped in front of the car on the assumption it would stop for him no matter who had the right of way? And who DOES have the right of way? Do they change the laws so that a human always does, at least in some areas? I'm thinking primarily city driving where the speed limit is likely to be low.
 
I think that happened last year.

The number of people who will want to drive within large cities is already dropping. The real problem isn't developing the technology, it's sorting out the laws and defining responsibility. Even if automatic cars are 1000 times safer it still means someone gets hurt every day, and at least a few killed each month. Who's responsible when no person is driving the car? The auto manufacturer? The programmer? The person who stepped in front of the car on the assumption it would stop for him no matter who had the right of way? And who DOES have the right of way? Do they change the laws so that a human always does, at least in some areas? I'm thinking primarily city driving where the speed limit is likely to be low.

I don't really see the problem, as laws have been dealing with similar cases for a long time. The automatic car is just another in a long line of potentially dangerous tools humans use. Even the horse and buggy presents a parallel case, where the horse is commanded by the driver but is of course using its own brain to make the actual decisions, which are occasionally disastrously wrong.

I don't think the question of whether or not to put the horse in jail stopped anyone from making buggies.
 
I don't really see the problem, as laws have been dealing with similar cases for a long time. The automatic car is just another in a long line of potentially dangerous tools humans use. Even the horse and buggy presents a parallel case, where the horse is commanded by the driver but is of course using its own brain to make the actual decisions, which are occasionally disastrously wrong.

I don't think the question of whether or not to put the horse in jail stopped anyone from making buggies.

The difference is that now we are a much more risk averse and litigious society. No car manufacturer wants the bad press and accompanying lawsuits that any autonomous car will inevitably have. And any accident will receive massive coverage, with talking head experts debating whether the technology is ready and who should be blamed.
 
The difference is that now we are a much more risk averse and litigious society. No car manufacturer wants the bad press and accompanying lawsuits that any autonomous car will inevitably have. And any accident will receive massive coverage, with talking head experts debating whether the technology is ready and who should be blamed.

Sure, more litigation is inevitable, but since that increase has been going on since cars were invented I wouldn't expect it to be a barrier. At least not to the big companies who have the resources to deal with it, and do. The blame question is already well in place-- automated driving is just the tool doing a larger part of the job. Every new technology is expected to be scrutinized.

At least when an automated car screws up it will be unusual and interesting news, not like the boring human-driver carnage we've gotten so blasé about.
 
Sure, more litigation is inevitable, but since that increase has been going on since cars were invented I wouldn't expect it to be a barrier. At least not to the big companies who have the resources to deal with it, and do. The blame question is already well in place-- automated driving is just the tool doing a larger part of the job. Every new technology is expected to be scrutinized.

At least when an automated car screws up it will be unusual and interesting news, not like the boring human-driver carnage we've gotten so blasé about.

I am not trying to make it sound like autonomous cars will turn roads into Deathrace 2000 specticals. But unusual and interesting deaths are always hyped by the press. How many people per year are hurt or killed in sky diving accidents or by shark attacks? Yet every time it happens it's at least local news. And if someone dies it probably goes national. Have that death be caused by an AI car and people will immediately jump to conclusions. It's man made so it's not an accident. Once again, these cars may be 100 times safer but news and therefore people will concentrate on when it fails, not on when it succeeds.
 
I am not trying to make it sound like autonomous cars will turn roads into Deathrace 2000 specticals. But unusual and interesting deaths are always hyped by the press. How many people per year are hurt or killed in sky diving accidents or by shark attacks? Yet every time it happens it's at least local news. And if someone dies it probably goes national. Have that death be caused by an AI car and people will immediately jump to conclusions. It's man made so it's not an accident. Once again, these cars may be 100 times safer but news and therefore people will concentrate on when it fails, not on when it succeeds.

I agree with the first part: I fully expect AI car deaths to be headline news and to cause fear. But it won't stop the manufacturers or reasonable buyers. Consider airlines: another much safer mode of transportation whose failures get headlines, and yet that hasn't stopped those manufacturers or most passengers. AI cars will most likely follow the same path.
 
Consider airlines: another much safer mode of transportation whose failures get headlines, and yet that hasn't stopped those manufacturers or most passengers.

However, you do wonder whether air travel would have got off the ground (see what I did there) if it had tried to start in the litigious, risk-averse early 21st century rather than the more gung-ho mid 20th. Imagine a world in which Orville Wright had sprained his ankle landing and the whole project was shut down for 18 months for the enquiry...

Two big, public, fatal accidents were enough to end the airship as a mainstream form of public transport.

The danger to autonomous cars is that the impatient investors and entrepreneurs will push things too fast. Elon Musk has achieved great things, but the concept of releasing incomplete self-driving technology to the public was not one of them.
 
The driver sacrifice problem is so over-rated. People posit such hypotheticals, which happen once every ten years if ever. These are unlikely scenarios that are totally dominated by the overall cost or benefit of the mainstream cases.

Not necessarily. If people know for certain that a fixed algorithm in self-driving cars will cause them to sacrifice the driver under particular circumstances, they can engineer that circumstance. Everyone always thinks of the case where it's a group of adorable blond blue-eyed children, but what if it's a couple of hooded terrorists jumping in front of your car?
 
However, you do wonder whether air travel would have got off the ground (see what I did there) if it had tried to start in the litigious, risk-averse early 21st century rather than the more gung-ho mid 20th. Imagine a world in which Orville Wright had sprained his ankle landing and the whole project was shut down for 18 months for the enquiry...

Two big, public, fatal accidents were enough to end the airship as a mainstream form of public transport.

The danger to autonomous cars is that the impatient investors and entrepreneurs will push things too fast. Elon Musk has achieved great things, but the concept of releasing incomplete self-driving technology to the public was not one of them.

There will always be paths that *should* fail, as in your examples, and fear and litigation are useful there to trim them away. What remains will be better for it.
 
Those of us who have had graphics driver issues crash our computers over and over, are not encouraged to trust these same companies to write 100% crash free drivers for cars.

If they can't launch OS updates for phones without tons of bugs and issues, why should I expect the car OS to be reliable?
Cars already have OS's. They are typically low level.
 
Have we considered that an AI car can relatively easily be made intelligent enough to avoid those driver-sacrifice cases in the first place? It's a matter of looking a few "moves" ahead, as in chess, and slowing down and/or leaving itself an out. There's no reason a car would *have* to just plow through a hazardous place at high speed.

If certain road spots become bottlenecks because all the cautious AI cars are slowing down there, maybe planners will be forced to actually fix them, put up pedestrian barriers, etc.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.