Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Why not do a search for the safety record of Google's cars... As of May 2015 Google logged 1.8 million miles driven, involved in 12 accidents, all human caused.

Take those cute little Google cars out of their safe, little, and meticulously mapped/controlled, approved testing zone bubbles, expose them to real and varied traffic/road/weather conditions and see that accident rate rise like a phoenix on sterroids.
 
If these toys drive with, say, three car lengths between them, and slam the brakes on at a closer distance, I'm going to have great fun zig-zagging my way through them to the front of the pack.
 
Say what you will, but ultimately the streamlining of this is down to government control of our daily lives. Whatever you might think of the tech, here is what will happen at some point:
  • Your every move will be recorded and stored in a company database
  • The government will have the ability to access these records and see where you were every minute of every day.
  • Now that may not sound bad at first, but consider this: You are on disability, you go and spend time at the movies, let's say at Downtown Disney at Disneyland. The government reviews the records of where your car was and determines you were at Disneyland going on rides (which you weren't), so you can't possibly be disabled and are committing insurance fraud. Alternatively you loan your car to someone else who does go to Disneyland and they determine the same thing even though you were at home in massive pain the entire time.
  • The government decides they don't like how much you are driving, so they send an override command to your car to take you home and not run for the rest of the day/week/month/year/etc.
  • Someone hacks your car and overrides the destination.
Mark my words this will happen at some point in time with self driving cars and not in the distant future, these reasons are why the federal government is interested in streamlining the approval process, they see this as a great way to take even more control of our personal lives and I think we all should be standing up and saying no.
 
  • Like
Reactions: Ezlivin
The problem is that automobile manufactures want to eliminate the steering wheel, gas and brake pedals completely, which poses A LOT of potential issues, including safety, privacy and a number of other concerns.

I think we are a long way off from cars that do not have a manual override.

(And if I am wrong, I will just ride my motorcycle!)
 
If these toys drive with, say, three car lengths between them, and slam the brakes on at a closer distance, I'm going to have great fun zig-zagging my way through them to the front of the pack.
Well depending on the speed you should be driving with three car lengths between you and the car in front of you so you don't rear end them if they have to slam on their brakes.
[doublepost=1500501707][/doublepost]
I think we are a long way off from cars that do not have a manual override.

(And if I am wrong, I will just ride my motorcycle!)
Not by any means, Google already had them and the state of California said no, they must have a manual override.
As for motorcycles, either they will become autonomous too or the government will make them illegal if we don't start standing up to this nonsense.
 
Well depending on the speed you should be driving with three car lengths between you and the car in front of you so you don't rear end them if they have to slam on their brakes.

The safety rule of thumb for follow-distance is actually three seconds, not car lengths.
 
Even the worst autonomous vehicles drive better than your average American driver.

Will autonomous vehicles have enough sense to crash into a pole instead of hitting a child who runs into the street?

--
Mind you, my wife can't drive any more and such a vehicle would be fantastic for her.

--
If I were a criminal, I'd have a field day with autonomous cars that you could call for a ride.

Just lure it into a vacant lot, close the gate, surround it with humans so it stops, then jack it up so it can't go anywhere. Now you can disable its brain and GPS, and take it away to a chop shop to be broken up and sold :D

I'm telling ya, they're going to have to arm these things! It'll be like the three laws of robotics; the third law being self-survival.
 
I have bittersweet feelings about autonomous vehicles.

On the one hand, I love the tech, and think that it will be the future of transportation. But, on the other hand it makes me sad.

I love driving, and I love cars. Maybe one day, having to give up control of driving to a computerized chauffeur would take all the fun and thrill out of owning a car. It then might be like owning a refrigerator or a toaster, just purely utilitarian. Also, not being able to teach my future grandchildren about cars makes me kind of sad.

You don't have to care. Human operated vehicles will not disappear in our lifetime, and future generations will grow up accustomed to driverless vehicles. You retain all the fun of driving a car, so what does it matter after you've passed on?

In thirty years when it comes time to vote to enact this as a federally permanent measure, I hope you don't become an elder who continually votes against the future because you "want things to always remain the same," even though you'll be dead in a year and it won't affect you in the least.
 
Someday it will be illegal to drive a car with a steering wheel. And a generation will be born that will never learn to drive.
 
  • Like
Reactions: KeanosMagicHat
Basically if the vehicle has a steering wheel and brakes, then you are very clearly considered responsible as you could have overridden the system and took over. Where it becomes less clear is if there is no way for the driver to intervene.

This seems a very grey area. My scenario said what happens if my car makes a mistake, as in does something it wasn't designed to do. There'd be a case in that situation for liability with the manufacturer.

Also, many accidents happen in a fraction of a second. I like to think I'm a decent driver. I've driven well over 1 million miles and I've never had an accident that was my fault.

With current vehicles where I am in total control, I know I can react pretty quickly and have avoided accidents many times when other people have done something stupid around me.

I don't think I should be held to the same reaction times etc. in an autonomous car. No matter how you try, you are never going to be as engaged as you are driving, especially with anything over a 10 minute journey.

Autonomous cars will be extremely boring and even the most dedicated of individuals will drift off into other thoughts about what they're gonna say to the girl they're meeting or what the score might be at the game today, or will "she" be at the baby shower, I hope not etc etc.
 
  • Like
Reactions: BruceEBonus
Before decrying the legislation. I'd like to know what safety standards don't have to be met.

Is it just meant as a means to speed up research? Rather than having to wait six months for approval of every minor software patch and hardware change.
 
This really is the (white) elephant in the room. It isn't going to happen. Like Amazon drone deliveries - people are human. And humans make mistakes. Machines aren't human. And machines make mistakes. Humans are accountable for their actions. Machines aren't. Therefore when people are killed by self driving cars and drones fly into aircraft or get intercepted en route to their intended destinations. Who appears in court. The drone? The car? Ironically The Onion did such a spoof story in the past about a military drone being held accountable in the dock - how right they were.
 
Someday it will be illegal to drive a car with a steering wheel. And a generation will be born that will never learn to drive.

And I'll bet you anything that governments will demand ways for police to remotely control such vehicles.

In the future, rebellions will be led by free men who still know how to drive themselves.

I foresee a Hollywood movie based on this concept pretty darned soon :)

Before decrying the legislation. I'd like to know what safety standards don't have to be met.

Is it just meant as a means to speed up research? Rather than having to wait six months for approval of every minor software patch and hardware change.

Excellent question. If it's like previous safety requirement drops, then it's simply stuff like where controls and instruments, and even rear view mirrors must be for humans. Also how far a seat can move, how much line of sight a driver must have from his seat, etc.

A computer with cameras needs none of those old rules.

Without dropping these regulations, then each autonomous car would have to continue to look just like a regular car, inside and out... instead of say, looking more like a living room inside with no driver seat at all.
 
This really is the (white) elephant in the room. It isn't going to happen. Like Amazon drone deliveries - people are human. And humans make mistakes. Machines aren't human. And machines make mistakes. Humans are accountable for their actions. Machines aren't. Therefore when people are killed by self driving cars and drones fly into aircraft or get intercepted en route to their intended destinations. Who appears in court. The drone? The car? Ironically The Onion did such a spoof story in the past about a military drone being held accountable in the dock - how right they were.

Who appears in court when the brakes fail in a car? Who appears in court when the engine fails on an airplane?
 
We've introduced some dumb legislation in the past. Add this to the list. SMH. It should be hard as hell to get autonomous vehicles on the road. Making it easy is just stupid and irresponsible. Sweet jeebus.

The majority of these systems are safer than human drivers. And they have every single free-market incentive to be so ... get the (over) regulation out of the way so the US industry can rapidly advance, and compete with other national markets: https://www.technologyreview.com/s/608229/chinas-plan-to-take-over-all-self-driving-cars/
 
I have bittersweet feelings about autonomous vehicles.

On the one hand, I love the tech, and think that it will be the future of transportation. But, on the other hand it makes me sad.

I love driving, and I love cars. Maybe one day, having to give up control of driving to a computerized chauffeur would take all the fun and thrill out of owning a car. It then might be like owning a refrigerator or a toaster, just purely utilitarian. Also, not being able to teach my future grandchildren about cars makes me kind of sad.

Where do you drive that you actually get to enjoy driving? There are times (few and far between) where I find myself being able to enjoy a ride (for whatever reason: fast, slow, beautiful, peaceful, etc.). The rest is just traffic. Traffic, traffic, traffic.

I don't teach my kids to ride a horse (I could), or drive a train, or fly a plane (all things they can learn if they want to).

I for one would welcome an electronic chauffeur ASAP ... (I have better things to do w/ my time)...
[doublepost=1500505592][/doublepost]
Your country never acts in the interest of the people but in the interest of companies and profit. Are you seriously surprised ?

*Unlike* GB ... or now the UK ... (irrelevance much?) .... :rolleyes:
 
Why not do a search for the safety record of Google's cars... As of May 2015 Google logged 1.8 million miles driven, involved in 12 accidents, all human caused. The average human has an accident every 165,000 miles though this is an old statistic... with distracted driving the numbers are getting worse.

Am I missing something here - 1.8m miles with 12 accidents is one accident every 150,000 miles which is worse than the human stat?
 
  • Like
Reactions: 69Mustang
I just think of the impacts to jobs. The main goal isn't to try to replace the casual recreational/driver, the low hanging fruit is trying to get companies to replace any vehicle that's driven by a person with an automated car.
 
I for one would welcome an electronic chauffeur ASAP ... (I have better things to do w/ my time)...

The roads in most places aren't good enough for this. And it's this type of attitude (I'm not judging) that is causing Tesla issues right now. People get confidence in it, over estimating it's capabilities and aren't ready at the wheel IF something goes wrong.
 
  • Like
Reactions: 69Mustang
The majority of these systems are safer than human drivers. And they have every single free-market incentive to be so ... get the (over) regulation out of the way so the US industry can rapidly advance, and compete with other national markets: https://www.technologyreview.com/s/608229/chinas-plan-to-take-over-all-self-driving-cars/
If you haven't, you should actually read the article you linked since the headline and the article's content don't convey the same message. The article's content definitely doesn't convey a need for a sense of urgency from US companies. Baidu is, just this month, starting road testing in restricted areas, while companies in the US (under the current regulations) have logged years of data and millions of miles. Rapid advancement at the expense of safety regulation is a dumb, dumb, dumb idea that history has proven over and over to be... a dumb idea. Besides, rapid advancement won't mean a darn thing if the infrastructure (roads and bridges) remain in a deplorable state. I'd prefer they do it right more so than doing it quickly.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.