Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Saw one of these self driving cars on the freeway (Apple I believe?) yesterday was shocked to see people driving in such a way to antagonize the vehicle intentionally. The person behind was inches off their bumper. I guess some people like to see the world burn.
Like that remote pedo-identifying prick Musk, who claims his automated roadkill devices are actually autonomous artificial intelligences and investors shouldn’t worry too much that his company can’t ramp up production of their beloved loss makers. Please excuse my comment, I’m writing this under the influence of Ambien and will shoot myself into space or foot now. ... Where is my mail-order flamethrower?!
 
Last edited:
Ever heard of a shoulder lane? Most countries have it. It's where the remedial human driver or Apple Car should park if inexperienced with merging and not on the expressway. It's also where the responsible driver can swerve into to avoid collision with the remedial driver or Apple Car that decides to stop on the expressway.
You must not live in a very big city with rush hours. Merging can be a very slow bumper to bumper crawl.
[doublepost=1535783975][/doublepost]
You are completely ignoring the role that ML plays in autonomous vehicles, and more broadly in AI.
I have yet to see an impressive AI using machine learning. I hear a lot of hype, I watch a lot of demos - I’m continually underwhelmed. The human brain is just so much more adaptable, reactive, contextual. A focused, skilled human driver is far better than any self driving AI at this point. And probably for a long time to come.
 
  • Like
Reactions: Gasu E.
You must not live in a very big city with rush hours. Merging can be a very slow bumper to bumper crawl.

I do live in a huge metropolitan but don't make the remedial driver mistake of not giving myself sufficient space in front of me to escape. Now you know the purpose of shoulder lane and managing space to escape that responsible drivers have known for decades.
 
All humans need to be is more observant. 15Mph is not fast at all, 2:58pm would be in the afternoon, so why would traffic be going so slowly on a Expressway anyway. You don't let people in then give then a 'nudge'
 
That’s the problem with self driving cars. They are made to follow rules. Hence the Apple autonomous car going less than 1 mph to find a gap to merge into another lane. Everyone knows you gotta stick your nose into the other lane and force yourself in there!

I know, it's going to be such fun to watch the multi mile long queue build up behind one of these things in rush hour, whilst it sits at a junction waiting for a nice large gap in traffic to enter the main road.

So far all these tests, whilst impressive are still a joke compared to the reality of actual real world day to day driving and the events that a human has to deal with unexpectedly.

I really want driverless cars, I hope by the time Im too old to drive safe, there will be such a car for me to either own, or probably call upon to pick me up like a driverless uber, which is probably the more realistc option for the future for almost eveyone.

However Im feeling we have a massive amount of time before this is a reality, not the few more years the hype new stories would make us believe.

And yes, won't it be fun to know that a driverless car will always stop for you, and always obey the rules and laws of the road in the country it's operating. how much fun will people have with that I wonder.
 
  • Like
Reactions: unobtainium
With all the sensors merging should be a piece of cake vs stopping on the expressway like what a bad or inexperienced driver would do. Also, doesn't it have collision avoidance like Tesla to avoid those situations?

Some things can’t be avoided. It can have anti collision, but the other not. There are been many Tesla incidents, no one is Tesla fault (even if newspaper and news blog needs to imply it for more copies/clicks).
 
I think some people aren’t reading the article. The Apple car did nothing wrong. It was the Nissan Leaf that collided into the rear because it was following too closely. So if you were claiming injuries while driving the Nissan Leaf, you’re on your own!
Have you read it? The two cars were on different roads and the Apple car was merging in to the path of the other car. Both collided. No blame has been apportioned to either car yet. Anything else is pure speculation
[doublepost=1535789856][/doublepost]
In driving school, you learn that the vehicle who rear ends another is at fault no matter what. If you hit someone, you get a ticket for following too closely. There are only two ways to rear end someone -- following too closely, or being distracted and not hitting the brake fast enough.

The article said the Apple car was waiting for a gap in highway traffic while trying to merge. You can't get onto the highway by putting your car inside of another car, that's not how physics work; you have to wait for a gap.

People are on their phones too much these days. The driver of the Leaf was probably texting.
Not true, The Apple car was on another road and merging, i.e. it crossed a dashed white line into the path of the other car.
You are required to match the speed of cars on the other road and give way, the Apple car was doing 1mph and the other car 15mph.
Sounds like Apple Car fault to me.
 
Saw one of these self driving cars on the freeway (Apple I believe?) yesterday was shocked to see people driving in such a way to antagonize the vehicle intentionally. The person behind was inches off their bumper.
Tailgating an autonomous vehicle seems particularly stupid since unlike a human driver it doesn’t care.

Even dumber, the best you could possibly manage in that situation is rear-ending the autonomous vehicle, which (like virtually any rear-end collision) will be your fault, and there will be a pile of video and sensor data to prove it was your fault. Actually, that probably applies to just about any accident you manage to get into harassing an autonomous vehicle at this point--you’re almost certainly going to be at fault, and there will be plenty of proof.
 
  • Like
Reactions: BigMcGuire
if Apple builds vehicles the way it builds new MacBook pros I certainly don't want to be involved in an accident inside one of those vehicles. Actually I'd rather not drive t hem at all.
 
In driving school, you learn that the vehicle who rear ends another is at fault no matter what. If you hit someone, you get a ticket for following too closely. There are only two ways to rear end someone -- following too closely, or being distracted and not hitting the brake fast enough.
I agree with you almost completely, but if you want to be pedantic about it, technically it’s possible for a driver to cut in front of another driver then slam on the brakes in a manner such that the vehicle doing the rear-ending is not at fault. That’s an edge case, though, and doesn’t sound anything like what happened here.
 
I don’t think that’s it, I think people are just negligent and Careless to other people on the roadway today, and they have no due regard to other drivers and traffic laws, I’m not speaking to specifically to this accident between the Leaf and Apple vehicle, but more in relation to your post about people driving to antagonize the Apple vehicle.

I read an article the other day (failed to find it right now, perhaps someone saw it as well?) describing how autonomous cars (or semi-autonomous cars, as the article corrected) can be somewhat of an obstacle in "natural" traffic; humans need to understand, that autonomous cars react differently to humans (pedestrians crossing the road etc.); it also made a case for communicating autonomous cars, giving a human-readable signal to pedestrians and other drivers. But, referring to your point, it also stressed that some people really want the autonomous cars to fail: they jump in front of them so as to make them break (i.e. test their crash-avoidance technology), etc..
 
The biggest weakness (or greatest strength) here is the Human factor

We are clearly years away from this technology being ready. The Google Self Driving Car has had its accidents
https://www.newsledge.com/google-self-driving-car-accidents/
The idea that self-driving cars won’t get into accidents once deployed is naive. Even a fully autonoumous vehicle will get into accidents and kill bolting pedestrians. Unless we completely seal our roadway, accidents will occur. Every form of transportation has accidents.

Putting a 16 year old behind the wheel and unleashing them into the general population is widely accepted. Imagine if every year we released an inexperienced AI to the roads that had a habit of being wreckless, used alcohol more than other AI’s, and would travel with other AI’s showing off how good of a driver they are by driving FASTER and merging dangerously. That’s what we have now, and it’s incredibly dangerous.

https://www.dosomething.org/us/facts/11-facts-about-teen-driving
 
  • Like
Reactions: Makosuke
In driving school, you learn that the vehicle who rear ends another is at fault no matter what. If you hit someone, you get a ticket for following too closely. There are only two ways to rear end someone -- following too closely, or being distracted and not hitting the brake fast enough.

The article said the Apple car was waiting for a gap in highway traffic while trying to merge. You can't get onto the highway by putting your car inside of another car, that's not how physics work; you have to wait for a gap.

People are on their phones too much these days. The driver of the Leaf was probably texting.
Except in California where you need front and rear cameras because it doesn't matter how or where you are hit.
 
The idea that self-driving cars won’t get into accidents once deployed is naive. Even a fully autonoumous vehicle will get into accidents and kill bolting pedestrians. Unless we completely seal our roadway, accidents will occur. Every form of transportation has accidents.

Putting a 16 year old behind the wheel and unleashing them into the general population is widely accepted. Imagine if every year we released an inexperienced AI to the roads that had a habit of being wreckless, used alcohol more than other AI’s, and would travel with other AI’s showing off how good of a driver they are by driving FASTER and merging dangerously. That’s what we have now, and it’s incredibly dangerous.

https://www.dosomething.org/us/facts/11-facts-about-teen-driving
Agreed teen driving is a considerable concern. That being said there is also a disturbing percentage of irresponsible Adults that drive under the influence of alcohol or drugs. It is unfortunate no matter how much the law is enforced against such individuals that will continue to offend
https://www.theguardian.com/uk-news...casualties-in-uk-four-year-high-alcohol-limit

Sorry we are going a little off topic. Back to the autonomous vehicle although the human factor does play a part the concern is to what degree the human is in control and how straightforward it would be with to manual. This is essential. The fact is no matter who is developing the technology the autonomous car is years away from being available to the consumer to purchase perhaps not even in our lifetime.

 
2018. News Flash: car involved in collision with another car. No one killed, nobody injured. Subscribe or sign in for details about damage to paintwork. Now the weather.........
 
Last edited:



Apple is testing its self-driving vehicles in a number of Lexus SUVs out on the roads of Cupertino, and on August 24, one of those vehicles was involved in an accident.

Apple is required to disclose autonomous vehicle collisions to the California DMV, and the information on the accident was published on the DMV's website.

lexussuvselfdriving2.jpg

According to the accident details, the vehicle in question was in autonomous mode at the time, and sustained moderate damage in the crash, but it does not appear that Apple was at fault for the collision. From the accident report:Apple has been testing its self-driving software in Lexus RX450h SUVs in Cupertino, California and surrounding areas since early 2017, but this is the first time an Apple vehicle has been involved in a crash.

Apple's test vehicles are outfitted with a host of sensors and cameras, and while they are autonomous, each one has a pair of drivers inside. At the current time, Apple is testing its software in more than 60 vehicles.

It's not yet clear what Apple plans to do with its self-driving software, but it could be added to existing cars and there are still rumors suggesting Apple is working on its own Apple-branded vehicle that could come out by 2025.

Apple is also working on a self-driving shuttle service called "PAIL," an acronym for "Palo Alto to Infinite Loop." The shuttle program will transport employees between Apple's offices in Silicon Valley.

Article Link: Apple Autonomous Test Vehicle Involved in Accident on August 24
All that paraphernalia on top of that Lexus would distract anybody and easily cause a crash. Apple should stop this legal exposure, what happens when three 10 year old girls burn to death in an Apple vehicle. Apple would have to pay billions in every crash. If Apple want to enter the market, it should do it after other companies have created a market where at least 1000 deaths are already accepted. Of course robot cars are safer. What matters is the perception of juries and the legal precedent that has established limited legal exposure as a proven routine reality. Until then Apple sits as a fat cow ready to be plundered in this market. One crying jury member could sink Apple, hugging a blood soaked burned teddy bear. I love Apple, I've got every Apple product and I want to protect Apple. It's not Apple's time. Once the market is a boring reality, like a subway system where pushed people die in the tracks with no news coverage, then Apple can purchase the number 1 or second player. The value on these self-driving cars is not the individual cars, it is the network of millions of cars created by a trillion dollar company.
In fact Apple or any other huge interest getting sued for billions and hurt badly as a result could delay or doom the self driving market for decades. Let little companies get the early exposure, play the cards smartly in courts while establishing the "10 robotic deaths are better than 1000 human driver deaths" as an accepted reality. After some 1000 or 3,000 deaths are processed through self driving vehicles, (which is much better than the 100,000 human deaths that would have been prevented), during which time people will accept the new normal, then Apple would could safely enter the market.
 
Last edited:
Saw one of these self driving cars on the freeway (Apple I believe?) yesterday was shocked to see people driving in such a way to antagonize the vehicle intentionally. The person behind was inches off their bumper. I guess some people like to see the world burn.

How did you know that was there intention? They could have been on their Apple phones and just distracted.
 
  • Like
Reactions: BigMcGuire
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.