Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Tesla though is in serious financial trouble.. and has a CEO who likes to mouth off on Twitter. Be interesting to see if they are still around in 3 years.
[doublepost=1535921386][/doublepost]

That won’t work, those modifications you talk about keep rather a lot of people in jobs, and drive a big global market, the idea that governments around the globe will simply kill of that market is rather extremist. Just for self driving cars. It would only serve to drive a massive negativity towards them.

You raise a good point, however with self driving cars relying 100% on specially located sensors and factory specific software to make it all work modifications could wreak havoc on the entire system. There is no way a manufacturer could allow a self driving car to be modified in terms of engine power and limits, ride height, wheel and tire specs, suspension, brakes, etc. There will have to be a law in place where it is illegal to make those modifications on self driving cars where one change could mess up the entire system. Or if you make a change you are 100% liable at that point for a malfunction. Plus I don't think you can let "Billy Bob's Garage" work on these things. They will have to be serviced by licensed dealers. Remember if one thing messes up and the car malfunctions at high speed... your done. There is no chance for human correction... you're toast!

I would expect modifications would still be allowed on normally driven passenger vehicles, but if the push is to go to 100% self drivers in the next 20 years or so a lot of the things people do to their cars today will have to stop.
 
Seeing as human drivers tend to rearend self driving cars I wonder if they’ll develop predictive impact detection (car approaching fast to the rear) that will trigger the horn .. or something
 
Autonomous vehicles are dangerous in a mixed environment. Either all vehicles are equipped and using autonomous technology or none are. Deaths have already been attributed to vehicles using a semi-autonomous mode. Keep these autonomous vehicles off the streets and highways until such time technology has proven able to handle the infinite variables and situations operating a motor vehicle. I believe that an advanced bullet proof system would be really great. Just get in, tell the machine where to go and away it goes..but not yet...


infinite ?

with this silly kind of impossible standard, no human would be allowed to drive now! LOL
 
  • Like
Reactions: diandi
How many thousand kilometers did they do before having this first accident?
They're in alpha and they're already on the way to be better than us humans...

Probably better than you. Not all of us are clueless drivers.
 
From the perspective of a neuroscientist,
humans have horrible attention spans for driving even when not impaired by alcohol or drugs on pretty sunny days,
EVERYONE thinks they are better driver than they actually are,
we are just used to the carnage
 
Well, at least it’s following the autonomous vehicle trend of crashing...

It doesn’t state his fault it was though? But I know I wouldn’t go anywhere near an Uber self driving car!

Well I actually think the whole things one dumb stupid idea! Their needs to be MUCH tighter controls before this tech is allowed anywhere near the roads, but no, it’s being shoved into us no matter what.. so all roads users are at risk. I believe in the next year or two cars able to be fully autonomous will be on sale.

Your comment embodies the epitome of human fear mongering. Autonomous vehicles are already statistically safer than human drivers, and they’re only going to get safer. The Apple vehicle wasn’t even at fault. But you didn’t even bother to read the article; you just instantly went on an anti-autonomous vehicle rant based on your own unfounded biased assumptions.

Trend of autonomous vehicles crashing? Putting road users at risk? What on earth have you been smoking that makes you ignorant to the fact humans crash all the time putting other human lives at risk?

Humans also learn.

Machines also learn. When an autonomous vehicle makes a mistake, it can be reviewed and corrected, then it can be pushed to every autonomous vehicle in the world. Then no autonomous vehicle will ever make that mistake again. Unlike humans, who continue to repeat each others mistakes (drunk, drugged, tired, using mobile phone, not driving for the conditions, inability to control a vehicle effectively in an emergency situation etc).

“Humans learn” is literally the worst argument against autonomous vehicles.
 
  • Like
Reactions: diandi
I want to preface this by saying I a lot of the products… As for an Apple autonomous car don’t hold your breath at a time soon because it’s going to arrive about the same time as the Apple Television set and the Steve Jobs predicted user interface to cable television ....remember he said ‘we have that cracked’ or something pretty close to that .

Tesla is further down the road than anybody on this and they’re struggling with it as is Uber as is GM and Ford Mercedes Audi Volkswagen.

****, I’d be happy if I just could make CarPlay work properly ...what a POS that is
 
Seeing as human drivers tend to rearend self driving cars I wonder if they’ll develop predictive impact detection (car approaching fast to the rear) that will trigger the horn .. or something

I bet that could annony.. if that's not perfect either.
 
Can you buy a tube TV from Best Buy? Nope. One day you won’t be able to buy a car that you drive. It’s inevitable that the market will shift to autonomy and with that, our culture will shift dramatically pertaining to the acceptance and safety of autonomous vehicles. Once autonomous driving is proven to be dramatically safer than manual driving, manual driving will have a large red target on its back for lawmakers.

People that cherish driving will be the “get off my lawn” crowd. While there will be some sort of grandfathering that will allow manual driving, the government will set an end date making manual driving illegal.

Manual driving it is one of the biggest killers in our country. Its days are numbered.

Yeah, just like how they banned motorcycling in 1985.

Oh wait
 
That's, unfortunately, becoming commonplace with ALL of the cars known for having self-driving functionality. I have a Tesla S and had a guy pull that on me, on the interstate, heading back from vacation a couple weeks ago.

I was cruising along in my lane when he started to pass me on the left in a Ford Explorer. Suddenly, he veers over sharply, like he wanted to run into my driver side door. I immediately turned to the right and dodged him, going off onto the shoulder. He just got back in his lane and sped away.

There's apparently a crowd who thinks it's amusing to try to force self-driving cars to react to them pretending to hit them.

Saw one of these self driving cars on the freeway (Apple I believe?) yesterday was shocked to see people driving in such a way to antagonize the vehicle intentionally. The person behind was inches off their bumper. I guess some people like to see the world burn.
[doublepost=1535939092][/doublepost]That sort of thing does happen, once in a while. You also get the people who purposely slam on their brakes unexpectedly, to try to force someone to rear-end them.

More and more people are investing in dashcams that continually record, so they have evidence, whether or not a human witness is around.

if I reverse into your car, at say 5 mph just enough to do damage.

Then I pull forward, screaming it's your fault, and take photo's of the damage YOU did to my car and there are no witnesses.

Who gets the blame?
[doublepost=1535939867][/doublepost]There will definitely have to be changes made. Personally? I'm increasingly of the opinion that self-driving vehicles won't be accepted on the roads (beyond this experimental phase) unless one of two things happen:

Either A), you pass a law requiring ALL cars on the road to be self-driving - eliminating the human factor completely, or B) you create a network so all the self-driving vehicles communicate with each other while traveling, with sort of a "hive intelligence". (Traffic stopped ahead because of an accident or road construction? The first self-driving cars to encounter it communicate it to the others. Certain vehicle driving erratically? The self-driving cars automatically warn the others to keep a distance from it. That sort of thing...)

I hope I'm wrong and one of these big companies will really crack the problem of making a machine drive a car in all situations. But I just fear, for legal reasons alone, it's not going to work out. As one engineer pointed out; if you really could program in all the logic needed for your self-driving car to always choose the optimal outcome to avoid human injury or death? Lawyers would sue the manufacturer out of existence in a matter of only months. (Imagine a scenario where the car realizes it can't stop in time to avoid a front collision with a stopped car ahead on a major road. The option most sane HUMANS would do is try to swerve out of its way and hope for the best. The car might do some quick calculations and decide, "No... if I swerve right, I'd likely cause a bigger accident because of several cars coming up on my right. If I swerve left, I go off the road into a brick building. Best choice is to slam into the car in front of me on purpose since both vehicles have air-bags and safety belts, and the car in front will roll as I hit it to absorb some of the impact, unlike the brick building." Try explaining to a typical jury why the car made no effort to get out of the way of the stalled car and hit it straight on.... Now imagine this playing out many times, after the company has hundreds of thousands or millions of self-driving cars on the road. They can't afford to keep showing up to court to defend themselves and explain why these things happened each time.)


I still can't see Apple making a car. They know they will be putting their neck on the line to do that.
Making systems to sell to others, well that's also risky as the car won't get the blame for a death, the computer/software will as that's controlling the car.

Its a very exciting space right now and everyone wants to be there and not left out, but I can see so many legal issues to come when deaths start happening.We'll need the whole legal system, top to bottom, and the public in general to understand and accept deaths will just happen, and accept that whilst this is sad, it's safer than humans driving.

Still hard to explain this to the mother with her 8 year old daughter, organs splattered down the road, after the Apple car hit and ran over her.

The media will go ape over such an event, it will happen, question is, how will we deal with it.

If you were born into a world where this was the norm, then you would accept it as a price worth paying.

Like say none of us had cars, and someone told you, cars could be sold to the public, but perhaps 1000 people a day will die due to this, you'd probably say no. it's too high a price, but as thats the world we grew up in, we just accept it as the norm.

Society will accept this in time,but the transition to have that acceptance is going to be hard Im sure
 
I'm 100% certain in year to come it will be seen as crazy to spend all that money, the second biggest purchase in most people's lives on this expensive device to sit outside your home or at your office car park for 98% of the 24 hour day, when you can use one of the Massive nationwide fleets of tens of thousands of driverless cars just waiting for you to summon one.

The problem is that there will be massive demand in the morning and again in the late afternoon. To meet the demand, the fleets will have to have tens of millions of cars. Then, to avoid two trips in the morning and two again in the afternoon, the fleets will want to park a car in your driveway overnight, and at your employer during the day. Your fares will have to cover nearly the full cost of buying a car because 90% of days you will be a car's only rider.

A benefit to owning a car yourself is that you can store things in it: The kids' school backpacks, your sporting gear, the gift for grandma, and so on.

...

Here's a TV news report about the accident that shows the conditions at that location:
 
. . . . . . . .
Here's a TV news report about the accident that shows the conditions at that location:
Wow.
Big surprise.
Autonomous cars in accidents. Human drivers in other vehicles were the cause of the majority of those accidents. The majority of humans, as a collective, are very poor at driving.
Just like the majority of the posters, in this thread, fail at comprehension.
This really is shocking.
I mean SHOCKING.
/sarcasm
 
How many thousand kilometers did they do before having this first accident?
They're in alpha and they're already on the way to be better than us humans...

did you even read the article. the Apple Car wasn't even at fault. it was rear ended. which perhaps highlights one of the biggest things that self driving cars won't solve, actual drivers. perhaps instead of trying to make fully autonomous cars at this point they should focus on creating semi autonomous cars that can do things like detect when the car in front is braking and force you to slow down, can detect when its not safe to change lanes, can communicate with infrastructure to stop cars from entering intersections when the light is about to change so they won't be blocking the flow of traffic or when you are in the right hand lane at a 'no right on red' and so on. THEN when everything is in place and can be upgraded to facilitate fully self driving cars, upgrade the cars
 
  • Like
Reactions: FreemanW
As I said before, I feel Robotic Taxi's are one very possible outcome.
But there is the rush hour issue I can't work out how they will solve it. I mean we can't all work from home.

People say it's silly as you can do that today it's a taxi.
But that's not the same.

With a self driving car you could just want to travel 1 mile or 1000 miles, it's not going to complain.
It's not going to need sick pay, or toilet breaks, or to stop for food, and all the stuff that goes along with employing humans en mass.

If anyone does it, it's going to be on a massive scale, perhaps involving the car makers themselves.
If they can make cars in their factory that go out by themselves, and start earning them money then that may be what happens.

If the maths work out of how much to make the car, how much on upkeep and how much money it takes in.

It would be like you making fully AI robots and these robots pay you, the robot maker back all the money they earn from performing tasks.

Sound crazy now I know, but could happen.

We could move into a world, perhaps in many decades to come, perhaps 100 years? where it's just the accepted norm, local roads around homes are clear of parked cars.

And if you wish to actually drive one for fun you have to book up a session at a special area.

Will that happen, no idea, but it certainly could.

And if you were born into such a world, you would think it was how it SHOULD be.
Esp when your (then) history books told you how many hundreds of people a day died in the past when humans were driving.
 
Seeing as human drivers tend to rearend self driving cars I wonder if they’ll develop predictive impact detection (car approaching fast to the rear) that will trigger the horn .. or something
Only because the virtual drivers are like skittish senior drivers. This technology ain't gonna taking over in my driving lifetime. I would say for urban areas we are at least 20 years out and in the countryside ? Try 40.
[doublepost=1535970658][/doublepost]
Can you buy a tube TV from Best Buy? Nope. One day you won’t be able to buy a car that you drive. It’s inevitable that the market will shift to autonomy and with that, our culture will shift dramatically pertaining to the acceptance and safety of autonomous vehicles. Once autonomous driving is proven to be dramatically safer than manual driving, manual driving will have a large red target on its back for lawmakers.

People that cherish driving will be the “get off my lawn” crowd. While there will be some sort of grandfathering that will allow manual driving, the government will set an end date making manual driving illegal.

Manual driving it is one of the biggest killers in our country. Its days are numbered.
Perhaps is some of these offshore socialist utopian paradise's it will happen a little sooner but we are generations away from this. Thankfully I will be long dead by then.
 
Yep accident caused by an idiot that doesn’t know what a yield sign is . You gotta watch out from them they will rear end you because they don’t think you’ll stop they expect you to cut off traffic just like their intentions were too .
 
  • Like
Reactions: FreemanW
If a human is standing in your land, waving his arms in the air.
Should the car stop (so you the passenger can see what's up)
Or should the car simply slow down and drive around this person in the road?

Just think of the various reasons why someone may be doing this.
And what factors YOU as a HUMAN take into account to decide if you stop or not.

Age, Sex, Cloths, Road conditions, Time of day, General physical appearance.
(also of course who YOU the driver are, 300 lb body builder, or weak old lady)

What should a driverless car do ?
 
This is the most relevant the Nissan Leaf has Ever been, being involved in a collision with the Apple autonomous vehicle.
haha this was the funniest thing ever!
[doublepost=1535977900][/doublepost]The perfect vehicle for me would be some type of collaboration between Tesla & Apple! Was able to drive a Tesla while on vacation and loved every second of it. I can only dream, and hope to make enough mula to afford it haha...
 
Autonomous vehicles are in our future. We will continue to move there. At the present time they are not perfect. Vehicles with drivers still get into accidents. The big question is which are safer.
 
Tesla though is in serious financial trouble.. and has a CEO who likes to mouth off on Twitter. Be interesting to see if they are still around in 3 years.

Tesla doesn't have any financial problems. Their breakeven point for the Model 3 is somewhere between 2500 and 3500 per week = 32K-45K per quarter. They only delivered 18K last quarter, so obviously were in the hole. This quarter, they're expected to manage over 50K, so should have a profit (~4 weeks until they give an official number of deliveries, and ~8 weeks until the quarterly financial earnings report.)

Even if they did have financial problems, they wouldn't have a problem raising money, and if they did, they'd be bought by Apple or Google, and very little would change other than Tesla's dependency on capital.

There's no universe where Tesla doesn't exist in 3 years.
 
< 1mph merge onto an expressway? o_O
Or was it a turn from a stop?
The intent was to merge into the expressway. Since there was no safe gap to get in, the car slowed down and was just before stand still. Just like when you stop at a stop sign, ten inches before the sign you are driving at 1 mph.
[doublepost=1535990133][/doublepost]
apple has deep pockets I can get you easy cash payout
Apple has excellent lawyers, you'll wish you had never bothered them.
[doublepost=1535990418][/doublepost]
You raise a good point, however with self driving cars relying 100% on specially located sensors and factory specific software to make it all work modifications could wreak havoc on the entire system. There is no way a manufacturer could allow a self driving car to be modified in terms of engine power and limits, ride height, wheel and tire specs, suspension, brakes, etc. There will have to be a law in place where it is illegal to make those modifications on self driving cars where one change could mess up the entire system. Or if you make a change you are 100% liable at that point for a malfunction. Plus I don't think you can let "Billy Bob's Garage" work on these things. They will have to be serviced by licensed dealers. Remember if one thing messes up and the car malfunctions at high speed... your done. There is no chance for human correction... you're toast!

I would expect modifications would still be allowed on normally driven passenger vehicles, but if the push is to go to 100% self drivers in the next 20 years or so a lot of the things people do to their cars today will have to stop.

I'll be a bit more optimistic. Yes, a self driving car needs to know exactly how the car would behave for example if you accelerate with 27.9% of full power. But that doesn't have to be stored in memory with no chance of modification. Let's say you increase your engine power by 50 percent. Self driving car leaves the garage, immediately spots that the acceleration is much higher than expected, and within seconds adjusts to it, so it will again accelerate exactly as it wants to. Of course the car may be programmed to not accelerate beyond some limit, and you might not enjoy your extra horse powers because the car isn't driving any faster.
[doublepost=1535990647][/doublepost]
There's apparently a crowd who thinks it's amusing to try to force self-driving cars to react to them pretending to hit them.
The problem for these people is that your self driving car has cameras everywhere, and if a stunt like this causes an accident, they'll have to pay for the damages. (Watch out what your insurance cover says. For example in Germany, one of the very few cases where third party liability doesn't have to pay is when the damage was done intentionally. )
[doublepost=1535991086][/doublepost]
Secondly, the Apple vehicle was rear-ended from behind. I am not aware of any vehicles that are programmed to lurch forward in the event of a potential rear end collision. I suppose it is possible, but that would be problematic and it’s own way, for various reasons, as the path ahead would need to be 100% clear.
What would a good driver do: If there is a road full of traffic ahead, slam on the brakes before you get hit to avoid being pushed into traffic. If the road ahead is empty, you accelerate. When in doubt, make sure the guilty party gets damaged, and not innocent third parties. (That's nowhere in the traffic rules, but it's in my rules).
[doublepost=1535991478][/doublepost]
So, traffic lights go wrong and are stuck on red during rush hour. The driverless car sits there for the next 3 hours whilst 10 miles of traffic back up behind it, as it cant break the lawand go thru a red light.
I don't know the US rules, but in other countries _the law_ says that at a certain time, you can assume that the light is defective and can go. Very carefully, because you must assume that traffic from the left and right have a continuous green light and don't expect that they have to stop.
[doublepost=1535991721][/doublepost]
Well it did say the car was merging in to another road at the time.

PS please stay civil

At ONE MILE PER HOUR. That wasn't merging, that was just before having finished a complete stop. And don't post nonsense that you must KNOW is nonsense, then people are a lot more polite.
[doublepost=1535992012][/doublepost]
The Apple car wasn't responsible for the crash, but who the heck merges at 1MPH?
Remember that the car was not driven by a human being but by a computer. A computer that knows the exact speed at the time of the accident. One mile per hour is your speed just a tiny moment before your car stands still after hitting the brakes, OR your speed just after you start driving from stand still. Like if you are hit from behind one tenth of a second after the traffic light turns from red to green.

In the same situation, you wouldn't know your exact speed. This car does.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.