Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This rumour is simply ridiculous..

Siri can’t even find directions to the local coffee shop reliably, imagine sitting in the car for 30 minutes trying to tell it where to go…


On top of that, one thing is making a vehicle without controls, another thing is getting it approved to be put on our roads.

Just impossible.

Not even Tesla has a fully reliable autopilot, I recently went for a drive with a friend who has a Tesla 3 and the car was about to miss a turn completely, plus it speeds up and slows down abruptly without any reason. On top of that good luck with traffic lights, in 30 min city drive he had to manually accelerate at a green light because the car was coming to a full stop.

This dream of autonomous driving is at least 15 years away in my opinion.
GM, Ford, VW, Tesla, Volvo, Continental, ZF, and there are plenty of other players and Japan and China also in the game, and I think all would think you are wrong and call attention to how they are putting their money where their mouths are.

Apple has more R&D capital and arguably has as much if not more experience with key elements of autonomy than some of the above companies.

It may take 15 years, but think of how fast hardware and software in many industrial segments is advancing and it could easily be that you are pessimistic by a factor of 2.
 
  • Like
Reactions: libertysat
I still don't believe that Apple will move to actual production. That is an enormous effort that is too far removed from their other product lines


Also, wouldn't buy a car without a mechanical steering interface an accelerator / brake. I actually enjoy driving (not everyone lives in a flat grid of roads, or congested cities). I bought my Tesla because it actually drives extremely well and provides the enjoyment of driving a small European sport sedan.
 
This is a dumb rumour. There will be no level 5 autonomous driving for quite a while. 4 years is unrealistic. There are just too many special cases that cannot be handled in anyway without a manual override of some sort.

Also it's likely a product people just don't want. I love the idea but I wouldn't buy a card without a steering wheel... or at least a side stick!
 
And for a time it will be.

Those cars with legacy driver controls will likely have to install a beacon (to alert the robot cars and transportation grid that an irrational vehicle is in the area.

Human piloted cars will pay much higher insurance premiums. Indeed, those hanging on to legacy tech and paying big premiums will be like profit life support for insurance companies.

Later after the vast majority of the fleet is autonomous, we will see the dismantlement of street signs and traffic lights and carbon piloted cars will have both a beacon and require some kind of display to provide their pilot with the equivalent of the info given by the present signage.
Sounds like iHell. No thanks.
 
To those in favour of this nightmare fuel...what does this technology - assuming it actually works - allow you to do? What benefit does it provide? To the average person who drives maybe an hour or two each day...what do they do with this newly freed up time? Because they are in a car so it's not like they can go for a run or ride a bike to get some fresh air and work out? The activities are going to be pretty limited being in a car...no? So basically, as somebody else pointed out, it frees up more time to sit on your phone watching Netflix/scrolling Facebook/liquifying your brain watching TikTok? I don't see any of those things as a positive sadly!

Now if a technology cam along that allowed us to ACTUALLY get that time back in a way where we could get outside and do something with that time...then THAT would be worth considering. All this is is a transfer of our attention from the road to the phone...and I see that as a net loss to humanity...a HUGE one! At least when you're driving you are actually looking at the real world and you might notice some of the beauty of nature...or a piece of unusual architecture...or something that isn't the ever-widening, gaping mouth of the proto-dystopia of the Metaverse.

Drinking. Eating. Screwing. Play video games. Sleep. It will be awesome!
 
  • Like
Reactions: Pufichek
Well I was making some jokes in that post.

But, every time you fly, probably about 85% of the flight has the plane in auto-pilot mode. You've been trusting software to make life-death decisions for you just about every time you've flown anywhere.

I fully grasp concerns in this thread. It seems wondrous... especially by 2025 (if that happens). But it seems that we all have only one vision of this: it splattering us as soon as we give it a try. New technology doesn't automatically fail with deadly results in first releases to the public. How many of us were killed by iPhone 1, iPad 1, Watch 1, M-Series Mini 1? Harmed? Slightly injured?

How many new models of Airplanes have your boarded and flown without giving a thought to all the brand new technologies in play? Brand new train/subway? Brand new monorail? Bullet trains?

None of that says this is all slam dunk. But it's also not raging disaster that will kill all who try it.
every time I fly, theres a pilot behind physical controls overseeing autopilot operation, im not thrusting software, im trusting someone that busted their ass studying aviation and is able to take over if there is some malfunction
 
2025 is a hypothetical date. Remember Apple has been testing these autonomous cars for years in California. The question is how far along are they
Also my thought is that they are creating the autonomous system for other car manufacturers to use
Makes sense as they would sell more autonomous systems that way
Time will tell
 
  • Like
Reactions: libertysat
As someone who is driving Tesla's Full Self-Driving Beta... all I can say is no. I realize it's two different companies but the chances of having a car with no means to take over in unusual/unsafe situations is just unrealistic in this decade. As much as I want it to be true, there's no way, yet.
Same. Hard pass. Re: Tesla full driving and beta---I rarely use it now and it's presently a waste of money. I can go from A to B faster when I control the car. Self driving cars are still just a beat or three too slow. This is BEFORE we even get to safety. I've had the car in AP on the highway slam on the breaks because it saw phantom objects.

This is a pipe dream for the next 15 years, unless there is an incredible leap forward in processing, as I think that is the bottleneck. This is why Tesla people are hearing rumblings a new self driving chip when the third version is still relatively new.
 
When you hear what's actually involved in level 5 autonomous driving, the article should read "for launch in 2055."

Here's a great Lex Fridman podcast with one of the top Waymo engineers. You can learn what makes this problem space so challenging and learn why Apple's 50 vehicle "fleet" has no chance.

"learn why Apple's 50 vehicle "fleet" has no chance"

This is sort of unfair. Apple has more $ than anyone and more importantly, you have no idea what is actually happening between the scenes.

C suite executives know more than us MacRumors readers.
 
Do people think that any autonomous driving system will ever beat a 100% capable human driver, not under the influence of any substance, fatigue, etc? I mean, without the monitoring assistance by some 6G-7G electronic grid overlay... which is btw what Toyota is hinting at.

midvale%2B%25281%2529.jpg


Personally, I seriously doubt this. Why? Let me put it this way. Billions of brain cells multiplied by tens of thousands of synapses in each individual human brain make for more instant connections than there are stars in 20 to 40 thousand galaxies. Housed in a roundish ‘cockpit’ capable of swiveling (the human head), equipped with two amazingly effective optic and ditto hearing sensors (eyes and ears) and a hard-drive full of constantly updateable learning material, topped off by the human intuition, they enable us to split-sec reference what we see, hear and feel when driving. Hard to beat those, if ever. And this is the 1st of the FIVE fundamental issues I have with AV systems...
 
Last edited:
I think everyone is going to be very surprised when the discover that Apple's car is not used for the masses, but instead purposed for more of a cab-like use in specific urban areas.
 
  • Like
Reactions: libertysat
I'd say the opposite: There is no way that I am going to get into a "fully" self-driving car that still has a steering wheel and pedals, or anything else that could make me responsible for not intervening "in an emergency". If Apple is selling a system as "self driving" then they should have the confidence to be legally in charge of the vehicle at all times, and responsible for the consequences of any failure. I expect to be able to legally ride in it while drunk, texting, reading a book or not wearing my glasses. I don't expect to need a driving license and certainly not liability insurance... Or, more precisely, even if I am Mr Responsible, stay sober and attentive at all time, there will be an army of morons who treat self-driving as an opportunity to install a beer fridge and play VR Kandy Crush.

Even it suddenly stops and needs to be moved to a place of safety I don't want to be rudely awakened to find that my first bit of real driving since buying an autonomous car is to extricate it from the fast lane of a 12-lane highway starting from a dead stop because that's exactly when bad accidents happen. The software has to "fail safe" in that situation. I don't know what the US driving test is like, but here in the UK I'm pretty sure that anything resulting in the examiner having to take over control of the car would probably be a "fail" - and I don't want to be driven around by an entity that couldn't pass a driving test.

Full self-driving isn't ready until it is ready - and as well as a lot of fine tuning of the tech, that's probably going to take legal changes as well, and probably infrastructure improvements. I'm not holding my breath.

Thing is, people talk as if humans are "bad drivers". They're not - they're incredibly good drivers, especially when they obey the rules, stay sober etc. The problem is that it is a stupidly dangerous activity to expect people to do with minimal training, as an incidental part of their daily lives when they have 101 other things to worry about. We wouldn't let someone fly a plane with that negligible level of training and monitoring, yet pilots aren't continuously passing within inches of other planes travelling at speed in the other direction... If you want to make driving safer create driver assistance devices that monitor the driver on simple, objective things like speed, proximity, lane keeping (all things that exist - but lacking the crucial 1000 shock to the butt to stop drivers getting complacent and relying on the assistance... but seriously, research into unintended consequences is also essential) and provides a back-up against the driver making a mistake.

What you don't do is put in a FSD system which almost works until something unexpected happens and then expect the human to stay alert (while doing nothing) so that they can intervene at a second's notice - possibly when the self-drive has already dumped them in a sticky situation (...and then try to crowdsource the beta testing... *cough*esla*cough*)
Yeah trust apple so that when you get into an accident and die you can shout from the grave "you said this would never happen." It is dumb to not have a fail safe in a car (no matter how high tech they claim it to be) that could let you intervene and save your life if something goes wrong.
 
Yeah trust apple so that when you get into an accident and die you can shout from the grave
I absolutely trust Apple to minimise their own legal liability.

If the car has a steering wheel, they can do that by holding me responsible for intervening in an emergency... and even if I'm "shouting for the grave" remember that it is your future insurance premiums that will be covering the cost of the school bus that my car mistook for an exit ramp.

If the car doesn't have a steering wheel, they'll have to do that by making the car as safe as possible - or not making it at all if they can't make it safe.

...and remember the "me" here isn't personal - it's every other idiot on the road...

It is dumb to not have a fail safe in a car
Expecting the driver to take over in an emergency isn't "fail safe" it is "fail dangerous". If a self-driving car needs to rely on that, then it's not ready for the road.

If you expect the driver to react while the car is moving - think again. Even in the unlikely event that they're sitting there alert, sober and ready to intervene, they don't just have to anticipate the danger - they have to anticipate that the software is going to fail. Sorry, just let me drive while an indefatigable bit of software watches my back, not vice-versa. If there's a significant chance of that happening - or of the software giving up and failing "safe" by stopping in the middle of a highway - the car shouldn't be on the road. End of.

A year after these cars (hypothetically) come on the market, plenty of people will be operating them who haven't driven a yard since someone handed them their license. Nobody is going to accumulate any "road sense" - and you're planning to suddenly wake them up and dump them in the most dangerous situation that most regular drivers will ever face?

I'm not talking about zero danger - of course, it has to be proportional. Even the best driver in the world in their regular manually-driven car could be toast if there's a sufficiently serious and sudden mechanical failure. There will always be a risk - but the acceptable level of risk, and the cost of insuring against it, should belong to the carmaker. That steering wheel (and, maybe, current law) is making you responsible. I'm not taking responsibility for Apple/Tesla/Google's software testing.
 
Do people think that any autonomous driving system will ever beat a 100% capable human driver, not under the influence of any substance, fatigue, etc? I mean, without the monitoring assistance by some 6G-7G electronic grid overlay... which is btw what Toyota is hinting at.

midvale%2B%25281%2529.jpg


Personally, I seriously doubt this. Why? Let me put it this way. Billions of brain cells multiplied by tens of thousands of synapses in each individual human brain make for more instant connections than there are stars in 20 to 40 thousand galaxies. Housed in a roundish ‘cockpit’ capable of swiveling (the human head), equipped with two amazingly effective optic and ditto hearing sensors (eyes and ears) and a hard-drive full of constantly updateable learning material, topped off by the human intuition, they enable us to split-sec reference what we see, hear and feel when driving. Hard to beat those, if ever. And this is the 1st of the FIVE fundamental issues I have with AV systems...
This phenomenon is noticeable when using Tesla FSD. The human mind is much quicker and more decisive.
 
  • Like
Reactions: voy@ger
I absolutely trust Apple to minimise their own legal liability.

If the car has a steering wheel, they can do that by holding me responsible for intervening in an emergency... and even if I'm "shouting for the grave" remember that it is your future insurance premiums that will be covering the cost of the school bus that my car mistook for an exit ramp.

If the car doesn't have a steering wheel, they'll have to do that by making the car as safe as possible - or not making it at all if they can't make it safe.

...and remember the "me" here isn't personal - it's every other idiot on the road...


Expecting the driver to take over in an emergency isn't "fail safe" it is "fail dangerous". If a self-driving car needs to rely on that, then it's not ready for the road.

If you expect the driver to react while the car is moving - think again. Even in the unlikely event that they're sitting there alert, sober and ready to intervene, they don't just have to anticipate the danger - they have to anticipate that the software is going to fail. Sorry, just let me drive while an indefatigable bit of software watches my back, not vice-versa. If there's a significant chance of that happening - or of the software giving up and failing "safe" by stopping in the middle of a highway - the car shouldn't be on the road. End of.

A year after these cars (hypothetically) come on the market, plenty of people will be operating them who haven't driven a yard since someone handed them their license. Nobody is going to accumulate any "road sense" - and you're planning to suddenly wake them up and dump them in the most dangerous situation that most regular drivers will ever face?

I'm not talking about zero danger - of course, it has to be proportional. Even the best driver in the world in their regular manually-driven car could be toast if there's a sufficiently serious and sudden mechanical failure. There will always be a risk - but the acceptable level of risk, and the cost of insuring against it, should belong to the carmaker. That steering wheel (and, maybe, current law) is making you responsible. I'm not taking responsibility for Apple/Tesla/Google's software testing.
Thinking that AI (new at that) in a car would be better able to respond and react to things is laughable at best. But yeah you can put your life in apple's hands, I (like many others) would prefer some method to engage if the time was needed. Everyone sees how terrible Siri is and you really think apple's AI is going to be groundbreakingly better that they could do this with no steering wheel or anything? L O L..
 
No steering wheels and pedals wow! Who’s saving up for Apple car already?

View attachment 1913391
Not me, but for sure A few wealthy crazy people (which are a lot BTW) how are you going to park in a complex parking? How are you going to avoid turning in the wrong street as Google/Apple Maps suggest all the time as it is forbidden? How are you going to drive outside roads arent perfectly marked? (Countryside,

-Siri, park over there
-“parking over there”
-no no!! The other place!
-“calling thomas Lace”
 
Last edited:
Thinking that AI (new at that) in a car would be better able to respond and react to things is laughable at best.
Tech like proximity alert, speed alert, auto stop and lane-keeping is already in production. It can be achieved by sub-systems that "do one thing well" measuring things that are easily measured, and provides a second-line of defence while the human driver concentrates on driving. It doesn't drive your car into a wall that the AI has mistaken for empty space.

Everyone sees how terrible Siri is and you really think apple's AI is going to be groundbreakingly better that they could do this with no steering wheel or anything? L O L..
This is nothing to do with Siri, and it's not gonna be sketched out by Tim and Craig on a napkin.

But, anyway, you're missing the point. Of course it's hard to do with no steering wheel - and colour me skeptical that it's happening any time soon. What I'm saying is that if an autonomous car still needs a steering wheel because it relies on a driver to keep it safe then it's not ready for the road. *harrumph*esla *graaaggh*utopilot. (that's a nasty cough...) The steering wheel is a cop out that lets the carmakers have drivers beta-test their AI at their own risk.

If you sell a self-driving car, then irresponsible people are going to drive it drunk, while texting, sleeping etc. whatever wise words are written in the guidelines. Responsible people are going to realise that it's hard enough to stay focussed when you're actually engaged in driving - trying to maintain the same level of concentration when you're not driving just in case you need to intervene just ain't gonna work.
 
I would much rather see improved driver assisted driving, computers can do wonders in braking, like anti-lock brakes, even my motorcycle has anti-lock brakes and has saved my ass more than once. Do you think fighter jet pilots actual fly the plane, they merely steer it where they want to go, the computers are constantly correcting critical functions that a human couldn't possibly do at the speeds they fly. The only autonomous vehicle I have been in are the trams at the airport. If anything landed on the dedicated roadway, I don't think they could react to it.
 
This stuff is pretty scifi. I'm sure there'll be a lot of hiccups and unfortunately the talking heads will pounce on it each time there is an accident, even if it turns out that self-driving cars may well be many times safer than regular cars.
 
  • Like
Reactions: libertysat
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.