Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This is simply not true. Self-driving vehicles have already happened. The problem is, when you load them up with all the necessary features to actually be safe enough without a driver (like lidar, advanced mapping, etc.), they are way too expensive and unwieldy. This is why Musk wants to solve the problem using only vision, front radar, and ultrasonic (I'm not 100% convinced that's the best route, but he has a point).

I do agree that Tesla vehicles in the existing autopilot version are very dangerous, mainly because their drivers rely too much on it, as proven by the handful of autopilot fatalities. However, they now have a massive fleet of vehicles gathering real-world data from all over, and with advancements in continuous machine learning, they do actually have a compelling case for being able to reach their goal of full autonomy, provided they don't go bankrupt. I have a feeling they'll have to add a few more sensors/cameras before they actually get to that point, though.


You missed the most important one, AI.
Watch the middle section of the video Tesla Autonomous Day on Youtube.
They're years, no decades, ahead of anyone else. Including the armchair quarterbacks here.
 
I disagree that we are many decades away. The flaw in your logic is the assumption that technology development is linear. The advancement of technology is very much non-linear, more like exponential. This is because technology begets technology. The advances that took the airline industry 80 years will not take the auto industry the same 80 years. Did the airline industry have advanced neural nets and massively parallel processing on dedicated chips at the beginning of that 80 years? Did the airline industry have a fleet of vehicles feeding millions of miles of telemetry data, per day, to the advanced neural net? I'll concede that replacing a car's driver in all conditions is substantially more difficult than the aircraft autopilot you describe. but it will not take many decades to reach the goal.

I will concede that the timeframe may not be 80 years, and I sincerely applauded your recognition of the complexity of replacing a human behind the wheel of a car in all conditions … it is a very complicated problem to solve. I also appreciate your logical and thoughtful post, we don't always see that on this forum.

One of my planks is that I think too many people don't fully comprehend or appreciate how sophisticated the human brain is and the kind of incredibly complicated processing that goes on when a human is operating a automobile. OK OK OK folks, yes, there are some real morons that sit behind the wheel but their brain is still doing quite a bit processing, maybe not always smartly, but it is processing somewhat. Couple that with the self-serving pundits like Elon Musk who love to spew garbage like "hey everyone, a self driving car is just around the corner" couple that with the news media and uninformed celebrities telling us that self driving cars will be out "next year" and you have a frenzy drinking up the self driving car Kool-Aid (I think it's now an actual flavor).

Here is just one example of a complicated situation that our brains resolve as a kind of "second nature" for good drivers. Imagine you are driving in a residential neighborhood that you are very familiar, and let's say you are driving the legal/appropriate speed limit of 25 miles/hour. As you drive along, you notice a few more cars than normal parked along the street and you can see some party balloons several hundred yards ahead. You also know the neighborhood has a few children and you deduce that there is a party going on up ahead, maybe a birthday party. You are a smart/experienced driver and might slow down as you approach the home you think is having the party, maybe 15 miles an hour like in a school district. You do this because as you approach you also notice, in addition to the party balloons and extra cars on the street, some parents walking small children, let's say about 8 to 10 years old. You also know from experience that kids, especially this young, dart in and out of streets all of the time and even at the legal 25 miles/hour, you are going to be extra safe and slow down. Your brain might process all this in a few seconds. Think about the incredibly complex set of logic rules your brain processed to signal you to slow down even though you were going the legal speed limit. Think about the kind of sensors and software needed to reach the same conclusion.

Now on the other side of the equation, I wholeheartedly admit there are plenty of stupid things humans do behind the wheel that a well engineered self driving car won't do, so yes, self driving cars will also save lives. I just wanted to point out some of the really complicated scenarios a self driving car will face in the "real world".

I think there are hundreds if not thousands of examples like this. I am NOT saying a self driving car is impossible, in-fact, I believe 100% they will happen eventually, I just continue to say that it's a very complicated problem to solve and the process will be long, filled with setbacks, injury, and death as we march toward artificial intelligence, which is not a matter of if but a matter of when.
 
Last edited:
Apple will develop the self driving tech part along with a car subscription model of some sort.
Need a ride to & from work everyday - the ride comes to you on time and drops you off at work.
Then scoots off to another subscriber who needs a ride...

Lots of people can get by without a car, or more than one...
 
If Apple somedays releases its own car, it will not be a car as we understand it today..

I think it will be something like a personal transportation device, maybe only for one or two people with some baggage.

Something like Sincalir C-5, but updated to todays design.

ps: But don't expect a full self driving car (SAE level 5) in next few years, it will be done, but not in near future!

ps2: I personally think that everything below SAE 5 is just playing with peoples lives and you are just paying to be companies test bunny. Companies and CEOs that claim FSD is safe should be legally and criminally responsible if a car is in a accident or if a car kills a person, saying something is in beta is just stupid and wrong, also expecting for a human to automatically take control of such a car is just stupid!

ps3: Plus there are many privacy, hacking and other concerns, which will have to be resolved for people to use a FSD car, the main one for me is who is responsible if such a car is in an accident, am I as a user to blame?
[doublepost=1559911984][/doublepost]One more thing, the "car" that will be sold by Apple will not be manufactured by Apple, it will probably be built by MAGMA or a similar company.
 
I didn't fall for anything, I left it pretty ambiguous as to whether they do or do not have higher than usual fatality rates.

If you know of clear statistics, please share. I know fatalities and incidents are tracked overall, but separating down to make/model, options like autopilot, whether those options were engaged, cause/fault, etc. seem almost impossible to nail down with any statistical significance. Then there's debate in how to measure and compare, whether by miles driven, vehicle years, etc. I'm definitely all ears if you know more, though.


Here's a basic article that contains links to other sources as well to get you going. As pointed out, the wonderful thing is that Tesla's are connected so Tesla gets unparalleled accident data immediately, and knows whether the car was in Autopilot mode or not, etc. Concomitantly, there is a huge amount of media coverage for every Tesla that results in a fatality. Remember, that's the real issue, i.e., "Are cars driven on autopilot more dangerous than cars that are entirely controlled by humans?" The most well accepted metric, and one gathered by the NHTSA, is number of fatalities per miles driven.

https://electrek.co/2018/10/04/tesla-first-vehicle-safety-report-media-coverage-crashes/
 
  • Like
Reactions: Return Zero
I’m not so fond of self-driving cars.
Hal 5000: i’m Sorry Dave, I cant drive that” (drives
Off the pier)
 
You missed the most important one, AI.
Watch the middle section of the video Tesla Autonomous Day on Youtube.
They're years, no decades, ahead of anyone else. Including the armchair quarterbacks here.
I mentioned the machine learning (which is AI). In my first paragraph I was just talking about hardware sensors.

I watched the whole thing live. I totally agree they are light years ahead of the competition. I still think they may need a few more sensors in the end to achieve full autonomy (level 5 or whatever), but I do think they can definitely do it without lidar. I say this not as some fan or investor but also an electrical engineer with some experience in robotics and electric autos.
[doublepost=1559918560][/doublepost]
Here's a basic article that contains links to other sources as well to get you going. As pointed out, the wonderful thing is that Tesla's are connected so Tesla gets unparalleled accident data immediately, and knows whether the car was in Autopilot mode or not, etc. Concomitantly, there is a huge amount of media coverage for every Tesla that results in a fatality. Remember, that's the real issue, i.e., "Are cars driven on autopilot more dangerous than cars that are entirely controlled by humans?" The most well accepted metric, and one gathered by the NHTSA, is number of fatalities per miles driven.

https://electrek.co/2018/10/04/tesla-first-vehicle-safety-report-media-coverage-crashes/
I think I remember seeing that when it was released. However it's much harder to compare apples to apples since I believe the aggregate NHTSA numbers include incidents involving motorcycles, pedestrians, etc. Obviously a Tesla with advanced safety features is going to be safer than the average motor vehicle of any type on the road, but I've also seen some research suggesting they have higher fatality rates than other vehicles in their class. At that point, though, it's pretty much splitting hairs. We all know they are quite safe when driven responsibly. :D
 
Machine learning is the new hype word currently, as was cloud, cryptocurrency, etc.

The whole problem with ML is that you need good data to get good results.

A company can claim that they have trillions of miles driven by their AI, the problem is if somebody does not go through that data and classifies it as good, bad, etc. then the whole claimed miles is just PR BS.

And NO, shadow mode isn't the same, if you do not know and check why human did brake, did turn etc. then you can not use that data for the model.

Garbage in, garbage out!

The problem with FSD is not just the speed of chip, but data and software for AI, and currently we do not know what we will need for FSD (SAE 5).

Maybe it will be enough just an Celeron chip, maybe a special ASIC board, maybe just a radar, maybe just a camera, maybe lidar, maybe all of them, until we get FSD (level 5) we just do not know what is needed for FSD!

First will probab
 
Don't fall for the fake news. It's not hard to prove. Fatalities are tracked 100%, so there are great statistics/evidence.
[doublepost=1559843789][/doublepost]


That's not how this works. The main technology is software, which will be updated over the air as Tesla does it. Not much else that will distinguish cars except design, which is where Apple will rule once again.
[doublepost=1559843843][/doublepost]


LOL.
I'm confused...what's funny?
 
LOL at all of the AI Machine learning people here.

Our current "AI" is just a bunch of glorified if-else statements, which is why all of these systems suck at anything unaccounted for in the if-else. What all of this "AI" is really geared for is big data collection and analysis designed ultimately to keep the populace in check.

Before they do self driving cars, maybe they should get the phone support prompts properly.
Human arrogance is always off the scale, less than 100 years ago they thought wearing uranium on your neck was beneficial - imagine what people 100 years from now will think of our arrogance (if we don't become sterile from 5-gee that is)
 
  • Like
Reactions: VaeVictis
LOL at all of the AI Machine learning people here.

Our current "AI" is just a bunch of glorified if-else statements, which is why all of these systems suck at anything unaccounted for in the if-else. What all of this "AI" is really geared for is big data collection and analysis designed ultimately to keep the populace in check.

Before they do self driving cars, maybe they should get the phone support prompts properly.
Human arrogance is always off the scale, less than 100 years ago they thought wearing uranium on your neck was beneficial - imagine what people 100 years from now will think of our arrogance (if we don't become sterile from 5-gee that is)

None of this made an iota of logical sense... apart from the last sentence.
 
I mentioned the machine learning (which is AI). In my first paragraph I was just talking about hardware sensors.

I watched the whole thing live. I totally agree they are light years ahead of the competition. I still think they may need a few more sensors in the end to achieve full autonomy (level 5 or whatever), but I do think they can definitely do it without lidar. I say this not as some fan or investor but also an electrical engineer with some experience in robotics and electric autos.
[doublepost=1559918560][/doublepost]
I think I remember seeing that when it was released. However it's much harder to compare apples to apples since I believe the aggregate NHTSA numbers include incidents involving motorcycles, pedestrians, etc. Obviously a Tesla with advanced safety features is going to be safer than the average motor vehicle of any type on the road, but I've also seen some research suggesting they have higher fatality rates than other vehicles in their class. At that point, though, it's pretty much splitting hairs. We all know they are quite safe when driven responsibly. :D


People need to keep their eye on the ball which is whether self-driving cars will be safer than humans driving. Currently, humans driving cars kill about 1.5 million people a year through driving while drunk, medical issues, driving distracted, stupidity, and simply human error, and up to 50 million are injured and disabled! Many billions in damage occur. Aside from the empirical data, intuitively and through common sense we also know that cars on autopilot won't have zero deaths or injuries, but they will never approach this anywhere near this level of carnage that humans cause.
[doublepost=1560012455][/doublepost]
Well, they did try and buy Tesla so I could see why people think they want to get into that business.


No, there is zero evidence that Apple wanted to buy Tesla. Just unsubstantiated rumors from unidentified sources. Ahh, the sad state of journalism.

The first mistake people make is is mistakenly assuming that Apple's interest is in selling an electric car. Yes, Tesla has a head start, but hat's all that Tesla has achieved thus far, and there are many, many much larger manufacturers now entering that field to compete with TeslaI, and unlike Tesla, they have existing money making car businesses to leverage for a gradual entry into the tough economics of electric vehicles. Even Musk now admits, Tesla is in danger of going bankrupt.

Instead, why it's extremely unlike that Apple ever had an interest in purchasing Tesla's money losing electric car business is that Apple would have no interest in producing electric cars for the above reasons. Apple's interest undoubtedly lies in a whole new experience with their mastery of design and engineering, e.g., producing a car with advanced technology, including some type of auto-pilot. For that, they don't need Tesla's technology, they can and are developing it on their own. Tesla is just using off the shelf tech available to anyone (as evidence now every major car company has cameras, radar, etc., on nearly every new vehicle that performs basic "self-driving" such as adaptive cruise control, lane keeping, forward collision avoidance, automatic braking, etc.).

Further, there's little chance Apple would have any interest in maintaining repair shops, stocking parts, etc. The most likely scenario is that Apple is designing something completely different that they will partner with a company, e.g., their soon to be implemented partnership with Volkswagen, to produce the special Apple Car. It's also likely that Apple has no interest in selling this car to consumers, but rather to companies that will use them in fleets that for Uber like services, corporate campuses, shuttles, etc.
 
  • Like
Reactions: VaeVictis
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.