Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Autonomous driving is so new that it isn't even here yet, but that isn't stopping it from improving safety:
https://dzone.com/articles/how-autonomous-vehicles-could-improve-road-safety
Since this software and hardware isn't really at Alpha stage, let alone Beta, and it's having this affect, it's mind boggling that anyone could think this won't be taking over in the not too distant future. At this point it's just a list of engineering problems to be solved, and not a very long one.
 
Dude, I am very well aware of the systems in my car, I even do software coding for my own car, in the cars original systems. Do you do that? What is "electric timing"? I've never seen a car trying to pace the electrons, that would be cool tech, but not sure why. Do YOU even know how engine management systems work at all? It doesn't seem that way (more than what you just read on wikipedia)

No, Porsches and BMWs do not have more lines of code, they mostly use off the shelf Bosch or Delphi systems like ME9, like so many other cars from the cheapest to the most expensive.

Honestly what are you yappin about? We're not talking general electronics here, and I can assure you that the ECU is not by far the most complicated piece of electronics in a car.

In conclusion: yes mister, I'm very well aware of the intricacies of engine management systems, that tough, is not what this thread is about. It's about systems that assist the driver (you know, the consumer) in operating the vehicle and more specifically cars that can do it "all by themselves", a.k.a "autonomous cars".

lol, this conversation is over after my reply. I don't trust you one bit if you think the ECU doesn't offload from the driver, nor do I think you know what you're talking about (much less, you coding cars LOL).

By the way, if we want to talk about an extreme example, the new Porsche Panamera has millions of lines of code more than its predecessor because it's offloading engine control/performance, chassis control/movement, torque vectoring, suspension controls, etc., all on its computers in real time, reacting to sensors left and right. These, my friend, DO offload driving work from the driver (including the ECU/EEMS). The driver does not have to worry about fuel to air ratio to get optimal performance. The driver does not have to worry about the engine choking because of fuel rushing in too quickly at a hard press of gas. And the driver does not have to worry about torque vectoring in tight turns. Are these all necessary for daily driving? Not all the assistance features, but they all play an integral part in getting a solid car.

And what I mean by consumer facing assistance systems is that the ECU/EEMS or chassis control, torque vectoring, etc. are not computerized functions that can be turned on or off by the average driver, like how cruise control or lane departure, etc. They're an integral part of how modern cars operate, with or without the average driver noticing it or knowing about it. Cruise Control is something the average driver knows about, and can turn it on or off. Which is why that is consumer facing.

The only thing missing now is the "autonomous" part, which to be honest is not that far off.

You dismissing the amount of technology that is already in a modern car is quite laughable. Autonomous technology is a challenge on its own, I agree. But you dismissing the idea that such tech is far away is even more laughable. Anyway, good day, and have fun "coding" your cars.
 
Last edited:
I love people with great conversational skills! I think you do best in not replying, you risk getting even more embarrassed.
I just copy/pasted some stuff from what I've been seeing on my computer:

SAAB COMPUTING PLATFORM IPL


Hardware platform: rev 9 64MB Ram + 32MB Flash

boot from OCP-Board


System RAM Data Bus OK
System RAM Address Bus OK
Q2SD RAM block 0x7B OK
System RAM block 0x7B OK
DSP RAM not OK
Starting Image ...
Found image @ 0x00200000
Jumping to startup @ 0x88008644


Welcome to Neutrino on SAAB

Startup bootimage /etc/rc ...

Application software version: OCP-SW 405 2.1.1 RC18 2004/09/09 17:39:37
Startup ProcMgr
Startup Photon
procman [pid: 28684] Logger activated


But, I guess that is not proof enough for you. Feel free to google it though, I have not copied from someone else. It's from the Navi unit for SAAB 9-3.
Modifying ECU parameters, aka "tuning", is easier to do but require even more knowledge, about the engine not computer programming.

Yeah, Bosch ME7 which the Panamera uses is a high performance system, used by other high performance cars like the Citroen C3 and VW/Audi 1.8T-engine cars among others.

You tried to look smart and come up with some examples but came out embarrassed since you don't actually have any knowledge or experience.
 
I just copy/pasted some stuff from what I've been seeing on my computer:
I'm not taking sides here, but what you posted says almost nothing about the amount of code since 64MB could hold millions of lines of code and 32MB of flash memory could hold it all if it's compressed. On the other hand, it might only have a few hundred lines, but the hardware was cheapest in that configuration because it's sold in high volume. There's just not enough information presented to know without using other data sources.

Also, I don't think there's ample evidence whether or not that device is actually assisting with driving the car in the background in any way. It may (likely is) only be presenting information, which would make it moot for your argument, since it would be basically the same as a cell phone mounted on the dash giving turn by turn directions.
 
Last edited:
Why is the forum just so full of testosterone?
You got a problem with that? Do you?? WELL, DO YOU???
;)

It is a tech forum, and tech is and has been predominantly male. Maybe that's changing, but change is often slow. Accept us for who we are, including our flaws.
 
  • Like
Reactions: bollman
The code I copy/pasted was just to prove that I wasn't talking BS when I said I actually do some programming in the internal systems of my car. It's nothing special, the navi actually runs QnX RTOS as base system.

I'm not really that full of testosterone, it's the day after the biggest party in Sweden, Midsummer. The testosterone levels are at an all time low I think, haha.

I just disliked being called a liar and that people should think I was full of crap. :)
 
Pay a ******** of money to get a self driving car? Nah,if I promise my friend a beer or two,he will gladly take me wherever I want in my own car. So,I guess that's cheaper.
 
Pay a ******** of money to get a self driving car?
You posted that on a tech related site? Really??? You don't think we know how tech prices go? Ok, for anyone completely new to anything tech, here's how it works:

  1. New, never before seen item comes out. It's expensive. It becomes popular with people who can afford it.
  2. Somebody comes out with a cheaper version. Even more people buy it.
  3. Economies of scale are reached. Prices come down even more.
  4. Many people buy it.
  5. More economies of scale bring down the price even further.
  6. The product becomes a commodity. Practically everyone has at least one.
Now, as for the expense of self driving, the cost of the software is A) likely to drop quickly with high volume sales and B) likely to be offset by lower insurance rates. Bottom line: by the time you buy a car with self driving capability it probably won't be very expensive. It may actually cost less after adjusting for insurance than a car without it.
 
Self driving cars have a very long way to go before it reaches "stage 6" in the above list, and I don't think cost will be the biggest hurdle to jump.

Firstly, the tech is not by far ready for prime time, yet. There's an infinite number of situations that the tech has to be able to handle. It will takes years to accumulate the data for the systems to be reliable. This is probably one of the easier steps though, it's all about the quality of sensors and skill of the programmers.

The legal aspects. There will have to be changes in traffic law to apply to self driving cars. Who wants a car that yes, drives on it's own, but the one in the "drivers seat" will be responsible for whatever the car does? People may panic in a tight situation. Unless the decision made in the heat of the moment is completely wrong or illegal, the driver will not be held responsible for the outcome. When it comes to self driving cars, the system will not panic (unless it crashes ;) ) and therefore may be held responsible for what happens. And what car maker want be held responsible for death or injury to the occupants of the car or bystanders if the system makes a sketchy decision, or one that may be thoroughly analysed after an accident and deemed wrong? Self driving cars will surely keep full logs (or be required to) of all sensor input and actions it has taken.

The ethics. What will the software be programmed to do in a situation that requires it to make a decision that involves "sacrificing" human life? Aim for the elderly lady to avoid hitting the child? Assuming it will become legal in the US to have self driving cars, the company that makes the software driving the car will probably not be able to claim the algorithms is trade secret when it comes to a court case where blame is to be decided.

Insurance. I'm not so sure it will lower insurance costs anytime soon, just think of the possibilities to blame the car for an accident! And, in other parts of the world, car insurance is not expensive at all. I pay like $500 a year for a pretty new car which I drive a lot every year, older cars are cheaper, like $200 or less.
 
Self driving cars have a very long way to go before it reaches "stage 6" in the above list, and I don't think cost will be the biggest hurdle to jump.
...
That reads like a list of rationalizations:
  1. Readiness: You don't need to out run the bear, just the slowest in your group. The tech doesn't need to be perfect. It just needs to be better than humans, and it's already better in many cases. Soon it will be better in nearly all cases.
  2. Legal: When it is better in nearly all cases it will be allowed in some jurisdiction, somewhere. Not long after that, its safety record will cause other jurisdictions to feel enormous pressure to allow it just to avoid liability.
  3. Ethics: These decisions are currently made by humans, and they do a poor job of it. Again, you don't need to be faster than the bear.
  4. Insurance. I'm sure there are countries where you can drive without any insurance at all, but those countries aren't going to influence autonomous driving adoption rates. In the western world insurance is an important factor, and as autonomous driving becomes adopted human driving will become expensive. Blaming the car will be futile, since it records everything. Video and telemetry make much better witnesses than humans.
 
1. In this case, you need to be _a lot_ better than a human to be allowed to market the system. You see, the difference is that a human is inherently flawed and the world is setup accordingly and the driver is always to blame for mistakes. A machine on the other hand has to be more or less perfect or it will not be allowed to make decisions that have a direct life or death consequences. "Self-flying airplanes" has been around for at least 10 years now accumulating immense flight hours, yet they are still not allowed to do all the flying and not without supervision by a qualified pilot, and it's still the pilot in charge that is responsible for the aircraft no matter how good the autopilot is.
2. That assumes the safety record will be impeccable. One accident where the autonomous system is even under suspicion and that will be pushed back for a long time.
3. As I said, humans are imperfect and the world works around it. There's no way to exactly know what caused an accident or exactly what decisions a human made. A self driving cars logs will be throughly examined in the event of a crash and it will open up for a totally different way of accident investigation. There's no way the courts will let that one slide, nor will people affected by an accident.
Example: City driving, a child jumps out in the road. A human driver will act on instinct/reflex and perhaps avoid hitting the child but might hit someone else. The driver is responsible, he or she could've driven slower and avoided the accident or veered a different way.
If a self driving car is put in the same situation, what will the car decide to do? Who will be responsible for whatever action the car decided to take? Or will the decision in court be that the child that got hit was to blame since the self driving car could've done nothing different? Don't think so.
4. This depends a lot on number 3 and 2.
 
Honestly, your first two points indicate a profound lack of understanding of the basic math involved here. If self driving cars are ten times safer than humans (about where Elon expects them to start out) that means 4,000 deaths per year instead of 40,000. Not impeccable, but no jurisdiction will be able to ignore that. Your fourth point proves that you don't understand business any better than you understand math. The western world is where the money is, and business caters to the money.

As for point 3, insurance and the courts will handle the odd case, just as they do now. There will simply be far fewer of them.
 
And being held responsible for 4000 deaths a year is no problem for companies you mean? Exactly how is that business model gonna work? Or do you expect laws to change to assume self driving cars are flawless and always without blame? Or the "driver", which has nothing to do with the cars actions, would still be responsible?
That is not gonna work outside the US. At least in my country, even if you kill yourself in your car, it will be decided who was to blame for the crash, in court if needed. I do not expect that to change anytime soon.

We might end up where driving your own car is regarded as unsafe and even unlawful, I do not refute that, on the contrary. But, cars and the traffic situation involved, and the risks to your fellow man while operating a car is a lot more complicated than self driving trains, airplanes and boats. I expect those forms of transportation to become autonomous before cars do (on a large scale), and that is purely down to math! What airline wouldn't want pilotless aircrafts? Can you imagine the money involved if a company got software to power autonomous planes approved? That company could easily outperform Apple!

For now, the autonomous part of cars will only be just like cruise control was years ago, a luxury assistive system in expensive cars that eventually trickle down to less expensive cars, but the person in the seat behind the wheel will still be in charge and to blame for anything that happens.
How did insurance and the legal system change to accommodate all other safety features of cars? Seat belts, Airbag, ABS, ESP, auto brake. All those features saves even more lives every year than autonomous cars will (according to your calculations) but I still don't see older cars without them being heavily taxed or get prohibitive high insurance costs. Compared to autonomous cars, those systems are vastly less complicated and easy to approve of, yet there has not been any action to get rid of older cars.

According to wikipedia, the situation in the US is (EU require all those systems since 2015):
In March 2016, the National Highway Traffic Safety Administration (NHTSA) and the Insurance Institute for Highway Safety announced the manufacturers of 99% of U.S. automobiles had agreed to include automatic emergency braking systems as a standard feature on virtually all new cars sold in the U.S. by 2022.

The first cars that introduced auto brake was the likes of Mercedes, in 2002-2003. That means that it took 19-20 years to get 99% of the auto makers to implement auto brake on almost all cars, a system quite a lot simpler than autonomous cars and it should be a no-brainer to just require it, yet, it takes many years.

So, even if a fully functional and 100% safe autonomous car came out today, we might not see it in "virtually all" cars from 99% of the makers until 2037 and that is not considering the changes needed in the legal systems.
 
And being held responsible for 4000 deaths a year is no problem for companies you mean?
How do you not get this? Insurance companies are already involved in practically all of the 40,000 deaths per year! Cutting that number to 4,000 (and dropping) doesn't make more work for them, it makes less! Statistically, it's virtually certain that at some point an accident will occur where the computer will be at fault, but one accident, or even several, cannot be used to prevent saving 36,000+ lives every year.

Yes, it took a long time for anti-lock brakes to become mainstream, but that's because the car industry likes to milk every new technology for every bit of profit they can. With electric cars coming along side autonomous driving, traditional car makers now have to compete with tech companies. Tech companies expect change to happen in months, not years, and they're prepared for it. Tesla, for example, doesn't wait an entire year to make changes to their models, and car companies that don't learn to keep up aren't going to be with us long.

You sound very much like the CEO of Palm six months before the original iPhone. The fact is, computer guys ARE going to just walk in, and everything is going to change. Probably much more rapidly than some people would like.
 
Why hasn't it happened in the aircraft industry then? Airlines are actually begging for self flying aircrafts, yet it doesn't happen. Now Boeing is actually confirming it will start to test autonomous aircrafts.
https://www.theatlantic.com/technology/archive/2016/03/has-the-self-flying-plane-arrived/472005/
https://www.washingtonpost.com/news...st-self-flying-planes/?utm_term=.67b03f312fb1

The analogy with Palm is not relevant, they had no idea and wasn't even paying attention.

I'm just saying that the revolutionary part that's missing from autonomous vehicles is the actual "intelligent" part. So far, all these systems are all based on algorithms, made by programmers, which means that whatever flaw they have, so will the software, and todays AI is absolute crap at "thinking outside the box".

I think Siri is a good example of exactly how far from "intelligent" computers are. Would you trust Siri to take you anywhere if she were in control of your car? I wouldn't step into such a vehicle.

Amara's law has been in effect for a very long time by now:
https://en.wikipedia.org/wiki/Roy_Amara
 
For the same reason it will happen in the auto industry: the accident rate. Flying is the safest form of travel! I've said it at least a couple of times in this thread: you don't need to be 100% safe, you just need to be safer. It's much harder to be safer than extremely safe flying than incredibly unsafe automobile drivers. After the low hanging fruit is handled, trains, planes, and even boats won't be far behind.

Palm was a leader in the field! They're incredibly relevant, and they were paying attention which is why they were whistling through the graveyard, just a car manufacturers are now with Tesla.

Siri isn't designed to drive a car, but the fact that you can even imagine it doing so indicates how far the software has come from nothing just ten years ago.

What you're not getting is that no one is over estimating the near term because we're no longer in the near term. Self driving has been in development for a long time, starting with turn by turn directions years ago, and now with features like autopilot in Tesla cars. It's now advancing rapidly, and so applying linear growth to estimate its future may well under estimate what happens over the next few years.
 
Just to prove the complexity:
http://www.bbc.com/news/technology-40416606
Something that would not confuse humans. There are quite a few different spieces of animals on the planet.
Turn by turn instructions are still crap as it relies on static data, and the world is far from static.
[doublepost=1498890484][/doublepost]The argument about flight industry being safer already does not hold up. Sure, there are fewer aircrafts sold, but the system could be retrofitted easily, which is not the case in cars. There are immense number of aircrafts which could be fitted with this system if approved, and the company with the patents would become extremely rich. It's money, not safety that drives the development. Tim Cook can yap as much as he likes about being a nice company and doing the right thing. If it came down to serious money, he wouldn't be so kind any more.
Car manufacturers are hoping to make more money from self driving cars as people today are too bored or uninterested in driving, they rather take the bus to be able to fiddle with their phone. Car sales are stagnating, the desperately hope this will breathe life into the industry again, but it's moot, cars will go away eventually, regardless of systems and propulsion.

As you can read in the previous article, something as simple as kangaroos jumping can confuse sensors to the level where special programming is needed for the system to be able to detect them. What else would be difficult for the system? Futuristic prams? Disabled people? As I said before, the breakthrough that enables computers to actually "think" is not here, and until then, we will have self driving cars, oh yes, but the driver will still be responsible, and expected to override the system if anything goes wrong or it becomes confused.

Also, that needed breakthrough in software will need another breakthrough, in computer tech:
https://arstechnica.com/information-technology/2016/02/moores-law-really-is-dead-this-time/

I would not assume Tesla, or any car manufacturer will come up with the system that enables it all. I think Tesla is a little bit too careless and car manufacturers a bit too stupid (management-wise). And Apple seems to be realising just how difficult this is. It went from entire car, to autonomous system. Expect it all to dwindle away in 2018-2019.
I think the breakthrough will come in a totally different field, and be applied to this. I'd say the breakthrough will come from a university, where suddenly, "real" AI will be developed. A system that can learn and deduct information.
When that happens, it will be a revelation and a turning point in the history of humanity, and autonomous cars will just be on small part of what can be achieved.
 
Seriously, you're grasping at straws. Of course you can find evidence of car makers (especially ICE makers) failing at this. It's just as easy to find video of Teslas succeeding wildly, and they still only have autopilot, not yet fully autonomous capabilities. Here's one where they avoid deer, pigs, and human drivers. — The pig (wild boar?) manages to run into the car! Search on youtube for "tesla autopilot predicts crash" to see more.

Honestly, I'm not seeing how you don't get this. Yes, there will be money to be made selling systems like this for air planes, but the level of complexity required for airplanes is far greater. Not simply because they travel so much faster, but because it's very difficult to improve on their safety record. And if you can't show that your autonomous flying system is safer than human pilots you're not going to get your system approved. People driving on the road are absolutely terrible. Professional pilots are far safer. When planes crash, it makes the news because it happens so rarely, not because of the number of deaths. If deaths counted, we'd have nightly tallies about car accidents (40,000 / 365 ~= 109 deaths per day!). When was the last airline crash in the U.S.? Do you even remember? I don't.

By the way, if you want to put your money where your mouth is, bet against this guy:
https://electrek.co/2017/04/29/elon-musk-tesla-plan-level-5-full-autonomous-driving/
He thinks he'll have autonomous vehicles by sometime in 2019. Lots of people have been sure that he was wrong about things like SpaceX, the Roadster, Model S, Model X, and plenty are betting against the Model 3 by shorting his stock. They've all been (or are about to be) shown to be wrong, so I wouldn't be too hasty to bet against him on autonomous driving. I hope Apple succeeds at it too, by the way, but Elon's got a huge lead.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.