Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

PinkyMacGodess

Suspended
Mar 7, 2007
10,271
6,216
Midwest America.
Computers have a lot of advantages over humans when it comes to driving: they can see everywhere at once, they can see in the dark, they never get distracted by kids arguing in the back seat or urges to show off, they can have much faster reaction times, they can calculate F=ma to much higher precision trying to thread the needle between over and under reacting, they don't get drunk or fall asleep, and they're not likely to confuse the accelerator for the brake in a panic (though Tesla apparently still has some issues with this...).

They can also coordinate much better with other computers than people are ever willing to cooperate with other people, and improve traffic flows.

What computers lack is the ability to reason and adapt. They'll get there, but almost certainly not as soon as the evangelists think. Being able to drive suburban roads in Phoenix or Cupertino is a long way from solving the general problem.

The real justification for self driving cars isn't to give the driver some additional leisure time, it's to improve safety, traffic and energy efficiency. Worthy goals, but it's going to get worse before it gets better. We're also much more forgiving of people than we are of machines-- if I ask a friend for directions and they get it wrong, I react much differently than I do when Siri gets it wrong-- so they're going to have to do much better than people before we fully accept them.

Okay, but, until this is Minority Report type stuff, self-driving cars will always be a road hazard potential. Such a car can brake, or swerve, thousands of times faster than a human can, so its like running into someone, you go left, they go right, you go right, they go left... Think TCAS, when one pilot ignores its commanded movement. Until ALL cars are that technology, it's a bigger risk.

Like, this example: The car hits a curb, and stops instantly because it hit something, and that takes precedence. Anyone following that car isn't likely to have the same quick stopping drive/ability, and will have to either run into them, or swerve to avoid, meaning the latter would potentially involve other vehicles, or pedestrians. Reality gets messy... Even driving down a highway could cause the same cascade. It can become ugly quickly... There will be a lot more deaths and injuries on the way to a 'perfecter' self-driving vehicle.

And 'self-flying planes'? People are talking about them, and I sure as hell wouldn't want to be a passenger on one. Nope. But rich people are 'going into space' *COUGH* on self-flying 'spaceships'.
 

PinkyMacGodess

Suspended
Mar 7, 2007
10,271
6,216
Midwest America.

Do it yourself Sushi! YUCK... (Reminds me of the time my dad had a thumping in the car. Couldn't for the life of him figure it out. Turned out to be loose molding on the passenger door. Too funny when that was identified. A friend had furious thumping in the front of his car, and it turned out to be part of an 'air dam' that popped loose 'at speed', the dealer said, which prompted his wife to filet him, at the dealer, about 'driving so damn fast'???)
 
Last edited:
  • Haha
Reactions: matrix07

Analog Kid

macrumors 604
Mar 4, 2003
7,804
8,884
Okay, but, until this is Minority Report type stuff, self-driving cars will always be a road hazard potential. Such a car can brake, or swerve, thousands of times faster than a human can, so its like running into someone, you go left, they go right, you go right, they go left... Think TCAS, when one pilot ignores its commanded movement. Until ALL cars are that technology, it's a bigger risk.

Like, this example: The car hits a curb, and stops instantly because it hit something, and that takes precedence. Anyone following that car isn't likely to have the same quick stopping drive/ability, and will have to either run into them, or swerve to avoid, meaning the latter would potentially involve other vehicles, or pedestrians. Reality gets messy... Even driving down a highway could cause the same cascade. It can become ugly quickly... There will be a lot more deaths and injuries on the way to a 'perfecter' self-driving vehicle.

And 'self-flying planes'? People are talking about them, and I sure as hell wouldn't want to be a passenger on one. Nope. But rich people are 'going into space' *COUGH* on self-flying 'spaceships'.

To be fair, everything you mention is true for people, and probably more so. The example you're providing are based on experiences you've had, which means they almost certainly involved people.

And planes have been largely flying themselves for quite some time-- that's why Tesla called it AutoPilot in their car, not AutoDriver. Ignoring the KAL007 incident and just looking at the GPS enabled world, how many air crashes have been due to autopilot errors versus human errors or equipment failures? I'd imagine computer guidance could handle sudden equipment failures better than humans.

Again, self driving cars are a long ways off, I believe, but I also think you're overestimating the performance of people.
 

name99

macrumors 68000
Jun 21, 2004
1,831
1,468
Even though it's an incredibly minor incident, hitting the curb seems like something that should be very easy to avoid.

And yet I do it maybe once a year! (Though *usually* not bad enough to hurt the tire or require realignment.)

If Apple (and other smart cars) can reduce that frequency, they're doing better than some humans...
 

PinkyMacGodess

Suspended
Mar 7, 2007
10,271
6,216
Midwest America.
To be fair, everything you mention is true for people, and probably more so. The example you're providing are based on experiences you've had, which means they almost certainly involved people.

And planes have been largely flying themselves for quite some time-- that's why Tesla called it AutoPilot in their car, not AutoDriver. Ignoring the KAL007 incident and just looking at the GPS enabled world, how many air crashes have been due to autopilot errors versus human errors or equipment failures? I'd imagine computer guidance could handle sudden equipment failures better than humans.

Again, self driving cars are a long ways off, I believe, but I also think you're overestimating the performance of people.

Autopilot is NOT auto-flight. There are some planes that are capable of 'auto-land', and 'auto-takeoff', but the majority of them are not. And even if they are, it's a stupid lazy pilot that would ever depend on the systems 100% of the time. As a matter of fact cockpit automation, and pilot unfamiliarity, or incompetence has likely caused more crashes than anyone would ever want to imagine. So, do planes 'fly themselves'? No. Can they? No. Are pilots unnecessary? Again, No.

Plus the dangers of self-driving cars is compounded in a plane, because there are way more vectors of motion than in a car. I would thing (certainly hope) that literal self-flying planes are way off in the future, like 5 generations of human beings off.

And GPS can't be relied on. That is why so many countries have, and are, creating their own navigation systems. The DOD said the would 'fuzz' the GPS system int he event of some kind of incident involving, or perpetrated onto America, which caused them to not trust 'our' system. Can't blame them, right...

So, anything that encourages the driver to relax their attention from the act of driving is an attention suck that has, and will continue to cause and contribute to accidents.

Overestimating the performance of people? Well, I suppose. There are people downing horse de-worming paste...
 
  • Like
Reactions: steelhauler34

steelhauler34

macrumors 6502
Jul 23, 2019
345
251
We have all the latest and greatest anti collision technology in our work trucks. All of which would be absolutely vital to full autonomy. I can assure you it’s buggy as hell. Too many computer controlled sensors that can fail in heavy rain, ice or snow. Full autonomy is a long way off and no one will ever convince me a computer controlled vehicle will be better. These researchers are going to discover the same thing in chasing their pipe dream of everyone in a autonomous vehicle.
 
  • Like
Reactions: PinkyMacGodess

PinkyMacGodess

Suspended
Mar 7, 2007
10,271
6,216
Midwest America.
We have all the latest and greatest anti collision technology in our work trucks. All of which would be absolutely vital to full autonomy. I can assure you it’s buggy as hell. Too many computer controlled sensors that can fail in heavy rain, ice or snow. Full autonomy is a long way off and no one will ever convince me a computer controlled vehicle will be better. These researchers are going to discover the same thing in chasing their pipe dream of everyone in a autonomous vehicle.

Case study the 737MAX. Not enough training, instruction, and relying on a single sensor, and making the second redundant sensor nearly a $60,000 up-charge (and knowing that airlines usually never buy the up-charges).

Almost everything on the plane has a redundant system, and if it doesn't, it's not usually necessary for flying people. Boeing decides to 'fix' the issues with the MAX, but tweaking software, and relying on a SINGLE SENSOR. And the software doesn't take in to account the altitude of the plane before it reacts to the sensor input.

Yeah, coupled with cultural issues, and all of it, two planes with real human beings in them crashed. The sensor was, from what I've read, the first time that something that major has been left to a single sensor, and that software can aggressively counter a pilots inputs to such a degree. All because, apparently, Boeing didn't want to have to go through the redesign process because they put much larger engines on the plane, and they changed some of the flight characteristics. Some say their software 'fix' wasn't needed, which makes it an even more horrific tragedy.


Self-anything is only as good as the assumptions and foresight of the programmers and engineers. And how much they feel they have to cover for...

That, coupled with an amazingly pliant FAA that let Boeing be their own quality control system is an amazing example of another sad issue: 'agency purchase'. The regulated become the regulators, and it's happened in more and more agencies, and is across whole industries, and ideologies. Everyone does it, if they can...
 

s.r.

macrumors newbie
Mar 12, 2009
10
4
Dublin, Ireland
Literally everything about driving is being reactive. Responding to other drivers and maintaining safe speeds and distances. You see congestion ahead? You react by slowing down. You see inclement weather? You react by slowing down. Emergency vehicle on the side of the road? You react by changing lanes.
This goes both ways actually. You can train an AI to be proactive, but for time being it's been trained to be reactive mostly. That's where humans have an advantage, just cause our brain had more years to evolve, we can seamlessly switch between being proactive(car chasing, rally and other competitive vehicle sports) and reactive(everyday general driving). Everyday driving does have proactive moments as well when we choose our path or driving style to avoid causing possible traffics or collisions. An AI could calculate all that as well and offer you it as an option, but it's not allowed to act on it by itself.
Also, some humans. are able to think outside of the box. An AI in theory also could, but would be limited by the imagination of its creator and current technologies.
I am not an AI specialist, but something tells me, we won't have a real human-AI until we stop applying the "human" type of thinking and cognition. Once, an AI will have a concept and capability of free will, and room to utilise it, then it will be defo capable of its own cognitive mechanisms based around its own hardware constraints and able to perform reactive and proactive actions. Connect it to a supercomputer with all the database of the hard knowledge and that would be a super AI that outperforms humans at everything.
 

pubb

macrumors regular
Mar 13, 2007
150
151
This goes both ways actually. You can train an AI to be proactive, but for time being it's been trained to be reactive mostly. That's where humans have an advantage, just cause our brain had more years to evolve, we can seamlessly switch between being proactive(car chasing, rally and other competitive vehicle sports) and reactive(everyday general driving). Everyday driving does have proactive moments as well when we choose our path or driving style to avoid causing possible traffics or collisions. An AI could calculate all that as well and offer you it as an option, but it's not allowed to act on it by itself.
Also, some humans. are able to think outside of the box. An AI in theory also could, but would be limited by the imagination of its creator and current technologies.
I am not an AI specialist, but something tells me, we won't have a real human-AI until we stop applying the "human" type of thinking and cognition. Once, an AI will have a concept and capability of free will, and room to utilise it, then it will be defo capable of its own cognitive mechanisms based around its own hardware constraints and able to perform reactive and proactive actions. Connect it to a supercomputer with all the database of the hard knowledge and that would be a super AI that outperforms humans at everything.
What you view as "proactive" is still reactive. For example, you see a possibly drunk driver weaving in and out of lanes, so you choose to avoid that person by turning onto a different road, speeding past them, or just pulling over and avoiding the situation entirely. A self-driving car could respond to "inconsistent" drivers by "proactively" avoiding them in any of a number of different ways. You're still reacting to the circumstances around you.
 
  • Like
Reactions: PinkyMacGodess

PinkyMacGodess

Suspended
Mar 7, 2007
10,271
6,216
Midwest America.
What you view as "proactive" is still reactive. For example, you see a possibly drunk driver weaving in and out of lanes, so you choose to avoid that person by turning onto a different road, speeding past them, or just pulling over and avoiding the situation entirely. A self-driving car could respond to "inconsistent" drivers by "proactively" avoiding them in any of a number of different ways. You're still reacting to the circumstances around you.

Until AI can 'think', and work on 'predicting' the outcome of each possible action, it will always be a flawed experiment and imminent potential danger to the world at large.

One movie, from a few years ago, shows how I try to ride my bike, surprisingly. It's Premium Rush, which is a great 'bicycle movie', for single speed devotees. The main character, bike messenger in New York, mentally plays out the potential effects of decisions while he rides through New York traffic. It's a cool part of the movie. To ride in traffic IRL, you really have to think about what the traffic is doing, and gauge the possibilities and potential for bodily damage. It's like chess, thinking several moves in the future, based on each move, and an educated guess on what your opponent could do next. But it's far easier to be reactive, as, like a plane autopilot. It can read the sensors and make a rout guess of what to do to stabilize the plane. Often, when things get out of hand, my ground school instructor said that turning on the autopilot can either stabilize flight and buy time in the moment, or turn itself off signifying the issues require more input from the pilot(s).
 

duck apple

macrumors regular
Feb 26, 2009
180
49
applelexusselfdriving1.jpg

Why this extremely ugly Toyota! Lexus has other better looking cars.
Jobs won't approve it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.