Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Not a fair point. The guy in your example was watching a Harry Potter DVD while driving on AP.
So far, Tesla has racked up millions of miles on AP and the fatality rate is less than the average.

My points are that it's buggy, and not really autopilot (as the name might imply). If you watch a few YouTube videos of it, you'll see that it hasn't racked up those millions of miles w/o a LOT of human help!

A set of batteries for a Tesla will run $15,000+ and have to be replaced every 6 years once the subsidies go away. Of course then it will be too late to chose as there will not be any low cost alternatives.

Yea, a lot of this is being rushed into place out of a hysteria reaction to 'global warming' and such. But, I still think electric vehicles are a good plan for the future. They *should* be more reliable, overall, and last longer. They are capable of incredible performance and things like AWD easily (including much better control of the power and traction).

There are a couple of battery technologies that are just in the labs, but sound promising, like aluminum based ones. We'll see.

In other words, politico's would have been slammed if they started giving subsidies to Apple who supposedly owes Billions of dollars in unpaid taxes as they 'hide' their money overseas...

If we don't want companies getting incentives, tax loopholes, subsidies, then we need to change the laws and policies. It's baloney to set thing up this way, and then blame companies for taking advantage of it. Most typical people go to a tax preparer (or use software) that also takes advantage of any tax advantages, etc. And, believe it or not, over half of Americans pay zero taxes... and some percentage actually get paid money back while paying none in.

A lot of the stuff we hear about concerning companies not paying taxes is also related to incentives various governments pay, competing to get companies to locate in their areas to create jobs and such. So, it's not quite as straight forward as the press often makes it.
 
...It sounds like you're saying that people are acting like rational beings on the road, acting like they do based on what they have learned in driving school. As someone who have been driving for the last 15 years, I wish they did :)

No. I brought up that morality is not taught in driving school to point out that morality is not considered relevant to driving.

And it boils down to the problem of blame. If (when) something actually goes wrong with a self driving car, who is to blame? Is it the Apple who programmed it in a certain manner (it just does what it's told, remember). Would the programmer be associated by murder due to a sloppy algorithm? Last one is a stretch but yea :)

Note that this is a completely different argument about self-driving cars that the one I was responding to.

How to assign blame in a complex, real-world case when something goes wrong is not any different when self-driving cars are involved. It's certainly a new wrinkle, but there have always been new wrinkles. We'll work out the norms for this like we always have (some combination of litigation, regulation/laws, and extension of existing norms). It's hardly an unworkable problem that will stymie the progress of self-driving cars. Back in my post I said self-driving cars just need to be safer than human-driven ones. That's significant to the question of blame as well: fewer crashes means less blame to spread around, which makes this a relatively minor issue.
 
You should probably look into it and think about it a bit more then. That's more Google's approach. Tesla's is *MUCH* more crude.

Ah, I see. If only I would "think about it more" then I would naturally come to your conclusion that the problem is completely unsolvable. The basic flaw in your argument is how Google or Tesla are approaching this problem today hardly matters for the future.

The most important fact to keep in mind is that human beings do not carry all the potentially necessary information to complete any complex task in our brains. What we lack in the ability to know everything is compensated for by sensory and reasoning abilities. Yet we are demonstrably very dangerous behind the wheel of a car because of the inherent limits to our knowledge, sensory, and reasoning abilities, as well as highly imperfect physical abilities (all of which vary greatly from individual to individual). Of these, save reasoning, it is easy to see how computation systems could be an improvement. It's also possible to imagine how computational reasoning, within the limited bounds of the problem, can also be better.

Reasoning may be the tougher nut to crack, but then again, it's also easy to see how an automobile's AI would not have to entirely duplicate human reasoning to become net superior to cars piloted entirely by humans. The key is fuzzy logic, one of the main areas of AI research.

I am certain it will be figured out over time. Cars will continue to get smarter in the way they assist human drivers (already much improved only in the last few years). The problem as you've illustrated it is getting humans to accept that maybe they aren't as good at something as a computer could be. That attitude simply illustrates the limits of human reasoning, not the strengths of it.
 
I think from my perspective, Tim Cook is on borrowed time (in a sense). You see, apple has been making money hand over fist, but all the products that produced those record profits, date back from when Steve ran the company. Only the Apple watch is something that Cook can put his name on, and even then, its questionable about how successful that is.

Apple is relying on one product for the majority of its profits, and Wall St. never likes to see that, because if that single product falters, then there goes all the money their investors stood to make. Cook does need to innovate and roll out more products that allow a consistent and balanced revenue stream.

If a balanced revenue stream was something Wall St. really cared about, why is Google such a Wall St. darling? All of Google's revenues are from search/advertising. Android, for all the talk about its dominance, is designed to drive people towards Google services. Google has not come out with a product or service that rivals search as a revenue machine.

Sure, Google has projects that it's working on but so what? The pattern has been for Google to jimmyrig some half-baked concept, preview it to a whole bunch of media hype, talk about it in gradually decreasing amounts, stop talking about about it altogether after some time which invites speculation of the project's end, and then finally pull the plug officially. That's what happened with Project Ara, Google Glass, Google+, Google TV.

Whatever automotive thing Apple was working on was nothing more than rumors. In my opinion, if the rumor is true that Apple has abandoned its car plans for now and has set a late 2017 deadline for autonomous driving, it's a testament to Apple's discipline. It's a good thing that Apple is prepared to can it if it can't compete in a meaningful way.
 
My co-worker who had one of the original Volts she was leasing just got one of the new vo


My Prius does this NOW. My understanding is any braking above a certain speed utilizes the regen which is why I haven't needed to have the brakes redone in a decade.
[doublepost=1476785392][/doublepost]

Hence my caveat. Long distance commuters. 25 minute is annoying but I think of people like my boss at work. He has a 120 mile commute every day and he's done this commute for over ten years. He first used a Diesel Jetta by VW and now has a Chevy Diesel Cruze. But that also makes me think that though an autopilot feature would be useful it wouldn't be something I'd rely on every day or for every drive. I have cruise control in my vehicle but don't use it every day or every drive.

I hope you're still getting the brake fluid serviced regularly though?
[doublepost=1476810209][/doublepost]
Yeah, we can build a car. The most unique and user friendly car in the universe. The Apple car.

Nope, we're not going to build a car.

Yeah, let's hire every automotive automation tech who ever lived. Let's pirate people from Tesla. Let's build the Apple car.

Nope, we're not going to build a car.

(repeat above ad nauseum)

Right on, Tim. You certainly know how to steer (pun intended) a company in the right direction.

Or was it the other direction?

They brought Matthew McConaughey in to consult, and then discovered that all their prototypes could only go alright alright alright. Would have had to start over from scratch. Tragic really.
 
No. I brought up that morality is not taught in driving school to point out that morality is not considered relevant to driving.
...
Back in my post I said self-driving cars just need to be safer than human-driven ones.

Morality is built-in to humans, though it certainly wouldn't hurt to reinforce it in driving school! (And, maybe more than once near the beginning of one's life.) If I had to venture a guess, I think we could eliminate the majority of vehicle related deaths with regular driver training and stricter law enforcement/penalties for bad driving practices.

But, morality is not relevant to driving? Give me a break! If there are any laws associated with it, then morality is involved. Why do you think there is a speed limit, or why not drive drunk?

Ah, I see. If only I would "think about it more" then I would naturally come to your conclusion that the problem is completely unsolvable. The basic flaw in your argument is how Google or Tesla are approaching this problem today hardly matters for the future.

No, I'm just saying I recognize a few things about the challenge. I started my career in electronic engineering. I've spent most of my career in computer science. I spent a good bit of time in data operations (i.e.: systems automation), and even worked for an industrial design firm for a while. I'm a car and driving enthusiast, and have some grad-school education in philosophy of mind. That doesn't make me an expert, but does give me a bit more insight than the average person (and likely a broader view than most people working in AI).

And, while I'm not a fortune teller, technology-advancement isn't magic either. Technology won't ever do something that can't be done.

It's also important to understand what I'm saying is a huge challenge, vs what can't be done. I'm admitting that AI tech is going to advance by leaps and bounds. But, it's never going to actually think and reason like we do. You're saying that won't matter to driving, I think, while I'm saying it ultimately will (unless what we think of as driving is radically changed).

The most important fact to keep in mind is that human beings do not carry all the potentially necessary information to complete any complex task in our brains. What we lack in the ability to know everything is compensated for by sensory and reasoning abilities. Yet we are demonstrably very dangerous behind the wheel of a car because of the inherent limits to our knowledge, sensory, and reasoning abilities, as well as highly imperfect physical abilities (all of which vary greatly from individual to individual). Of these, save reasoning, it is easy to see how computation systems could be an improvement. It's also possible to imagine how computational reasoning, within the limited bounds of the problem, can also be better.

And, for the most part, a computer is pretty limited as well. They certainly can't know everything. It would be an incredible challenge to match a human's sensory input, because much that is based on sensory interpretation. A computer can take 50 megapixel images all day, and it's quite pointless, even compared to a person with poor vision, if the image can't be interpreted.

Sure, a computer could have a radar sensor that might detect something a human couldn't, and react more quickly IF the proper interpretation of the data, and appropriate reaction, is built into the program. And, for a particular task, that's relatively easy, and a computer will often beat a human.

But, once we start layering and combining situations, it gets complex really quickly. The amount of data a human driver is taking in, analyzing, and reacting to is many orders of magnitude more than the best AI system is currently working with. That doesn't mean that in a very specific situation, the computer might do better. I'm all for *assistive* technology!

Reasoning may be the tougher nut to crack, but then again, it's also easy to see how an automobile's AI would not have to entirely duplicate human reasoning to become net superior to cars piloted entirely by humans. The key is fuzzy logic, one of the main areas of AI research.

This is where I disagree. There's no magic going on here, it's just a more sophisticated form of look-up and branching. When a situation outside the parameters appears, a human could actually figure it out, a computer can't. Again, I'm not saying a human does this perfectly, or sometimes as quickly as necessary to avoid an accident... but the computer won't either, as it's not even possible.

The whole thing about automated 'driving' is that it has to be greatly constrained. Like I think I said in an earlier post... IF all the vehicles communicated (best automated), and IF the roadways are well enough mapped/constrained, and IF the non-predictable obstacles can be controlled or eliminated from the situation well enough, and IF there's enough sensor-tech to overcome things like weather, then it might work fairly well. But, that isn't reality, nor is it driving in a comparable way to what humans currently do.

Could we do the above? Sure, with enough changes and trade-offs to current driving, and a ton of infrastructure change. I used to ride the automate train in Vancouver daily, and it worked pretty well. If that's where the future of cars go, it would suck IMO, but I guess it would work. If the goal is to save lives, though, I'd rather just work on training and eliminating the bad drivers from the road that cause the majority of the accidents.

(For example, I've driven hundreds of thousands of miles, and have only caused one very minor accident when I first started driving, and have been only been in one accident that was a true accident due to extreme weather conditions... the rest have all been the other party's fault and, from what I know, could have been avoided with some training and/or the other driver not doing something they weren't supposed to be doing. Or, to put it another way, the main problem isn't humans lack of capabilities to drive well/safely.... it's our ability to decide to do what we're not supposed to be doing.)

I am certain it will be figured out over time. Cars will continue to get smarter in the way they assist human drivers (already much improved only in the last few years). The problem as you've illustrated it is getting humans to accept that maybe they aren't as good at something as a computer could be. That attitude simply illustrates the limits of human reasoning, not the strengths of it.

Cars won't get smarter, they don't think or learn. Programmers will think of ways of employing technology, sensors, and computers in the process of assisting humans, or attempting to 'drive', yes.

My goal is to get humans to realize the strengths and weaknesses of both humans and AI, and properly implement them in reality, instead of the sci-fantasy stuff in the movies, or the dreams of 'futurists' like Musk, Kurzweil, etc.

If a balanced revenue stream was something Wall St. really cared about, why is Google such a Wall St. darling?

Wall Street is an attempt at a reputable casino, anymore. It's about futures speculation, not investment. It's a huge problem to economic stability, but certainly shouldn't be looked at as any kind of indicator of how well a company is or isn't doing.
 
  • Like
Reactions: Benjamin Frost
Nice! Can we have a Retina MBA with decent CPU/GPU now? Tks!

Or, a Mac Pro or mini that's semi up-to-date and reasonable for the price. Or, some QC on the software side. Or, some back to a focus on good UI. And, maybe ditch the stupid B&W 'flat' obsession. Or, make it so that each time I update the OS on a device, I don't have to re-setup all the cloud stuff and tell Apple over and over, that NO, I don't want all my stuff moved to their flaky cloud? ..... ...... ..... (I could go on all day)
 
No. I brought up that morality is not taught in driving school to point out that morality is not considered relevant to driving.



Note that this is a completely different argument about self-driving cars that the one I was responding to.

How to assign blame in a complex, real-world case when something goes wrong is not any different when self-driving cars are involved. It's certainly a new wrinkle, but there have always been new wrinkles. We'll work out the norms for this like we always have (some combination of litigation, regulation/laws, and extension of existing norms). It's hardly an unworkable problem that will stymie the progress of self-driving cars. Back in my post I said self-driving cars just need to be safer than human-driven ones. That's significant to the question of blame as well: fewer crashes means less blame to spread around, which makes this a relatively minor issue.

To be able to make self driving cars more safe we would have to treat roads like a closed system, like railroads. That won't happen.

We can just agree to disagree on this one. I don't think AI will be sufficient to act better than humans do in real world situations, they merely follow rules better. And I don't think people want that in the end.
 
This reminds of a conversation I had back in the late spring of 1995 right before Microsoft released Windows 95.

I'd be surprised if Apple manage to pull off anything that spectacular. I'd love them to do it, but at the same time I can't rely on them any longer to provide updates in a timely fashion.
 
  • Like
Reactions: dan110
It's also important to understand what I'm saying is a huge challenge, vs what can't be done. I'm admitting that AI tech is going to advance by leaps and bounds. But, it's never going to actually think and reason like we do. You're saying that won't matter to driving, I think, while I'm saying it ultimately will (unless what we think of as driving is radically changed).

You've broken my argument up into thoughtlets, making it difficult for me to respond in an organized way, so I will go back to this comment, which seems to be the nub of the crux.

My point all along is that computers don't need to "think like we do" to solve a specific set of problems more effectively than humans do. In fact it is better in some ways that they do not, because not only are our cognitive and physical skills massively varied, we tend to massively (and in this case, dangerously) overestimate our abilities. Our skill in piloting hunks of steel at high speeds is actually quite poor, a case that is trivially easy to prove in the aggregate. Yet very few individuals would own up to being unskilled (or even less skilled than average). If you ask anyone why they drive too fast or follow too close, the justification is a rationalization. Humans are good at the dangerous skill of rationalization. Computers don't rationalize, so right away, advantage to computers.

So far from it "not mattering," if the problems of driving can be broken down to very specific set of rational problems (which I believe it can) then the numbers of deaths and injuries from accidents can be vastly reduced the more the systems can be controlled and the less we control them. I know this concept is bound to offend a lot of people (rationalization, again) but it's true nonetheless.

Solving these problems will not be easy or it would be done already. But they also aren't impossible because they haven't been solved already. That supercomputer you carry in your pocket today would've been science fiction less than 20 years ago. Just something to keep in mind as we talk about what's possible.
 
  • Like
Reactions: Klyster
The US Geological Survey estimates 13 million tonnes known reserves. Oddly enough Ford estimates 39 million tonnes known reserves. Now which do you believe? We are using about 0.6 million tonnes per year right now. So the math says 21 to 65 years and then no more lithium, without the cost skyrocketing in order to get it from low density sources such as seawater or recycling. The cost of recycling is currently over twice the current cost of mining and refinement. Most people call lithium a rare element for a reason. So we are ditching an energy source that has 100's of years of availability for one with less than 65 years of very optimistic availability without an engineering breakthrough. Seems costly and not real smart to me. Once we get to electric cars that are not subsidized only the rich will be able to afford them. A set of batteries for a Tesla will run $15,000+ and have to be replaced every 6 years once the subsidies go away. Of course then it will be too late to chose as there will not be any low cost alternatives.

Your whole theory depends on every scientist and engineer to stop researching battery chemistry. In ten years we may not even need lithium.
 
I'm very aware that Apple have always done a ton of research, I bought my first Mac in 1987 and have not switched since. The reality is, their product line is now bigger that it ever used to be and makes me wonder if that is the reason they are starting to make mistakes. Apple have had a number of hiccups along their journey as you would expect but there has been far more recently.

As for the argument against the relevance of a Mac, quite frankly it's a moot point right now because they still sell and people still use them. Yes, we are moving into a 'mobile crazed technology world' but for those that need to do professional work (studio production) the Mac is still an imperative tool and I can't see that changing anytime soon. When referring to software development, animation, video etc, the Mac is still the goto machine and is not replaceable with tablets at this time.

So for those that mention (including Tim Cook) that an iPad can replace your computer then it's important to realise that theres's more to some people's work than Word, Excel and Mail!

I never said the Mac wasn't an important tool, but that it's becoming less relevant. Clearly, there needs to be a transitional period as iPad matures. But considering Intel's slow progress and Apple's focus on the future of computing, no one should be surprised by Apple's slower Mac upgrade cycles. An imperceptible 10% increase in performance just isn't that exciting.

As for the expanded product line, Apple is literally 100x bigger than just a decade ago so that's not surprising. Even the iPod had 4 models with various color options and the market for those were far smaller.

And considering the significantly larger user base, I think the mistakes are on par with the past, though that's purely my opinion.
 
No. I brought up that morality is not taught in driving school to point out that morality is not considered relevant to driving.



Note that this is a completely different argument about self-driving cars that the one I was responding to.

How to assign blame in a complex, real-world case when something goes wrong is not any different when self-driving cars are involved. It's certainly a new wrinkle, but there have always been new wrinkles. We'll work out the norms for this like we always have (some combination of litigation, regulation/laws, and extension of existing norms). It's hardly an unworkable problem that will stymie the progress of self-driving cars. Back in my post I said self-driving cars just need to be safer than human-driven ones. That's significant to the question of blame as well: fewer crashes means less blame to spread around, which makes this a relatively minor issue.

You sound just like a tech guy, if I may say so.

The problems of self-driving go far beyond the death rate. It would be easy, in theory, to achieve a death rate of 0. Just limit the speed of all cars to 0.5 mph. But safety is probably not the biggest issue of self-driving; rather, it's boring matters, like costs, infrastructure and road user conflicts. Adequate safety would almost certainly be impossible to achieve, incidentally.
 
Your whole theory depends on every scientist and engineer to stop researching battery chemistry. In ten years we may not even need lithium.
I'm not holding my breath, it's been 30+ years since the birth of the modern lithium battery and we haven't found a superior replacement that can beat it lifespan, weight, energy density, and cost. People in the 50's thought we'd be using atomic batteries by now.
 
  • Like
Reactions: Benjamin Frost
Without Jobs Apple is no longer innovating but merely releasing news to prop up their stock but not following through.
Do you really think if Steve Jobs was running the show, he's spend all that money and effort on building an Apple Car?

I think Apple found themselves unable to deal with companies like they usually do, i.e., bully them into strong contracts for apple. Car manufacturers are a multibillion dollar business, there's no way Apple could wave some money and have suppliers jump, because the scale of business they're talking about pales to the scale of GM and Toyota.

They were out of their league and out paced by the likes of Tesla, and Google. Then of course was the very un-apple like news of infighting and sniping that hurt the project
 
  • Like
Reactions: Benjamin Frost
And all the people they took on to work on this will simply be..... well Apple wont give a hoot so long as they are not there any more.
 
Do you really think if Steve Jobs was running the show, he's spend all that money and effort on building an Apple Car?

I think Apple found themselves unable to deal with companies like they usually do, i.e., bully them into strong contracts for apple. Car manufacturers are a multibillion dollar business, there's no way Apple could wave some money and have suppliers jump, because the scale of business they're talking about pales to the scale of GM and Toyota.

They were out of their league and out paced by the likes of Tesla, and Google. Then of course was the very un-apple like news of infighting and sniping that hurt the project

This assumes the "Apple Car" rumors were accurate in the first place. I never believed them, so to me, this "news" is about the cancellation of a project that probably never even existed.
 
I'm not holding my breath, it's been 30+ years since the birth of the modern lithium battery and we haven't found a superior replacement that can beat it lifespan, weight, energy density, and cost. People in the 50's thought we'd be using atomic batteries by now.

People thought we hit peak oil 30 years ago too, and people in the 50s thought everything would be nuclear powered.
 
If you do the math, you can see Tesla has already delivered 53712 through three quarters this year. They plan to deliver 25500 in Q4 which brings it to 79212. I expect them to beat 80K easily. The rumor is that they will announce Autopilot 2.0 this Wednesday (10/19). That will definitely bring in new buyers.

57/3 is about 17k per quarter. They might be able to increase production by 50% but I will believe it when I see it.

As far as GM goes, they don't seem to believe in their product if they are only limiting it to 50K units. Building battery capacity doesn't happen overnight.are.

Alternatively, they don't see demand for more than 50k cars a year.

I recall how Apple got a leg up on the competition by locking down all the flash inventory as the smartphone market was about to take off. That's what Tesla is doing with the gigafactory. This is how a company takes market share.
Except Tesla is just one company that can make batteries. If the demand is there others will ramp up. It takes more than angigafactory to take market share in the car business. Tesla lacks the dealer and service network, let alone the charging stations to make long trips viable, which may prove to be bigger constraints than battery production to grow to any size that actually matters in the car industry.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.