Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I have a serious question that I'd very much appreciate a full range of answers on.

Firstly I LOVE computer equipment getting faster.
It's a personal joy to see things getting quicker and quicker in a noticeable way each year.
I'm sick of Intel offering so little speed bumps on main CPU's over the past years, every since Sandybridge.

So I LOVE the way Apple keep pushing and pushing.

My BIG question is this.

How much software, and what software is out there now, that actually really needs/and/or takes advantage of this power?

With a desktop, you see smoother games, what was a jerky framerate, becomes a smooth one.
Or you not get smoke, shadows, lighting effects you did not have on the old games.

Are programmers REALLY taking full advantage of the very latest hardware that Apple are making for us all to enjoy?

Or are Devs, just stuck back in time, scared to make (let's be honest, GAMES!) that NEED the latest and best from Apple to run well, as they don't wish to lose sales to owners of older phones?

I want the latest and best in hardware, but I hate it, when software writers don't take FULL advantage of the hardware.

So please, I ask you. Can you give me an idea of what the situation is here with the latest and greatest that Apple offers us all?
 
You misinterpreted AngerDanger's post. You see, there are actually four cameras, which all merge together to create a coherent and beautiful image.

Though the initial prototype was a little bulky (see below), it's sorted with this new design.

post-2084-0-09588800-1425315928.jpg

Great. So a finger over one lenses inputs a dark smudge into the pic. Cue forum posts smudgegate blackthingygate mypicsgate
 
  • Like
Reactions: Keirasplace
whike the 6S & 6S plus are exciting
They're not land breaking unless there's a feature that we don't know about yet which is possible
 
If there were more cores, then iOS developers would quickly learn to take advantage of them.

Correct multi-core multithreaded coding turns out to be harder than most programmers (especially self-taught ones) think. The vast majority of app developers either can't do it, or write apps that only use one core anyway.
 
umm.... don't mention 'wafers'

These upgrades are looking so identical, u'll soon won't be able to tell them apart.
 
I will admit, I'm pretty shocked (and it sort of tells me something BIG and bad) that not 1 person has been able to give any replies to my posting just a few posts away about "what apps take full advantage of the great hardware that Apple has offered us"

To be honest, I was expecting a LOT of replies, with lists of great apps/games that took full advantage of the latest Apple models, the Metal programming and really blew the lid off what quality and performance dev's are showing off.

Instead, total silence.

That's pretty worrying, and i will admit, does totally make me feel that, as I suspected it's not happening.
Devs are not wishing to push the edge too much, and are after mass market sales across many years of Apple hardware, and not make the very best of the very best Apple has given us to use.

Honestly... No reply was the last thing I wanted to hear, esp on such a BIG forum like this :(
 
I have a serious question that I'd very much appreciate a full range of answers on.

Firstly I LOVE computer equipment getting faster.
It's a personal joy to see things getting quicker and quicker in a noticeable way each year.
I'm sick of Intel offering so little speed bumps on main CPU's over the past years, every since Sandybridge.

So I LOVE the way Apple keep pushing and pushing.

My BIG question is this.

How much software, and what software is out there now, that actually really needs/and/or takes advantage of this power?

With a desktop, you see smoother games, what was a jerky framerate, becomes a smooth one.
Or you not get smoke, shadows, lighting effects you did not have on the old games.

Are programmers REALLY taking full advantage of the very latest hardware that Apple are making for us all to enjoy?

Or are Devs, just stuck back in time, scared to make (let's be honest, GAMES!) that NEED the latest and best from Apple to run well, as they don't wish to lose sales to owners of older phones?

I want the latest and best in hardware, but I hate it, when software writers don't take FULL advantage of the hardware.

So please, I ask you. Can you give me an idea of what the situation is here with the latest and greatest that Apple offers us all?

Hi piggie, just want to clarify some things. Long term Geek here who follows CPU's and PC / Computer development. this is just my caveat that I am not a CPU engineer, so if someone has better information, I'm happy for it.

First:
in regards to Intel's diminishing returns on their higher end architectures. This is an unfortunate limitation that seems to be cropping up due to physics limitations with how transistors in CPUs work.

For CPU's to get noticeably faster in the x86 realm, there are generally two methods. first, is a system redesign. Putting more transistors in place to do more work faster. And more kinds of work. however, a limitation to this is energy loss through heat dissipation. When a transistor in a CPU works, there is always a tiny bit of that electrical energy that gets lost to the air in heat. As more transistors are put into place to do more instructions, they will in turn generate even more heat. Intel learned the hard way during the Pentium 4 era, that there is only so much heat a computer package can safely emit before it becomes uncomfortable for the users (they became mini heaters!) and you risk damage to components.

To counter the heat issue, and allow for less energy to be transferred without leaking heat, while also allowing for more transistors to fit on an existing package, we undergo what is commonly known as a process shrink. you probably have often seen the term "14nm" and "10nm" or similar thrown around. What this actually refers to is the size of the gap between the nodes of a transistor. This gap is often filled with a resister, and in the CPU case, Silicon. as we get smaller, and this gap shrinks, there is less heat loss (cooler transisters).

But we have started running into a physics problem. as we get smaller and smaller in our "gap", we start overcoming the resistive properties of silicon. the gap just is too small and the nodes may 'short out'. which renders the transistor useless. this has cause a dramatic slowdown in the ability to make much smaller transistors. Basically, until we find a replacement for silicon in our CPU's, there is a limit to how small they can go. Which in turn, puts a limit on how much more transistors we can seemingly add without pushing the heat boundaries again.

Because of that, The high end chip makers (especially Intel) have hit a point of diminishing returns. we're just not seeing these tremendous leaps in performance any more. And likely wont for a while, until we can change our very thoeries on transistors (Quantum computing for example ads a 3rd state to a transistor)

second Point:
I love what Apple has managed in the CPU space, but it too is highly dependant on the above. The difference is, and why Intel and the other ARM cpu's are gaining so much performance so fast, is because they come at CPU's from a different angle. Where Intel started from the high point and have had to become cooler and more efficient, the ARM cpu's started as insanely low powered, low energy chips, and have worked their way UP. So we are seeing absolutely tremendous leaps all at once as they start venturing into the more heat / more power world that intel used to be king in.

However, they are still significantly far behind in providing extremely powerful CPU's. but, then again, that's not their goal. They are meant to be far less complex, far more single purpose, and far more power efficient.

if you were to do a clear benchmark for example between the fastest ARM CPU and the fastest x86 CPU, there would be no competition. the x86 CPU at this point would be leaps and bounds more power.

Intel on the other hand has had a dramatic focus on scaling down to compete with ARM. thats where their biggest R&D seems to be these days.

Thirdly:
we have absolutely hit a point in tech where the CPU architectures, are no longer the bottleneck on their platforms. Software seems to be the primary driver. Part of this, especially in gaming IMHO was having the last generation consoles around for far far too long. game dev's were mostly programming to the common denominator, being consoles. 10 year old hardware by the end of it. This was evidenced when we saw games released even in the last couple years, that barely scratched the performance of modern PC's or even mobile devices! They weren't all that incentivised to get better, because they had these really old consoles that weren't being replaced and couldn't power the newer higher end development we wanted. Thankfully, that has started to change in the last year, and games like GTA V, Dragon Age Inquisition, have really started pushing more and more the boundaries of modern hardware. This should eventually trickle its way downwards to everything from Console to mobile.


I know, long post, But i really hope that helps answer your questions. :)
 
Honestly... No reply was the last thing I wanted to hear, esp on such a BIG forum like this :(

Piggie, I answered above as soon as i saw it. and you're right. I've had a few legit technical questions go completely un-answered before on these forums.

its a telling state that the majority of the posters on this site are not really geeks or techies but, here's the evil word "Fanboys" who just want to follow along with what Apple is doing, (yes yes, haters too!) and aren't generally interested in the industry at hole. Many don't understand the fundamental concepts behind computing technology and are just looking to see what Apple is selling. They listen to the keynotes and the words said, and take it as fundamental truths and never bother asking the questions (like you did, asking questions like you did is fantastic)
 
Will buy if there is 2GB of RAM. Tired of iOS not being able to handle 2-3 apps open at the same time.

I wrote the same thing last year but I really wanted the bigger display so I sold my iPhone 5 and get the 6. This year they better put 2GB and triple core on their SoC otherwise I'll stick with my 6.
Since they put 2GB on the Air 2 we can imagine they'll do the same in their next line of products, but you never know...
 
Piggie, I answered above as soon as i saw it. and you're right. I've had a few legit technical questions go completely un-answered before on these forums.

its a telling state that the majority of the posters on this site are not really geeks or techies but, here's the evil word "Fanboys" who just want to follow along with what Apple is doing, (yes yes, haters too!) and aren't generally interested in the industry at hole. Many don't understand the fundamental concepts behind computing technology and are just looking to see what Apple is selling. They listen to the keynotes and the words said, and take it as fundamental truths and never bother asking the questions (like you did, asking questions like you did is fantastic)

Thank you for the reply, and yes, I sadly think you are right.
Perhaps it's different in some other areas of these whole forums, but perhaps the iPhone and iPad areas are more about "Style and Coolness" as opposed to anything more technical, so perhaps the wrong group to ask these type of questions of.

I know Intel etc, have hit some limits of how fast they can push, but I will admit, I don't actually believe they are as stuck as we think they are.
Let me put it this way..... If AMD suddenly came out with THE best CPU and GPU. Unveiled it, out of the blue. I would guarantee you, as if by total magic, Intel would be able to counter it.
I do honestly believe, whilst I know there ARE limits, they also as a company wish to make the most money from the least effort possible, and without any competition they are just making the most of it.

As you say low power is where it's at right now, simply because that's where the demand and money is, and the money men, follow the money.
The days of tech firms doing it, because they are tech guys, and want to push tech simply for tech sake I fear are a distant memory.

I always want the latest and greatest, and have done since I chose the Amiga a few years ago :)
The point is, now, and back then also....... I want the software to take full advantage of my hardware.

Hence my question. Is there any iPhone / iPad software (lets call it games as it's games which push hardware - or should do!) that run great on the 6 and 6+ and Air 2, but pretty bad on anything lesser?
 
couldnt really tell ya, I love having the latest greatest cause of "coolness" of having something to play with.

but I'm not a mobile gamer. outside of benchmarking a few, don't really pay attention to it. I think one reason we might not be truly pushing the CPU/GPU boundaries on i-devices is because of the nature of the games that lend themselves to touch screens and mobile devices.

they tend to be more puzzle, push and click, than fast paced high graphic titles.
 
couldnt really tell ya, I love having the latest greatest cause of "coolness" of having something to play with.

but I'm not a mobile gamer. outside of benchmarking a few, don't really pay attention to it. I think one reason we might not be truly pushing the CPU/GPU boundaries on i-devices is because of the nature of the games that lend themselves to touch screens and mobile devices.

they tend to be more puzzle, push and click, than fast paced high graphic titles.

Would it not be nice however for Apple do try and do what Microsoft are sort of attempting to do with their phones?
How about a future where the phone could run OSX also?

Get home, after using your phone at work, or out and about.

Slip the phone into your recharging dock on your desk, the dock is connected to a large 24" or 27" monitor, and the phone then flips into OSX mode so you can carry on your work if you wish, as if the phone was your main computer.

The phone could be both.

Apple won't do this as it would damage sales, but it's only a matter of time before it would be possible technically.
 
Would it not be nice however for Apple do try and do what Microsoft are sort of attempting to do with their phones?
How about a future where the phone could run OSX also?

Get home, after using your phone at work, or out and about.

Slip the phone into your recharging dock on your desk, the dock is connected to a large 24" or 27" monitor, and the phone then flips into OSX mode so you can carry on your work if you wish, as if the phone was your main computer.

The phone could be both.

Apple won't do this as it would damage sales, but it's only a matter of time before it would be possible technically.

I would love that. Ubuntu tried a kickstarter to do that and unfortunately failed.

one device for everything is a cool idea, but would hurt cloud services to an extent. its' a different paradigm entirely. Haven't tried Win10 on my Surface Pro to see just how useful it is to really judge. looking forward to trying though
 
... Perhaps the source means they reworked the wafer design (by changing the reticle that contains device die), which added some delay.

Or that it really isn't "rework" but making tweaks in how the dies are processed to get the yields up.

Moreover, I still remain skeptical of TSMC and Samsung co-production on a leading edge process. Their library processes are inherently different and Apple would essentially be designing two chips simultaneously.

Depends upon the scope of "co-production". If "A9" (the next generation) thought of as a class of both the iPhone ( dual ) and iPad ( three core) iPad chips (e.g., the "A8" covers both the A8 and A8X instances ). It would make sense if Apple was splitting the iPad and iPhone chip production into separate lines that share same basic micro-architecture design.

Two reasons. One is sheer volume. The iPhone keeps growing bigger and bigger each year. Apple needs more and more wafers in foundries which have other clients clamoring for wafers too. If can pull the iPad demand out there is more available wafers at a single vendor with the same quota assigned to Apple. For example the iPad Mini 3. Why didn't it also get an A8 with the iPhone. Did Apple run out of available capacity? Instead the Mini 3 got the same thing the Mini 2 had.


Second, the iPad doesn't have to restricted to just being a big iPhone. For example Apple is adding concurrent tasking , multiwindow GUI applications to the iPad in iOS9. The kinds of app workloads on the iPad may start to diverge from the iPhone. Enough so that Apple assign a CPU+GPU development team to track the iPad. The iPhone keeps a more phone targeted CPU+GPU team (e.g., one app visible at a time). ( the iPod , AppleTV , etc. can just get trickle down iPhone work).

I stress that this is something that has never been done before for a leading-edge CPU before.

For a singular CPU product instance yes. But families have been split. ARM is implemented on a couple of processes, but they are exactly the same instances.
 
Last edited:
Just to add...

I REALLY wish Apple had more vision for the iPad.

All they have done with it, in the last 5 whole years, in reality is just treat it as a giant iPhone, with pretty much the same, silly user interface just expanded over a 10" screen.

It's so crazy and sad at the same time.

The bizarre thing is, Apple make rules for devs way way back in the beginning that banned devs from just taking a phone app and expanding it to fill the 10" screen and do little else.
And yet that is EXACTLY what Apple themselves did for the user interface on the iPad.

It's a crime that Apple have had such little imagination with what to do with all that 10" screen real estate than to just plop a few icons on a sea of wallpaper, and that's it.

such a waste, and given Apple and how they are supposed to be at the cutting edge of design and innovation, it's so sad that they have done nothing else.
And yet, it seems they now wonder why sales are drying?

Come on Apple. It's been 5 long years.. PLEASE Do Something.
The iPad deserves far far FAR better.
 
5 years ago to be honest it was a much important step as it was really the only way to get a usable experience on a tablet.

previous tablet iterations that attempted to run desktop OS's failed miserably because the OS's weren't quite so intuitive. one of the selling points of the iPad was that it would function exactly like those popular iPhones. no learning curve really.

But you're right. that was the extent of the vision it seems. since then, the ipad hardware has advanced very nicely, but it's still being constrained heavily by iOS's limitations. Still fundamentally a "phone os" being used on alarger device.

we've seen in the competing camps, Android and Windows, a lot of unique ways of using the larger real estate with better OS's designed for the bigger screens. many people complain about "metro" for example on a desktop (and maybe rightfully so), but if you've ever used the Windows 8 on a tablet, the experience is actually quite well thought out. Android, while not perfect, also has some unique tablet oriented stylistic changes and features, with the forked android devices getting even more creation (Samsung has some seriously interesting things in some of their tablet lines)

I think this is why a lot of people are waiting eagerly for the iPad pro, with some hope that Apple has brought something new to the line, more than just iOS
 
Or that it really isn't "rework" but making tweaks in how the dies are processed to get the yields up.

That would be a foundry-initiated endeavor. The piece talks like Apple requested the change.



Depends upon the scope of "co-production". If "A9" (the next generation) thought of as a class of both the iPhone ( dual ) and iPad ( three core) iPad chips (e.g., the "A8" covers both the A8 and A8X instances ). It would make sense if Apple was splitting the iPad and iPhone chip production into separate lines that share same basic micro-architecture design.

Two reasons. One is sheer volume. The iPhone keeps growing bigger and bigger each year. Apple needs more and more wafers in foundries which have other clients clamoring for wafers too. If can pull the iPad demand out there is more available wafers at a single vendor with the same quota assigned to Apple. For example the iPad Mini 3. Why didn't it also get an A8 with the iPhone. Did Apple run out of available capacity? Instead the Mini 3 got the same thing the Mini 2 had.


Second, the iPad doesn't have to restricted to just being a big iPhone. For example Apple is adding concurrent tasking , multiwindow GUI applications to the iPad in iOS9. The kinds of app workloads on the iPad may start to diverge from the iPhone. Enough so that Apple assign a CPU+GPU development team to track the iPad. The iPhone keeps a more phone targeted CPU+GPU team (e.g., one app visible at a time). ( the iPod , AppleTV , etc. can just get trickle down iPhone work).

It is possible that they could optimize the iphone to one process, then use the extra months to finalize the same design on another process and live with it being less optimized in the iPad. The delta between the two is so small though in terms of schedule.



For a singular CPU product instance yes. But families have been split. ARM is implemented on a couple of processes, but they are exactly the same instances.
That's for the reference design. We're talking about a custom design with aggressive times to market.
 
5 years ago to be honest it was a much important step as it was really the only way to get a usable experience on a tablet.

previous tablet iterations that attempted to run desktop OS's failed miserably because the OS's weren't quite so intuitive. one of the selling points of the iPad was that it would function exactly like those popular iPhones. no learning curve really.

But you're right. that was the extent of the vision it seems. since then, the ipad hardware has advanced very nicely, but it's still being constrained heavily by iOS's limitations. Still fundamentally a "phone os" being used on alarger device.

we've seen in the competing camps, Android and Windows, a lot of unique ways of using the larger real estate with better OS's designed for the bigger screens. many people complain about "metro" for example on a desktop (and maybe rightfully so), but if you've ever used the Windows 8 on a tablet, the experience is actually quite well thought out. Android, while not perfect, also has some unique tablet oriented stylistic changes and features, with the forked android devices getting even more creation (Samsung has some seriously interesting things in some of their tablet lines)

I think this is why a lot of people are waiting eagerly for the iPad pro, with some hope that Apple has brought something new to the line, more than just iOS

Yes.

And I have said this in the past.
I do totally understand why, on day one the iPad had to look like a GIANT iPhone.

1: Apple did not know if it was going to be a winner so no point in devoting time to a brand new OS and UI for a potential flop.

2: Apple wanted to make sure users (let's be honest Apple = a lot of non technical users) were not scared off by a new, unfamiliar and confusing interface.

I totally GET and understand both of those points. But as I said, this is distant history (5 years is a LONG time in computing)

And now 5 years later, other than a bit of polish here and there, it's exactly the same as day 1.
Pretty unforgivable that they have just it rot a bit, and done nothing to make the most of the large 10" screen over the phone screen.

Yes, iPad Pro.
One would hope big advances ahead.
My worry is Apple = total lack of vision and we will see the exact same UI just with more icons.

I'm ready for a change (Been ready for years!)

The joke about how many icons in the window on the 10" model was simply a joke, which I know they are just making a little better.
But it really does make you wonder how a group of such talented people can sit around the table, look at an iPhone interface on a 10" screen, look at each other, and not even think, perhaps we are not making the most of this.

Perhaps the danger is simply, they are so convinced they are right to leave it alone, that nothing else than dramatic falls in sales will make them even contemplate anything more.
 
for a split second I thought you were talking about blackberry.
I quit the wrong week to quit sniffing glue
 
Are those socs not ball soldered on but still using pins? I hope so, that means they're easily replaceable.
 
Moreover, I still remain skeptical of TSMC and Samsung co-production on a leading edge process. Their library processes are inherently different and Apple would essentially be designing two chips simultaneously. I stress that this is something that has never been done before for a leading-edge CPU before...

Apple has double sourced custom chip designs before. They mint enough cash to hire 2 (or more!) complete back-end design teams, and given a hardware variant of Brooks Law, two fully staffed tape-outs might actually be safer than throwing all the bodies at just one.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.