Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Apple will be crippling any A12 gains by giving the iPhone the least amount of system RAM of all rival flagship Android smartphones. iPhone 3GB to many other smartphones sporting 6GB to 8GB of RAM. That's totally messed up on Apple's part. Almost no one will see the A12's edge over Qualcomm and Samsung processors in everyday use. Nowadays, social apps are being valued much higher than any hardware Apple can ever hope to offer. Qualcomm will never give in to Apple in terms of processor power, so it will be a very costly war costing Apple huge amounts of money for marginal gains.

IMO Android have sort of dug their own grace as an OS with this. RAM is one of the most cheapest components and Android really isn’t optimised. Yes you rightly say there’s little real world difference between an Android device on 6GB RAM and an iOS device on 3GB RAM, but that’s part of the problem.

Android OEMs largely play to tech specs, so chucking in more RAM is an easy and cheap way to one-up the competition. However as more Android manufacturers do that, it becomes the norm and whilst the latest version of Android would sail on those specs, it would vastly struggle on a device with half the RAM. The need for memory optimisation is less because the OEMs just aren’t shipping low memory devices.

Throwing more cores and more RAM into a device to improve performance is a fundamentally flawed technique. If you don’t address the way an OS manages resources, you’ve lost the long game. Furthermore, consumers will learn to expect it and even if the OS is substantially improved, they’ll think “oh, only 8 cores? Only 6GB RAM? That’s what the last phone had. It can’t be quicker.”

Yes, as much as a lot of Android users love to boast about specs and Apple users being sheep, a vast number genuinely think that more cores = better performance and more RAM = better phone. That’s just not true. The arrogant users have a passing knowledge of technology which boils down to more is better. OS optimisation with a standardised list of hardware is so incredibly important for performance and frequently understated.

Apple will be crippling any A12 gains by giving the iPhone the least amount of system RAM of all rival flagship Android smartphones. iPhone 3GB to many other smartphones sporting 6GB to 8GB of RAM. That's totally messed up on Apple's part. Almost no one will see the A12's edge over Qualcomm and Samsung processors in everyday use. Nowadays, social apps are being valued much higher than any hardware Apple can ever hope to offer. Qualcomm will never give in to Apple in terms of processor power, so it will be a very costly war costing Apple huge amounts of money for marginal gains.

By the way, I ****ing love DS9 and I ****ing love your username so I’m sorry to so vehemently disagree with you on this point.
 
Last edited by a moderator:
... mass production on Intel's 10 nm process won't be coming until at least 2019 now.

Also, not sure I understand the question about the GPU. Intel has integrated the GPU on the same chip since their sandy bridge chips in 2011 which was made on a 32 nm process.

One reason for the delay is that they can't integrate the GPU on the 10nm CPU - anybody knows the details, do enlighten us here.
 
I'm more interested in when we'll see advancements in battery technology.

As am I, but I think the mitigating factors that will help battery efficiency until we see dramatic advancements in battery tech, is more efficient processors and even with the oncoming of micro LED in the next two years or so, will greatly reduce the amount of power draw from the battery, which would prolong the battery even more. But as far as brand new battery technology as a whole, I still think that’s three to five years minimum before we see anything like that.
 
Wow. Single digit nanometer processes... The cleverness and industry of humanity never ceases to amaze me.

I never heard about nanometer sizes in the Commodore 64 days. It must have been huge.
Best I can tell, it had feature sizes on the order of 1000 times bigger.

I'm actually having a hard time finding the process size of the MOS 6510 used in the C64 at the time of release. It looks like the VIC-II video controller was 5µm. The previous generation processor, the 6502 was used in the Apple II and it looks like it started with an 8µm process.
Lol. They’ve been saying since 90nm that we are near the quantum limit.
I thought the issue at 90nm was about the lithography-- that the process was suddenly going to get very expensive because they couldn't use UV to mask the chips anymore. Advancements in diffraction imaging kept the process alive, if I recall.

Or I could be remembering that all wrong... The point is the same though-- there semiconductor literature has been an endless stream of doom for decades. Sooner or later, I imagine, the predictions will be right.
I'm curious as to why TSMC hasn't been acquired by Apple. Seems like a great way to bring everything in house, reduce costs and protect the technology from competitors.
This would be a really bad business decision. Apple doesn't come close to using the full capacity of TSMC, so the only way to finance the ongoing operations would be for Apple to become a semiconductor supplier to others which is a business model they have no idea how to pursue-- razor thin margins for products with no human interface isn't what Apple's good at.

By designing their own parts but relying on outside fabs they maintain control of the functionality but can always benefit from the lowest cost manufacturing. If TSMC stumbles next year and can't get to 5nm, or if some other technology comes out of nowhere to dominate processor manufacturing, Apple can take advantage of it. Apple's volumes are such that they can get pretty good terms from whoever they buy from.
 
Because, of course, an iPhone is not an “actual” computer. Funny. As they say, “The best camera is the one you have with you” . . So goes it with the computer, and Apple has understood that with mobile computing better than the rest.
Look, I need to update my MacBook for doing actual work, and a phone is no serious replacement in any way.
 
Well, I'll be glad when the notch goes away. We only have it because it is a compromise, and hopefully it'll be gone by the end of next year.
 
Apple will be crippling any A12 gains by giving the iPhone the least amount of system RAM of all rival flagship Android smartphones. iPhone 3GB to many other smartphones sporting 6GB to 8GB of RAM. That's totally messed up on Apple's part. Almost no one will see the A12's edge over Qualcomm and Samsung processors in everyday use. Nowadays, social apps are being valued much higher than any hardware Apple can ever hope to offer. Qualcomm will never give in to Apple in terms of processor power, so it will be a very costly war costing Apple huge amounts of money for marginal gains.
I think you're remembering comparisons of performance on the same system using different amounts of RAM and trying to generalize that to compare two very different approaches to memory management.

Most of this boils down to garbage collection in Java.

It's two different approaches to system design. Android felt that Java and its garbage collected memory management would make it easier for people to develop apps for their products. Apple has been moving from the more traditional developer controlled memory management to better automation of memory management through smarter compilers.
 
  • Like
Reactions: Mescagnus
I'm holding out for 2nm. It really is amazing how small they are getting.
Looking forward to it. Wish the notch was only 7nm though.
There has to be an ultimate limit. 0 isn't possible but, as an engineer myself, I know that there is a limit that is "close enough".
 
Well, I'll be glad when the notch goes away. We only have it because it is a compromise, and hopefully it'll be gone by the end of next year.
What technology do you see on the horizon that would make the notch go away? We're dealing with basic physics here: how do you put cameras/sensors behind a screen (whether LCD, OLED, or micro-LEDs) that needs to display information and emits light? Perhaps the screen will have "missing" pixels - i.e. holes through which a camera/sensor in back can receive incoming light? But how much light can you actually capture even if you poke multi-pixel sized hole(s)?

The way I see it, there are only two alternatives: a pop-up camera/sensor arrangement (I think a manufacturer just announced such a beast this week!) - which comes with a myriad of much worse problems (mechanical parts that can break; additional water penetration points; clumsy to hold landscape; difficul to protect with a case) - or forget about bezel-less design and go back to having a bezel - albeit much smaller than before.
 
I mean this is getting just ridiculous. Meanwhile, actual computers are neglected for years and years.
I think the reason is Computers are not constrained by space, if the processor needs more power they can add larger battery, phones are constrained by space, and Computers have good enough processors that work for almost every one.
[doublepost=1529697202][/doublepost]
Apple will be crippling any A12 gains by giving the iPhone the least amount of system RAM of all rival flagship Android smartphones. iPhone 3GB to many other smartphones sporting 6GB to 8GB of RAM. That's totally messed up on Apple's part. Almost no one will see the A12's edge over Qualcomm and Samsung processors in everyday use. Nowadays, social apps are being valued much higher than any hardware Apple can ever hope to offer. Qualcomm will never give in to Apple in terms of processor power, so it will be a very costly war costing Apple huge amounts of money for marginal gains.

I can't wait to see 32 GB RAM on android phones, which will make them super fast. Seriously my laptop has 64 GB RAM, I use my phone through out the day, should have more RAM than Laptop.
Sarcasm ...
 
  • Like
Reactions: keysofanxiety
What technology do you see on the horizon that would make the notch go away? We're dealing with basic physics here: how do you put cameras/sensors behind a screen (whether LCD, OLED, or micro-LEDs) that needs to display information and emits light? Perhaps the screen will have "missing" pixels - i.e. holes through which a camera/sensor in back can receive incoming light? But how much light can you actually capture even if you poke multi-pixel sized hole(s)?

The way I see it, there are only two alternatives: a pop-up camera/sensor arrangement (I think a manufacturer just announced such a beast this week!) - which comes with a myriad of much worse problems (mechanical parts that can break; additional water penetration points; clumsy to hold landscape; difficul to protect with a case) - or forget about bezel-less design and go back to having a bezel - albeit much smaller than before.
I think companies have been working on putting cameras behind screens for a couple years now.
 
Dumb question, what does 7nm mean. Is that the size of one nand gate in the chip? I understand that smaller makes it faster and less power is needed, but everything I read talked about node size.
 
As am I, but I think the mitigating factors that will help battery efficiency until we see dramatic advancements in battery tech, is more efficient processors and even with the oncoming of micro LED in the next two years or so, will greatly reduce the amount of power draw from the battery, which would prolong the battery even more. But as far as brand new battery technology as a whole, I still think that’s three to five years minimum before we see anything like that.
Yes, and microLED is easily three to five years out as well. I don’t think we’ll see significant improvements in battery tech in that time period either.

In both cases I’d be looking more at 5-10 years out.
 
Last edited:
I think companies have been working on putting cameras behind screens for a couple years now.

Telling me that is pretty uninformative. What companies? How do they get around the basic conundrum of needing to receive visible light through a medium (the screen) that isn’t transparent?
 
According to a DigiTimes report last year, TSMC's integrated fan-out wafer-level packaging technology -- which the supplier uses in its 7nm FinFET chip fabrication -- is largely superior to any progress made by Samsung in the same field, which eventually led to Apple's decision to stick with one supplier for all of its processors again this year.

Lol, you guys should really do some research. Samsung is currently the one with the superior process for 7nm and beyond. Not a doubt in my mind, Apple will rely on Samsung's world dominating innovation as usual. TSMC will eventually get to EUV.

Source
 
Last edited:
I am not an expert here (and did not stay in a Holiday Inn last night) but the literature on this topic almost universally talks to about 4nm being the ceiling (floor?) for this race. It does not offer changing geometry as a solution to this problem. As my limited knowledge of this particular topic perceived, below about 4nm and there is "bleed" where binary states are lost and/or flowing electrons jump lanes, either of which results in states that will make accurate computing fail.

My (perhaps poor analogy) would be worrying about the size of on and off light switches, so companies keep shrinking and shrinking them. What used to work with a finger tip flipping a plastic "handle" becomes too small for fingers anymore. At some point, we're bending paper clips to insert a point into a hole like we may do to pop a sim card holder out of a phone. And then it gets smaller than that and we're needing something thinner than a paper clip to toggle a light on and off. And then smaller. And then smaller still. Eventually, the size is reduced down so small, you no longer have a reliable way to turn that light on (binary 1) or off (0) without accidentally turning other lights on and off.

Again, perhaps very poor analogy- but the concept remains the same (as perceived by me). The literature on the topic says that size shrinks to this maximum (minimum) such that reliability of state is in jeopardy. Anyone who knows anything about programming knows that if some rouge program flips some 0s to 1s in some other program's code, that other program is going to probably crash/fail/exhibit wonky behaviors. I read the literature on this topic like below about 4nm and this kind of thing is expected... though I thought I saw something about 3nm possibly being the max (min) instead of 4nm. I've seen nothing in support of anything below 3nm being possible.

Based on all that I've seen on this topic, I have zero expectations for 2nm, 1nm and then fractions of 1nm being possible. For example, I don't foresee rumors of the A18 chip in the iPhone XVs being spun as using the new 0.032nm process. Instead- from what I think I read on this topic- the peak at about 3-4nm is followed by "more cores" as the one way forward. However, just as it is with desktops/laptops, there quickly comes a point where more cores somewhat peaks out too (in short, I'm doubting the iPhone XVs rolls out with the new 24-core A18). As with desktop/laptop cores, eventually you have the proposition of more cores vs. most of them sitting around with nothing to do- thus long after dual cores become a common thing, we're still not seeing 50-core or 500-core PCs.

But one more time: I'm no expert on this topic- just trying to share what I think I make out from reading up on this particular topic. I could be entirely misunderstanding the collective sources and/or maybe Pym particles are around the corner so that 4nm can be shrunk to .4nm to be shrunk to .04nm and so on (with full support of electrons perhaps shrunk to some kind of on/off quarks or similar to be shrunk to some kind of undiscovered fractional quark (or magic) or similar).

The 4nm supposed limit is based on a FET structure. It’s fhe underlying assumption. But there are many other ways to do transistors. HBTs are a good candidate, using SiGe since there’s a nice kink in the band gap due to lattice strain.
 
RE: "which is expected to be found in all three upcoming 2018 iPhones."


If AAPL we're to offer a 4th new iPhone in Sept, a 5" iPhone 8s Plus @ $699, that would be, IMO, their best-selling new iPhone, & it alone could trigger a new wave of enthusiasm in ALL of mobile.

Specifically, I'm saying the Dual Camera 5.5" iPhone Plus shrunk down to 5".

But just like BMW & Porsche, who decided a decade plus ago to transition away from Hydraulic-assist power steering to Electronic-assist power steering, it appears AAPL may be making almost as BIG of a Blunder.

Touch ID & the Hardware Home Button are excellent product attributes, & are strongly preferred by some ... AAPL would be wise NOT to throw ALL who desire them under the bus ... just look what has happened to BMW's once-loved 3-series sedan !

Does Tim Cook even get it ???

Does the guy who conceived the strategy to get Apple stock and profits to historic heights get it??? You lost the Internet with that statement.
 
One reason for the delay is that they can't integrate the GPU on the 10nm CPU - anybody knows the details, do enlighten us here.

This is just a guess, but the GPU has previously been outsourced. I believe it's expected for the A12 to be Apple designed. By doing this they can design it around being on the CPU die itself.
 
This is just a guess, but the GPU has previously been outsourced. I believe it's expected for the A12 to be Apple designed. By doing this they can design it around being on the CPU die itself.

I wasn't clear about this on re-reading; I am talking about Intel's delay in 10nm release being attributed to their issues with integrating a GPU in the new 10nm CPUs. Apple seems to have a handle on the iOS devices' CPUs; it is the Mac products that are getting affected by the Intel delay.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.