Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Who's making 128Gbit chips? I can only find 64Gbit on Micron's website for example: https://www.micron.com/products/dram/ddr4-sdram
My understanding is your not going to find the low power DDR4 stuff Apple uses on any of the memory producers web site because Apple buys it all.

Somewhat like several Intel chips Apple uses are not in ARK - because Apple is the only user of them.

When you are at Apple's scale (especially when the iPhone is thrown into the mix) the normal rules go out the window.

Wish I could find the article that talked about the low power DDR4 they use but it's been over a year now.

Speaking of Micron, they are currently tripling the size of plant that's near me just outside of Washington, DC. Wouldn't that be a kick if it had something to do with Apple. Probably not, but it's an amusing thought.
 
  • Like
Reactions: szw-mapple fan
So far from what I have seen the growing pains are surprisingly minimal. Far fewer than I expected even. They really seem to have hit it out of the park with Rosetta. I mean you can create a shortcut for Terminal, get info on it, tell it to launch as x86 and install Homebrew and other cli tools and so far it seems to be pretty transparent.
Rosetta 2 appears to be performing a conversion during install / initial run than performing emulation as they had to do with 68K to PPC to x86.
 
In a lot of ways, the PC industry is tamed and corralled by a fear of someone else doing “today’s stuff, just faster” while you’re working on “truly groundbreaking, but the word ’break’ is in there for a reason”.
Meh - the PC industry is driven by cost. Period. Race to the bottom. Many thought Jobs being too critical when he spoke of a lack of taste; I think he was being too charitable.

You look at the majority of innovation - mice, the GUI, hardware standards like SCSI, SIMM/DIMMs, USB, native networking (appletalk!), integrated Wifi that mere mortals could use (I think people gloss over how groudbreaking Airport was when it launched), built in ethernet (hello iMac - the original!), CD ROMs then CD ROM writers/ditching the floppy: they shipped in volume or were pushed hard on the Mac first. Apple didn't invent any of that. But they *mainstreamed* it. Real Artists Ship. It's more than a trite phrase.

I shudder to think about what a mundane space computing would be left up to the soulless high volume/low margin PC wizards 🙃

Even if you hate Apple, your computing experience has been enhanced simply because they have been here pushing the envelope. Hell people got pissed with the Apple II coming pre-assembled and ready to use by anyone, not just hobbyists. Regressive elitism still alive and well in many open source communities today :p

Theres been a lot of water under a lot of bridges in the last 50 years. And I got to experience most of it - I think that's what makes this transition somewhat amazing. For all the past change, that there still is the ability to up-end current expectations and deliver such positive and decisive change brings back some of the thrill when stuff like this happened every year instead of every 10 years!
 
Rosetta 2 appears to be performing a conversion during install / initial run than performing emulation as they had to do with 68K to PPC to x86.
They translate the code once for static apps. There is a performance hit on first run, then the translated code is cached. For JIT applications like Java JVM it's dynamically emulated like the original Rosetta.

Both, so far, seem to be faster for most apps than the apps running natively on the machines these M1s are replacing.

Not saying native still isn't preferable, but so far Rosetta 2 seems to be pretty freaking amazing for what it's doing for the vast majority of apps. The terminal/homebrew thing still boggles my mind.

And since these are the low end machines in the Mac universe, if you had a real computationally intensive Pro app these aren't the machines for you anyway.

It's almost as if they planned it or something :cool:
 
  • Like
Reactions: PickUrPoison
Maybe or maybe not, I’m not able to know how others use their stuff.
Well, macOS doesn’t support having two applications in the foreground at once. Once you click on a different window, that’s now the foreground application. So no one is using the UI’s for two separate applications at once.

However - I was responding to your below quote, where you said ‘Does anyone really use more than one application at a time?’

Yes they do. I do daily.
Does anyone really use more than one application at a time?
which was followed with
Having windows “available” isn’t the same as actually typing into one window while touching up a drawing in another while meticulously fine tuning the color in another while scrolling through a webpage in yet another. I’m not even sure the UI handles multiple targets.
You, like everyone, are likely utilizing applications by switching between windows while other applications are running in the background. No one is actively manipulating the UI of multiple applications at once. Well... maybe music producers are? Because they can play their Keyboard while using the mouse to modify parameters? I’m not sure if that counts, though.
 
They translate the code once for static apps. There is a performance hit on first run, then the translated code is cached. For JIT applications like Java JVM it's dynamically emulated like the original Rosetta.

Both, so far, seem to be faster for most apps than the apps running natively on the machines these M1s are replacing.

Not saying native still isn't preferable, but so far Rosetta 2 seems to be pretty freaking amazing for what it's doing for the vast majority of apps. The terminal/homebrew thing still boggles my mind.

And since these are the low end machines in the Mac universe, if you had a real computationally intensive Pro app these aren't the machines for you anyway.

It's almost as if they planned it or something :cool:
Rosetta 2 is essentially transforming x64 applications into native applications which would explain the higher speeds. There's no (or little) emulation occurring. That said I expect a true native application to be even better as it can be optimized in ways which Rosetta 2 likely cannot.
 
  • Like
Reactions: DocNo
I don't think they have a choice. The RAM will have to be external. 8GB per chip is as dense as DDR4 gets right now, as far as I know. So to match the iMac's 128GB option alone would need 16 chips, whereas Apple's photos show two chips taking up a huge amount of space on the package. Getting to 1.5TB like the Mac Pro offers? No chance if it's on the chip.

Perhaps. Intel made custom chips for Apple (Ok, mainly custom packaging, but there is still overhead with each additional SKU) - I wouldn't be shocked if Apple was working with a memory supplier for higher density chips, if it made sense.

Especially if Apple were footing the bill for the design/tooling. Hmm - sort of like they did with TSMC to propel the advancement of the 5 nanometer process.

Speaking of custom packaging - stack those same 16 RAM chips in a new package? They don't necessarily need to re-invent the universe here.

Do not underestimate the power of Apple having total control. Need some wonky memory controller to talk to 16 chips directly at once? If the price/performance pays off Apple can do it if it makes sense and they want to. No more needing to convince a partner to do something that today looks exotic.

And for the high volume/low margin PC boys anything above bare minimum/least common denominator is considered exotic :p

I mean could you see AMD/Intel trying to convince motherboard makers to invest in some new off the wall memory architecture beyond what they see today? Now can you see Microsoft trying to heard all those same cats? Now look at who Apple has to convince - different groups within the same organization?

Nope - no advantage there!

I don't think we will see such a "clean slate" Mac design soon. We may never see one. But if it made sense for the right reasons, there's no reason now that we couldn't see something truly unique. If that doesn't get your geek juices flowing then I don't know what could.
 
Well, macOS doesn’t support having two applications in the foreground at once. Once you click on a different window, that’s now the foreground application. So no one is using the UI’s for two separate applications at once.


Does anyone really use more than one application at a time?
which was followed with
Having windows “available” isn’t the same as actually typing into one window while touching up a drawing in another while meticulously fine tuning the color in another while scrolling through a webpage in yet another. I’m not even sure the UI handles multiple targets.
You, like everyone, are likely utilizing applications by switching between windows while other applications are running in the background. No one is actively manipulating the UI of multiple applications at once. Well... maybe music producers are? Because they can play their Keyboard while using the mouse to modify parameters? I’m not sure if that counts, though.
It’s a bit weird why you don’t understand this concept, but apps can be doing something, user instigated, without being in the foreground or interacted with. Hence, using multiple apps at the same time.

I’m ok if you want to argue with it, but as I said, I do this type of thing daily, so it’s a bit futile and rather bizarre you’re adamantly denying it.

You’re trying to move the goal posts by suggesting that literally manipulating two apps at the same time is using more than one, whereas using one whilst others are working in the background is not.
Have you never realised how multitasking on a computer has worked since its inception? I’m not sure what you’re trying to argue against or trying to prove here tbh.
 
  • Like
  • Love
Reactions: mike... and DocNo
This assume no one else would have pushed innovation.
No one has to the level Apple did/is still doing.

Heck look at Xerox PARC. Everyone loves to talk about how Apple stole from Xerox - while conveniently ignoring that Xerox didn't know what to do with Smalltak, the GUI, the mouse or the Alto systems. They were selling access for ideas, and Apple paid. Hell Jobs berated Xerox management for literally sitting on the greatest innovation in computing at that time. If anything Apple liberated the GUI and Mouse from Xerox (as well as several key people from PARC to work on the Mac).

You think Microsoft would have done that? IBM? HA!

I lived the 80's and 90's. I saw this stuff first hand, up close and personal through a series of happy accidents. I routinely attended Comdex and Interop in the late 80's and 90's since I grew up near Vegas and even stole off to a few Macworlds after friends moved to the bay area and I could crash at their pads (was there for the launch of iTools, missed the Intel Mac's by a year. 'doh!) I pressed the flesh on the show floors, fought in the Netware - Windows NT - OS/2 wars during the gawdawful Spindler/Performa Mac era. When it comes to desktop computing Apple was and still is the catalyst for fundamental change (well, as long as Jobs was in charge in the early days).

And not just desktop computing. Did you ever use Windows Mobile/WinCE? Microsofts big idea for mobile? Destop Windows on a mobile device. Warts and all. Fill up over 1/3 the screen with UI elements that make total sense on a desktop but just clutter up already small and crowded mobile displays. Don't get me started on that damn start button and the persistent task bar. Real innovation there.

Who came out with a phone device with a useable (anyone who mentions Symbian - think about your life choices) mobile UI first? Apple with the iPhone. Yes, Palm was mobile first but it also didn't go anywhere and at the end was passed around like a hot potato before going to Compaq to die an ignominious death at the final hands of HP. That and Palm never had cellular data (at least natively). Ahh HP - the same bastards that killed off DEC Alpha and Itanium. Not that I'm bitter. Want to kill someting? Sell it to HP. We used to think Computer Associates and then Symantec were bad. Ha! HP are the true pro's at stifling innovation. Computer Associates and Symantec may have been greedy and charged you up the a$$ as they consolidated power in vertical segments, but at least they still provided some value. All HP did was kill $h!t. But I digress....

At least StrongARM escaped Compaq/HP's grasp to Intel. Who then promptly sat on it and later sold it for a pittance. Sadly the dolts responsible for that probably aren't still at Intel to feel the proper shame for their lack of vision.

Still want to talk to me about innovation on the PC side of things? Just ask any Windows ARM Surface fan how they feel about the job Apple is doing with their ARM transition. Just have a little distance between you when you approach them. They are a touchy lot 🤣

I shouldn't laugh - maybe the shellacking Apple is delivering and apparently going to double/triple down on may finally light a fire under MS who will then light a fire under someone else. Samsung? Qualcomm? Oh Qualcomm - how fat and lazy on patent revenue you've become. Maybe 5G will snap them out of their patent profit trough. The trough that's going to start getting thinner and thinner. Hunger is a great motivator. Here's to hoping Apple induces many to once again get hungry and have some freaking pride again.

Time to once again snap this ridiculous industry out of its complacency. Innovation - would be a pleasant surprise for once.
 
Last edited:
We will never know as Apple and the PC are the only viable options today.
Amazon would likely disagree with you.

The raspberry Pi foundation would absolutely disagree with you.

Then we can poke the whole Android/Chromebook bear too.

Yours would have been a very accurate assessment in the 90's, but this is not the 90's. Wintel is no longer the dominate computing platform, but just another fish in the sea - especially outside of the US and Europe :cool:
 
We will never know as Apple and the PC are the only viable options today.


‘PC’ has an all encompassing meaning. That already proves the point that your comment is flawed. There are multiple options from multiple vendors to enable you to build a ‘PC’.
 
Base model?

My next downgrade is going to be a HUGE upgrade.

Yeah, it looks like I could be “downgrading” from my 2012 i7 27” iMac with 24GB RAM to a M1 Mac Mini with 16GB for home use. Sure it would be nice to be able to upgrade my memory beyond 16GB, but I ho early don’t think it will be an issue that often. I still use a beater (literally dented and abused MBA that I got cheap) 2013 11” MBA with 4GB RAM for light email, web browsing, and my son’s Zoom Taekwondo class that is clearly underspec’ed.
 
  • Like
Reactions: DocNo
Yeah, it looks like I could be “downgrading” from my 2012 i7 27” iMac with 24GB RAM to a M1 Mac Mini with 16GB for home use. Sure it would be nice to be able to upgrade my memory beyond 16GB, but I ho early don’t think it will be an issue that often. I still use a beater (literally dented and abused MBA that I got cheap) 2013 11” MBA with 4GB RAM for light email, web browsing, and my son’s Zoom Taekwondo class that is clearly underspec’ed.
Your downgrade will definitely be an upgrade, but if you don’t need it right now, I’d wait a bit longer to see more pro options if that’s what you want.
 
‘PC’ has an all encompassing meaning. That already proves the point that your comment is flawed. There are multiple options from multiple vendors to enable you to build a ‘PC’.
That's right. Unlike the early years of computing where there were many companies doing their own thing we are in an era of status quo.
 
Amazon would likely disagree with you.

The raspberry Pi foundation would absolutely disagree with you.

Then we can poke the whole Android/Chromebook bear too.

Yours would have been a very accurate assessment in the 90's, but this is not the 90's. Wintel is no longer the dominate computing platform, but just another fish in the sea - especially outside of the US and Europe :cool:
That's why I put the "today" qualifier in my statement.
 
Nope, I will be laughing for waiting when you could be experiencing the future, now. For $999, I don't need to replace my main driver today, I can get this, enjoy it alongside my work machine, learn the strengths and weaknesses and then in 6 months if Apple releases a machine that I feel out of experience, not guess work, not old ways of doing things, that I could use, then cool. I upgrade, and sell my MBA M1 for $700 and figure the $300 was a cheap price to pay for so much gained. But sure, wait! kibitz from the sidelines. It's fun to read.
Ha ha ha, this was more of an instruction to the guy I was replying to who was complaining. My M1 Mac mini arrives tomorrow! 😁
 
  • Like
Reactions: DocNo and G5isAlive
Ha ha ha, this was more of an instruction to the guy I was replying to who was complaining. My M1 Mac mini arrives tomorrow! 😁

oops! My bad. 🤭 hope you enjoy your M1 Mac mini as much as I am my M1 Mac air. I am amazed at how fast and low temp it is. not so sure about Big Sur though lol.
 
  • Like
Reactions: MyopicPaideia
That's why I put the "today" qualifier in my statement.
Except to day there is far more than Apple/Wintel already.

I suspect we are aligned but probably talking past each other slightly - if so no worries.
 
oops! My bad. 🤭 hope you enjoy your M1 Mac mini as much as I am my M1 Mac air. I am amazed at how fast and low temp it is. not so sure about Big Sur though lol.
How is your memory use with Safari? I have a few windows open with a half dozen or so tabs each - and only one window with a few Youtube tabs (which seem to be the worse) and with my 8GB 2015 MBA I've got an ever growing swap file - currently at 5.14 GB (was 2GB a few hours ago) :p
 
This assume no one else would have pushed innovation.
I think that’s the assertion being made. For everyone that tried to push innovation, there was someone else with access to the same customers that could just do “the same thing we’ve always done with tweaks”. The history of the PC is littered with companies like the makers of DR-DOS that tried to do something different but were stymied by someone else that had a strong desire to see the status quo continue with more minor adjustments (or just with THEM in the position of control instead of others :)

Just think, we COULD all be using a system that’s basically iterations on the Apple II GS!
 
  • Love
Reactions: DocNo
How is your memory use with Safari? I have a few windows open with a half dozen or so tabs each - and only one window with a few Youtube tabs (which seem to be the worse) and with my 8GB 2015 MBA I've got an ever growing swap file - currently at 5.14 GB (was 2GB a few hours ago) :p

Now that you mentioned it, safari was using up a ton of memory (around 6gb) on my iMac with 40 gb ram, for a total of 20 gb ram usage total. A lot less on my MBA (around 6+ gb).
 
Because macOS and iPad OS manage memory in totally different ways. If an iPad is low on memory it will evict other apps to free up memory for the app you're actively using. If macOS runs out of memory, it starts swapping to your disk drive instead but keeps all apps open. The iPad approach is not suitable for a Mac and vice-versa.
But I am not still understanding if I am in Affinity Photo how even with all apps evicted like you state and with literally nothing on the OS even running, it would have the maximum 4GB of RAM available for Affinity Photo. Whereas the Mac is using more than the 4GB of RAM just for Affinity Photo on the same test.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.