Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

ipedro

macrumors 603
Original poster
Nov 30, 2004
6,232
8,493
Toronto, ON
After the early Mac Christmas introduced an all new iMac, refreshed Mac Minis, and a 13" Retina MacBookPro, the Mac Pro was notably absent.

Many professionals can comfortably have their performance needs met with the high end MacBookPro or the new iMac. But there are those who need multiple large screens, powerful graphic cards, very high capacity redundant drives and purpose built industrial hardware and/or interfaces.

I think Apple could match this need with a modular Mac Mini. With the introduction of Thunderbolt, interconnected external modules that behave like internal components are now possible.

ministack-1.jpg


1 - Pick a CPU and memory configuration. The built in graphics drive the displays with Thunderbolt ports and the SSD is dedicated to OSX.
You can add multiples of these to multiply your processing power. OSX takes care of distributing processing to the available cores.
2 - Add modules for dedicated graphics stackable to meet your needs.
3 - Add storage modules with all the redundancy that you need.

You can then add any custom built components that drive factory equipment, assembly robots, medical equipment, optical media production, or any other custom purpose that suit the kind of Mac customers demanding a professional Mac tower.

People who want a Mac Pro, want it because it's expandable. But even the Mac Pro is limited to the number of slots in the case. A modular Mac Mini would be nearly unlimited. You could mount them in racks to build a super computer, infinitely more affordable than any other super computer built today, because this system could be purchased and assembled off the shelf.

People who want a Mac Pro, want it because it offers options with the highest specs. The stackable CPU modules would distribute processing to as many cores as you need. Right now, the Mac Pro can be built up to 12 cores. 3 Mac Mini stacks could match that. Add more for even more processing power.

This would be a good opportunity for Apple to introduce the optical version of Thunderbolt and a locking system to attach each stack to the one below it.
 

Aldaris

macrumors 68000
Sep 7, 2004
1,790
1,247
Salt Lake
I can see something along these 'modular' lines. And it might not be too bad of a concept... Maybe within a current Mac Pro frame and 'Mini'-like components that could quickly/easily, be somewhat hot-swappable (except for key items like CPU/GPU etc.)

Could honestly be the xMac/hybrid that many have requested and be highly configurable, for a wide spectrum of users.
 

blanka

macrumors 68000
Jul 30, 2012
1,551
4
Thunderbolt has nowhere near the bandwidth needed for discrete graphics. I think a PCI express X16 slot can handle the trafic of 10 TB ports with ease.
 

Neodym

macrumors 68020
Jul 5, 2002
2,433
1,069
Yes, that would be a nice approach, as i suggested e.g. here and here.

I even suggested it to Tim Cook by eMail (unfortunately w/o response) and i think you should do the same - perhaps the thought will gather traction at Apple if more people are requesting such a modular system.
 

DJenkins

macrumors 6502
Apr 22, 2012
274
9
Sydney, Australia
This idea keeps coming up every now and then, and historically Apple would be the company to attempt this re-invention. Even though I think Apple have completely dropped the ball on the pro market, they are the sort of company I admire for previously making bold moves probably ahead of the times.

From my limited understanding, I can see a few hurdles.

OSX would need to be able to distribute the work to the different boxes. Sure it can utilize multiple cores now, but it's not the same with processors. The way i7 chips are built, you can't even use them in dual proc configs. For that you need certain Xeons. And for 4 proc machines you need a completely different series of Xeons again. And all of these are on the same motherboard.

It would take something really special about a thunderbolt connection to get this sort of distributed processing happening.

Second is the cost per unit. Even though you are getting more cores/processors you still need to pay for a case and some sort of motherboard and all the guts etc.

I considered this before building my hackintosh. I thought I would get a second hand mac pro and buy a few mac minis to help render my 3D work. But in the end it was way cheaper to get more power out of just one 12 core machine. With each new gen of processors, they are adding more and more cores. 10 core processors are already available. I think the trend is more cores per processor, not more separate processors.

Another thing is heat, all those machines with tiny fans and nowhere for the air to go?!

In my opinion people need to stop pinning their hopes on one single company to deliver their dreams. Look around and see what is the best you can buy at this point in time. Buy it now and free yourself up to go get some work done! If Apple releases a better machine later down the track, there's nothing stopping you from going back to them!
 

JesterJJZ

macrumors 68020
Jul 21, 2004
2,443
808
there's nothing stopping you from going back to them!

Except for your software investments and filing system archives and backups.

Sure, you can cross-trade certain software, some at a cost, still a PITA.
I'd also rather not have MacDrive handling ALL my data on a PC either.

Going cross-platform and back isn't something many people take lightly. Many of us need to commit if we're gonna switch.
 

DJenkins

macrumors 6502
Apr 22, 2012
274
9
Sydney, Australia
Except for your software investments and filing system archives and backups.

Sure, you can cross-trade certain software, some at a cost, still a PITA.
I'd also rather not have MacDrive handling ALL my data on a PC either.

Going cross-platform and back isn't something many people take lightly. Many of us need to commit if we're gonna switch.

Good points indeed. I mainly think for myself as a single user, or the places I work at have servers where both mac & PC's are attached.

A business totally dependent on one system or the other would be a different story, but I do think crossgrading is gaining ground. I know if you volume license with Adobe you get both Mac & PC serials to interchange as you wish.
 

Umbongo

macrumors 601
Sep 14, 2006
4,934
55
England
I think Apple could match this need with a modular Mac Mini. With the introduction of Thunderbolt, interconnected external modules that behave like internal components are now possible.

I get why the Mac mini and thunderbolt encourages this type of thought, but they aren't the solution. It is a solution if Apple were only to look inside the box and use what they have on hand.

Thunderbolt suffers from all the issues of PCI-E connectivity. Fine for bandwidth, although 3.0 x16 is half of the interconnect of two Sandy Bridge-EP Xeons, but not for latency. Creating their own interconnect is possible but that would be a large investment for a niche product. They are realistically limited by what is happening in the server space.

The mini stack may look good and logical, but it suffers from the heat issue which without some change in cooling technology or physics means this type of system will never be a replacement for a big box.

What you are describing is a farm of computers. Fine for some tasks, but not for workstation use where low latency and high speeds are key. The software also isn't at this stage yet to make use of many cores efficiently.

There are many companies, groups of companies and academic groups trying to solve the issues needed to be overcome for something like this and I don't see Apple coming out of nowhere and being the one to solve it. They don't have the interest or the talent.
 

Wardenski

macrumors 6502
Jan 22, 2012
464
5
1 - Pick a CPU and memory configuration. The built in graphics drive the displays with Thunderbolt ports and the SSD is dedicated to OSX.
You can add multiples of these to multiply your processing power. OSX takes care of distributing processing to the available cores.
2 - Add modules for dedicated graphics stackable to meet your needs.
3 - Add storage modules with all the redundancy that you need.

You can then add any custom built components that drive factory equipment, assembly robots, medical equipment, optical media production, or any other custom purpose that suit the kind of Mac customers demanding a professional Mac tower.

I don't think this an elegant solution. You would need a power adapter for almost every box. GPUs limited to thunderbolt would not be as good as PCI 16x and the crosstalk between CPUs would be also limited by thunderbolt.

Also, this would be a fantastically expensive product.

So no, this product would not be very good!
 

Neodym

macrumors 68020
Jul 5, 2002
2,433
1,069
OSX would need to be able to distribute the work to the different boxes.
Like Grand Central Dispatch? ;)

Sure it can utilize multiple cores now, but it's not the same with processors.
Why not? Okay - latency is higher when having to switch to another core cluster, but basically it's the same principle.

The way i7 chips are built, you can't even use them in dual proc configs. For that you need certain Xeons.
You can not use them multi-CPU with hardware-supported on-chip-interfaces. But you could pool multiple CPU's with multipe cores per individual CPU. The OS would then take care to assign tasks to one CPU within that Cluster and distribute the threads of that task only onto the cores of the CPU the mother task is running on.

Think of something along multiple dies on one chip (which has already been done by others in the past), only here it'd be multiple dies on one PCB.

Maybe not trivial, but not exactly rocket science either...

And XSan may be discontinued in 10.8 as someone wrote, but that doesn't mean it can't be developed further if need be.

Another thing is heat, all those machines with tiny fans and nowhere for the air to go?!
The current Mac mini housings contain CPU's with up to 45W TDP. A Cortex A9 DualCore has a TDP of only 2W. That would allow for roughly 20 ARM Dualcore CPUs = 40 cores within the current thermal envelope of that housing (volumetric issues aside).

The mini stack may look good and logical, but it suffers from the heat issue which without some change in cooling technology or physics means this type of system will never be a replacement for a big box.
Think outside the box. As i wrote above the thermal envelope of the mini housing would allow for roughly 40 ARM cores - maybe more if Apple tweaked the design further towards energy saving.

There are many companies, groups of companies and academic groups trying to solve the issues needed to be overcome for something like this and I don't see Apple coming out of nowhere and being the one to solve it. They don't have the interest or the talent.
Ummm - how would you know that they'd lack both interest and talent?

They took the mobile musicplayer market by storm with the iPod.

They completely revamped the (Smart)phone market out of the blue, which was considered firmly established before, with quite some experienced competitors in there.

They actually established a tablet market (again with a new product), which several competitors had failed to accomplish for years before.

Same goes for what's now named the "Ultrabook" segment in notebooks (granted - they probably had help from Intel).

Or the All-in-One market, which is slowly being targeted by various competitors.

And the competition for the Mac mini type segment is still few and far betwen - often not being able to match Apple's prices and in spite of the mini being quite popular (and the segment therefore potentially attractive to competitors in face of slowing PC sales in general).

Unless you work inside Apple I find it a bold statement to say that Apple would lack both interest and talent to (successfully!) try new approaches or solutions!

They may have neglected the computer segment lately (perhaps related to the Forstall case), but it seems that they are about to correct that now and in the coming months...

----------

You would need a power adapter for almost every box.
...unless you power the modules from one central PSU module with proprietary internal connectors.

GPUs limited to thunderbolt would not be as good as PCI 16x and the crosstalk between CPUs would be also limited by thunderbolt.
...unless you connect the modules with a dedicated proprietary connector capable of much higher speeds, combined with a fine-tuned load-distribution. Thunderbolt is not the only highspeed interface available on the market.

Also, this would be a fantastically expensive product.
In other words: A typical Apple product! ;)

So no, this product would not be very good!
I think you're mindwise stuck with the (consumer) hardware currently available on the market. Think bigger - something like Thunderbolt was also considered impossible not too long ago...
 

Umbongo

macrumors 601
Sep 14, 2006
4,934
55
England
Think outside the box. As i wrote above the thermal envelope of the mini housing would allow for roughly 40 ARM cores - maybe more if Apple tweaked the design further towards energy saving.

I don't see many Mac Pro users clamoring for that.

Ummm - how would you know that they'd lack both interest and talent?

They haven't shown any interest in this arena. You're talking about servers and workstation replacements. Well it isn't a workstation replacement for starters and Apple aren't interested in the enterprise as evidenced by their behavior. They are a consumer products company. And yes I can say they don't have the talent, because Apple, like many in Silicon valley, are starved on engineering personnel and those that are doing stuff similar to this aren't at Apple.

End of the day what problem is this solving that Apple have? Forget Mac Pro customers because this doesn't solve anything for them, but where is Apple's end game?
 

ActionableMango

macrumors G3
Sep 21, 2010
9,612
6,907
This idea horrifies me every time I hear it.

I don't want a bunch of small boxes all with separate power cables and other cables running everywhere, with more boxes and cables for all the external drives, and yet more boxes and cables for the external graphics cards. This does not sound like Apple to me.

Not to mention only being able to take advantage of the very few programs that will be rewritten to take advantage of this architecture, and even then probably only after it's been out for a few years. I also doubt GCD would live up to the hype and work seamlessly without software specifically being rewritten for use with distributed computing. Also, knowing the software companies, very few will bother, and they'll drag their heels doing it.

There are two things I'd like to see far more than this suggestion. (1) A Mac Pro updated for current Intel architecture, current graphics, and current connection options. (2) The fabled xMac.
 

cryingrobot

macrumors regular
Mar 26, 2008
156
0
Modular computing like this is such an obvious direction but no maker wants to gamble on the engineering and costs to roll out what would be brand new platform. Maybe Jobs would have done it but as his legacy grows older, there is less and less difference between Apple and the HPs and the Samsungs.
 

Neodym

macrumors 68020
Jul 5, 2002
2,433
1,069
I don't see many Mac Pro users clamoring for that.
I do. They all want (more) power, flexibility and not further rising costs (at least the "xMac part" of mac Pro users).

They haven't shown any interest in this arena.
Neither did they in e.g. mobile phones, tablets or music players before they entered the market.

You're talking about servers and workstation replacements.
I'm talking of the complete desktop portfolio including, but not being limited to workstations such as the Mac Pro. Maybe except the iMac, but including an xMac.

Well it isn't a workstation replacement for starters
How do you define "workstation"?

Apple aren't interested in the enterprise
Maybe their focus is currently more on iDevices, but they made several decisions to help penetrating the enterprise market, such as support for Outlook, Active Directory, encryption etc.

And yes I can say they don't have the talent, because Apple, like many in Silicon valley, are starved on engineering personnel and those that are doing stuff similar to this aren't at Apple.
One more reason for Apple to research different approaches then.

End of the day what problem is this solving that Apple have? Forget Mac Pro customers because this doesn't solve anything for them, but where is Apple's end game?

- Better market differentiation from other (Intel) boxes
- Losen dependence on Intel's roadmap
- Bring another innovation
- Stay competitive as ARM is gaining traction in desktop computing (-> Windows 8/RT)
- Less hardware platforms across the portfolio...
- ...while at the same time being able to cater for more customer demands, like e.g. the xMac
- Further leveraging of economies of scale (e.g. housings, Axx CPU's)
- Further standardization across the setup (R&D cost, maintenance, warranty)
- Higher performance in massively multithreaded applications (workstation)

This idea horrifies me every time I hear it.

I don't want a bunch of small boxes all with separate power cables and other cables running everywhere, with more boxes and cables for all the external drives, and yet more boxes and cables for the external graphics cards. This does not sound like Apple to me.
Because it isn't! Take a look at this. All internal connectors and modules of varying size to accommodate powerful graphic cards as well as chipset graphics, rotating storage as well as SSD blades.

External cables would not need to be more than with today's solutions.

Not to mention only being able to take advantage of the very few programs that will be rewritten to take advantage of this architecture, and even then probably only after it's been out for a few years.
Sounds familiar - similar concerns have been voiced after the transition from PPC to Intel had been announced.

I also doubt GCD would live up to the hype and work seamlessly without software specifically being rewritten for use with distributed computing. Also, knowing the software companies, very few will bother, and they'll drag their heels doing it.
The development effort i would see Apple would have to bring is to offer exactly that: Break through the paradigm that distributed computing is limited to very specialized use cases, requiring highly specialized software. IMHO GCD is a good start into that direction.

There are two things I'd like to see far more than this suggestion. (1) A Mac Pro updated for current Intel architecture, current graphics, and current connection options. (2) The fabled xMac.
Neither of those would take more than a year to develop, much less 3 years. And neither would fit Tim Cooks wording of "something really great" in the works.

Modular computing like this is such an obvious direction but no maker wants to gamble on the engineering and costs to roll out what would be brand new platform. Maybe Jobs would have done it but as his legacy grows older, there is less and less difference between Apple and the HPs and the Samsungs.
This may well be the litmus test for Apple to demonstrate if they can still innovate without the former company visionary at the helm.
 

ipedro

macrumors 603
Original poster
Nov 30, 2004
6,232
8,493
Toronto, ON
Quote:
Originally Posted by ipedro
Think outside the box. As i wrote above the thermal envelope of the mini housing would allow for roughly 40 ARM cores - maybe more if Apple tweaked the design further towards energy saving.
I don't see many Mac Pro users clamoring for that.
I don't see many Mac Pro users clamoring for that.

hmm, you're quoting something that I didn't write.
 

Umbongo

macrumors 601
Sep 14, 2006
4,934
55
England
hmm, you're quoting something that I didn't write.
Yeah, must have hit multi-quote and deleted the wrong bits. Sorry about that.

I do. They all want (more) power, flexibility and not further rising costs (at least the "xMac part" of mac Pro users).

What do you do that you want 40 low power cores and a switch back to RISC over high-end x86 processors? Cause the majority of Mac Pro users are in to digital content creation and parallelism is no where near a suitable stage yet to not lose massive performance by losing the single core performance of Intel's processors.

xMac people to me have always seemed to want a consumer desktop that meets their exact needs and costs the same as building it themselves. Usually their requirements are that of a mid-range gaming PC.

Neither did they in e.g. mobile phones, tablets or music players before they entered the market.

I'm talking of the complete desktop portfolio including, but not being limited to workstations such as the Mac Pro. Maybe except the iMac, but including an xMac.

I don't see enough market interest for this as the desktop market is shrinking and only power users should have use for more than one little box and this type of system couldn't compete with the alternatives in terms of performance.

How do you define "workstation"?

The same way manufacturers do. High-end single system, the front runner of hardware performance. Which these days means Intel Xeon or AMD Opteron with Quadro of FirePro graphics. Exception for the Mac Pro using consumer cards because Apple control the drivers, certification and support anyway.

Maybe their focus is currently more on iDevices, but they made several decisions to help penetrating the enterprise market, such as support for Outlook, Active Directory, encryption etc.

To make it easier for people in enterprise to use their consumer devices. That is different to Apple competing with hardware vendors, which they failed to do before.

- Better market differentiation from other (Intel) boxes
- Losen dependence on Intel's roadmap
- Bring another innovation
- Stay competitive as ARM is gaining traction in desktop computing (-> Windows 8/RT)
- Less hardware platforms across the portfolio...
- ...while at the same time being able to cater for more customer demands, like e.g. the xMac
- Further leveraging of economies of scale (e.g. housings, Axx CPU's)
- Further standardization across the setup (R&D cost, maintenance, warranty)
- Higher performance in massively multithreaded applications (workstation)

You give ARM way to much credit here and expect too much of software to utilize many cores. There is a reason no one has made RISC based workstations for 3 years and why people aren't running Photoshop, Maya and Logic on server farms.

Don't get me wrong, I have no issue with the concept. It makes sense if it was feasible. It isn't though. Maybe in the future with vast OS and hardware advancements for parallelism and significant ARM advancements, but not in the near future.
 

Moonjumper

macrumors 68030
Jun 20, 2009
2,740
2,908
Lincoln, UK
xMac people to me have always seemed to want a consumer desktop that meets their exact needs and costs the same as building it themselves. Usually their requirements are that of a mid-range gaming PC.

I'm someone who would love an xMac. I would not expect it to be the same cost as building it myself, just somewhat cheaper than the iMac as it doesn't have the display costs. The reasons I like the idea are:

  • Much better thermal constraints so that desktop components can be used throughout
  • Upgradability after purchase
  • Ability to easily replace failed components
  • Choice of monitors, which was very important when the iMacs had extremely glossy screens, but is hopefully not a problem with the new model
  • Be able to keep monitors for use with replacement computers

The Mac Pro is an excellent machine, but it is also very expensive. I don't need server grade components.
 

ActionableMango

macrumors G3
Sep 21, 2010
9,612
6,907
Sounds familiar - similar concerns have been voiced after the transition from PPC to Intel had been announced.

Apple was being crippled by PPC and everyone knew this had to happen. It was better for everyone to unify under x86. What you are asking for is more similar to the opposite happening, moving from x86 back to PPC.

We've also had dual and multicore processors as standard equipment for years now. And yet the overwhelming majority of applications make poor use of it, even the most processor-intensive applications. It's the reason why the single CPU hexacore MP beats the dual-proc 8-core MP except in very limited scenarios.

If software companies can't even be bothered to design their software for multicore, which has been a standard in the industry for years, you are expecting way too much for them to jump into a proprietary Apple design. I see zero motivation for anyone to support this.

Add to this mix that Adobe and Apple's friendship is long, long over. Apple started to compete with Adobe products, then shoved Flash up their yahoos. Adobe is far more likely to look at this and laugh then they would embrace it with open arms and gusto.

Who knows, maybe you're right. I just really, really hope you're not. It would unquestionably push me into Windows.
 

laserbeam273

macrumors 6502
Sep 7, 2010
424
0
Australia
This may well be the litmus test for Apple to demonstrate if they can still innovate without the former company visionary at the helm.

Sounds awesome, I like your thinking.

I've thought about this before but took it further, for all macs - imagine if you could just Thunderbolt your MacBook Air to your iMac, and it would automatically carry some of the processor and graphics load. Would be super cool.

We're all about the same. But who will bring it to real world?
https://forums.macrumors.com/threads/1268058/

Maybe, if Apple doesn't do it, a group of computer geniuses could do it. You'd have to get to bare bones, probably start with unix and change the very core of the way the computer works. Would be revolutionary I reckon. I think the real challenge is to carry out such advanced parallel threading, that you can predict the time a single process will take, know what upcoming processes depend on it, and decide whether the time taken to send the process there and back is less than waiting.

Obviously, for blatantly distributable tasks, like rendering, this is entirely feasible. But for small, complex tasks like opening a program, or a complex sequence of code, this is really hard. I feel as though it almost calls for a completely new style of programming. Modular pieces of code, with A and B independent processes, C depends on A and B, and the computer can evaluate the complexity of each individual task. It can then decide whether sending A or B to an external processor would be advantageous.

Graphics would be tough I feel, but mainly because I don't really know what happens in a GPU.

HDD would be doable, obviously through RAID-style striping. Though it's questionable how you handle it - when you plug in a computer as a "slave", does all its free memory become part of a big RAID 0 drive?

RAM would be tough, it would really depend on latency. I suppose if you have a very smart machine, it could automatically calculate when it would be beneficial to use it. Perhaps use the local RAM, and if it maxes out and a pageout occurs, send it to the slave machine.

ODD... lol jks.

This is such a fascinating subject though, just imagine being able to plug together two computers, and use their full potential to the very max. It would be incredible. An MBA could become your sole machine, and when you want more power, plug it into your frankenMac stored inside your desk. Xeon processor, Kepler GPU, mad RAID set and 128 GB of RAM... but all you see is your MBA on your desk. Wow.
 
Last edited:

r00tb33r

macrumors newbie
Apr 11, 2012
22
0
It would take a very tall stack of Mac Minis to remotely approach the computational power of Nvidia's top of the line Fermi and Kepler GPUs (GF100, GF110, GK104), as these are all in the teraFLOPS zone.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.