Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MysticCow

macrumors 68000
Original poster
May 27, 2013
1,565
1,766
Every single mini is Intel Integrated or Intel Iris. Is it because it would carve into iMac sales? Or is there another reason?

Why can't we have a single mini with AMD or Nvidia graphics anymore?
 
In terms of power, you could probably use a Radeon Pro 450/455/460 (the GPU's used in the 15" MBP). They have a TDP of 35 watts and the MBP looks to have an 85 or 86-watt power supply, the 2012/2014 Mini's having a 85-watt power supply. If you want the Nvidia GTX-10, they start at 75 watts TDP, so you may have to go back to the external brick to use that one.

I don't know about the space constraints - how difficult would it be to put a dGPU into the current Mini form factor?

The question is how many people would pay for a GPU in a Mini? I think a lot of people would be satisfied if there was a higher-end Intel iGPU in the Mini and would not be willing to pay the price for a dGPU and having the option wouldn't be worth it for Apple in terms of the profit generated vs. the additional logistical/engineering complexity. AMD doesn't sell the 450 dGPU's to consumers so it's not clear what the cost would be. In the MBP, to go from a Radeon 450 with 2GB to a 460 with 4GB is $200 (obviously there's an Apple Tax there).

I, for one, would not be willing to pay for a dGPU as an option or as part of the base model (say, if you don't get a dGPU as part of an i5 model but you do as part of an über-expensive i7 model). If a future i7 model only came with a dGPU and they price it in a similar fashion to how they do the MBP, I would just move my process-intensive tasks (which, for me, I would be able to do for the most part) to a Linux quad-core mini-form box. Of course, there's a decent chance that there will never be a quad-core Mini (or any Mini?) again, so I may just end up doing that anyway.
 
  • Like
Reactions: brendu
I have one, albeit the 2011 model.

Yesterday I had to bake the logic board in an oven to reflow the solder on the AMD GPU. If I had my time over I would have gone Intel graphics only.

Same as Osty for me, the minis I have that failed are those with the discrete GPU. Unless you can reseat them or replace them easily, I would rather go with Intel too these days
 
I don't know about the space constraints - how difficult would it be to put a dGPU into the current Mini form factor?

May I rephrase this question: How difficult would it be to create a Mini form factor capable of holding a standard PCIe graphics card?

Apple is in the unique position among modern computer manufacturers of having both (a) great control over their hardware manufacturing pipeline from motherboard designs all the way through to case construction, and (b) deep expertise in both engineering and styling of these devices. There is no reason Apple couldn't come up with a design that integrates a standard PCIe slot and still looks good and works well. (Heck, the Mac Pro could have housed such a slot if they had wanted.)

Apple has, in my opinion, wasted a great deal of their expertise in only concentrating on the form of their computers. In going for thinness and minimalism, they create designs that are nice to look at, but sacrifice more and more functionality with each revision. What they really need is for someone to be demanding an increase in functionality with each update; I think that's a key element that was lost when Jobs passed.

I think Apple has lost track of the point of using a computer. The folks up at the rarefied heights at the top of the company see these things as beautiful accessories you play with as you socialize with others; they seem to lack the concept that these things are ever used as tools to get work done.
 
May I rephrase this question: How difficult would it be to create a Mini form factor capable of holding a standard PCIe graphics card?

Apple is in the unique position among modern computer manufacturers of having both (a) great control over their hardware manufacturing pipeline from motherboard designs all the way through to case construction, and (b) deep expertise in both engineering and styling of these devices. There is no reason Apple couldn't come up with a design that integrates a standard PCIe slot and still looks good and works well. (Heck, the Mac Pro could have housed such a slot if they had wanted.)

Apple has, in my opinion, wasted a great deal of their expertise in only concentrating on the form of their computers. In going for thinness and minimalism, they create designs that are nice to look at, but sacrifice more and more functionality with each revision. What they really need is for someone to be demanding an increase in functionality with each update; I think that's a key element that was lost when Jobs passed.

I think Apple has lost track of the point of using a computer. The folks up at the rarefied heights at the top of the company see these things as beautiful accessories you play with as you socialize with others; they seem to lack the concept that these things are ever used as tools to get work done.

Oh, so agree. I owned one iMac and never again will I buy an all in one. My 2006 (Power PC) G5 tower has served me well and I still use it on occasion for software I didn't want to buy again for my 2012 Mac Mini. I'm a photographer and need decent desktop computer power. They have crippled the Mac Mini since then, along with the Mac Book Pro. I'm still toying with the idea of an older refurb MBP, but I can't justify the money for older hardware, even if refurb.

As others have noted in other threads, I believe the Apple Pro market is dying and I believe it was that segment that helped spur the growth to begin with.
Steve is long gone and now all we have is a bean counter with an empty pipeline, but I guess as long as they roll in the $$$, the BOD will sit on their hands. At one point, I believe it's going to break......
 
I agree with everything except the end. Apple is not sitting on their hands. They just have a different vision of where computing is going and are pursuing it vigorously.

1) Apple sees iCloud as replacement for local storage. Everything will be in the cloud and only the files in immediate use will be stored locally. This is how iPhones & iPads already operate. There is no need for local backups like Time Machine and related products like Time Capsule.

2) Apple wants to make computers that are as simple & non-technical as possible. That means built-in monitors and memory & storage that can't be changed later. Think laptops and iMacs (or iPhones & iPads) rather than Mac Mini & Mac Pro, neither of which has been changed for three years and possibly never will.

3) If someone wants to have an external device like a monitor or a local backup, there will be an option, though designed & made by a third party even if sold by Apple.

Here's a related set of comments, though I don't vouch for their accuracy:
https://hardware.slashdot.org/comments.pl?sid=10191963&cid=53786433
 
Dedicated graphics? Get real. I cannot stop laughing. Why would you want actual power? Isn't the thinness of your computer enough to make you happy forever? I cannot properly answer your question because I am too busy holding up a piece of paper to my computer and wondering "Why the hell can't my computer be as thin as this paper?". When will Apple fix this?

LOL, just kidding, this is my excitement for a Saturday night. Now going to reread Stephen King's "Thinner". Now THAT is a scary book.
 
Last edited:
Every single mini is Intel Integrated or Intel Iris. Is it because it would carve into iMac sales? Or is there another reason?

Why can't we have a single mini with AMD or Nvidia graphics anymore?

Even in this forum we have the issues with the 2011 Mac Mini with discrete AMD GPU as they age. This is before you get into the issues with the 15" Macbook pros that came wth discrete GPU and the issues they have there too.

The 13" Macbook Pros come with Iris Graphics versions of the Intel CPUs that are on offer. In some cases these came months after the original Skylake chip came out.

These are the 2 reasons why there won't be a GPU in future Mac Minis and with Iris Pro going away with a generation or two, we won't ever be seeing quad core chips in the Mini.

This isn't necessarily a bad thing as Thunderbolt 3 is there for people who wish to attach a scarily expensive external GPU solution.

The third (unofficial) reason would be that the extra GPU - no matter how efficient - will create more heat, meaning a cooling solution has to be beefed up and the machine would have to be - gasp - thicker.

OK I am being facetious with the last point but some people really do appreciate the silence and a reasonably small form factor.

If Apple were looking to merge a few lines together an enthusiast machine would have to be a headless 27" iMac, with a more standard CPU with HD 630 Graphics, but tied in with a proper GPU for compute which would also be able to shift more pixels. It would be built in China allowing the selling price to come down to something more reasonable for volume sales.
 
  • Like
Reactions: !!!
I agree with everything except the end. Apple is not sitting on their hands. They just have a different vision of where computing is going and are pursuing it vigorously.

I've gotta disagree on this one. It isn't that Apple has a different vision of where computing is going; it's that Apple has a different vision of what a computer user is.

Apple now sees the world as made up of people who pay money to consume content. The devices Apple produces are optimal for playing games, or watching movies, or browsing the internet. And that is all that Apple wants them to do; by restricting the ways in which their devices can be used (fewer ports, no expansion slots, soldered-down memory, drives locked in place with security screws), they make life "easier" for their users by ensuring that their users can never do anything complicated.

In short, your role as an Apple consumer is to sit down, shut up, and look at the pretty colors.

A decade ago, Apple was all about making complex devices easy-to-use for consumers. Today, Apple is all about making simple devices for consumers. There's a significant difference between the two approaches.
 
Every single mini is Intel Integrated or Intel Iris. Is it because it would carve into iMac sales? Or is there another reason?

Why can't we have a single mini with AMD or Nvidia graphics anymore?
I don't think there are any technological hurdles to overcome since Apple clearly has manufactured Mac minis with discrete graphics in the past and the basic design of the Mac mini chassis has remained unchanged for years.

The likely explanation is that their interpretation of what Mac mini users need to do with their systems has changed over the years as well as possibly accepting the improvements in integrated graphics performance as adequate for the marketplace.

Whenever Apple makes any decision to include or exclude a particular feature or component, they are juggling a decision of cost, performance, heat, energy, complexity, etc. Adding a discrete GPU to a device like the Mac mini has pros (like increased performance), but also has cons (more heat, more energy, more cost, another point of failure).
 
I've gotta disagree on this one. It isn't that Apple has a different vision of where computing is going; it's that Apple has a different vision of what a computer user is.

I think the difference between what you said and what I said is quite nuanced and effectively leads to the same conclusion: Apple is focused on making very easy-to-use, non-upgradeable computers that are great for consuming content. Some of them can be used for creating content like writing (some bloggers have switched to iPad Pro instead of a laptop) or drawing. iMacs can do for many prosumer uses. For more demanding use, you can either (a) use cloud-based resources, (b) plug something through Thunderbolt (e.g., eGPU), or (c) move to Windows or Linux. The numbers of those whose needs are beyond prosumers has always been a tiny fraction. Ten to twenty years ago, they were on the Mac and the consumers on Windows. Now things are reversing.
 
I don't think there are any technological hurdles to overcome since Apple clearly has manufactured Mac minis with discrete graphics in the past and the basic design of the Mac mini chassis has remained unchanged for years.

The likely explanation is that their interpretation of what Mac mini users need to do with their systems has changed over the years as well as possibly accepting the improvements in integrated graphics performance as adequate for the marketplace.

Whenever Apple makes any decision to include or exclude a particular feature or component, they are juggling a decision of cost, performance, heat, energy, complexity, etc. Adding a discrete GPU to a device like the Mac mini has pros (like increased performance), but also has cons (more heat, more energy, more cost, another point of failure).

Yep, totally agree - I don't believe it's an engineering limitation, but a business decision based on their view of the marketplace for a small, standalone, headless desktops (vs. laptops and AIOs).

The Mac Mini has always been a bit of a fringe machine for Apple, and even across the entire desktop market. Personally, I love the machines: you can match them up with displays of your choice (including TVs), they take up nearly no space, make terrific HTPCs, are easily transported and at various points, been pretty stout little computers.

We still own a '12 i7 QC, 512GB SSD + 1TB HDD, 16GB RAM, the GPU isn't super fast, but the quad core with an SSD makes it extremely effective for non GPU intensive tasks including development (which was its previous duty), and now it's a heck of a mix of media server, general game room computer and semi-dedicated Minecraft machine for my little G :D Plus it was a refurb, so I've got Apple Care through 2018, that I picked up for <$50.
 
I agree with everything except the end. Apple is not sitting on their hands. They just have a different vision of where computing is going and are pursuing it vigorously.

1) Apple sees iCloud as replacement for local storage. Everything will be in the cloud and only the files in immediate use will be stored locally. This is how iPhones & iPads already operate. There is no need for local backups like Time Machine and related products like Time Capsule.

2) Apple wants to make computers that are as simple & non-technical as possible. That means built-in monitors and memory & storage that can't be changed later. Think laptops and iMacs (or iPhones & iPads) rather than Mac Mini & Mac Pro, neither of which has been changed for three years and possibly never will.

3) If someone wants to have an external device like a monitor or a local backup, there will be an option, though designed & made by a third party even if sold by Apple.

Here's a related set of comments, though I don't vouch for their accuracy:
https://hardware.slashdot.org/comments.pl?sid=10191963&cid=53786433

I understand where you are coming from, but I just see things differently from my needs and use.

1. I avoid iCloud, except for some handoff items that make my contacts and notes seamless. I won't trust someone else with my data. Apple is using this as an income stream, i.e. services.

2. Yes, most consumers don't want to understand or have need to upgrade RAM, etc. They are willing to pay the Apple price to have it soldered in. Once again, I will never buy an all in one device. My screen went out on my iMac, rendering it useless. My point was that Apple has abandoned the Pro market and the ability to have a powerful desktop computer that doesn't need to be thin as cardboard.

My mention of Apple was that the BOD is sitting on their hands in regards to Tim Cook's supposed 'pipeline' and that 'exciting' thing are coming. We have been hearing this for several years now and IMHO, the innovation has stagnated.
 
Just to chip in some stuff:

1) Apple is horizontally structured, so their development team moves from project to project instead of spending time perfecting a single product line. This is great for having a development team focus on what's best for the company. Unfortunately, it usually causes people to view the customer base for each product as the same type with the same sort of needs. This can lead to convergence of products when the customer base is actually diverging on each product line (which is what I think is happening). Net result, you get laptops designed to fit the same sort of customer that's interested in an iPad pro.

2) Because the base Intel graphics is "good enough" now (able to run 4K/60hz + another monitor) and Iris Pro isn't fast enough to interest customers that want more, Intel is moving away from Iris Pro. Basically, the "in-between" graphics market really doesn't exist

3) There's a rumor that Intel may cut a deal with AMD for inclusion of AMD graphics parts in a multi-chip module for mid-level gamer level graphics

4) AsRock just brought out a mini-STX board with a MXM graphics card slot. This means a near mini sized SFF computer now has a replaceable graphics card option (abet an expensive one)

That last one should be interesting for Apple, if they can get past the "not invented here" syndrome. An industry standard, cheap, "thin" mini-STX board could be used for both iMacs and the same non-thin one be used for a mini replacement. That would drop material costs dramatically
 
Last edited:
I understand where you are coming from, but I just see things differently from my needs and use.

Actually, we seem to be in full agreement. What I wrote is not where I'm coming from, but where (I think) Apple is coming from, which is quite different from my needs. My 2011 iMac just failed and, looking at the current options, I can't find any new Mac that I'd like to buy. I'll probably buy a used one until I have more time in my hands to make a full transition to Linux...
 
2. Yes, most consumers don't want to understand or have need to upgrade RAM, etc. They are willing to pay the Apple price to have it soldered in.
For my family and friends, they typically call me about their Mac getting slower and I put more RAM in for them. Many are in need for an upgrade in the next year or two. When I tell them that the machines they used to buy from Apple now cost twice as much, if they want them to last as long a single their old ones, they are going to scoff. And I am going to tell them that it's not worth it to go Mac anymore.

That is the reality of many Macs in my circles. They might be last Macs in several households - especially when I tell them the horrible anti-consumer moves Apple has been making ;)
 
  • Like
Reactions: ssgbryan
The mini used to have a CD drive and accessible RAM. It sucks that they are using a design that was meant for those kinds of things and don't take advantage of it. Not that I want a CD drive, but they could pack so many things in this design...
 
The mini used to have a CD drive and accessible RAM. It sucks that they are using a design that was meant for those kinds of things and don't take advantage of it. Not that I want a CD drive, but they could pack so many things in this design...
Can't take a single sale away from an iMac, MB or MBP
 
I've had bad luck with Apple devices and proper independant graphics.

I also think that integrated graphics are good enough now for most users. Back in 2006 when I had a laptop with an Intel 950GM I couldn't even watch YouTube. Intel has made great strides with their graphics since then, Iris and Iris Pro graphics chips are strong enough to even compete with some of the entry level discrete options in games and you'd be hard pressed to find a device that can't handle pretty well every media center activity you want it to.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.