Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I see the Studio+ASD as being more of a replacement for the iMac Pro.
Apple readily admitted they dropped the ball with the trash can Mac and so the iMac Pro was a stop gap solution.

The Studio is a great more flexible option and given how the M1 is architected there really isn't room in the product line for an iMac Pro. A 27" iMac to be sure but there's nothing they could really add to make a compelling iMac pro
 
  • Love
Reactions: EzisAA
So I've seen mentioned in this thread multiple times, but it doesn't match my experience at all. While we do use Citrix for one of our corporate apps, we don't run thin clients, we run fully equipped Windows PC's and they are used as Windows PC's where all local stuff is done on the machine itself.

We only put Xeon's in servers as you suggest, desktops are either i7's or i5's, unless they need more computing power, and then we go with an i9. A Xeon is a server/workstation chip, not a "PC" chip. There's really no need to put one in a desktop. It's not faster than the cheaper i9.
Why people defend Intel is beyond me, dude. Seriously, they need to adopt ARM and accept that Core is in an engineering dead end. There's nothing left to squeeze out of it.

Or do we have to defend the Motorola 68k while we're at it? Great processor, man! We need more people putting this thing in their machines. Maybe we should resurrect it and tinker with it for 30 years like Intel has done with Pentium and then later Core? Isn't it time to move on?

M1 blows Core i3 out of the water, make i5 look bad, scares the **** out of i7 because he's next to be surpassed, and i9 is wondering whether anyone buys him.
 
Why people defend Intel is beyond me, dude. Seriously, they need to adopt ARM and accept that Core is in an engineering dead end. There's nothing left to squeeze out of it.
It may be beyond you to consider but many of us disagree with that sentiment. Btw, the latest intel and amd CPUs are significantly faster then M1. Don't get me wrong, the M1 is a fantastic product, but M1's success doesn't mean X86 suddenly stopped working.
 
Last edited:
We can’t use the M1 iMac because it only supports 1 external display which causes issues when the machine is passed down to conference rooms and other setups that have 2 big wall mounted TVs.
Yeah, I've heard others make that same complaint. One commenter said his small business was considering switching to being a Mac shop by buying entry-level M1 Airs for everyone, but that won't work because they all use two displays at their desks.

Technologically the M1 should be able to drive more than two displays (which, for an Air or iMac, means more than one external), so I thought this limitation was just an issue specific to their first-gen design. But then the M2 came out with the same limitation, suggesting this is a deliberate product segmentation decision rather than an engineering limitation. If so, at least for the near future, you may always need to step up to the Pro chip (and thus, as you said, a 27" iMac, if they make one) if you want to drive 3 displays total.
 
Apple readily admitted they dropped the ball with the trash can Mac and so the iMac Pro was a stop gap solution.

The Studio is a great more flexible option and given how the M1 is architected there really isn't room in the product line for an iMac Pro. A 27" iMac to be sure but there's nothing they could really add to make a compelling iMac pro
I think you misunderstood what I meant, which is simply that, in terms of capability and target audience, the Studio+ASD is more akin to the iMac Pro than the iMac.
 
I think you misunderstood what I meant, which is simply that, in terms of capability and target audience, the Studio+ASD is more akin to the iMac Pro than the iMac.
No I get it, my point is that the iMac Pro was a stop gap measure by apple, and the studio is a better all around option
 
  • Like
Reactions: EzisAA
Sure. But Apple is buying the panels. The price for a 5K panel might have gone up so much that isn't feasible to create a 27" 5K iMac for anything close to the previous prices. Apple wants the highest quality displays and almost no one else in the industry cares. You don't see any 200+ PPI large displays for desktop PCs except for the forementioned LGs. So, maybe Apple decided that a 24" 4.5K panel was cost effective and a 5K panel wasn't. We don't have enough information.
Sure, but my point was that it's plausible Apple would be able to have these made on the same production line as the 24" panels, at **roughly** (27/24)^2 = 1.27x the cost, which should be cost effective for a 27", assuming that product is priced about that much more.

I.e., just as you wrote this:
It’s entirely possible that an inexpensive source of 5K panels dried up and creating a new 5K 27” iMac would be too expensive to sell in enough quantities to be profitable.
I'm saying it's entirely possible Apple could obtain 27" panels cost-effectively from the same production line on which it's arranged for production of its 24" panels.

And just as you can offer your speculation given current limited info., so can I.
 
Yeah, I've heard others make that same complaint. One commenter said his small business was considering switching to being a Mac shop by buying entry-level M1 Airs for everyone, but that won't work because they all use two displays at their desks.
This is the same problem when Apple only wants to support scaled high-dpi displays. It's likely their opinion that you should be using a single high-dpi external display and not two 1080p (likely) displays. Everyone gets an Apple Studio Display and everyone is happy with a single external monitor.
 
Sure, but my point was that it's plausible Apple would be able to have these made on the same production line as the 24" panels, at **roughly** (27/24)^2 = 1.27x the cost, which should be cost effective for a 27", assuming that product is priced about that much more.

I.e., just as you wrote this:

I'm saying it's entirely possible Apple could produce 27" panels cost-effectively.

And just as you can offer your speculation given current limited info., so can I.
Yield probably makes a difference along with the panel size. So probably more than 1.27x the cost of the 24". Don't forget the LG 24" 4K was $700.00 and the 27" 5K was $1300.00. Almost double the price.
 
This is the same problem when Apple only wants to support scaled high-dpi displays. It's likely their opinion that you should be using a single high-dpi external display and not two 1080p (likely) displays. Everyone gets an Apple Studio Display and everyone is happy with a single external monitor.
That may be, but that's not how many businesses work. The administrative staff at my university is all PC-based, and they all use (and need) 2-3 cheap large externals for their work (one displays one large spreadsheet, a second displays another large spreadsheet, and a third displays the payroll program into which they are entering data). I wouldn't be surprised if you see this in private industry as well.
 
Yield probably makes a difference along with the panel size. So probably more than 1.27x the cost of the 24". Don't forget the LG 24" 4K was $700.00 and the 27" 5K was $1300.00. Almost double the price.
Interesting point. I wonder what the defect rate is for LCD panel production.

It's possible the difference in retail pricing is not directly reflective of production costs. E.g., LG may have though they'd get more business customers for the 27", and could thus sell it at a higher markup.
 
Interesting point. I wonder what the defect rate is for LCD panel production.

It's possible the difference in retail pricing is not directly reflective of production costs. E.g., LG may have though they'd get more business customers for the 27", and could thus sell it at a higher markup.
If I remember correctly, there was a company in Europe selling the 5K panels that were rejected for the LG Ultrafine. Company name started with an I (Iyama?). There were many complaints.
 
One guy has one problem with one screen: "Support for 3rd party screens is terrible and broken"
One guy has no problem with three screens: "Support for 3rd party screens is flawless and perfect"

The only way to guarantee a flawless Apple Silicon monitor experience is to either do your research (which you presumably did, or were just lucky) or just buy an Apple monitor and never have to worry about unsupported EDIDs, unsupported color formats, problems with wake / sleep, flickering, and unsupported HDMI versions ever again.
 
  • Like
Reactions: macsound1
I wonder if Apple are trying to avoid having to make custom components such as the TCON (timing controller) that went into the (5k only?) Intel iMacs.

The low end Mx CPUs may not have enough connectivity to run a 27" 5k display and also provide the requisite number of ports while the M1 Pro/Max costs too much and would have to be an iMac Pro on price? With onboard memory driving the 27" display likely to exceed 4Gb it's easy to see all SKUs of a 5k iMac coming with 16GB RAM minimum which raises the price even more.

Remember an M2 Pro spec CPU is expected to start with 16Gb Ram while an M2 Max would start with 32Gb - this would raise the price of a 27" iMac very quickly into Intel iMac Pro territory.

Maybe this is what's stopping Apple from releasing what should be a 27" iMac with M2 CPU which has a 24Gb option?

Bear in mind that features like continuity camera seem ideal for people running a desktop Mac and (for example) a Dell screen and then want to participate in a video call.
 
Technologically the M1 should be able to drive more than two displays (which, for an Air or iMac, means more than one external), so I thought this limitation was just an issue specific to their first-gen design. But then the M2 came out with the same limitation, suggesting this is a deliberate product segmentation decision rather than an engineering limitation.

What a surprise.
 
That's a corporate computer, not a personal one. Those are often horrible trash, because the buyer (manager) is disconnected from the user (worker). Lenovo is just the abandoned business machine devision from IBM after loosing the user market to Apple.
You really don't have a feel for the business market...
 
If I remember correctly, there was a company in Europe selling the 5K panels that were rejected for the LG Ultrafine. Company name started with an I (Iyama?). There were many complaints.
Iiyama [ https://en.wikipedia.org/wiki/Iiyama_(company) ] did produce a 5k panel they used in their own 5k monitor, the $900 Iiyama ProLite XB2779QQS, which was introduced around 2018, but it wasn't very good (https://www.knowyourmobile.com/news/cheapest-5k-monitor/). It was only 6 bits, which limited its color accuracy. And there may have been other issues.

But that doesn't speak to the question of what the defect rate is in the line that produces the panel for Apple's 24" iMac.
 
Why people defend Intel is beyond me, dude. Seriously, they need to adopt ARM and accept that Core is in an engineering dead end. There's nothing left to squeeze out of it.
I'm not defending Intel, but I definitely can't understand people who think they are dead! You're just not matching reality there. x64 is all I see on the desktop and laptops these days.

Or do we have to defend the Motorola 68k while we're at it? Great processor, man!
It was a great processor, more advanced and easier to program than the intels of the time. But the 68K's time is decades past. The kind of morphed into the PowerPC and that went to the bigger machine market, not PC's. Our main server is a Power9 machine actually. :)

M1 blows Core i3 out of the water, make i5 look bad, scares the **** out of i7 because he's next to be surpassed, and i9 is wondering whether anyone buys him.
We don't buy i3's and you'd be surprised about the latest i5's. Next to be surpassed? LOL, ain't going to happen anytime soon.
 
That may be, but that's not how many businesses work. The administrative staff at my university is all PC-based, and they all use (and need) 2-3 cheap large externals for their work (one displays one large spreadsheet, a second displays another large spreadsheet, and a third displays the payroll program into which they are entering data). I wouldn't be surprised if you see this in private industry as well.
It's definitely not unusual in private industry.
 
  • Like
Reactions: theorist9
Indeed, once the Reality distortion field (RDF) dissipated, it was clear that many GPUa and x86 processors exceed what the M1 Pro/MAx/Ultra is capable of. What the M1 series is a powerful processor that is incredibly efficient power wise. Its also one of the best laptop processors in the market.

The 30 and 40 series nvidia GPUs are phenomenal, and Intel has regained the crown for best desktop processor - though with AMD's 3d cache, they'll probably take that back once the those are released.
The M1 (Mx now the the M2 is out?) is a great laptop chip. On the desktop where power consumption isn't an issue its all-in-one-ness and lack of IO ports is a problem. For those of us who are not into gaming or video processing the built-in graphics on a Rizen XXXXG is good enough by a large margin.
 
  • Like
Reactions: bobcomer
No I get it, my point is that the iMac Pro was a stop gap measure by apple, and the studio is a better all around option
I agree completely with you there. That's not the point on which we differ. Rather, it's this:
I definitely can see the Studio + Studio display being the combo that in effect plugs the product hole that the 27" iMac left.
To my mind, the Studio + ASD effectively replaces the iMac Pro, not the iMac.

Yes, some put 128 GB RAM in the 27" iMac, got the highest-end CPU, and used it for prosumer applications (as I myself did). But I suspect I'm in the minority, and most who bought a 27" iMac got lower-end configurations and used it for routine computing. The Studio Max/Ultra are not for routine computing, and thus not a replacement (for most consumers) for the 27" iMac.

Let me make this more concrete. Here's a typical consumer config for a 2020 27" iMac (upgraded from the base config with 512 GB SSD and 16 GB RAM). Even using Apple's RAM upgrade prices, the total cost is only $2100:
Core i5 (3.1 GHz six-core with 4.5 GHz Turbo), 16 GB RAM, 512 GB SSD, Radeon Pro 5300 GPU (4GB GDDR6).

Given this, I don't see how one can view the ASD + Studio as a replacement for the iMac for a typical consumer—$2100 is not even close to the minimum of $3600 you'd need to spend on an ASD + Studio. The ASD+Studio is a good replacement for the iMac Pro (a high-end AIO workation), not the iMac.

What if you instead paired a Mini with an ASD? A Mini with 16 GB/512 GB is $1100, so an ASD+Mini would be $2700, still quite a bit more than $2100.

Plus a Mini is not going to give you the video connectivity of a 27" iMac. E.g., the old iMac could drive three displays (including its internal), while the Mini can drive only two. I think to replace the old iMac you'd need a hypothetical Pro Mini. Let's estimate a Pro Mini at $1400 ($300 more than the Mini). Add an ASD, and you're at $3000.

The real problem here is not the pricing of the Mini or Studio—it's that of the ASD. It's simply not consumer-priced, and thus any config that includes it is likewise not going to be consumer-priced.
 
Last edited:
I agree with all of that. The trouble with the (Mac) Laptop game, at least where I work is it doesn't run Windows apps well, so it's not even considered. That's not true of all corporations of course, but for smaller, low margin industries with low IT development budgets, compatibility is a MAJOR influence. The biggest. That's where I work...

If you are willing to run Parallels and only run productivity apps, compatibility should be pretty good, unless they have some detection to prevent you from running that software on a VM, or if they rely on some obscure kernel function that is not well-emulated by a VM.

However, if you run games and you need something coded by EA, your luck may not be so great.

There is an interesting solution if you need productivity software, by the way: buying a compute stick. You could set it up on a separate monitor to run Moonlight or another streaming software, and then connect it to your Mac computer. This will give the illusion it's all a single system, with the added benefit you'll be able to run any devices incompatible with MacOS (Moonlight / Parsec / SteamPlay are like Microsoft's Remote Desktop, but better).

Since those computers are so small, it makes it really viable to carry them around, and you would essetially sacrifice (almost) nothing.

The downside is this setup could be a bit cumbersome.
 
If you are willing to run Parallels and only run productivity apps, compatibility should be pretty good, unless they have some detection to prevent you from running that software on a VM, or if they rely on some obscure kernel function that is not well-emulated by a VM.
For work, run an unsupported OS, and spend more money for it, not doable. I have tried that combo, and I'd switch it from pretty good to good, most things run, but not all, and performance is passable.

However, if you run games and you need something coded by EA, your luck may not be so great.
Gaming isn't for work, and I don't game on anything but a console, so it really doesn't matter.

There is an interesting solution if you need productivity software, by the way: buying a compute stick. You could set it up on a separate monitor to run Moonlight or another streaming software, and then connect it to your Mac computer. This will give the illusion it's all a single system, with the added benefit you'll be able to run any devices incompatible with MacOS (Moonlight / Parsec / SteamPlay are like Microsoft's Remote Desktop, but better).
I already do that somewhat when I'm on my Mac at home, though I have an i9 desktop that runs Windows that I connect to.
Since those computers are so small, it makes it really viable to carry them around, and you would essetially sacrifice (almost) nothing.
And SLOWWW. I'll stick with a Windows laptop when traveling.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.