Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

ronrather

macrumors newbie
Original poster
Oct 18, 2024
19
25
It just seems totally INSANE that there's no way to use an iMac 27" 2015 as a display without hardware hacks. In 2025. I don't need high FPS or anything. I mean, the iMac and the MacBook are on the same network, surely there is some solution which works over the local network.
 
For #2, There is the hardware hack that makes it a 5K screen.

For #1 (not wanting the hack), there is Luna Display and Duet Display that can utilize the screen but it apparently is slow responsiveness.

As to not believing it, Apple would rather you pay the full iMac price again for an Apple Studio Display and also buy a new Mac. Fresh profitability > Customer Utility. 💰💰💰
 
As to not believing it, Apple would rather you pay the full iMac price again for an Apple Studio Display and also buy a new Mac. Fresh profitability > Customer Utility. 💰💰💰

Or or...it's a technical limitation (which it is). This entire rhetoric of "everything is to charge more" without any proof is getting old.
 
You are oh so right. Apple just can't do it... even though they did it before. And pretty much all smart TVs can handle an OS & apps running from within and fresh brains such as AppleTV attached to a port that then takes over the screen from the outside. But perhaps the brains who figured TDM out well over a decade ago retired or resigned and nobody else knows how to do it (or anything comparable)... and can't reverse engineer how TVs do it. It's just got to be technically impossible to make the monitor portion of an all-in-one work as a monitor when the tech guts are arbitrarily "vintaged" even though a good monitor + speakers + camera tends to be good for 2X-3X longer than the Mac guts are allowed to be macOS current. Why would money have anything to do with that? It's so much better to just toss all of it and buy anew... not for self-serving profit motives... but just better because anything else is too difficult for Apple to figure out.

OP & #2 poster are wrong for wanting such a thing- simple utility to keep using a perfectly-good monitor (+ speakers + camera) with their NEXT Mac purchase(s) vs. throwing it all out because one part has aged out (not by failing but by the Corp deciding to cut off upgrades). Why want more for our money as consumers? What matters is the Corp. :rolleyes:

IMO: what is "old" is fellow consumers prioritizing seller over buyers. Seller is doing just fine (King of the Capitalism mountain, richest company in the world, etc). Buyers getting more or just longing for more for their money already spent is generally a great thing.
 
Last edited:
@HobeSoundDarryl
This is old noise. Rather than write it all again here I'll just quote the last time I explained the technological innovation timeline that prevented the continuation of TDM in 2014 and later...

And yes, it could have been done from about the introduction of Intel Thunderbolt 3 technology in 2016-2017 onwards, at a cost of about $100++ of TB3 chipsets etc and about a quarter more logic board PCB area. = a redesigned case AFAICS.
Not surprisingly Apple deferred on that.
 
I wasn't saying TDM and only TDM... just using TDM to illustrate that Apple could make an iMac monitor function as a stand alone monitor to the other guy. Instead of preserving TDM, Apple could have done many things to preserve the same capability. As is, ASD- which is basically the former iMac 27" monitor minus the Mac guts- has a video input to use it as a standalone monitor. Conceptually, an iMac Past could have had a short cable out that turns back to the very same kind of input. When iMac tech ages out, disconnect that loop and plug anything into the same port to keep using the monitor section. Apple would have much more likely done this INTERNALLY (and thus invisibly)- just like how Smart TVs work (including 8K TVs) with an internal computer to show & run its own apps but also accommodate an external computer (like AppleTV or game consoles or cable/satt boxes)- but the point is the same: build in one input port to "take over" when something is connected to it and then iMac buyers have longer-term monitor utility.

Or make the tech guts replaceable so that an old iMac with many parts still perfectly fine can get a tech brain transplant to be a "new" iMac again.

Instead, the engineering "locks down" how iMac monitors can be used, making them useless in this way once the tech guts age out... creating OPs gripe (shared by countless others). The many hacks that add in a little cheap tech to convert iMac screens into 5K monitors shows that it doesn't take much cost to do that. It would be nice for all of many iMac owners like OP to be able to squeeze more years out of a great monitor with any kind of switch, port, etc setup instead of building these as a "throw baby out with the bathwater" landfill filler. However, opting to build them so they will be tossed before their usefulness is fully realized is the more profitable decision if many of the same people will turn around and spend full iMac money on an ASD.

I have a mostly retired iMac 27" myself, much older than OPs. Screen is still perfectly fine, illustrating how long the monitor portion can serve buyers. Speakers sound the same as day 1. Camera works the same. But tech guts are "obsolete", so it sits around doing nothing almost all of the time. I did what Seller expects: purchased a monitor when I embraced separates. However, I chose to purchase a third party monitor in part to avoid any additional lock-down engineering built into ASD. Among other things, it's own tech guts based upon iDevice chips are "on the clock" too, so it's only a matter of time until those are "vintaged" and then "obsolete" while the screen itself will likely still be just fine.
 
Last edited:
  • Like
Reactions: Parowdy and -DMN-
“Conceptually, an iMac Past could have had a short cable out that turns back to the very same kind of input. When iMac tech ages out, disconnect that loop and plug anything into the same port to keep using the monitor.”

EDIT: (For the cable from the logic board to the screen) That would be an 8 lane eDP or (before the display controller/LVDS Scaler chip it would be a) [/EDIT] DP timing sensitive cable, and the only protocol capable of delivering that in a single cable is a TB 3 cable which needs a retiming chip and a TB Host/Device controller chip + PCB logic components at each end.
Two TB docks size of space and expense, not to mention the extra design changes to the TB/GPU of any computer using the old iMac as a monitor.

The Apple Studio Display couldn’t exist before 2020, and by then the iMac was about to be EOL.
Ranting about it doesn’t make is possible… ;-)
 
Last edited:
You are oh so right. Apple just can't do it... even though they did it before. Perhaps the brains who figured that out well over a decade ago retired or resigned and nobody else knows how to do it (or anything comparable). It's just got to be technically impossible to make the monitor portion of an all-in-one work as a monitor when the tech guts are arbitrarily "vintaged" even though a good monitor + speakers + camera tends to be good for 2X-3X longer than the Mac guts are allowed to be macOS current.

You're talking about a completely different display. It's been well known since they pulled this feature that the bandwidth of lightning wasn't able to support 5k at the time and the previous target display mode used tricks to get it to work.

But go on, be angry over nothing. Most people never even used this mode to begin with which is why it's silly to assume that Apple got rid of it to "make someone buy more".
 
It just seems totally INSANE that there's no way to use an iMac 27" 2015 as a display without hardware hacks. In 2025. I don't need high FPS or anything. I mean, the iMac and the MacBook are on the same network, surely there is some solution which works over the local network.
Be aware that displays degrade over time used. A 10-year-old display is not like a new display, even if it works.
 
That would be an 8 lane eDP or DP timing sensitive cable, and the only protocol capable of delivering that in a single cable is a TB 3 cable which needs a retiming chip and a TB Host/Device controller chip + PCB logic components at each end.
I think it's just two DisplayPort connections. You can see them with AGDCDiagnose or AllRez. The left half and the right half of the 5K display have different EDIDs.
So the hardware hack would be to cut the two DisplayPort connections between the GPU and the display controller and add a couple DisplayPort connections.
It would be like the Dell UP2715K.
They can work from a $30 Thunderbolt to Dual DisplayPort Adapter.

That's a 1440p display, not 5K.
 
Last edited:
“Conceptually, an iMac Past could have had a short cable out that turns back to the very same kind of input. When iMac tech ages out, disconnect that loop and plug anything into the same port to keep using the monitor.”

That would be an 8 lane eDP or DP timing sensitive cable, and the only protocol capable of delivering that in a single cable is a TB 3 cable which needs a retiming chip and a TB Host/Device controller chip + PCB logic components at each end.
Two TB docks size of space and expense, not to mention the extra design changes to the TB/GPU of any computer using the old iMac as a monitor.

The Apple Studio Display couldn’t exist before 2020, and by then the iMac was about to be EOL.
Ranting about it doesn’t make is possible… ;-)

Conceptually, it's simple and cheap as the design has already been available in smart TVs. The T-con board attached to the LCD panel has 2 input switchable between the TV-turner (and external input ports) and the android board. Over the time, the android part becomes obsolete and sluggish, but the TV-turner (and external input ports) are still working. You just need to push the buttons on the TV remote control.
 
It just seems totally INSANE that there's no way to use an iMac 27" 2015 as a display without hardware hacks. In 2025. I don't need high FPS or anything. I mean, the iMac and the MacBook are on the same network, surely there is some solution which works over the local network.

Nowadays, software solution often means: Remote Control

 
@HobeSoundDarryl
This is old noise. Rather than write it all again here I'll just quote the last time I explained the technological innovation timeline that prevented the continuation of TDM in 2014 and later...

And yes, it could have been done from about the introduction of Intel Thunderbolt 3 technology in 2016-2017 onwards, at a cost of about $100++ of TB3 chipsets etc and about a quarter more logic board PCB area. = a redesigned case AFAICS.
Not surprisingly Apple deferred on that.

I checked your linked post, where you wrote:
To have done it in 2014 was impossible, and to do it after about 2015/16 would have required more resources in the source and target computers to have between them about twice the Thunderbolt 3 control circuitry in an iMac Pro, and two TB3 cables connecting them to get the necessary bandwidth....
Asserting something is "impossible" is a very strong claim, so I got curious and looked into this.

The Retina iMac was introduced in 2014, and featured Thunderbolt 2 (max data rate 20 Gbps). By comparison, the 27" 5k Dell UP2715, also introduced in 2014, was driven with dual DP1.2 cables, and DP1.2 has a max data rate of 17.28 Gbps. [Sometimes you see a specification of 21.6 Gbps, but that includes the overhead.]

So clearly it was possible for a 5k 60 Hz display to be driven by the tech available when the Retina iMac was developed. Indeed, DP1.2 became available in 2010.

Further, unless there's some wrinkle in TB2 that limits its video-specific bandwidth, it additionally could have been driven by the specific TB tech that Apple used at the time, which was TB2 (2 x 20 Gbps = 40 Gbps). TB2 was introduced by Apple a year earlier, in 2013.

Whether Apple should have implemented this tech in the 5k Retina iMac is a separate question.
 
Last edited:
  • Love
  • Like
Reactions: Parowdy and -DMN-
Yes! It also seems INSANE to me that I can’t use my 1997 17” Sony Trinitron CRT as a second display without hardware hacks!

You knew when you bought it. Same applies here, it’s not like Sony or Apple promised anything at the time of purchase. I personally don’t see value in iMac or AIO.
 
The Retina iMac was introduced in 2014, and featured Thunderbolt 2 (max data rate 20 Gbps). By comparison, the 27" 5k Dell UP2715, also introduced in 2014, was driven with dual DP1.2 cables, and DP1.2 has a max data rate of 17.28 Gbps. [Sometimes you see a specification of 21.6 Gbps, but that includes the overhead.]

So clearly it was possible for a 5k 60 Hz display to be driven by the tech available when the Retina iMac was developed. Indeed, DP1.2 became available in 2010.

Further, unless there's some wrinkle in TB2 that limits its video-specific bandwidth, it additionally could have been driven by the specific TB tech that Apple used at the time (TB2 was introduced by Apple a year earlier, in 2013).
Two DP 1.2 connections is required for 5K60. That is greater than the 20 Gbps max of Thunderbolt 2.
Thunderbolt 3 would have been required to implement Thunderbolt Target Display Mode.
Alternatively, Thunderbolt Target Display Mode could maybe have worked using two Thunderbolt 2 connections (but not with the current hardware or software setup).
 
Two DP 1.2 connections is required for 5K60. That is greater than the 20 Gbps max of Thunderbolt 2.
Thunderbolt 3 would have been required to implement Thunderbolt Target Display Mode.
Alternatively, Thunderbolt Target Display Mode could maybe have worked using two Thunderbolt 2 connections (but not with the current hardware or software setup).
You misunderstand. I meant that since 2 x DP1.2 offers a sufficient data rate for 5K60, then 2 x TB2 would as well, since its data rate is even higher. I thought that was obvious, since it should be understood I'm not claiming 1 x 20 >= 2 x 17.28!! Give me some credit! But I've edited my post to make that explicit.
 
Last edited:
I guess I just accepted when I bought my previous 27" iMac that the all in one format meant that when the hardware needed upgrading it would all be redundant.
After two iMac which I really liked I decided to upgrade to a Mac Studio and Studio Display.

Would I have liked to keep the iMac as a display? Possibly.
Did it worry me that I had to replace whole unit? No really I accepted that was the limitation of the system I had.

To be fair i got a good few years usage out of both my iMacs so did not really begrudge replacing them.
Yes it was a more expensive upgrade path, and I could have gone cheaper. But I for one am very pleased with both, and I do have a Samsung M80C (27") I use a monitor for my work PC that occsaionaly gets used as a second display.
 
Yes! It also seems INSANE to me that I can’t use my 1997 17” Sony Trinitron CRT as a second display without hardware hacks!
Yeah, that's exactly the same, huh.

Apple sells the 5k Studio Display that's nearly identical for C$1800. It would sure be nice if they made it so the 5k iMac displays weren't e-waste because they couldn't be bothered to let you use it as a monitor.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.