Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Yeah I would also suspect something new. However, AMD helped them do it for the 5K iMac... however that required another display controller... which maybe could also exist in a new cinema display??

Anyways, yeah, let's all start the widescreen 5120 x 2160 5K wide-screen rumor.

----------



While Skylake would bring 5K to all of Apple's offerings with integrated graphics, it's possible to imagine that 5K support could come earlier through AMD's discrete GPU offerings in the iMac, Macbook Pro 15 inch, and Mac Pro.

Does anyone know historically what Apple has done with this? Back when the the LED Cinema Display came out it just supported DisplayPort... how many of Apple's current offerings were still on DVI output only?

History!

When The LED Cinema Display was released in October 2008 only the 13" one-off aluminum Macbook (which turned into the 13" MBP) and the late 2008 15" MBP had mDP.

It's safe to say if we get a 5K monitor from Apple it would be entirely possible for only one or two products they were currently selling to be compatible initially.
 
  • Like
Reactions: Vanilla35
It may be the GPU limit, not the connection.

True... it just seems really strange to support a resolution for a monitor that doesn't exist rather than the full DP 1.3 spec... unless of course Apple is planning on releasing a screen at 5120 x 2160!

It's got the same 2GB GPU memory as the M290 in the 5K iMac, so I suspect that this chip could also be capable of full 5K... my impression is that the connection was the big limiting factor for GPU 5K support over the past year, rathe than horsepower.
 
The chip is new, so not much info about it. It is 28nm, which is the same as the 650M and 750M. I'm not expecting it to be a large improvement, but who knows.

----------

Interesting that Apple quotes an extra hour of battery life though.

and before radeongate was nvidiagate

I think they upped the battery size. It says 99.5wh, and I think the previous one was 95wh?
 
  • Like
Reactions: Vanilla35
History!

When The LED Cinema Display was released in October 2008 only the 13" one-off aluminum Macbook (which turned into the 13" MBP) and the late 2008 15" MBP had mDP.

It's safe to say if we get a 5K monitor from Apple it would be entirely possible for only one or two products they were currently selling to be compatible initially.

Interesting! That still covers large amount of their laptop lineup. Just the 15 inch rMBP w/ discrete graphics is a smaller portion of their market.

In that case, maybe we'll see the MacBook updated with skylake Core-M and then bam, a USB-C all in one dock solution to 5K display compatible with the newest MacBooks, and with USB-C to thunderbolt adapter... this 15-inch R9 M370X machine...
 
These new machines are still using the same Thunderbolt 2 controllers as previous generations, so it's just normal DisplayPort 1.2.

The DP 1.2 spec supports up to 17.28 Gbit/s using a 4-lane main link and HBR2. 5120 x 2160, 24 bpp, 60 Hz, SST only uses 16.8943104 Gbit/s. It's not really 5K, just UHD with an ultra-wide "21:9" aspect ratio. It happens to use a hair more bandwidth than 4096 x 2560, 24 bpp, 60 Hz, SST at 16.13670912 Gbit/s, and at over 11 megapixels is probably the highest resolution one can drive with a single DP 1.2 link that has actually been produced.
 
These new machines are still using the same Thunderbolt 2 controllers as previous generations, so it's just normal DisplayPort 1.2.

The DP 1.2 spec supports up to 17.28 Gbit/s using a 4-lane main link and HBR2. 5120 x 2160, 24 bpp, 60 Hz, SST only uses 16.8943104 Gbit/s. It's not really 5K, just UHD with an ultra-wide "21:9" aspect ratio. It happens to use a hair more bandwidth than 4096 x 2560, 24 bpp, 60 Hz, SST at 16.13670912 Gbit/s, and at over 11 megapixels is probably the highest resolution one can drive with a single DP 1.2 link that has actually been produced.

Interesting, great analysis. A few questions...

- Why wouldn't 5120 x 2160 be listed as max on any other spec page at Apple or anywhere else over SST? I haven't seen this listed for any Nvidia or AMD offerings, mobil or desktop.

- Isn't the DP signal just kindof a passed through for thunderbolt? I've read a lot of reports saying DP 1.3 requires Thunderbolt 3, but ASUS has some internal thunderbolt cards for their motherboards that accept a DP signal from another source and just pass it through exactly as is. I'm inclined to think that Thunderbolt 3 isn't necessary for DP 1.3...

- Is there actually a 5120 x 2160 display on the market?
 
How can you even compare an osx device to those losers?

Every windows device halts in 2 years, while mac os lives on, even the comparison is proof enough that you know nothing

If you want gaming performance, buy the cheapest windows device that fits your requirements, play games on it, re-format when it halts, or just dump it like everyone does, let mother nature take the hit

Where do you find inspiration for such nonsense? I have a Dell XPS M1330 from 2007 that is still running like a champ. It's actually running better and faster on Windows 8.1 than it ever ran on Vista or Win 7. You need a serious reality check.
 
  • Like
Reactions: Vanilla35
From Apple's website

A1AkwsC.jpg
 
Whats your personal opinion on the jump to AMD? I feel like they are less reliable (personal opinion of course) and lack of CUDA cores is a downgrade.

At this point I'm more willing to rely on AMD then nVidia after my whole fiasco with the 650m and 750m I had in my last two 15" rMBPs. Sure I did have problems with Radeongate, but the GeForce chips seem more unreliable then AMD. AMD hasn't given me any other problems in both my Mac Pros, but then again those are desktop chips not mobile chips.

(Yes I went through 3 MacBook Pros in 3 years, all were replaced by Apple due to multiple logic board repairs. Models: 2011 CTO 15", 2012 base r15", late '13 base 750m r15". The 2013 was replaced by a 750m '14 model but I returned it for store credit and bought a '13 Mac Pro)

What about the GT 330M in the 2010 MBPs and the GT 650M in the Ivy Bridge rMBPs, which were part of the recall program as well? As far as I recall, these NVIDIA GPUs didn't fail in other laptops besides that of Apple's, but it didn't take a lawsuit for Apple to launch a replacement program for these chips, because NVIDIA reimbursed them.

The 330m's weren't recalled to my knowledge but the 650m's certainly were. All of the repairs that were done on my MBPs were mostly done because of discreet GPU problems. This is why I have sworn to never buy another discreet GPU Mac notebook. Meanwhile, the two of the three 13" models within my family are reaching 5 years old and still chugging along (knock on wood).
 
How can you even compare an osx device to those losers?

Every windows device halts in 2 years, while mac os lives on, even the comparison is proof enough that you know nothing

If you want gaming performance, buy the cheapest windows device that fits your requirements, play games on it, re-format when it halts, or just dump it like everyone does, let mother nature take the hit

As much as I like Mac, this is BS. If you're talking about a $500 Inspiron, yes that thing will be dead in 2 years(personal experience.). However, my father bought a $2500 Sony VAIO from 2007 that still runs. When I did CAD work in a coil coating planet in Arkansas, there was an old Dell Latitude laptop that was literally over 10 years old running XP. Pay for quality, you get quality Mac or PC.
 
  • Like
Reactions: Vanilla35
How can you even compare an osx device to those losers?

Every windows device halts in 2 years, while mac os lives on, even the comparison is proof enough that you know nothing

If you want gaming performance, buy the cheapest windows device that fits your requirements, play games on it, re-format when it halts, or just dump it like everyone does, let mother nature take the hit

This is complete and utter nonsense. I'm posting this on an 11 year old T42 ThinkPad. A good laptop is a good laptop, regardless of who makes it.
 
Not sure if someone has mentioned but this is no surprise. AMD Radeon has been a huge supporter of OpenCL so it makes total sense for the change up. Mac Pro's and 5k iMacs with Radeon's, now the 15 MacBook Pro.
 
Spent all day hunting through the web for info on this chip, in an effort to decide if I want to upgrade between now and Skylake.

Seems to actually be a different chip, while the M370 is a rebrand (HD 8750M which is similar in performance to the 750M), the M370X is code named Strato XT and if it's offering up a 70% boost in GPU performance it might well be worth the jump. I just don't like AMD's drivers.

http://wccftech.com/amd-strato-xt-benchmarks-radeon-mobility-r9-m300-series/

----------

Also has anyone given any thought to getting Freesync working? That alone would warrant an upgrade in my eyes!
 
Not sure if someone has mentioned but this is no surprise. AMD Radeon has been a huge supporter of OpenCL so it makes total sense for the change up. Mac Pro's and 5k iMacs with Radeon's, now the 15 MacBook Pro.

So do AMD's run OpenCL any better than NVidia? I'm asking because NVidia's CUDA acceleration is notoriously buggy/disastrous in Adobe apps (After Effects, Premiere, etc.), so everyone using those programs has learned to turn off CUDA acceleration and use OpenCL.

Maybe these will actually run the Adobe suite better than the NVidias?
 
Where do you find inspiration for such nonsense? I have a Dell XPS M1330 from 2007 that is still running like a champ. It's actually running better and faster on Windows 8.1 than it ever ran on Vista or Win 7. You need a serious reality check.

My mother still has a 2007 Toshiba 15" laptop running Windows Vista. It's used as a "guest laptop". With a SSD and maxing out RAM to 4GB, it would give enough juice for a couple of years of good browsing experience. A laptop is only dead when it's literally broken. Otherwise, just max it out, replace the battery, and it will last forever.

Well, this is not true for Apple laptops anymore, unless in the future we could be able to upgrade the SSDs to something as fast as 2GB/s. This way, swapping to SSD wouldn't be a issue.
 
My mother still has a 2007 Toshiba 15" laptop running Windows Vista. It's used as a "guest laptop". With a SSD and maxing out RAM to 4GB, it would give enough juice for a couple of years of good browsing experience. A laptop is only dead when it's literally broken. Otherwise, just max it out, replace the battery, and it will last forever.

Well, this is not true for Apple laptops anymore, unless in the future we could be able to upgrade the SSDs to something as fast as 2GB/s. This way, swapping to SSD wouldn't be a issue.

Good point! At one point or another I've upgraded the CPU, RAM, BT card, wifi card, and the drive (several times on that one - all upgrades, no failures).
 
Just ordered this, will benchmark it against my current one when it arrives (Bootcamp W7 of course) and gauge if it's worth it. With the student discount and the resale value it's not that bad an upgrade price.

Also sorry to hear people have been having problems with their 750M's failing, I've been lucky and never had a single issue so far.
 

Attachments

  • image.jpg
    image.jpg
    49.2 KB · Views: 408
I just don't like AMD's drivers.

As far as I can remember, AMD drivers were consistently better than Nvidia ones under OS X. OS X benchmarks of AMD cards are closer to their Windows counterparts than that of Nvidia cards.

But still, I would be very surprised if the m370x would be even remotely as good as 950m... then again, it makes no sense to speculate without having any information.
 
Interesting, great analysis. A few questions...

- Why wouldn't 5120 x 2160 be listed as max on any other spec page at Apple or anywhere else over SST? I haven't seen this listed for any Nvidia or AMD offerings, mobil or desktop.

- Isn't the DP signal just kindof a passed through for thunderbolt? I've read a lot of reports saying DP 1.3 requires Thunderbolt 3, but ASUS has some internal thunderbolt cards for their motherboards that accept a DP signal from another source and just pass it through exactly as is. I'm inclined to think that Thunderbolt 3 isn't necessary for DP 1.3...

- Is there actually a 5120 x 2160 display on the market?

There is apparently at least one 104.6", curved, 5120 x 2160 panel in production, but it ain't cheap:

http://www.samsung.com/us/video/tvs/UN105S9WAFXZA
http://www.lg.com/us/tvs/lg-105UC9-led-tv

These are TVs though, and lack DisplayPort inputs.

Apple probably has a better idea than we do as to what will become available in the near future, and apparently the hardware, OS and drivers are all ready for 5120 x 2160 on the specified models. Intel's IGP's clearly aren't up to the task, and if I had to guess, it's probably due to memory bandwidth.

Although Thunderbolt controllers can pass through or output a native DP signal, the output is limited to whatever generation of DP they support. Many Macs were limited to DP 1.1 solely by their Thunderbolt controllers, and even if the Radeon R9 M370X (Litho XT?) were to support DP 1.3, it would still be limited to DP 1.2 by the DSL5520 Thunderbolt 2 controller.
 
As far as I can remember, AMD drivers were consistently better than Nvidia ones under OS X. OS X benchmarks of AMD cards are closer to their Windows counterparts than that of Nvidia cards.

But still, I would be very surprised if the m370x would be even remotely as good as 950m... then again, it makes no sense to speculate without having any information.

Maybe but I'm only interested in the dGPU under Bootcamp. Also we know a little from WCCF and from Apple's info and they are usually pretty accurate. A 70% gain is considerable but I'll benchmark it soon enough and find out if it's worth the hit to my wallet.
 
Just ordered this, will benchmark it against my current one when it arrives (Bootcamp W7 of course) and gauge if it's worth it. With the student discount and the resale value it's not that bad an upgrade price.

Also sorry to hear people have been having problems with their 750M's failing, I've been lucky and never had a single issue so far.

Let me know how the benchmarks are. I still have access to my girlfriends student discount through the end of this year, which is making me contemplate getting this even more (granted I have an upgraded 09 MP and a 2010 MBP).
 
Let me know how the benchmarks are. I still have access to my girlfriends student discount through the end of this year, which is making me contemplate getting this even more (granted I have an upgraded 09 MP and a 2010 MBP).

No problem, any games in particular you wanted looked at that I might have? I'm a world of tanks addict so a 70% would make the game go from playable to a perfect sweet spot while retaining good visuals. ~31avg to ~50avg would be huge for me, I sink so many hours into that game it's getting ridiculous, closing in on unicum stats though #.

Do you know if Apple (UK) still include a 3 year warranty with their student discount now that they've moved to Unidays rather than relying on a Internet connection within a university?
 
Maybe but I'm only interested in the dGPU under Bootcamp. Also we know a little from WCCF and from Apple's info and they are usually pretty accurate. A 70% gain is considerable but I'll benchmark it soon enough and find out if it's worth the hit to my wallet.

Yep, please share your findings! I would be very interested.
 
Yep, please share your findings! I would be very interested.

Yeh will do, let me know if there's any games you want tested that I might have and I'll be sure to compare, will be cool having them side by side. I'll start a thread in this forum for you all to see next week when the Mac arrives.
 
Yeh will do, let me know if there's any games you want tested that I might have and I'll be sure to compare, will be cool having them side by side. I'll start a thread in this forum for you all to see next week when the Mac arrives.

Skyrim, Mass Effect 3, and if you can.. Witcher 3 please =)
 
  • Like
Reactions: Vanilla35
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.