Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
1152Gflops is rather impressive for an integrated card. My mid-2012 rMBP does about 650Gflops. The Xbox One does about 1300Gflops—although it is the "underpowered" Xbox One and not the PS4. If I were in the market and they could get a quad-core and this integrated chip into a 13" Pro then I might buy one. But what I really want is a 5K iMac which has a dedicated card. How does that work—do these newer chips come all in one package so the newer iMacs (or at least the higher-end ones) would have two graphics cards, one integrated and one dedicated? From what I've read the 5K iMac can be a little laggy at times so I want to avoid that. I have to deal with that on my rMBP sometimes and it's annoying. My work iMac doesn't mention having two cards and it's the mid-higher end model.

HAH! I just looked up the gflops of my work iMac and the GeForce GTX 675MX has a rating of 1,152Gflops. What a coincidence…hmmmm……that's really weird.
 
If they bring out a Mac Mini with decent 4K support, and GPU performance good enough for casual gaming (not at 4K obviously) it could convince me to retire my Mac Pro.

But it will probably have soldered RAM, which would put me off a bit.
 
What about

1. 10-bit 4:4:4 4K60
2. 10-bit 4:4:4 3D 4K30
3. 8K

using DisplayPort 1.3 ?
 
Apple Thunderbolt Display with Thunderbolt 3 and USB 3.1 type-C (reversible; generation 2) built-in hubs around the corner?

Likewise for a new Apple Keyboard with Numeric Keypad and built-in USB 3.1 hub with two ports?

Is that possible now with the new Intel Skylake chips?
That would be great.
 
Pfft...give me my 8-core MacBook Pro running at 5GHz with a 2TB PCIe Flash Drive and 64GB of RAM, and I will be happy. MAKE IT HAPPEN

Okay, in all reality. We just have to be patient with things, these upgrades are in the works and we will get them when the technology gets there. Probably an 8-core MacBook Pro will always be out of the question due to power and heat...but perhaps breaking the 3GHz barrier on the quad-core Skylake offerings? And yes that 2TB PCIe SSD is possible now....and 64GB of RAM might be out of question, but I am sure Apple can make a 32GB MacBook Pro a reality in the near future!!
 
Last edited:
  • Like
Reactions: MrUNIMOG
In my opinion these multi-protocol cables are a bad idea. All the consumer know is they have a port that look like X and get upset when they plug something in and it doesn't work. Only to have someone explain to them they have a early 2015 Macbook and they need a late 2015 model to make it work.
That's a necessary downside to having a multi protocol port. The alternative is we keep having multiple kinds of ports. What's better in the long run?
 
I can't wait for this new chipset and the subsequent updates to the Macbook Pro and Apple display lines.

My wild speculation—and hope—is Apple has planned a major upgrade to both. My current hardware is not old by any stretch of the imagination and could last years, but I'm really looking forward to the leap of innovation (I think) Apple has in store for us.
 
  • Like
Reactions: MrUNIMOG
Hmm so a rMB with a usb-c port is kinda a underperforming concept , until it gets a TB 3 port .

Though I guess many current users will not care, just like TB and TB 2, a port most people
Never used anyway and accessories cost a fortune .
 
There's a cost to pushing all those extra pixels... slower frame rates, lower battery life, higher (costlier) bandwidth usage.

It's the industry's way of getting us to upgrade. 3D failed, curved screens failed so now it's all about 4K.

It will happen eventually, but I see no reason why Apple should rush to support 4K just to have another check mark on their spec sheet.

It's a valuable check mark, a valuable spec.
I've had a 4K tv for a couple weeks now. Granted it's a higher-end model, but I love the difference, the improvement. It up-scales very well for those wondering about 'content'....

It'd be hard for me to not look at new stuff, like the upcoming apple tv, to be worthwhile if it didn't support 4k... I think new hardware should be at the level now.
 
I can't wait for this new chipset and the subsequent updates to the Macbook Pro and Apple display lines.

My wild speculation—and hope—is Apple has planned a major upgrade to both. My current hardware is not old by any stretch of the imagination and could last years, but I'm really looking forward to the leap of innovation (I think) Apple has in store for us.

Square mini display port will be replaced by usb-c port.

It's Intel driving this. Apple decides how few ports to incorporate into the designs that take the advantage of the architecture.
 
Retina Macbook 2 + this:


http://gizmodo.com/heres-the-box-that-can-turn-your-puny-laptop-into-a-gra-1724958260

Here's The Box That Can Turn a Puny Laptop Into a Graphical Powerhouse

1392758470508056975.jpg


USB Type-C is shaping up to be the holy grail of ports. It can charge your laptop, deliver 4K video, and transfer loads of USB data all over a single cable—all at the same time. What could be better? You’re looking at the answer.

What you see in these pictures is a hub that uses Intel’s Thunderbolt 3, a supercharged version of USB-C with doublethe bandwidth. What does that actually mean in practice? It’s fast enough that you can actually augment the power of a relatively weak laptop with an external graphics card... yes, while still charging the laptop... driving two 4K monitors... and powering your USB devices all at the same time. Here’s what that looks like:

1392758470583706767.jpg


That’s right: with just a single USB-C Thunderbolt cable plugged into the side of this super thin, super light laptop we spotted at IDF 2015, you get three USB 3.0 ports, two HDMI ports, two DisplayPorts, external audio, and ethernet all at the same time. Plus an extra USB Type-C port for—in this case—attaching a ridiculously-fast external solid state drive.

1392758470626065039.jpg


1392758470665728911.jpg


1392758470947345551.jpg


The best part isn’t the plethora of ports, though: it’s the fact that this sleek box has an external graphics chip inside. In this case, an AMD Radeon R9 M385. Hello, games!

What if you need even more graphical muscle? Say, if you want to plug your thin and light laptop in at night and play some Grand Theft Auto V? Thunderbolt 3 can handle a way bigger external graphics card dock, too. Here’s what it looks like with a full-size AMD R9 200 series graphics card, delivering a respectable framerate in the Unigine Heaven benchmark.

1392758471040012943.jpg


1392758471222063503.jpg


1392758471268446607.jpg


1392758471312921999.jpg


Sadly, all of these Thunderbolt 3 boxes—and the laptop—are just Inventec reference designs, not commercial products yet. Plus, Intel won’t say what they might cost or when they might arrive, though the first real Thunderbolt 3 products will allegedly start hitting the market by the end of the year.

Will manufacturers actually build external graphics solutions with Thunderbolt 3? “Watch this space,” says Navin Shenoy, an Intel executive.

The weak link here being the CPU in the laptop. With that setup ,
A desktop is so much cheaper and efficient .

I like the concept, new line of Alienware has external GPU setups , though in practice, people who want gaming laptops want it in one package, the box for the external GPU is huge .
 
AMD needs to get their stuff together. I've never seen the CPU market so stagnate. Intel not even innovating anymore when these minimal performance increases.
 
  • Like
Reactions: mrxak
I would personally like to see one 4k monitor at 120hz, or better yet, 240hz.
 
And the benefit to all those extra pixels is sharper images. Do you feel the same about "retina" or the more recent launches that go beyond what was originally defined as retina (apparently the maximum that human eyes can resolve)? Let me guess: no, that's completely different. Yes, pushing more pixels asks more of underlying hardware... just like Retina (for which we don't seem to find much fault with Apple as they roll those many-more-pixel displays out; in fact, retina or retina+ is often one of the most touted reasons new hardware from Apple is a "must have upgrade").

Apparently, more robust hardware is about to hit. If Intel is going to build this into stock, core hardware, Apple has to move along. Else, there are going to be tons of cheaper Windows computers displaying 4K on up to 3 monitors while Apple "takes it's time" (even though the very same chips will be within the next generation of Apple computers).

This 4K thing is dazzling. We will whine to no end about anything where we don't feel like Apple is advancing with latest & greatest... except this one thing... where the status quo is apparently "good enough" (until Apple goes there and then all such positions won't turn on Apple for doing so... just as they didn't when this crowd argued with Apple when "720p was good enough" but didn't fault Apple when they embraced 1080p). Mysteriously, once Apple adopts it, it's gimmicky, failure, useless status is magically transformed into gushing praise & "I'm already in line" greatness.

There's also something called a point of diminishing returns and anything after retina is it for most people.

And I never said there's no benefit to 4K. It will most definitely help with larger formats, and when we're talking about video walls, even 4K isn't enough.

Like I said, 4K will eventually be the standard; My company and all our competitors are creating and promoting 4K products so sheer inertia will ensure that happens. But when you can't even get true 1080 streams, what's the point of rushing to adopt 4K?

As for Apple strategically introducing tech late being called magical, I've never heard that. If such sentiment exists, it's because of their superior implementation on an older idea like NFC payments or MP3 players. No one said Apple's entry into the LTE market was magical, for example. But that's a perfect example of how Apple's restraint and good timing benefited users and they'd do well to delay 4K adoption as well for the reasons I cited earlier.
 
  • Like
Reactions: Ulysses164
Nice update.

Now, if ASUS/Acer/Dell/BenQ/AOC/LG/Samsung/HP/Sharp/QNIX/et cetera would pay their software engineers to get off their collective asses and write display drivers for the OS X platform...
 
I don't think so. Chip ships about now. Even if you have the specs - doesn't a new motherboard need to be fabricated? New shell to fit it all? I think we won't see new laptops with Skylake until March 2016 at the earliest? Maybe even May / June 2016?

Oh dear, the endless waiting. hahaha I just was dreaming for it, I guess.
 
But when you can't even get true 1080 streams, what's the point of rushing to adopt 4K?

Again, so typical of the status quo, "1080p is good enough" argument, which mirrors the former "720p is good enough" stance when Apple still clung to that. Were there ever true 720p streams back then?

There's not a single app in the app store that can take advantage of the truly unique features of A9. So why bother with A9?

There was nothing to hook to USB3C, so why bother implementing that hardware "advancement"?

There's nothing to take advantage of Thunderbolt 3, so why bother?

Why bother with Skylake when there's not a lick of software to take advantage of any unique bit of those new chips?

In fact, since pretty much everything works now, why bother with anything new? Just stick with the "as is."

And on and on. Why are we so quick to argue for status quo for this ONE bit of technology but don't lash out with similar rationale against the advancement of just about anything and everything else? Again rhetorical. I asked the same question when the chorus was "720p is good enough". Where did all those in that chorus go after Apple embraced 1080p? That's also rhetorical: I'm guessing they went to the same place that all of the "1080p is good enough" crowd will go as soon as Apple embraces 4K... which appears to be very soon... else the competing computers using the exact same chips will appear to offer the 4K advantage (real or imagined).
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.