Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
If they bring out a Mac Mini with decent 4K support, and GPU performance good enough for casual gaming (not at 4K obviously) it could convince me to retire my Mac Pro.

But it will probably have soldered RAM, which would put me off a bit.

In your case, wait for Kaby Lake or Cannonlake. iGPU should be caught up by then, though there will still be compromises.
 
Pfft. I run two 1080 monitors at 144hz each. Everything is buttery smooth. I went back to 60hz very momentarily the other day and I instantly noticed the difference, so much choppier! 4K at 144hz or gtfo!
 
Again, so typical of the status quo, "1080p is good enough" argument, which mirrors the former "720p is good enough" stance when Apple still clung to that. Were there ever true 720p streams back then?

There's not a single app in the app store that can take advantage of the truly unique features of A9. So why bother with A9?

There was nothing to hook to USB3C, so why bother implementing that hardware "advancement"?

There's nothing to take advantage of Thunderbolt 3, so why bother?

Why bother with Skylake when there's not a lick of software to take advantage of any unique bit of those new chips?

In fact, since pretty much everything works now, why bother with anything new? Just stick with the "as is."

And on and on. Why are we so quick to argue for status quo for this ONE bit of technology but don't lash out with similar rationale against the advancement of just about anything and everything else? Again rhetorical. I asked the same question when the chorus was "720p is good enough". Where did all those in that chorus go after Apple embraced 1080p? That's also rhetorical: I'm guessing they went to the same place that all of the "1080p is good enough" crowd will go as soon as Apple embraces 4K... which appears to be very soon... else the competing computers using the exact same chips will appear to offer the 4K advantage (real or imagined).

You're missing the point entirely. It's not about maintaining status quo, it's about timing. And the irony of your post is that Apple has done more to disrupt the status quo than any of its peers.

And why are you so obsessed with how everyone might lavish praise on Apple once they adopt 4K or any new technology that has been adopted by others first? Apple also gets unfair amount of criticism as well. Does that bother you as much? It just comes with the territory of being the worlds largest and most visible company.
 
  • Like
Reactions: MrUNIMOG
Is it just me, or is calling everything USB-C rather confusing? Is USB-C a connector, or a transfer protocol? I understand that TB3 contains USB-C, but USB-C doesn't necessarily contain TB3. When I see a connector called USB-C, is it TB3 or just USB-C, or is it just a connector and doesn't have USB-3 at all?

Seems like it should have a different name for the connector and then call USB-C a protocol that is the next generation after USB 3.0.
 
USB-C is a connector type.
There's also a 10 Gb/s variant of USB3 known as USB 3.1 Revision 2.
And a 40Gb/s variant of Thunderbolt known as TB3.
And a Display Port revision supporting 5K and 8K displays (DP 1.3)

But you don't necessarily get anything more than, e.g., USB 2.0 over a USB-C connector, if the implementer decides to be stingy. Not even 100W power.
 
  • Like
Reactions: MrUNIMOG
I don't care about 4K... I want 5K and laptops with external GPUs as an option. Where is display port 1.3?
 
Apple's specs for the 13 inch rMBP (broadwell I believe) state it can support 2 4k displays plus the internal, nearly 4k display. It doesn't have a discrete GPU. This seems like a discrepancy between the article that says broadwell only allows 1 4k and Apple's specs. Anyone know what the real story is here now?
 
All this focus on 4K (or better) in impending hardware, phones (but not iPhones yet) shooting video at 4K, 4K camcorders dropping in price, 4K TVs dropping in price and increasingly overtaking prime real estate at TV-selling stores, H.265 seemingly impending, etc.

Then, visit any :apple:TV 4 speculation thread and find it packed with some of us putting down 4K as a "gimmick", "nobody can see a difference from average seating distances", and on and on. Nutshell sentiment: "1080p is good enough" just as "720p was good enough" when Apple continued to cling to that as maximum HD (and thus 1080p was the "gimmick", "nobody can see", etc).

Glad to see lots of stock hardware bringing the capability to the masses. Wonder if 4K will still be "gimmick" and "nobody can see" when Apple gets around to implementing it in Apple hardware? Rhetorical: I already know as I saw how quickly the "720p is good enough" argument evaporated as soon as Apple embraced 1080p. Rinse. Repeat.

Do you think we'll get a 4K iPhone display someday?
 
I must be one of the few who don't see the appeal. From most viewing distances, I cannot discern between individual pixels on my 1080p monitor, and that's when I'm wearing my glasses, contacts, pocket protector, etc.

From more than 24", yes. However if you sit closer than than then it's the difference between retina and non retina and that is a very big deal.
 
  • Like
Reactions: AngerDanger
USB-C is a connector type.
There's also a 10 Gb/s variant of USB3 known as USB 3.1 Revision 2.
And a 40Gb/s variant of Thunderbolt known as TB3.
And a Display Port revision supporting 5K and 8K displays (DP 1.3)

But you don't necessarily get anything more than, e.g., USB 2.0 over a USB-C connector, if the implementer decides to be stingy. Not even 100W power.

Most of the Z170 Motherboards that came out, support USB3.1 and a single USB-C connector, and none that I looked at support TB3 over it as they're all using third-party chips (eg ASMedia) for it. http://www.anandtech.com/show/9485/...-asrock-asus-gigabyte-msi-ecs-evga-supermicro
Thunderbolt%203%20-%20Docking_575px.png

"As mentioned in GIGABYTE’s details above, the Alpine Ridge solution will add around $10 to the cost of the board, which probably translates near $20 to the end-user cost. It is our understanding that the increased speed of the Z170 launch means that there has been supply issues with Alpine Ridge controllers and that there will be more products coming out next month (September) from various manufacturers that will use the controller."

So you're going to get burned if you early-adopt USB-C. Make sure what you get supports TB3, or it never will.
 
Do you think we'll get a 4K iPhone display someday?

Maybe? Some competitors are there now. Apparently "retina"- the max resolution resolvable by human eyes- was not enough for Apple, as they have since upped the dpi on some iDevices. Plus an iDevice that shoots 4K probably begs for a 4K screen (even at that tiny size). Since a 4K iDevice also begs for very profitable on-board storage upgrades, I can see it. Looking backwards, Apple updated :apple:TV to 1080p AFTER they had iDevices on the market that could shoot 1080p.

My guess: given this Skylake news, all of the computer makers are going to be hyping 4K soon, So I will be somewhat surprised if the next iPhone's new camera lacks the ability to shoot it. Then again, I was also surprised that Apple seemed about last in adopting 1080p so who knows?
 
This is getting confusing. So Thunderbolt 3 has the port of a USB-C. So then why continue making USB 3.1 and all that? Let Intel take control of the USB family then.

You go to a computer store and on a computer it says that it has USB-C. Then there is another computer that says it has thunderbolt 3, but the port looks exactly like USB-C. Why can't there just be only thunderbolt 3 ports? Might as well kill off the USB title.
 
Typographers can tell the difference between 972 dpi and 300 dpi. The problem of making them look good on low resolution devices, such as 300 dpi laser printers was solved by "hinting". Of course that's ink and paper.

 
I suppose it's no different than plugging in a 3.1 Revision 1 device, or a Display Port 1.2 device into the Early 2015 Macbook.
 
Last edited:
Just waiting for Apple to make this a reality!

Retina Macbook 2 + this:


http://gizmodo.com/heres-the-box-that-can-turn-your-puny-laptop-into-a-gra-1724958260

Here's The Box That Can Turn a Puny Laptop Into a Graphical Powerhouse

1392758470508056975.jpg


USB Type-C is shaping up to be the holy grail of ports. It can charge your laptop, deliver 4K video, and transfer loads of USB data all over a single cable—all at the same time. What could be better? You’re looking at the answer.

What you see in these pictures is a hub that uses Intel’s Thunderbolt 3, a supercharged version of USB-C with doublethe bandwidth. What does that actually mean in practice? It’s fast enough that you can actually augment the power of a relatively weak laptop with an external graphics card... yes, while still charging the laptop... driving two 4K monitors... and powering your USB devices all at the same time. Here’s what that looks like:

1392758470583706767.jpg


That’s right: with just a single USB-C Thunderbolt cable plugged into the side of this super thin, super light laptop we spotted at IDF 2015, you get three USB 3.0 ports, two HDMI ports, two DisplayPorts, external audio, and ethernet all at the same time. Plus an extra USB Type-C port for—in this case—attaching a ridiculously-fast external solid state drive.

1392758470626065039.jpg


1392758470665728911.jpg


1392758470947345551.jpg


The best part isn’t the plethora of ports, though: it’s the fact that this sleek box has an external graphics chip inside. In this case, an AMD Radeon R9 M385. Hello, games!

What if you need even more graphical muscle? Say, if you want to plug your thin and light laptop in at night and play some Grand Theft Auto V? Thunderbolt 3 can handle a way bigger external graphics card dock, too. Here’s what it looks like with a full-size AMD R9 200 series graphics card, delivering a respectable framerate in the Unigine Heaven benchmark.

1392758471040012943.jpg


1392758471222063503.jpg


1392758471268446607.jpg


1392758471312921999.jpg


Sadly, all of these Thunderbolt 3 boxes—and the laptop—are just Inventec reference designs, not commercial products yet. Plus, Intel won’t say what they might cost or when they might arrive, though the first real Thunderbolt 3 products will allegedly start hitting the market by the end of the year.

Will manufacturers actually build external graphics solutions with Thunderbolt 3? “Watch this space,” says Navin Shenoy, an Intel executive.
 
  • Like
Reactions: mrxak
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.