Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Is it just me, or is calling everything USB-C rather confusing? Is USB-C a connector, or a transfer protocol? I understand that TB3 contains USB-C, but USB-C doesn't necessarily contain TB3. When I see a connector called USB-C, is it TB3 or just USB-C, or is it just a connector and doesn't have USB-3 at all?

Seems like it should have a different name for the connector and then call USB-C a protocol that is the next generation after USB 3.0.
It will always be usb-c since thunderbolt also has usb-c. The only thing consumers will have to worry about is if it has thunderbolt or not.
 
In June, Intel introduced​
Thunderbolt 3​
with a USB Type-C connector and support for USB 3.1,​
DisplayPort 1.2​
and PCI Express 3.0. The new spec, rumored to launch alongside Intel's next-generation Skylake chips, is capable of driving up to two 4K external displays at 60Hz or a single 5K display at 60Hz running off a single cable. Article Link:​
Intel's Skylake Chips Will Drive Three 4K Monitors at 60Hz​
In Alt mode, USB-C supports DisplayPort 1.3
 
All this focus on 4K (or better) in impending hardware, phones (but not iPhones yet) shooting video at 4K, 4K camcorders dropping in price, 4K TVs dropping in price and increasingly overtaking prime real estate at TV-selling stores, H.265 seemingly impending, etc.

Then, visit any :apple:TV 4 speculation thread and find it packed with some of us putting down 4K as a "gimmick", "nobody can see a difference from average seating distances", and on and on. Nutshell sentiment: "1080p is good enough" just as "720p was good enough" when Apple continued to cling to that as maximum HD (and thus 1080p was the "gimmick", "nobody can see", etc).

Glad to see lots of stock hardware bringing the capability to the masses. Wonder if 4K will still be "gimmick" and "nobody can see" when Apple gets around to implementing it in Apple hardware? Rhetorical: I already know as I saw how quickly the "720p is good enough" argument evaporated as soon as Apple embraced 1080p. Rinse. Repeat.
You are confusing 4k screens with 4k video.

4k is currently still a gimmick with regards to video quality. Linustechtips recently did a study on this on youtube, and concluded that there was virtually negligible difference between native 4k video, and 1080p video upscaled to 4k when shot from a phone.

People want a 4k display not so much to watch 4k content, but more so that they can work on 1080p video while still having enough room for controls and other apps open side by side.
 
Pfft. I run two 1080 monitors at 144hz each. Everything is buttery smooth. I went back to 60hz very momentarily the other day and I instantly noticed the difference, so much choppier! 4K at 144hz or gtfo!

But...it's still 1080p, 1/4 the resolution of 4k. Blech.

4K at 144Hz doesn't exist. Such a monitor would need 35.83Gbps of bandwidth. By comparison, your single 1080p at 144Hz is only 8.96Gbps.

Thunderbolt 3 has a max bandwidth of 40Gbps! So theoretically it could support a 4K monitor at 144Hz, provided that you have the graphics horsepower to drive such a beast of a monitor.
 
  • Like
Reactions: MrUNIMOG
Maybe? Some competitors are there now. Apparently "retina"- the max resolution resolvable by human eyes- was not enough for Apple, as they have since upped the dpi on some iDevices. Plus an iDevice that shoots 4K probably begs for a 4K screen (even at that tiny size). Since a 4K iDevice also begs for very profitable on-board storage upgrades, I can see it. Looking backwards, Apple updated :apple:TV to 1080p AFTER they had iDevices on the market that could shoot 1080p.

My guess: given this Skylake news, all of the computer makers are going to be hyping 4K soon, So I will be somewhat surprised if the next iPhone's new camera lacks the ability to shoot it. Then again, I was also surprised that Apple seemed about last in adopting 1080p so who knows?

Great!!! Can't wait until my eyes feast on that 4K iPhone display.
 
Typographers can tell the difference between 972 dpi and 300 dpi. The problem of making them look good on low resolution devices, such as 300 dpi laser printers was solved by "hinting". Of course that's ink and paper.

(Almost) anybody can tell. But, some people care, and, some don't. It may be a genetic difference. I've had people tell me that they just don't care about the difference between VHS tapes and high-quality HD, because, presumably, all they care about is the dialog. I'm the opposite. I'm fine with hearing a Shakespeare play, but, I have to see a Hitchcock film in HD (at least).

But, so far, I'm underwhelmed by Skylake. Although that could change if the single-threaded CPU performance is much improved, as has been hinted.

OK, I guess it does "legitimize" 4K for the mass-market, and, that will help everyone.

Resolution isn't everything, though. I was watching Poirot episodes a while back, and, it was really too bad when they switched from Panaflex 16 to some direct to digital HD. HD color was not very good at first.
 
  • Like
Reactions: mrxak
People want a 4k display not so much to watch 4k content, but more so that they can work on 1080p video while still having enough room for controls and other apps open side by side.

Same thing said about why Apple chose 5K for iMacs: so people can edit 4K video with room for controls, etc.

But let me guess: 5K is perfectly fine and useful for iMacs... even a great reason for people to buy a new iMac.
 
Last edited:
Great!!! Can't wait until my eyes feast on that 4K iPhone display.

If there's ever a reason to argue about resolution overkill, I'd think 4K on a 4.7" or 5.5" screen would be it. But just wait: should Apple do it, we'll gush about the dazzling greatness of "Retina 4K" and how much sharper that screen is than the former "Retina HD" or just plain old "Retina".

We simply find fault with the 4K "gimmick" when it comes to talking about TV screens at 10 to 15 times bigger than iPhone screens. At those much larger sizes, it's all a big gimmick... a plot by the evil TV industry to sell us another TV when the technology we already have is perfectly fine "as is". :rolleyes:

Then, Apple will embrace 4K and we'll fall all over ourselves to "shut up and take my money" for all that 4K goodness. See NFC comments before Apple pay. Or Samsung watches that could only hold a charge for 1 days use. Or bigger-screen phones when Apple still touted 4" screens as "perfect"... and bigger-screen phones when Apple still touted 3.5" as "perfect". And so on.

The thing is this: what in this article is going to be the main focus of computer marketing this year? Thunderbolt 3? USB3C? 4K? 2 ports that can connect to just about nothing or resolution at several times 1080p? Can Apple computer marketing using the very same chips wait until Apple feels like touting 4K? And if Apple is going to talk it up, why not leverage the same spin with the iDevices and rumored new :apple:TV too?

Of course, that said, it felt like Apple dragged their feet on going 1080p when it seemed that pretty much everyone else had already gone there. So why should this be any different?
 
  • Like
Reactions: Menneisyys2
The article is simply wrong about which chips support 4k. I've got a 15" Haswell Macbook Pro running two 4k displays over displayport at 60Hz from late 2014. My Mac Mini Ivy Bridge from 2012 can handle 4k at 30 Hz.
 
  • Like
Reactions: MrUNIMOG
The thing is this: what in this article is going to be the main focus of computer marketing this year? Thunderbolt 3? USB3C? 4K? 2 ports that can connect to just about nothing or resolution at several times 1080p? Can Apple computer marketing using the very same chips wait until Apple feels like touting 4K? And if Apple is going to talk it up, why not leverage the same spin with the iDevices and rumored new :apple:TV too?

Of course, that said, it felt like Apple dragged their feet on going 1080p when it seemed that pretty much everyone else had already gone there. So why should this be any different?

Apple has been the biggest pusher behind better displays and faster I/O.

For many years laptop manufacturers were satisfied with low res TN panels until Apple started using high res IPS displays. There are plenty of smartphones with more pixels than the iPhone, but the iPhone 4 was the main driving force. The whole HiDPI craze was forced by Apple and now HiDPI IPS displays are commonplace. Lets not forget that Apple jumped straight to 5K, which has way more pixels than 4K.

As for I/O, other than Intel, Apple had more engineers behind USB-C than any other tech company. Apple worked closely with Intel to develop Thunderbolt and it seems like Thunderbolt 3 will finally fulfill the goal of the one cable to rule them all. Say hello to 5K or 2x 4K with a single cable.
 
  • Like
Reactions: MrUNIMOG
If there's ever a reason to argue about resolution overkill, I'd think 4K on a 4.7" or 5.5" screen would be it. But just wait: should Apple do it, we'll gush about the dazzling greatness of "Retina 4K" and how much sharper that screen is than the former "Retina HD" or just plain old "Retina".

We simply find fault with the 4K "gimmick" when it comes to talking about TV screens at 10 to 15 times bigger than iPhone screens. At those much larger sizes, it's all a big gimmick... a plot by the evil TV industry to sell us another TV when the technology we already have is perfectly fine "as is". :rolleyes:

Then, Apple will embrace 4K and we'll fall all over ourselves to "shut up and take my money" for all that 4K goodness. See NFC comments before Apple pay. Or Samsung watches that could only hold a charge for 1 days use. Or bigger-screen phones when Apple still touted 4" screens as "perfect"... and bigger-screen phones when Apple still touted 3.5" as "perfect". And so on.

The thing is this: what in this article is going to be the main focus of computer marketing this year? Thunderbolt 3? USB3C? 4K? 2 ports that can connect to just about nothing or resolution at several times 1080p? Can Apple computer marketing using the very same chips wait until Apple feels like touting 4K? And if Apple is going to talk it up, why not leverage the same spin with the iDevices and rumored new :apple:TV too?

Of course, that said, it felt like Apple dragged their feet on going 1080p when it seemed that pretty much everyone else had already gone there. So why should this be any different?

As you guessed correctly, I think 4K is overkill on a 5" smartphone display. I also think the benefit of 4K won't be that noticeable, if at all on a 40-50" 4K TV at the typical viewing distances. You'll need a much larger TV to see the difference and a lot of people will wonder what the fuss is about 4K. That's not to say I don't want to see Apple and other companies go full steam ahead with 4K adoption. I do. I just feel at the moment, 4K is only marginally better than 3D when it comes to benefiting the average consumer. Unless, you take into account the extra benefit of having "more to work with" when it comes to editing. With that being said, I will definitely consider 4K when it comes to recording special events in my personal life, so my video recording will be "future proof".
 
These increases in graphics performance (going by # of gigaflops) are incredible! At this rate, we won't even need those fancy external GPUs.

I can't wait for high-res desktop displays to go mainstream. Funny how we've had rMBPs with more pixels than the ATB Display all this time. Go to Fry's, and you'll find lame 1920x1080 displays everywhere unless you look in the high-price section. Probably because the phone and laptop markets are taking over. A big, cheap Android phone will have more pixels than an average desktop display... I mean come on.
 
Last edited:
As you guessed correctly, I think 4K is overkill on a 5" smartphone display. I also think the benefit of 4K won't be that noticeable, if at all on a 40-50" 4K TV at the typical viewing distances. You'll need a much larger TV to see the difference and a lot of people will wonder what the fuss is about 4K. That's not to say I don't want to see Apple and other companies go full steam ahead with 4K adoption. I do. I just feel at the moment, 4K is only marginally better than 3D when it comes to benefiting the average consumer. Unless, you take into account the extra benefit of having "more to work with" when it comes to editing. With that being said, I will definitely consider 4K when it comes to recording special events in my personal life, so my video recording will be "future proof".
If u dont see difference between full hd and 4k tv (even on 50inch one) you should definitely pay a visit to eye doctor, my friend..

4k is not gimmick, if speaking about tvs, lcds and notebooks, but for iphone, it does not make sense..
 
The article is simply wrong about which chips support 4k. I've got a 15" Haswell Macbook Pro running two 4k displays over displayport at 60Hz from late 2014. My Mac Mini Ivy Bridge from 2012 can handle 4k at 30 Hz.

I believe your laptop has a dedicated GPU (Nvidia) and this article is discussing the Intel built in iGPU (HD Graphics)
 
If u dont see difference between full hd and 4k tv (even on 50inch one) you should definitely pay a visit to eye doctor, my friend..

4k is not gimmick, if speaking about tvs, lcds and notebooks, but for iphone, it does not make sense..

I definitely noticed a difference on a 65" 4K TV. Made me a little envious. Definitely not on a Sony 43" 4K TV which is most likely the model I would get if I was in the market right now for replacing my 40" FHD TV in my apartment.
 
Please Apple, start the September event by showing whatever ios thingies are needed to keep investors and teenz happy, but then provide us with the real stuff we need for work:

-New iMacs with better OpenCL performance.
-Quad core Mac Mini.
-Updated Mac Pro, with updated GPUs and CPUs.
-Please provide us with a OpenCL profiler, so that we can develop OpenCL kernels without guessing by trial and error where the bottleneck is on the GPU.
-Bringing back NVIDIA GPUs would be welcome, mainly because they allow us to use the same machine for CUDA and OpenCL development.

I don't care if you dedicate the whole September event to the thingies stuff, provided you also release the products we need at the same time, even silently.

Thanks!
 
Ha! TBH, I think a 30" 6-8K would be just about perfect. 28" feels like it could use a bit more space, and 32" is a bit too much screen to fit in your field of view.
And then there's people who like wider FoV. You can always go further from the screen few inches, but going closer is not so easy (multiple monitors angles, near vision, etc.)

You might be right. USB-C is still a very capable port without Thunderbolt, just not as capable.
UsbC can include dp1.3, TB3 does not.

What about
1. 10-bit 4:4:4 4K60
2. 10-bit 4:4:4 3D 4K30
3. 8K
using DisplayPort 1.3 ?
TB3 does not support dp1.3. TB is designed to be based on existing tech. Dp1.3 was finalized when TB3 was already in design. So, TB is always one gen behind.
https://forums.macrumors.com/thread...isplays-at-60hz.1888677/page-10#post-21424209

I would personally like to see one 4k monitor at 120hz, or better yet, 240hz.
For gaming (or flying a real battle drone) 144Hz - 1000Hz might be nice, but I'd be happy with even 72, 75 (tv here in Europe is still 25/50fps), 90 (3xNTSC), 96 (4xCinema) or 100Hz. Looks like we can't get any incremental progress here...

I can't see me ever needing to drive three monitors simultaneously.
Can you see anybody else needing those 3?

Most of the Z170 Motherboards that came out, support USB3.1 and a single USB-C connector, and none that I looked at support TB3 over it as they're all using third-party chips (eg ASMedia) for it.
So you're going to get burned if you early-adopt USB-C. Make sure what you get supports TB3, or it never will.
Isn't TB intel's monopoly? So there won't be any 3rd party TB chips? ...And therefore 3rd party chips won't ever get TB...

In conclusion, it will be very interesting how Apple will make these different "cross advanced" tech understandable to "average mac buyer". Will there be 3 ports in next mbp model:
  1. 1 legacyTB (mDP),
  2. 1 usbC w/ support to TB
  3. 1 usbC w/ support to dp1.3
I guess that's too complicated "for the rest of us", so I guess Apple will put only usbC with TB -ports there and discard the support for dp1.3 until TB4 (2018?).
This way next year every new computer (w/ current chipset) on the planet, other than a Mac, will support 2x5k or 3x4k. Macs will support for the next 3 years one 5k or 2x4k, with one exception macPro (early2016), which will have several TB chips.

Or Apple will accept that TB has become a legacy bottleneck and makes it obsolete. Fast data pipe will then be just a regular pcie through usbC's Alternate Mode?

Or could one usbC port have both TB and dp without user needing to know which is needed?
If dp1.3 connection is plugged in, the route to GPU goes around TB chip and if there's a TB connection plugged, the data will go to TB chip?
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.