I can't see me ever needing to drive three monitors simultaneously.
For this guy:
http://www.theonion.com/article/coworker-with-two-computer-screens-not-****ing-aro-29151
edit: bleh macrumors doesn't like the link
I can't see me ever needing to drive three monitors simultaneously.
It will always be usb-c since thunderbolt also has usb-c. The only thing consumers will have to worry about is if it has thunderbolt or not.Is it just me, or is calling everything USB-C rather confusing? Is USB-C a connector, or a transfer protocol? I understand that TB3 contains USB-C, but USB-C doesn't necessarily contain TB3. When I see a connector called USB-C, is it TB3 or just USB-C, or is it just a connector and doesn't have USB-3 at all?
Seems like it should have a different name for the connector and then call USB-C a protocol that is the next generation after USB 3.0.
In Alt mode, USB-C supports DisplayPort 1.3In June, Intel introducedThunderbolt 3with a USB Type-C connector and support for USB 3.1,DisplayPort 1.2and PCI Express 3.0. The new spec, rumored to launch alongside Intel's next-generation Skylake chips, is capable of driving up to two 4K external displays at 60Hz or a single 5K display at 60Hz running off a single cable. Article Link:Intel's Skylake Chips Will Drive Three 4K Monitors at 60Hz
You are confusing 4k screens with 4k video.All this focus on 4K (or better) in impending hardware, phones (but not iPhones yet) shooting video at 4K, 4K camcorders dropping in price, 4K TVs dropping in price and increasingly overtaking prime real estate at TV-selling stores, H.265 seemingly impending, etc.
Then, visit anyTV 4 speculation thread and find it packed with some of us putting down 4K as a "gimmick", "nobody can see a difference from average seating distances", and on and on. Nutshell sentiment: "1080p is good enough" just as "720p was good enough" when Apple continued to cling to that as maximum HD (and thus 1080p was the "gimmick", "nobody can see", etc).
Glad to see lots of stock hardware bringing the capability to the masses. Wonder if 4K will still be "gimmick" and "nobody can see" when Apple gets around to implementing it in Apple hardware? Rhetorical: I already know as I saw how quickly the "720p is good enough" argument evaporated as soon as Apple embraced 1080p. Rinse. Repeat.
Pfft. I run two 1080 monitors at 144hz each. Everything is buttery smooth. I went back to 60hz very momentarily the other day and I instantly noticed the difference, so much choppier! 4K at 144hz or gtfo!
Maybe? Some competitors are there now. Apparently "retina"- the max resolution resolvable by human eyes- was not enough for Apple, as they have since upped the dpi on some iDevices. Plus an iDevice that shoots 4K probably begs for a 4K screen (even at that tiny size). Since a 4K iDevice also begs for very profitable on-board storage upgrades, I can see it. Looking backwards, Apple updatedTV to 1080p AFTER they had iDevices on the market that could shoot 1080p.
My guess: given this Skylake news, all of the computer makers are going to be hyping 4K soon, So I will be somewhat surprised if the next iPhone's new camera lacks the ability to shoot it. Then again, I was also surprised that Apple seemed about last in adopting 1080p so who knows?
Typographers can tell the difference between 972 dpi and 300 dpi. The problem of making them look good on low resolution devices, such as 300 dpi laser printers was solved by "hinting". Of course that's ink and paper.
wouldn't there be a new xeon chip based off skylake?Mac Pro is Xeon chip.
People want a 4k display not so much to watch 4k content, but more so that they can work on 1080p video while still having enough room for controls and other apps open side by side.
Great!!! Can't wait until my eyes feast on that 4K iPhone display.
The thing is this: what in this article is going to be the main focus of computer marketing this year? Thunderbolt 3? USB3C? 4K? 2 ports that can connect to just about nothing or resolution at several times 1080p? Can Apple computer marketing using the very same chips wait until Apple feels like touting 4K? And if Apple is going to talk it up, why not leverage the same spin with the iDevices and rumored newTV too?
Of course, that said, it felt like Apple dragged their feet on going 1080p when it seemed that pretty much everyone else had already gone there. So why should this be any different?
If there's ever a reason to argue about resolution overkill, I'd think 4K on a 4.7" or 5.5" screen would be it. But just wait: should Apple do it, we'll gush about the dazzling greatness of "Retina 4K" and how much sharper that screen is than the former "Retina HD" or just plain old "Retina".
We simply find fault with the 4K "gimmick" when it comes to talking about TV screens at 10 to 15 times bigger than iPhone screens. At those much larger sizes, it's all a big gimmick... a plot by the evil TV industry to sell us another TV when the technology we already have is perfectly fine "as is".
Then, Apple will embrace 4K and we'll fall all over ourselves to "shut up and take my money" for all that 4K goodness. See NFC comments before Apple pay. Or Samsung watches that could only hold a charge for 1 days use. Or bigger-screen phones when Apple still touted 4" screens as "perfect"... and bigger-screen phones when Apple still touted 3.5" as "perfect". And so on.
The thing is this: what in this article is going to be the main focus of computer marketing this year? Thunderbolt 3? USB3C? 4K? 2 ports that can connect to just about nothing or resolution at several times 1080p? Can Apple computer marketing using the very same chips wait until Apple feels like touting 4K? And if Apple is going to talk it up, why not leverage the same spin with the iDevices and rumored newTV too?
Of course, that said, it felt like Apple dragged their feet on going 1080p when it seemed that pretty much everyone else had already gone there. So why should this be any different?
If u dont see difference between full hd and 4k tv (even on 50inch one) you should definitely pay a visit to eye doctor, my friend..As you guessed correctly, I think 4K is overkill on a 5" smartphone display. I also think the benefit of 4K won't be that noticeable, if at all on a 40-50" 4K TV at the typical viewing distances. You'll need a much larger TV to see the difference and a lot of people will wonder what the fuss is about 4K. That's not to say I don't want to see Apple and other companies go full steam ahead with 4K adoption. I do. I just feel at the moment, 4K is only marginally better than 3D when it comes to benefiting the average consumer. Unless, you take into account the extra benefit of having "more to work with" when it comes to editing. With that being said, I will definitely consider 4K when it comes to recording special events in my personal life, so my video recording will be "future proof".
The article is simply wrong about which chips support 4k. I've got a 15" Haswell Macbook Pro running two 4k displays over displayport at 60Hz from late 2014. My Mac Mini Ivy Bridge from 2012 can handle 4k at 30 Hz.
If u dont see difference between full hd and 4k tv (even on 50inch one) you should definitely pay a visit to eye doctor, my friend..
4k is not gimmick, if speaking about tvs, lcds and notebooks, but for iphone, it does not make sense..
and an affordable eGPUMac Mini with quad core and a couple of usb type c please.
For this guy:
http://www.theonion.com/article/coworker-with-two-computer-screens-not-****ing-aro-29151
edit: bleh macrumors doesn't like the link
wouldn't there be a new xeon chip based off skylake?
And then there's people who like wider FoV. You can always go further from the screen few inches, but going closer is not so easy (multiple monitors angles, near vision, etc.)Ha! TBH, I think a 30" 6-8K would be just about perfect. 28" feels like it could use a bit more space, and 32" is a bit too much screen to fit in your field of view.
UsbC can include dp1.3, TB3 does not.You might be right. USB-C is still a very capable port without Thunderbolt, just not as capable.
TB3 does not support dp1.3. TB is designed to be based on existing tech. Dp1.3 was finalized when TB3 was already in design. So, TB is always one gen behind.What about
1. 10-bit 4:4:4 4K60
2. 10-bit 4:4:4 3D 4K30
3. 8K
using DisplayPort 1.3 ?
For gaming (or flying a real battle drone) 144Hz - 1000Hz might be nice, but I'd be happy with even 72, 75 (tv here in Europe is still 25/50fps), 90 (3xNTSC), 96 (4xCinema) or 100Hz. Looks like we can't get any incremental progress here...I would personally like to see one 4k monitor at 120hz, or better yet, 240hz.
Can you see anybody else needing those 3?I can't see me ever needing to drive three monitors simultaneously.
Isn't TB intel's monopoly? So there won't be any 3rd party TB chips? ...And therefore 3rd party chips won't ever get TB...Most of the Z170 Motherboards that came out, support USB3.1 and a single USB-C connector, and none that I looked at support TB3 over it as they're all using third-party chips (eg ASMedia) for it.
So you're going to get burned if you early-adopt USB-C. Make sure what you get supports TB3, or it never will.