Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
63,537
30,847



The Video Electronics Standards Association yesterday formally announced its new DisplayPort 1.4 standard, setting the stage for improved video quality and color for external display connections over both DisplayPort and USB-C connectors.

dp_usb_c.jpg

Rather than an increase in actual bandwidth, the improvements in DisplayPort 1.4 come due to improved compression, taking advantage of VESA's new Display Stream Compression 1.2 standard to support High Dynamic Range (HDR) video up to either 8K resolution at 60 Hz or 4K resolution at 120 Hz.
DSC version 1.2 transport enables up to 3:1 compression ratio and has been deemed, through VESA membership testing, to be visually lossless. Together with other new capabilities, this makes the latest version of DP ideally suited for implementation in high-end electronic products demanding premier sound and image quality.
dp_1_4_compression.jpg

In addition to video-related improvements, DisplayPort 1.4 also expands audio capabilities with support for 32 channels, 1536kHz sample rates, and broader support for "all known" audio formats.

The approval of DisplayPort 1.4 comes even though consumers are still awaiting the arrival of devices supporting the previous DisplayPort 1.3 standard. Intel had been expected to support DisplayPort 1.3 in its current Skylake generation of chips, but the company instead opted to offer dual DisplayPort 1.2 support. As we detailed earlier this year, the lack of DisplayPort 1.3 support in Skylake could lead Apple to hold off on releasing a new 5K Thunderbolt Display until next year when chips supporting the standard become available.

Intel hasn't laid out its DisplayPort support plans beyond Skylake, so it's unknown whether the company will first move to DisplayPort 1.3 or if it can jump straight to the new DisplayPort 1.4 standard. Either way, we're unlikely to see Macs supporting DisplayPort 1.4 until 2017 at the earliest.

Article Link: DisplayPort 1.4 to Use 'Lossless' Compression for Higher-Quality 8K Video Over USB-C
 

JonneyGee

macrumors 6502
Jun 8, 2011
358
1,222
Nashville, TN
Intel hasn't laid out its DisplayPort support plans beyond Skylake, so it's unknown whether the company will first move to DisplayPort 1.3 or if it can jump straight to the new DisplayPort 1.4 standard. Either way, we're unlikely to see Macs supporting DisplayPort 1.4 until 2017 at the earliest.

Knowing Intel, we might see 1.4 sometime around 2025. :rolleyes:
 
Last edited:

PatriotInvasion

macrumors 68000
Jul 18, 2010
1,643
1,048
Boston, MA
Officially holding out my next purchase for a newly designed MacBook Pro that can drive a standalone 5K Retina Thunderbolt Display.

However, assuming there will be a new case design for the 5K iMac this fall (4 years since it was last redesigned, and 7 years of the same 16:9 front face look and feel), I may not be able to resist and would get that and continue using an iPad as my portable machine (even though it lacks the ability to do some of things I prefer doing on the Mac). First world problems to the fullest.
 

Canubis

macrumors 6502
Oct 22, 2008
425
524
Vienna, Austria
Compression is never lossless. It may be imperceptible to the ear or eye, but it is still compressed at some level.
Excuse me, but this is complete nonsense. Indeed there is true lossless compression (though I cannot say for sure that "lossless" in this article truly means lossless). Just think about zip compression, you compress a text file by zipping it and get the exact same file with all data back when unzipping it. That's exactly what happens also with audio or video, when talking about lossless compression, though algorithms may differ.
 

oneMadRssn

macrumors 603
Sep 8, 2011
5,978
13,990
Compression is never lossless. It may be imperceptible to the ear or eye, but it is still compressed at some level.

That's not true at all. There is plenty of compression that is lossless.
[doublepost=1456940003][/doublepost]
support High Dynamic Range (HDR) video up to either 8K resolution at 60 Hz

This is why I think it is wise to hold off on investing in major 4k hardware for now. The standard is totally in flux. Why pay $1000+ now for a 4k TV that uses HDMI 1.4 or 2.0, when there are other HDR and 8k standards right on the horizon? It looks about as silly as buying a 720i TV with only components inputs in 2003, when HDMI and 1080p was right on the horizon.
 
Last edited:

cube

Suspended
May 10, 2004
17,011
4,972
Good. Then I will have a 40-48" 3D 8K HDR smart retina "monitor".

Hopefully FreeSync too (TVs also need it).

And 4 input ports is not enough.

And I've had it with the snooping, EULAs, and subpar usability.
 

manu chao

macrumors 604
Jul 30, 2003
7,219
3,031
Compression is never lossless. It may be imperceptible to the ear or eye, but it is still compressed at some level.
And compressing a file in the form of a ZIP 'archive' is also never lossless. If you zip a text file, the text will be shortened by slight rephrasing and the use of shorter words. This may be imperceptible to the reader but it is still compressed at some level.
 

theluggage

macrumors 604
Jul 29, 2011
7,501
7,385
Compression is never lossless. It may be imperceptible to the ear or eye, but it is still compressed at some level.

No - there are plenty of lossless compression algorithms. E.g. LZW compression used by GIF, some types of TIFF and ".zip" files (using lossy compression on executables and data files is not a good idea), Apple Lossless audio, FLAC audio, .png files...

Lossless compression works by finding a more efficient way to pack the data. E.g. replacing frequently occurring patterns with short codes or 'run length encoding' (e.g. if there is a row of 1000 white pixels just send 'white' and '1000'). Morse code is another example (frequently occurring letters are given the shortest code rather than ASCII which uses 8 bits for every single character - Huffman encoding is the algorithmic equivalent).

With lossless compression, you get back exactly what you put in. You don't get the sort of 100x compression you see with lossy compression, but 2-3x is feasible.

However, "visually lossless" is either a redundancy (if its lossless, of course there's no visual difference) or weasel words.
 

name99

macrumors 68020
Jun 21, 2004
2,188
1,997
And compressing a file in the form of a ZIP 'archive' is also never lossless. If you zip a text file, the text will be shortened by slight rephrasing and the use of shorter words. This may be imperceptible to the reader but it is still compressed at some level.

Jesus. Never in my life would I have believed this is how people actually think.
Dude, I don't want to mock you but you are so wrong, so UTTERLY ignorant about this subject, it's not funny.

Let me simply point out that LOSSLESS compression is a well-defined mathematical field, it is built upon probability theory, like all CS it consists of theorems and algorithms, and it applies to all random processes (think "data streams") regardless of whether they are text, video, sensor data, or anything else.
It has NOTHING to do with "rephrasing" and use of "shorter words". Your comment is the sort of thing I'd expect EE's to send each other on April Fools' day as a joke.
 

doelcm82

macrumors 68040
Feb 11, 2012
3,765
2,776
Florida, USA
This is why I think it is wise to hold off on investing in major 4k hardware for now. The standard is totally in flux. Why pay $1000+ now for a 4k TV that uses HDMI 1.4 or 2.0, when there are other HDR and 8k standards right on the horizon? It looks about as silly as buying a 720i TV with only components inputs in 2003, when HDMI and 1080p was right on the horizon.
The benefit to buying now is that you get to use it now.

I have a MacBook Pro (Retina, 13-inch, Mid 2014). It has the power to drive the cheap 4K TV I got at Sam's Club. I'm a lot more productive when I can keep several windows open and visible at the same time. (How did I ever work in VGA resolution?)
 
  • Like
Reactions: Lazy

lssmit02

macrumors 6502
Mar 25, 2004
400
37
Jesus. Never in my life would I have believed this is how people actually think.
Dude, I don't want to mock you but you are so wrong, so UTTERLY ignorant about this subject, it's not funny.

Let me simply point out that LOSSLESS compression is a well-defined mathematical field, it is built upon probability theory, like all CS it consists of theorems and algorithms, and it applies to all random processes (think "data streams") regardless of whether they are text, video, sensor data, or anything else.
It has NOTHING to do with "rephrasing" and use of "shorter words". Your comment is the sort of thing I'd expect EE's to send each other on April Fools' day as a joke.
I think you missed the joke.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.