Nothing, of course. Apple's been notorious for being late to the party on industry standards... how long did it take them to adapt USB 3.0?
...Or they're the ones throwing the party, as with Thunderbolt and USB-C.
Nothing, of course. Apple's been notorious for being late to the party on industry standards... how long did it take them to adapt USB 3.0?
Jesus. Never in my life would I have believed this is how people actually think.
Dude, I don't want to mock you but you are so wrong, so UTTERLY ignorant about this subject, it's not funny.
Let me simply point out that LOSSLESS compression is a well-defined mathematical field, it is built upon probability theory, like all CS it consists of theorems and algorithms, and it applies to all random processes (think "data streams") regardless of whether they are text, video, sensor data, or anything else.
It has NOTHING to do with "rephrasing" and use of "shorter words". Your comment is the sort of thing I'd expect EE's to send each other on April Fools' day as a joke.
Pied Piper has finally come to life.
Jesus. Never in my life would I have believed this is how people actually think.
Dude, I don't want to mock you but you are so wrong, so UTTERLY ignorant about this subject, it's not funny.
Let me simply point out that LOSSLESS compression is a well-defined mathematical field, it is built upon probability theory, like all CS it consists of theorems and algorithms, and it applies to all random processes (think "data streams") regardless of whether they are text, video, sensor data, or anything else.
It has NOTHING to do with "rephrasing" and use of "shorter words". Your comment is the sort of thing I'd expect EE's to send each other on April Fools' day as a joke.
May have to hold off on my Mac purchase for another year.
And compressing a file in the form of a ZIP 'archive' is also never lossless. If you zip a text file, the text will be shortened by slight rephrasing and the use of shorter words. This may be imperceptible to the reader but it is still compressed at some level.
Jesus. Never in my life would I have believed this is how people actually think.
Dude, I don't want to mock you but you are so wrong, so UTTERLY ignorant about this subject, it's not funny.
Let me simply point out that LOSSLESS compression is a well-defined mathematical field, it is built upon probability theory, like all CS it consists of theorems and algorithms, and it applies to all random processes (think "data streams") regardless of whether they are text, video, sensor data, or anything else.
It has NOTHING to do with "rephrasing" and use of "shorter words". Your comment is the sort of thing I'd expect EE's to send each other on April Fools' day as a joke.
And compressing a file in the form of a ZIP 'archive' is also never lossless. If you zip a text file, the text will be shortened by slight rephrasing and the use of shorter words. This may be imperceptible to the reader but it is still compressed at some level.
Oh lord. Not at all true. Don't make statements are are simply guesses. You might want to change your name because anything you say in the future won't be believed!Compression is never lossless. It may be imperceptible to the ear or eye, but it is still compressed at some level.
Compression is never lossless. It may be imperceptible to the ear or eye, but it is still compressed at some level.
Same for me. And apple needs to do something about that thunderbolt display. Its been too long. They cant wait anymoreMay have to hold off on my Mac purchase for another year.
Looking at the $7k+ Samsung OLED TVs that just achieved HDR via HDMI and wondering with this .... now what? I suppose we could assume an HDMI adapter would work... or can we expect TVs also to move to USB-C and what about HDCP blah blah blah... 8K already?
The hint is in the table: "4:2:0" + "12 bit"DisplayPort 1.4 uses a *virtually* lossless way to display images, which means lossy. Probably not noticable, but damn with a 30-bit 8K multi-thousand dollar display, I sure would like a lossless image.
I'm more concerned about latency. Compression usually adds some delay to the signal.DisplayPort 1.4 uses a *virtually* lossless way to display images, which means lossy. Probably not noticable, but damn with a 30-bit 8K multi-thousand dollar display, I sure would like a lossless image.
So would you say going from the camera to the cutting room doesn't count? That's lossy as well. I get what you're saying, but every step is lossy. Whether it adds a perceptible loss is what you're worried about, and frankly, you can't tell. Even if you could, there's no way to *not* do it.The problem with them going with lossy compression is the video source itself (blu-ray or DVD) is already lossy compressed. Two lossy compression steps in a row is not a good sign for video quality.