Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Jesus. Never in my life would I have believed this is how people actually think.
Dude, I don't want to mock you but you are so wrong, so UTTERLY ignorant about this subject, it's not funny.

Let me simply point out that LOSSLESS compression is a well-defined mathematical field, it is built upon probability theory, like all CS it consists of theorems and algorithms, and it applies to all random processes (think "data streams") regardless of whether they are text, video, sensor data, or anything else.
It has NOTHING to do with "rephrasing" and use of "shorter words". Your comment is the sort of thing I'd expect EE's to send each other on April Fools' day as a joke.
 
Jesus. Never in my life would I have believed this is how people actually think.
Dude, I don't want to mock you but you are so wrong, so UTTERLY ignorant about this subject, it's not funny.

Let me simply point out that LOSSLESS compression is a well-defined mathematical field, it is built upon probability theory, like all CS it consists of theorems and algorithms, and it applies to all random processes (think "data streams") regardless of whether they are text, video, sensor data, or anything else.
It has NOTHING to do with "rephrasing" and use of "shorter words". Your comment is the sort of thing I'd expect EE's to send each other on April Fools' day as a joke.

manu chao is absolutely correct.
I zipped compressed his text and got a much shorter phrase that means exactly the same thing.

INPUT
[And compressing a file in the form of a ZIP 'archive' is also never lossless. If you zip a text file, the text will be shortened by slight rephrasing and the use of shorter words. This may be imperceptible to the reader but it is still compressed at some level.]

OUTPUT
[I'm kidding]
 
If this is a compression (software) thing, I'm not sure what's to stop Apple from implementing proprietary compression over TB3 in a 1st party monitor/machines.
 
May have to hold off on my Mac purchase for another year.

Because then you'll be able to draw a line under whatever new technologies are coming and you won't want to put off for another year again, definitely not.
[doublepost=1456949127][/doublepost]
And compressing a file in the form of a ZIP 'archive' is also never lossless. If you zip a text file, the text will be shortened by slight rephrasing and the use of shorter words. This may be imperceptible to the reader but it is still compressed at some level.

This might be the best post i've ever seen on Macrumours :cool:o_O:rolleyes::eek::D
 
DisplayPort 1.4 uses a *virtually* lossless way to display images, which means lossy. Probably not noticable, but damn with a 30-bit 8K multi-thousand dollar display, I sure would like a lossless image.
 
Stop waiting around for Intel for Pete's sake. Use discrete GPUs and implement the new standards already!!
 
  • Like
Reactions: jimthing
Jesus. Never in my life would I have believed this is how people actually think.
Dude, I don't want to mock you but you are so wrong, so UTTERLY ignorant about this subject, it's not funny.

Let me simply point out that LOSSLESS compression is a well-defined mathematical field, it is built upon probability theory, like all CS it consists of theorems and algorithms, and it applies to all random processes (think "data streams") regardless of whether they are text, video, sensor data, or anything else.
It has NOTHING to do with "rephrasing" and use of "shorter words". Your comment is the sort of thing I'd expect EE's to send each other on April Fools' day as a joke.

I don't think it has to be 1 April to make a joke.
 
And compressing a file in the form of a ZIP 'archive' is also never lossless. If you zip a text file, the text will be shortened by slight rephrasing and the use of shorter words. This may be imperceptible to the reader but it is still compressed at some level.

...and as it turns out, this ended up being the win of the day :)
 
Compression is never lossless. It may be imperceptible to the ear or eye, but it is still compressed at some level.
Oh lord. Not at all true. Don't make statements are are simply guesses. You might want to change your name because anything you say in the future won't be believed!
 
Lossless audio and now lossless video, why Apple should offer higher base model storage at 128GB in all their products, take a hit on the profit from this method of angering their every customer.

My decree, "Apple shall offer no less than 128 GB on all its computing, telephony and AV products"!
 
  • Like
Reactions: jimthing
Looking at the $7k+ Samsung OLED TVs that just achieved HDR via HDMI and wondering with this .... now what? I suppose we could assume an HDMI adapter would work... or can we expect TVs also to move to USB-C and what about HDCP blah blah blah... 8K already?
 
It seems to me that all this mucking about would be making Apple (and probably others) think about moving chip manufacturers, surely? Macs have been hampered by the lack of new chips for years now, most significantly last year missing out entirely because of Skylake's tardiness. Then there's no DP1.3 anyway, and now we have to wait even longer. Certainly we will if we want any significant performance increase. If I were Apple I'd be looking hard at the longer term when they wouldn't have to rely on Intel to turn up with something each year. It seems that Apple have enough resources to keep increasing the performance of the A series chips pretty significantly each year.
 
  • Like
Reactions: jimthing
Looking at the $7k+ Samsung OLED TVs that just achieved HDR via HDMI and wondering with this .... now what? I suppose we could assume an HDMI adapter would work... or can we expect TVs also to move to USB-C and what about HDCP blah blah blah... 8K already?

There is another connector proposed for 8K consumer televisions. SuperMHL.
 
DisplayPort 1.4 uses a *virtually* lossless way to display images, which means lossy. Probably not noticable, but damn with a 30-bit 8K multi-thousand dollar display, I sure would like a lossless image.
The hint is in the table: "4:2:0" + "12 bit"
 
DisplayPort 1.4 uses a *virtually* lossless way to display images, which means lossy. Probably not noticable, but damn with a 30-bit 8K multi-thousand dollar display, I sure would like a lossless image.
I'm more concerned about latency. Compression usually adds some delay to the signal.

Still, it's impressive that USB-C will be able to handle 8k at all. Meanwhile Lightning needs compression just to handle 1080p.
 
The crappy part about tossing around this "visually lossless" moniker is that it is quite confusing. Audio codecs we know can go several ways... you can have FLAC/ALAC (data lossless), High bitrate m4a (Audibly indistinguishable in ABX tests) and Marketing (remember MSFT saying 64k wma was "cd quality"?).

It's nearly a guarantee that 99% of everyone has not seen non-data-compressed digital video. File sizes are SO massive that throughput is bottlenecked, even in today's fastest personal machines. On top of that, the data sizes would be massive; on the order of gigs per minute of 720p. Now, when you say "VISUALLY lossless", you are indicating perception. So like every video format before it, you are talking a lossy codec - and that's completely fine. Visually data-lossless video is ULTRA overkill and not necessary. Think about this page... why rewrite every white pixel 30 times a second? It's stupid, power hungry, expensive for the electricity, and makes no difference in the end.

With all of that said, it is a shame that they used "visually lossless" when they could have used "visually imperceptible codecs" or something more clear like "visually indistinguishable". We're so used to macroblocking on simple tv streams that it makes us all perk up and say, "Really?"

But for anyone waiting on 8k video, please... don't act like it's "just around the horizon" and "save your money". It's not. Most cable companies struggle to put out video that doesn't suck at 1080i. If you think 4320p is just moments from reality, can you just get nuclear fusion sorted first? I would wager we have a human on Mars before we have common, non-blocking 8k video as a common standard.
 
The problem with them going with lossy compression is the video source itself (blu-ray or DVD) is already lossy compressed. Two lossy compression steps in a row is not a good sign for video quality.
 
The problem with them going with lossy compression is the video source itself (blu-ray or DVD) is already lossy compressed. Two lossy compression steps in a row is not a good sign for video quality.
So would you say going from the camera to the cutting room doesn't count? That's lossy as well. I get what you're saying, but every step is lossy. Whether it adds a perceptible loss is what you're worried about, and frankly, you can't tell. Even if you could, there's no way to *not* do it.
 
The term 'Lossless' (with the quote marks) in this article likely has the same meaning as people talking about 320 kbps MP3 being 'transparent', i.e., the loss is imperceptible to ordinary people with ordinary visual (hearing) organs.
But they couldn't really say 'transparent' on a video codec :p
 
  • Like
Reactions: drumcat
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.