Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The term 'Lossless' (with the quote marks) in this article likely has the same meaning as people talking about 320 kbps MP3 being 'transparent', i.e., the loss is imperceptible to ordinary people with ordinary visual (hearing) organs.
But they couldn't really say 'transparent' on a video codec :p

Well, you have a pretty good point there!
 
And compressing a file in the form of a ZIP 'archive' is also never lossless. If you zip a text file, the text will be shortened by slight rephrasing and the use of shorter words. This may be imperceptible to the reader but it is still compressed at some level.

Say what? You might want to provide a citation for lossy compression of text files.

After 35+ years working as a technical writer, I've yet to see a zipped text file that results in "slight rephrasing and the use of shorter words". It would in no way be "imperceptible"; it's just too easy to compare the zipped/unzipped product with the original. Graphics are a whole different kettle of fish to text.

[edit]

This is why we really need a sarcasm font, for when people forget to append the /sarc tag to posts...
 
Last edited:
Say what? You might want to provide a citation for lossy compression of text files.

After 35+ years working as a technical writer, I've yet to see a zipped text file that results in "slight rephrasing and the use of shorter words". It would in no way be "imperceptible"; it's just too easy to compare the zipped/unzipped product with the original. Graphics are a whole different kettle of fish to text.
That was sarcasm... note his response.
 
Oh no! First world problems is us [Other users] who have to read your exhaustingly detailed explanation about how you use your devices. Thats the real problem here! :eek:

#NobodyCares #KiddingButNotReally
Gotta set the tone for the users of what some people like me desire from Apple. Feel free to move on to the next post. #hashtag
 
Compression is never lossless. It may be imperceptible to the ear or eye, but it is still compressed at some level.
what?

textfile example:

aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa


compress it:

a*500

uncompressing it will get you the exact same data.

simple lossess compression explained.
 
  • Like
Reactions: dysamoria



The Video Electronics Standards Association yesterday formally announced its new DisplayPort 1.4 standard, setting the stage for improved video quality and color for external display connections over both DisplayPort and USB-C connectors.

dp_usb_c.jpg

Rather than an increase in actual bandwidth, the improvements in DisplayPort 1.4 come due to improved compression, taking advantage of VESA's new Display Stream Compression 1.2 standard to support High Dynamic Range (HDR) video up to either 8K resolution at 60 Hz or 4K resolution at 120 Hz.
dp_1_4_compression.jpg

In addition to video-related improvements, DisplayPort 1.4 also expands audio capabilities with support for 32 channels, 1536kHz sample rates, and broader support for "all known" audio formats.

The approval of DisplayPort 1.4 comes even though consumers are still awaiting the arrival of devices supporting the previous DisplayPort 1.3 standard. Intel had been expected to support DisplayPort 1.3 in its current Skylake generation of chips, but the company instead opted to offer dual DisplayPort 1.2 support. As we detailed earlier this year, the lack of DisplayPort 1.3 support in Skylake could lead Apple to hold off on releasing a new 5K Thunderbolt Display until next year when chips supporting the standard become available.

Intel hasn't laid out its DisplayPort support plans beyond Skylake, so it's unknown whether the company will first move to DisplayPort 1.3 or if it can jump straight to the new DisplayPort 1.4 standard. Either way, we're unlikely to see Macs supporting DisplayPort 1.4 until 2017 at the earliest.

Article Link: DisplayPort 1.4 to Use 'Lossless' Compression for Higher-Quality 8K Video Over USB-C
I can't see the necessity for 5K monitors at 27" so why should 8K be needed. On huge screens or cinemas this is just what we want but anywhere else we will need refrigerated GPUs the like of which simply don't exist and cases the thickness of which would send Cook and Ive into a depression. This is just Apple and carrot and "keep acting like sheep and sending us all your cash"
 
  • Like
Reactions: nlenz
Sounds like it would be great to have USB-C for the iPhone and iPod Touch. Why not go all the way with USB-C for the whole product range and make it a standard? And please don't tell me they cant because the iPhone needs to be thinner...
Since you brought it up, fwiw, despite all it's virtues, USB-C is actually slightly bigger than the Lightning connector.
 
Compression is never lossless. It may be imperceptible to the ear or eye, but it is still compressed at some level.

Aha, https://en.wikipedia.org/wiki/Redundancy_(information_theory)

[doublepost=1457012955][/doublepost]
And compressing a file in the form of a ZIP 'archive' is also never lossless. If you zip a text file, the text will be shortened by slight rephrasing and the use of shorter words. This may be imperceptible to the reader but it is still compressed at some level.

Jesus! Did you figure out how to write that by yourself?

(This message has been compressed and then decompressed, that's why it was "shortened" and "slightly rephrased". In it's original form, it was much bigger).
 
This is great! Now all we need is one cord instead of multiple cores for different OS devices. Besides, apple charges ridiculous price for their cables.

I wouldn't be surprised if apple implement some kind of chip in their cables so they will only work on their devices.
 
Sounds like it would be great to have USB-C for the iPhone and iPod Touch. Why not go all the way with USB-C for the whole product range and make it a standard? And please don't tell me they cant because the iPhone needs to be thinner...

YES.

Any move away from proprietary <anything> that Apple makes is a good move.

But, I don't think this'll happen on iOS devices though. Apple's lock-in is damn near unbreakable.
 
  • Like
Reactions: APlotdevice
And compressing a file in the form of a ZIP 'archive' is also never lossless. If you zip a text file, the text will be shortened by slight rephrasing and the use of shorter words. This may be imperceptible to the reader but it is still compressed at some level.

Best troll post of 2016. Not malicious, yet completely hilarious. Well played.
 
"visually lossless"
"compression algorithm"

Sounds a bit fishy in terms of fidelity, definition and colour depth in fast-moving scenes, etc.
Then again we had OnLive a few years back, happily streaming compressed 720p/1080p video games over an ADSL connection, and progress helping, we might be able to convert on the fly at 8k/60Hz in a matter of microseconds.

Maybe that's how Super-Hi Vision screens (I love that name, so much better than "Ultra HD") will be fed.
Maybe.
[doublepost=1457023667][/doublepost]
YES.

Any move away from proprietary <anything> that Apple makes is a good move.

But, I don't think this'll happen on iOS devices though. Apple's lock-in is damn near unbreakable.
yeah, their proprietary system-on-chips suck big time. /s
[doublepost=1457023719][/doublepost]
When has Apple did that?
"done" that :)
Lightning cable works like that. It's got a chip in it. No MFI certification, no access to the chip.
 
yeah, their proprietary system-on-chips suck big time. /s

You got me there. Sure, they're great for iOS devices.

However, if they were to migrate this to OSX machines it would break BootCamp capability, so there's that.

I want Apple to interoperate with non-Apple stuff. This is my biggest pet-peeve with the company. It seems like it's one-step-forward, two-steps-back with them regarding this issue.

Incidentally, it's one of the things I found most amusing in the steve.jobs movie.
 
And compressing a file in the form of a ZIP 'archive' is also never lossless. If you zip a text file, the text will be shortened by slight rephrasing and the use of shorter words. This may be imperceptible to the reader but it is still compressed at some level.


You're kidding right?
 
"done" that :)
Lightning cable works like that. It's got a chip in it. No MFI certification, no access to the chip.

Yes, but Lightning is not a standard cable, it's a totally proprietary cable.
 
May have to hold off on my Mac purchase for another year.

Me too. I have ONE cash wad to blow. I want retina larger than a 15" MacBook, and I DON'T want laptop parts (MacBooks, iMacs, and Mac Mini), because heat dissipation sucks in such devices and they self destruct.

I'm really irritated about this. I haven't done serious photography since my 21" CRT died two years ago. The third party 4K screens and Mac high-PPI resolution modes are well known to be problematic (display incompatibility).

http://www.anandtech.com/show/10110...-14-standard-displayport-adds-compression-hdr

According to Anandtech, even DisplayPort 1.3 can't support a 5k 30-bit screen like the 27" iMac has, only 5k 24-bit. 5k 30-bit support is one of the new things DisplayPort 1.4 introduces. So if Apple actually wants to put the 27" iMac's 5k 30-bit screen into a SST Thunderbolt Display even waiting on Thunderbolt to adopt DisplayPort 1.3 isn't enough, they'll have to wait until a DisplayPort 1.4 compatible Thunderbolt is available.

Very annoying. The PC world still doesn't care about high-PPI, so those of us that DO care are left waiting for Intel to catch up in support chipsets.

Are we addicted to upgrading when current tech is already exceeding our needs as-is?

You must be one of those people that sees no value in retina-like density of PPI on displays. It's not about motion graphics, and it's not about TV; it's about still image presentation and editing without being forced to work zoomed OUT at "60%" just to see the full image.

I can't see the necessity for 5K monitors at 27" so why should 8K be needed. On huge screens or cinemas this is just what we want but anywhere else we will need refrigerated GPUs the like of which simply don't exist and cases the thickness of which would send Cook and Ive into a depression. This is just Apple and carrot and "keep acting like sheep and sending us all your cash"

See the above text. This has nothing to do with forcing people to upgrade and everything to do with making display dot density match print dot density so there's a 1-to-1 relationship in scale. It's quite annoying to work on photos that are too big for the display you're editing them on. We've tolerated it so far because the technology didn't exist. Now it does. Not only is it that I can't go back to non-retina after getting used to retina, I can no longer patiently play the zooming game (zoom out to see full composition, zoom in to see 1-to-1 pixels for editing).
 
I have some spare pocket change($5652) to spend. I've decided that
it's time to finally upgrade my 5-year-old TV to something nice. I'm
about to order the $4999 Sony XBR-75X910C 75-inch 4K TV


Does this mean I should wait another couple of years for 8k TV? :(



 
I have some spare pocket change($5652) to spend. I've decided that
it's time to finally upgrade my 5-year-old TV to something nice. I'm
about to order the $4999 Sony XBR-75X910C 75-inch 4K TV


Does this mean I should wait another couple of years for 8k TV? :(


Well, if I were you, I'd look around... Honestly... do you see a lot of 4k content? Do you have a fast enough connection to stream the little content available in that format?

TV broadcasters have shown some signs of committing to 8k aka Super Hi Vision (e.g. BBC and NHK experimented with it over the last few Olympics), e.g. for sports and some types of shows. 4k, while common in cinemas now, is still very much in flux as a technology (codecs, media support) and very much in its early stage for home use.

I would buy a very nice 60" 2k TV for now, for ~$1,500, and enjoy blu-ray (still nothing streamed comes close to Blu-ray in terms of PQ and sound today)... and when 8k is around the corner for TV/video (in at least 5 years I'd say) you can use that other $3,500 (well, with a bit of luck it'll have grown to $4,000 in the bank) on a shiny new 8k set.

I personally was in the market for a 4k TV a few months back (with a little less budget than you, I must say) and realized it was not worth it for me. I can't get fast enough internet where I live in London for decent 4k streaming, and the content availability is ridiculously small. I mostly watch art house / independent cinema from Europe and Asia and am perfectly happy with Blu-ray and streaming the odd flick from Mubi.tv in 720p/1080p over Chromecast. So I got myself a £400 2k projector (Optoma HD141x) and am enjoying 2k films on a 100" base screen for very little money... Of course YMMV!
 
Well, if I were you, I'd look around... Honestly... do you see a lot of 4k content? Do you have a fast enough connection to stream the little content available in that format?

TV broadcasters have shown some signs of committing to 8k aka Super Hi Vision (e.g. BBC and NHK experimented with it over the last few Olympics), e.g. for sports and some types of shows. 4k, while common in cinemas now, is still very much in flux as a technology (codecs, media support) and very much in its early stage for home use.

I would buy a very nice 60" 2k TV for now, for ~$1,500, and enjoy blu-ray (still nothing streamed comes close to Blu-ray in terms of PQ and sound today)... and when 8k is around the corner for TV/video (in at least 5 years I'd say) you can use that other $3,500 (well, with a bit of luck it'll have grown to $4,000 in the bank) on a shiny new 8k set.

I personally was in the market for a 4k TV a few months back (with a little less budget than you, I must say) and realized it was not worth it for me. I can't get fast enough internet where I live in London for decent 4k streaming, and the content availability is ridiculously small. I mostly watch art house / independent cinema from Europe and Asia and am perfectly happy with Blu-ray and streaming the odd flick from Mubi.tv in 720p/1080p over Chromecast. So I got myself a £400 2k projector (Optoma HD141x) and am enjoying 2k films on a 100" base screen for very little money... Of course YMMV!


one of the reasons I've been looking for a good 4k TV is that I record all my videos on 4k now. (on my Samsung smartphone and my Sony AX100 4k camcorder http://www.amazon.com/Sony-FDR-AX10...=1457383357&sr=1-4&keywords=sony+4k+camcorder )

the 4k video look nice on my exisiting 1080p TV and I want something that will really show off all the hundreds of hours of 4k vidoes I've shot over the last couple of years.

I took the plunge and ordered the Sony XBR-75X910C 75-inch 4K TV this morning. partly because I was able to get a decent price of $4100 delivered. (after B&H Photo agreed to pricematch with CDW)
 
As an Amazon Associate, MacRumors earns a commission from qualifying purchases made through links in this post.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.