Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
And compressing a file in the form of a ZIP 'archive' is also never lossless. If you zip a text file, the text will be shortened by slight rephrasing and the use of shorter words. This may be imperceptible to the reader but it is still compressed at some level.

Nonsense. Here, let me try:

Code:
$ cat > sonnet.txt
Shall I compare thee to a summer's day? 
Thou art more lovely and more temperate. 
Rough winds do shake the darling buds of May, 
And summer's lease hath all too short a date.
^D
$ zip sonnet.zip sonnet.txt
$ rm sonnet.txt
$ unzip sonnet.zip
$ cat sonnet.txt

Hey, you scrubbed up OK!

$

Well, you learn something everyday.

Did you hear the one about a well-known photocopier manufacturer who used over-zealous lossy compression on their devices until people found that it was changing the numbers in the small print on spreadsheets?
 
Last edited:
Excuse me, but this is complete nonsense. Indeed there is true lossless compression (though I cannot say for sure that "lossless" in this article truly means lossless). Just think about zip compression, you compress a text file by zipping it and get the exact same file with all data back when unzipping it. That's exactly what happens also with audio or video, when talking about lossless compression, though algorithms may differ.
Simple, non-technical users might only be confronted with the idea of compression when there is a quality trade-off. That's the only context they hear this term in. To those non-technical people, when the term compression is used for distinct items (be that data in a computer memory or, eg, a collection of different physical packages) instead of physical matter (eg, air, as in compressed air) then compression generally means reducing the size needed to store those 'items'. If you think of differently shaped packages inside a truck, packing them in a space-efficient manner (ie, not just piling them on randomly, but combining similarly sized packages) would be a lossless compression. If you also trampled on the packages to make them smaller that would be a lossy compression. If that doesn't damage the stuff inside the packages, this might be called a perceptually lossless compression.
 
And compressing a file in the form of a ZIP 'archive' is also never lossless. If you zip a text file, the text will be shortened by slight rephrasing and the use of shorter words. This may be imperceptible to the reader but it is still compressed at some level.

Not true. ZIP compression is always lossless.

Just because the compression mechanism alters the content in order to compress it doesn't mean that the compression is lossy. All compression methods alter the content, be they lossless or lossy. What lossless compression means is that you can recover the original, in perfect identical form, after decompression.
 
  • Like
Reactions: dysamoria
Not true. ZIP compression is always lossless.

Just because the compression mechanism alters the content in order to compress it doesn't mean that the compression is lossy. All compression methods alter the content, be they lossless or lossy. What lossless compression means is that you can recover the original, in perfect identical form, after decompression.
I guess I should have added an example like the one theluggage posted above to make my point clearer.
 
Not true. ZIP compression is always lossless.

Just because the compression mechanism alters the content in order to compress it doesn't mean that the compression is lossy. All compression methods alter the content, be they lossless or lossy. What lossless compression means is that you can recover the original, in perfect identical form, after decompression.
Haha wow seriously people. Read what he said carefully, it's so obvious it's a joke.
 
Vicious cycle, will the "new" "better" cable thing ever end?
What is the ETA on this one?
 
Excuse me, but this is complete nonsense. Indeed there is true lossless compression (though I cannot say for sure that "lossless" in this article truly means lossless). Just think about zip compression, you compress a text file by zipping it and get the exact same file with all data back when unzipping it. That's exactly what happens also with audio or video, when talking about lossless compression, though algorithms may differ.
I don’t agree witj that when it comes to audio, or vi
Haha wow seriously people. Read what he said carefully, it's so obvious it's a joke.
I have to admit to being a doofus, (is that how you spell it?).
I saw his post and thought, ‘WHHAAATTT??’
I then made up a text file of about 60,000 words in a plain text document, zipped, unzipped and then the penny dropped……….
 
The approval of DisplayPort 1.4 comes even though consumers are still awaiting the arrival of devices supporting the previous DisplayPort 1.3 standard. Intel had been expected to support DisplayPort 1.3 in its current Skylake generation of chips, but the company instead opted to offer dual DisplayPort 1.2 support. As we
detailed
earlier this year, the lack of DisplayPort 1.3 support in Skylake could lead Apple to hold off on releasing a new 5K Thunderbolt Display until next year when chips supporting the standard become available.​
http://www.anandtech.com/show/10110...-14-standard-displayport-adds-compression-hdr

According to Anandtech, even DisplayPort 1.3 can't support a 5k 30-bit screen like the 27" iMac has, only 5k 24-bit. 5k 30-bit support is one of the new things DisplayPort 1.4 introduces. So if Apple actually wants to put the 27" iMac's 5k 30-bit screen into a SST Thunderbolt Display even waiting on Thunderbolt to adopt DisplayPort 1.3 isn't enough, they'll have to wait until a DisplayPort 1.4 compatible Thunderbolt is available.
 
No - there are plenty of lossless compression algorithms. E.g. LZW compression used by GIF, some types of TIFF and ".zip" files (using lossy compression on executables and data files is not a good idea), Apple Lossless audio, FLAC audio, .png files...

Lossless compression works by finding a more efficient way to pack the data. E.g. replacing frequently occurring patterns with short codes or 'run length encoding' (e.g. if there is a row of 1000 white pixels just send 'white' and '1000'). Morse code is another example (frequently occurring letters are given the shortest code rather than ASCII which uses 8 bits for every single character - Huffman encoding is the algorithmic equivalent).

With lossless compression, you get back exactly what you put in. You don't get the sort of 100x compression you see with lossy compression, but 2-3x is feasible.

However, "visually lossless" is either a redundancy (if its lossless, of course there's no visual difference) or weasel words.

You say lossless. I say lossless.

Let's call the whole thing off!

Too bad I can't post the melody.

But......who cares, it will be better than what we have right now. Nothing to see here, move on.
 
Erm, when was the last time you saw Apple do something as tacky as silk-screening industry logos on aluminium? They put their own, subtle indicators on their device. Go away and make a better mock up, please :D
 
Is 8K the number of pixels or the price of the monitor?
Another question, will a bacteria be able to see the pixels or will it have to use a microscope?
 
Last edited:
  • Like
Reactions: Lazy and coolfactor
Officially holding out my next purchase for a newly designed MacBook Pro that can drive a standalone 5K Retina Thunderbolt Display.

However, assuming there will be a new case design for the 5K iMac this fall (4 years since it was last redesigned, and 7 years of the same 16:9 front face look and feel), I may not be able to resist and would get that and continue using an iPad as my portable machine (even though it lacks the ability to do some of things I prefer doing on the Mac). First world problems to the fullest.

Oh no! First world problems is us [Other users] who have to read your exhaustingly detailed explanation about how you use your devices. Thats the real problem here! :eek:

#NobodyCares #KiddingButNotReally
 
Last edited:
  • Like
Reactions: Lazy
PC DOES WHAAAT?!!
Intel had been expected to support DisplayPort 1.3 in its current Skylake generation of chips, but the company instead opted to offer dual DisplayPort 1.2 support.
Please remind me, why were we waiting for Skylake again?
Either way, we're unlikely to see Macs supporting DisplayPort 1.4 until 2017 at the earliest.
Great, screw you 2016. Next year please!
 
  • Like
Reactions: dysamoria and nlenz
The benefit to buying now is that you get to use it now.

I have a MacBook Pro (Retina, 13-inch, Mid 2014). It has the power to drive the cheap 4K TV I got at Sam's Club. I'm a lot more productive when I can keep several windows open and visible at the same time. (How did I ever work in VGA resolution?)

I guess that depends with how often you buy and how much money you want to spend on TVs. I think the majority don't want to spend money on a new main living room TV that often. Thus, for those folks, it's better to buy at a time when there is a definitive established standard, and the only thing on the horizon are token updates to that standard that don't change much. For example, 2005-2010 was the optimal time to buy a 1080p TV. Those TVs will likely remain useful to 2020 easily.
 
  • Like
Reactions: jimthing
Nothing, of course. Apple's been notorious for being late to the party on industry standards... how long did it take them to adapt USB 3.0?

It took them as long as it took Intel to put it onto a CPU. That's what's gating most of Apple's innovation on the PC side.
(Which is, IMHO, the primary reason at some point in the next 5 years they'll switch to ARM. Not because it's cheaper, though it might be, or faster, though again it might be in five years, and with a larger-than-mobile power budget; but because Apple can add stuff to the SoC at their pace, not Intel's pace. Compare how rapidly they have constantly updated iOS devices, while the best they've been able to do on the Mac side is slow introduction of USB 3, and slow introduction of retina.)
 
  • Like
Reactions: jimthing
This is why I think it is wise to hold off on investing in major 4k hardware for now. The standard is totally in flux. Why pay $1000+ now for a 4k TV that uses HDMI 1.4 or 2.0, when there are other HDR and 8k standards right on the horizon? It looks about as silly as buying a 720i TV with only components inputs in 2003, when HDMI and 1080p was right on the horizon.

Agreed, but it's not quite so in flux anymore, as the pieces are almost in place for investment-worthy UHD hardware. 2015 was really in flux, with the industry developing and coalescing around standards like HDR, peak nits, BT 2020, DCI/P3, etc. Of course, most of these new standards are only truly taken advantage of with OLED, which we will also see unleashed this year (in the broader market) at acceptable prices in both TVs and monitors. And of course, to drive that hardware, you'll ideally need either HDMI 2.0a or DP 1.3 and above.

From the sound of it, DP 1.4 doesn't sound like a hardware change (port or cable). Perhaps DP 1.3 ports could be upgraded on the driver end to support DP 1.4 encoding?

It's a shame we haven't even seen DP 1.3 yet.
 
  • Like
Reactions: jimthing
I don't have any USB-C devices but can someone tell me how robust this connector is? Lightning feels pretty solid and I would've loved for that connector to be USB-C.

I've had a few micro-USB connector that get bent or fail if inserted at a wrong angle or something. Not that I do that often but they just don't seem as solid as Lighting.
 
I don't see OLED taking off unless they can fix its well-publicized blue fade problem, without using odd-shaped pixels that distort the picture to do it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.