Re: Re: Re: crap-idy-crap crap
Originally posted by beatle888
the future, maybe there are possibilities that we will only see when we get 128 bit.
My point was actually that the human eye is not acute enough to discern a large palette of colors.
The JPEG format can store images that contain up to 16.7 million colors (termed as "24-bit" or "true-color" since the human eye can not differentiate between 2 colors that are next to each other in a "true-color" spectrum).
... from...
http://dp3.lib.ndsu.nodak.edu/~nem/archive/
64bit color bit depths would allow for a pallet of 1.844 x 10^19 colors (that's 18,440,000,000,000,000,000 colors) while so called 'True Color' provides slightly more than 16,000,000 colors.
Even on the 'OLED 2560x2048' displays... there are only 5,242,880 pixels. You would need 3,518,437,208,884 of those monitors to display every color available. If you were playing 30 frame/sec video of smooth color blends... your video would play for 3,719 YEARS before it was able to display all of those colors.
If you had a 128 bit color panel, it would take the same movie 6.860277220920572e+22 years to play! (hope I got my math correct ;-))
Any Who... 64 bit and 128 bit color pallets are enormous. Too big to be useful to people.
Perhaps if you wanted to create some cinema classic filmed (rendered) entirely in subtle variations of one shade of aqua. ;-)
So... I was basically saying that I REALLY doubt that people will be running iMovie on 128bit video in 3 years. There is no benefit.
Additionally, assume a video stream of 640x480 resolution, 30 frames per second, 128bit color depth (fairly low rez, super high quality). You'd need over 140 MByte/second constant bandwidth to stream that video. That would require a solid state drive or striped 15K SCSI drives (that MIGHT be fast enough).
BTW... if you had a Firewire camera that could to that... it'd require the majority of a FW 1600 bus. FW 800 wouldn't come close, neither would a Gigabit ethernet connection.
... just playing with my calculator.
