Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I have no idea what movies you've watched, but you can see a huge amount of difference between a BD and a DVD. And by huge, I mean...like...a whole bunch.

From a technical point of view, yes.

But a DVD that has been mastered well from a good source and is played using good upscaling equipment will look better than a BD from a less good source and/or that have been transferred by less competent people.

Yes, the average BD is looking better than the average DVD. Not always so much better that you notice it without specifically looking for it, though, unless you have subtitles enabled. :p
 
From a technical point of view, yes.

But a DVD that has been mastered well from a good source and is played using good upscaling equipment will look better than a BD from a less good source and/or that have been transferred by less competent people.

Upscaling cannot make up for details that just aren't there in the source. A DVD is 480p. Any extra pixels added by an upscaler will not have any of the details that a 1080p source would have.

Seriously, people that keep saying "upscaled DVDs are just as good as Blu-rays" are just showing some kind of bias against BD for some unknown reason (too much Steve-Os in the morning ?).
 
Dude, nobody ever gets what I'm saying. The DVD player is a fine Feroudja with a 1080p upscaler. Sure, it's not as good as a lossless 1080p video, but it's better than Bluray. Bluray uses very lossy compression even though it is a better algorithm.

Wait, H.264 is a better algorithm than MPEG-2, but it's not better ? Why am I not getting what you're saying again ?

MPEG-2 is lossy, just like H.264. But, H.264 is much more efficient than MPEG-2. 480p video stretched (upscaled) to 1080p cannot have the same level of details that a true 1080p source has. Get that out of your head.

I actually compared the two on a large screen. The Bluray looked horrible compared to the upscaled DVD, believe it or not. And it was a new movie, the Dark Night.

The Dark Knight on Blu-ray is absolutely a great transfer from a very good source. The DVD has nothing on it. We know you're outright lying now or simply can't admit the truth because it would somehow contradict what you've been lead to believe by Steve Jobs.

Hint : Steve never cared about the truth, he only cared that you bought what he was selling. The guy was a salesman, he wasn't out to better the world, he was out to line his pockets with money. The faster you come to terms with this, the faster you can hold a real discussion on the merits of technology rather that repeat the spiel that came out of that man.
 
[url=http://cdn.macrumors.com/im/macrumorsthreadlogodarkd.png]Image[/url]


As reported by Liliputing (via Electronista), Intel is envisioning the high-resolution "Retina" displays pushed by Apple in its iOS devices as the future of PCs, with comments at its Intel Developer Forum in Beijing noting that the company is supporting those plans with its chips.

Image


Specifically, Intel sees handheld and tablet devices targeting resolutions in the range of 300 pixels per inch (ppi), while notebook computers target roughly 250 ppi and all-in-one desktop computers register around 220 ppi.As noted by 9to5Mac, Intel executive Kirk Skaugen specifically referred to these displays by the "Retina" term coined by Apple at the introduction of the iPhone 4 back in 2010. In his presentation, Skaugen mentioned that Intel's third-generation Core i-Series processors (also known as Ivy Bridge) will support Retina displays if manufacturers choose to offer them. This support is not new, however, as he also noted that the current second-generation Core i-Series chips (Sandy Bridge) also support Retina displays, although Ivy Bridge will mark a significant leap forward in graphics support.

Apple is of course rumored to be working toward releasing Retina-capable Macs, as evidenced by support for the "HiDPI" mode showing up in OS X Lion and Mountain Lion. Rumors have suggested that an updated 15-inch MacBook Pro set to appear in the near future could indeed carry a 2880x1800 screen capable of utilizing HiDPI mode to display sharper content.

Article Link: Intel Looking Toward Retina Display PCs by 2013

Pixel density is not color gamut and shadow detail.

And will the mass market want it when $200 24" cheap TN panels are good enough for customers...

----------

Upscaling cannot make up for details that just aren't there in the source. A DVD is 480p. Any extra pixels added by an upscaler will not have any of the details that a 1080p source would have.

Seriously, people that keep saying "upscaled DVDs are just as good as Blu-rays" are just showing some kind of bias against BD for some unknown reason (too much Steve-Os in the morning ?).

True - upscaling does not add detail. It can't. 480i is a far cry from 1080p.

Most web sites will not want to conform to iPad 3 because bandwidth usage will skyrocket. The infrastructure is not ready for that amount of data being sent, and you know costs (unlike cost savings) get passed down to customers. If anybody wants to pay more for web hosting, more web site access fees, higher broadband (DSL/cable/whatever) costs, then demand Retina everywhere. What is wrong with 72PPI to begin with, and do note I vastly prefer Blu-Ray over DVD to begin with... 300PPI has a lot of benefits, but some pragmatic reality concerns just show iPad3 being way too early for web use as such. (And zdnet reported a cookbook app went from 300MB to 700MB, which means your 16GB non-expandable iPad3 model will be used up in a hurry too, what a surprise... (not really a surprise...)


----------

So Retina is the output of an equation that looks like: Retina = PPI/Distance from face? The reason it's BS is because designers can't know exactly how far you're holding the object from your eyes.

Unlike Blu-Ray movies, we're not 12 feet away from the iPad.

~110PPI is a nice sweet spot, as I sit here viewing my 27" Apple Cinema Display.

300PPI is gross overkill for a tablet, since one is normally 12" away... not 12'. It's sharper, but if nothing else the iPad3 is too ahead of its time...
 
Last edited by a moderator:
Wait, H.264 is a better algorithm than MPEG-2, but it's not better ? Why am I not getting what you're saying again ?

MPEG-2 is lossy, just like H.264. But, H.264 is much more efficient than MPEG-2. 480p video stretched (upscaled) to 1080p cannot have the same level of details that a true 1080p source has. Get that out of your head.



The Dark Knight on Blu-ray is absolutely a great transfer from a very good source. The DVD has nothing on it. We know you're outright lying now or simply can't admit the truth because it would somehow contradict what you've been lead to believe by Steve Jobs.

Hint : Steve never cared about the truth, he only cared that you bought what he was selling. The guy was a salesman, he wasn't out to better the world, he was out to line his pockets with money. The faster you come to terms with this, the faster you can hold a real discussion on the merits of technology rather that repeat the spiel that came out of that man.

Note: I did not buy any Apple products after 2006.

MPEG2 may be a less efficient algorithm, but the Bluray videos are compressed too much. The thing I was wrong about was how simple this is. The DVD upscaler does more than just stretching and pixel prediction. It does a lot of work with the edges. The difference was obvious: the Bluray video had jagged edges when stretched out, but the DVD had crisp edges and better overall quality (except for the color).

I actually had no problem with Bluray until I tried it. I thought it was going to be the best quality. If I ever hook up the Bluray player again and have a Bluray and DVD version of the same movie, I'm going to take pics so people can't say "pics or it didn't happen".
 
Upscaling cannot make up for details that just aren't there in the source. A DVD is 480p. Any extra pixels added by an upscaler will not have any of the details that a 1080p source would have.

Seriously, people that keep saying "upscaled DVDs are just as good as Blu-rays" are just showing some kind of bias against BD for some unknown reason (too much Steve-Os in the morning ?).

I wonder if the issue isn't the BD/DVD, but the television itself.

If the "sharpness" or "noise" settings (or other "picture enhancement" circuits) are cranked up, it could be too sharp with the BD, and fine with the DVD - since the DVD is fuzzier to start with.
 
If you find that a DVD displays a better image on your TV than a BluRay, go see an optician. :rolleyes:

I have to admit though, if you have a higher-bandwidth SD TV source (576p/720×576) vs. a 720p source, even with some experience you can't tell much of a difference. Of course, this doesn't apply to your 480p XviD or MKV, but to cable/satellite TV broadcasts.
On my TV, the same 576p (PAL) channel via digital cable looks nearly as good as 720p, but the digital terrestrial variant looks as horrible as on analog cable or as an XviD.

But 1080p is just, well, crisper and you'll always notice a difference than both 576p and 720p.
 
Last edited:
Note: I did not buy any Apple products after 2006.

MPEG2 may be a less efficient algorithm, but the Bluray videos are compressed too much.

DVDs are as compressed if not more due to the less efficient nature of MPEG-2. That's the whole point.

Like Aiden said, the problem is your TV settings.
 
Well I like that Intel is throwing it's support behind retina displays. Although retina displays are not really my highest priority for a computer (although I love it on the iPad and iPhone), it's still a pretty cool feature! :) Now intel needs to push forward Thunderbolt faster... which they kind of are... but not fast enough for my tastes...!
 
Upscaling cannot make up for details that just aren't there in the source. A DVD is 480p. Any extra pixels added by an upscaler will not have any of the details that a 1080p source would have.

A good upscaler is needed to display the standard definition DVDs on a high definition display without it looking like crap. It does not and should not invent details. But if one is to compare quality, it is only fair go make each alternative look as good as possible on the equipment used.

You go on about the details and pixels, but my point from the post you partly quoted was that resolution is not everything. Using more pixels and different (arguably better) compression algorithms does not automatically make the quality of the end result better. If you use a bad source and/or tweak the compression incorrectly, all those fancy pixels are not going to save the day.

I prefer a good DVD over a bad BD.

A good BD on the other hand is breathtaking...
 
Excuse my wording, but I intended on saying that they won't handle it in terms of reasonable graphics performance.

I understand that the 7000 series and the likes has support for it. But how will they perform? That is a different story all together.

Agreed that there is a big difference between 4K playback or 2D windowing, and, 4K realtime rendering as in games (really QFHD, at 3840x2160, looks to be the commonplace size actually). Once 4K screens are common, the burden will be on AMD and Nvidia to catch up ;-)
 
As impressive as that sounds, unfortunately that's not right. Increasing the resolution x4, the actual res would be 5120 x 2880 pixels (it works by increasing the area, not the physical dimensions - you've increased the res x16 times (4x width, 4x height). Would be great though, although in thinking that way, you'd be looking at a 54" screen on your desk!!!

Of course! Ah, my maths wasn't too good when it comes to scaling. :)
 
I am more concerned about the batt life (or the lack thereof). So this is where the space savings from removing the cd-drive goes to? :p
 
DVDs are as compressed if not more due to the less efficient nature of MPEG-2. That's the whole point.

Like Aiden said, the problem is your TV settings.

I highly doubt it. They have been tweaked carefully, and the Bluray and DVD player both used HDMI to connect to the same receiver, just at different inputs.
 
I highly doubt it. They have been tweaked carefully, and the Bluray and DVD player both used HDMI to connect to the same receiver, just at different inputs.

Exactly my point. Dark Knight is a superb transfer to Blu-ray. If it's looking worse than the DVD on your TV, something is quite wrong.
 
Thermo, I am going to disagree with you and say that there is a mass market for high dpi displays.

Photographers want them, graphics designers want them and student who want to be able to read long passages (books, notes, etc) on their notebooks/laptops/tablets without destroying their eyes want them.

This is one area where I fully support Apple's push to create a critical mass of ownership for a feature.

I agree that the high resolutions are less useful for videographers and certain applications... but it is high time for vector based graphics to step into the limelight a bit more.



----------

Exactly my point. Dark Knight is a superb transfer to Blu-ray. If it's looking worse than the DVD on your TV, something is quite wrong.


Agreed. Check your cables, check the player settings and then check to see whether you are having ground loop issues.
 
Exactly my point. Dark Knight is a superb transfer to Blu-ray. If it's looking worse than the DVD on your TV, something is quite wrong.

If my settings were wrong, it would also make the DVD look bad. Anyway, it's a useless argument at this point. I don't even remember why I mentioned Bluray.
 
If my settings were wrong, it would also make the DVD look bad.

Actually the opposite - if your settings heightened the sharpness to make the fuzzy DVD look better, they could add visual noise to the already sharp BD image.

Since you mentioned "jaggies" on the BD, I suspect this to be the case - a 2Mpixel image should not have more pixelation than a 300Kpixel image. However, "edge enhancement" algorithms running on an already sharp image could have that effect.


Anyway, it's a useless argument at this point. I don't even remember why I mentioned Bluray.

Agreed - let's let this tangent die.
_________________

The biggest advantage for high resolution screens is for photography - with 5 Mpixel photos basically considered to be "low resolution" these days, editing photos from your 10 Mpixel point-and-shoot (or much higher resolution DSLR) on a 1.3 Mpixel display (current MBP 15") is more or less tragic.
 
Unlike Blu-Ray movies, we're not 12 feet away from the iPad.

~110PPI is a nice sweet spot, as I sit here viewing my 27" Apple Cinema Display.

300PPI is gross overkill for a tablet, since one is normally 12" away... not 12'. It's sharper, but if nothing else the iPad3 is too ahead of its time...

I do like my 110 PPI 2560x1440 27" iMac here too. But I saw that HiDPI and it really does make a difference. I will look forward to a 5120x2880 display for a 27" iMac (or any Mac).

The new iPad is 264 PPI and it is exactly retina at 13" for those with 20/20 vision (defined by vision acuity at one arc-minute). The iPhone 4/4S (at 326 PPI) is retina at exactly 10.5 inches. These seem like reasonable distances.

220 PPI for my iMac would work out great because I am typically about 16-20 inches from iMac display. 220 PPI becomes retina at distances over 15.6 inches.

So my iPad at 13", my iPhone at 10.5", and my iMac at 15.6" (minimum) just seems to fit. I have two of the above, I just need (want; would like) a retina Mac display.

Other advantages is that one can look closer to see more detail, such as looking at a 4K (4096x2160 or 3840x2160) movie inside a window that is about the size of a 1920x1080 window in todays 27" iMac. (Remember, the 27" iMac has 78% greater resolution than a 1080p display)
 
I don't even remember why I mentioned Bluray.

Because someone brought up 4K and that we should use very expensive media that can't hold very more data than Blu-ray presently (flash memory type media) and you wanted to shime in with this little piece :

Yeah, Bluray was a bad idea (and Steve was correct about it). Plus, the quality wasn't all that good because of the very lossy H.264 compression it used. It ended up looking worse on my big screen than a DVD in my good 1080p upscaling DVD player.

It's not really a tangeant, it's very much on-topic for a thread about 4k resolution capable monitors. According to your bit of logic, 4k movies would look worse than your upscaled DVDs...

*sigh*.
 
Resolution independence doesn't work, that's all. x2 is the only way to get higher res displays - clearly more flexible on a laptop screen as the base size is flexible, e.g. could be 1680x1050 (x2) or 1440x900 (x2) but the x2 factor is the only way to make this work.

Why do you think so?
Resolution independence works very well for printers. Very old 7-pin printers had only 60 dpi while modern laser or inkjet printers have thousends, yet they all print a pdf in the same physical size.

It also works well for 3D graphics. The monsters won't be smaller if you play at 1600x1200 than they would at 640x480, they just look better.

There is no problem making this impossible for a normal desktop GUI. There are just things that have been messed up in the past. So a true resolution independent GUI probably means flushing a lot of backwards compatibility into the toilet.

Christian
 
There is no problem making this impossible for a normal desktop GUI. There are just things that have been messed up in the past. So a true resolution independent GUI probably means flushing a lot of backwards compatibility into the toilet.

The problem that doesn't creep up in PDFs and 3D gaming is the reliance of the desktop on pixel maps for UI elements (or bitmaps if you prefer). We use these static pixel based images to render the elements and thus any scaling by fractional values of the pixel( > 1 or < 2) needs to use an algorithm that messes up the image in someway (no scaling algorithm is perfect).

Moving to vector based images (like SVG) would solve this of course. KDE has been able to use SVGs since like version 1.x in the 90s, so that's not even an issue. It's just stagnation that keeps us tied down in this instance.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.