Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

KittyKatta

macrumors 65816
Original poster
Feb 24, 2011
1,058
1,212
SoCal
Based on some info from the boards I decided to try to rip my first BluRay using MakeMKV and Handbrake and Im very proud of myself. :D , I experimented using Handbrakes AppleTV2 presets to make a 720p version and 1080p version but couldnt see a big difference. maybe I did something wrong but it took 9hrs for both versions so I thought I'd ask around first to get other opinions.

So what's everyone's opinion on ripping 720p vs 1080p? Is it an obvious difference? Im considering just doing 720p rips for the smaller file size (Hard drives are cheap but iPads are pretty limited) and since I'm not getting the best audio then I still own the Bluray for the "ultimate" experience.

Edit: Just want to clarify I have an ATV3 and iPad3.
 
Last edited:

OptyCT

macrumors 6502
Nov 9, 2008
362
4
It's all dependent on the blu-ray source material and the TV you're viewing it on. Since it's taking you 9 hrs. to convert, I would suggest sticking with 720p. You won't lose much. The jump from 480p (DVD's) to 720p is dramatic. However, going from 720p to 1080p is not as dramatic a difference.

Personally, I rip my blu-rays and convert them in 1080p. However, I have a Core i7 iMac and can accomplish this in a reasonable amount of time. If I simply remux using Subler (if the resulting MKV video file is H.264), I can rip and convert an entire blu-ray in well under an hour (MakeMKV = 20 minutes, Subler = 15 minutes). For all other video file types, I use Handbrake, where the conversion in 1080p usually takes about as long as the movie duration (i.e. a 2-hour movie takes about 2 hours to convert).
 

KittyKatta

macrumors 65816
Original poster
Feb 24, 2011
1,058
1,212
SoCal
It's all dependent on the blu-ray source material and the TV you're viewing it on. Since it's taking you 9 hrs. to convert, I would suggest sticking with 720p. You won't lose much. The jump from 480p (DVD's) to 720p is dramatic. However, going from 720p to 1080p is not as dramatic a difference.

Personally, I rip my blu-rays and convert them in 1080p. However, I have a Core i7 iMac and can accomplish this in a reasonable amount of time. If I simply remux using Subler (if the resulting MKV video file is H.264), I can rip and convert an entire blu-ray in well under an hour (MakeMKV = 20 minutes, Subler = 15 minutes). For all other video file types, I use Handbrake, where the conversion in 1080p usually takes about as long as the movie duration (i.e. a 2-hour movie takes about 2 hours to convert).
Sorry for the dumb question but this is kinda new to me:

1) Is remuxing just putting the MKV in an MP4 container without any compression? For example, my Beauty and the Beast MKV was 20GB so subler would just make a 20GB M4V? Why would I want a size that big rather than a handbrake 1080p thats 3.6GB?

2) Grainyness has nothing to do with 1080p vs 720p right? I purchased Mission Impossible 3 1080 on iTunes and it looked fantastic but it has a "grainy" look. Is that because its filmed on film rather than digital?

3) Does something filmed digitally like Avatar have more benefits to doing 1080p or even a huge MKV rather than an older film like Back to the Future where the source is grainy and already looks just fine on 720p?

Thanks for any answers and for not making me feel stupid. :D
 
Last edited:

ayzee

macrumors 6502a
Jun 12, 2008
576
35
For future proofing purposes put up with the extra file sizes and stick to 1080p content. I can tell the difference, the bigger the tv the more you can tell.

When we move to 4k resolution TVs you will Definetly notice the difference
 

dynaflash

macrumors 68020
Mar 27, 2003
2,119
8
AppleTV 2 still downscales 1080p to 720p to the display. Therefore, I am not sure how much of a difference you can actually notice.
 

Thiemo

macrumors member
Aug 17, 2008
44
0
AppleTV 2 still downscales 1080p to 720p to the display. Therefore, I am not sure how much of a difference you can actually notice.

No offense, but what does this mean anyway? We're talking about ATV3 which is able to display 1080p. It does not matter if ATV2 can't.
 

d21mike

macrumors 68040
Jul 11, 2007
3,320
356
Torrance, CA
ITunes Movies. I originally got the HD (720p version w/sd version) and now I downloaded the 1080p version (it also download another sd version). What you see in Finder is:

name of move.m4v
name of movie (HD).m4v
name of movie (1080p).m4v

1st one is the SD Version then the 720p version then the 1080p version.

After coping a link in iTunes (they are stored on a NAS Device).

My Apple TV 2 gets an error if I just copy the 1080p Version in iTunes. My Apple TV 3 can play it fine. But if I copy all three in iTunes (I only see the 1 entry with HD,SD at bottom) then both my Apple TV 2 and 3 can play it (only see's 1 listed in Apple TV Menu).

My question is, is the Apple TV 3 smart enough to play the 1080p version?

This even goes further when I Sync with my iPad 2, 3 and my iPhone 4s. My iPad 3 can play the 1080p. Have not test iPad 2 (my wife has it) but my iPhone 4s can not play the 1080p but it can play the 720p. Not sure I even need the SD Version. But if you set the bit rate to download in iTunes it will only download either the 720p or 1080p and the SD Version. Would be nice to get both HD Versions in my case.

Do you think that every device will select the highest quality it can play?
 

dynaflash

macrumors 68020
Mar 27, 2003
2,119
8
No offense, but what does this mean anyway? We're talking about ATV3 which is able to display 1080p. It does not matter if ATV2 can't.

Hmm, before edited I could have sworn the OP said atv 2. If not ... my bad. I stand corrected.
 

KittyKatta

macrumors 65816
Original poster
Feb 24, 2011
1,058
1,212
SoCal
Hmm, before edited I could have sworn the OP said atv 2. If not ... my bad. I stand corrected.

I said "AppleTV2" referring to the handbrake preset. I wasn't clear enough but no big deal. It does make me wonder why they're taking so long for handbrake to give out an AppleTV3 preset. I know you can make one yourself but I know so little about this stuff and dont want to screw things up that I prefer the pros to tell me what to encode to. :D
 

dynaflash

macrumors 68020
Mar 27, 2003
2,119
8
It does make me wonder why they're taking so long for handbrake to give out an AppleTV3 preset.

Simple, for HB to have a new built in preset requires releasing a new version of HB. We released HB 0.9.6 right before the atv 3 came out. So a whole new release cycle for one device preset does not make sense.
 

KittyKatta

macrumors 65816
Original poster
Feb 24, 2011
1,058
1,212
SoCal
Simple, for HB to have a new built in preset requires releasing a new version of HB. We released HB 0.9.6 right before the atv 3 came out. So a whole new release cycle for one device preset does not make sense.

Fair enough. But my 5hrs backing up Back tithe Future to AppleTV2s 720p preset somehow ended up being 4.7GB so I must've accidentally tweaked something while browsing through the settings. Updated presets help out newbies like me from making mistakes. :D
 

dynaflash

macrumors 68020
Mar 27, 2003
2,119
8
We are looking at a way to be able to update presets w/out having to release a new version of hb.
 

andymodem

macrumors 6502a
Nov 20, 2008
583
108
Baltimore, MD
I'm torn. I did a 1080 and 720 version of Iron Man 2 and couldn't tell a difference between the two on my 60" plasma from 8 ft away. Trying to decide if it's worth it to stay with 720 for the smaller file sizes.
 

Jetson

macrumors 6502a
Oct 5, 2003
587
41
There is a significant difference in quality between 720p and 1080p on Apple TV.

There are many examples to cite, but I'll give one.

Compare season 1 (720p) of Star Trek Enterprise with seasons 2-4 (1080p) on iTunes. There is an enormous increase in sharpness and level of detail between the two.
 

Irishman

macrumors 68040
Nov 2, 2006
3,388
842
There is a significant difference in quality between 720p and 1080p on Apple TV.

There are many examples to cite, but I'll give one.

Compare season 1 (720p) of Star Trek Enterprise with seasons 2-4 (1080p) on iTunes. There is an enormous increase in sharpness and level of detail between the two.

Um, incorrect, but I can see how you could be confused. Season 1 of Enterprise aired in standard definition, but got upconverted to 720p for HD broadcasting, iTunes, and Netflix. The latter seasons aired in 1080i, and stayed that way for HD broadcasts, iTunes and Netflix.

Hope that helps.
 

TyroneShoes2

macrumors regular
Aug 17, 2011
133
3
Sorry for the dumb question but this is kinda new to me:...Thanks for any answers and for not making me feel stupid. :D
Lets start here. There is a huge difference between being dumb or stupid, and being uninformed. There is no shame in either, but the smartest thing you can do is to ask questions (and one of the dumbest things is to not ask), so that in my book defines you as definitely not dumb or stupid. Uninformed I can help you with. I get paid (not nearly enough) to handle these issues on a professional basis regularly, so I feel that I can give you an informed opinion that is based more in fact than opinion.

1) Is remuxing just putting the MKV in an MP4 container without any compression?...
Probably; but...not...quite; and it depends. "Remuxing" is short for "remultiplexing". Compressed digital transport streams, for instance, can contain multiple programs in the same stream, and although all of the bits interleave into a big pot, the metadata that is in the packet headers keeps them organized rather than all jumbled up.

If you receive a 6.1 channel and a 6.2 subchannel from WXYZ-TV, they have multiplexed two programs [a "program" refers to all of the separate elemental streams of audio, video, and metadata associated with a particular media element] into the same stream, or combined two programs into the same stream. Your HDTV receiver contains the ability to demultiplex, or separate out 6.1 or 6.2. So remultiplexing, which is actually very uncommon both professionally and in the consumer world, is just recombining two programs or two elementary streams into the same MPTS (multiple program transport stream), at least in the MPEG transport stream world. And strictly speaking, that is not in itself, a process that requires encoding/decoding/transcoding or compressing, although any of those things may happen in a particular workflow along with remuxing.

If you do a Handbrake rip of a consumer DVD, you are probably doing a lot of demuxing. IOW, you are keeping the main video, english audio track, etc. and separating out and discarding all of the other associated streams such as second audio, descriptive audio voiceovers, added features, etc. That is one way of getting a smaller file size without compromising quality. Handbrake may not refer to it as demuxing per se, but under the hood that is exactly what is happening.

Mp4 and MKV are both containers, or ways of encapsulating encoded media, normally to achieve compatibility. They are protocols for wrapping media, so in reality the differences between them are slight. Its confusing because both codecs such as MPEG-4 and MPEG-2 and Cinepak are usually designated by file suffixes, and wrapper or container formats are also. But a wrapper is really just the protocol for how the metadata in the packet headers is described, while the codec defines how the payload, the media, is actually constructed.

If I can take a stab at an analogy, think of a wrapper format as just the package your iPad came in (if you change the package it doesn't affect the iPad), but think of the codec as the iPad itself, and the apps and files on your iPad as the encoded media. What is important about that is that if you change the wrapper from one to another, it is usually easy to do and non-destructive to the payload (because you are only changing the protocol for how the payload is described). On the other hand, if you change the codec, if you chain the source and target codecs together, there is a generational loss in quality, possibly minor, possibly severe (because you are fundamentally changing how the media is constructed).

So it is important to know just what you are doing when repackaging digital files. Are you transcoding? Just rewrapping? Are you deinterlacing? Everything you do will have certain levels of consequences, from severe to invisible, depending on lots of variables including what the process itself actually is doing, the original quality level, and the target quality level, at a bare minimum. This is not a simple environment to understand and work within.

2) Grainyness has nothing to do with 1080p vs 720p right? I purchased Mission Impossible 3 1080 on iTunes and it looked fantastic but it has a "grainy" look. Is that because its filmed on film rather than digital?
1080p vs 720p will typically not have anything to do with graininess, assuming the encoding is done with a similar level of care within a similar bit budget. But if the original was a "grainy" print, 720p, by virtue of its lower resolution, will mask that particular artifact better than will 1080p or 1080i.

Film vs digital is difficult to compare because they are different processes, but there is really no shortcoming of either that makes one better than the other. Film can be done in a way that creates excellent masters, and so can digital.

If everything else is held equal (and I can't stress the importance of that statement enough) 1080i and 1080p and 720p are perceptually very nearly equivalent. No one in a double-blind test has ever been able to say "Oh, that video is 1080i and that other one is 720p and that third one is 1080p" just from watching the video play back; the differences are very subtle and even trained eyes can't approach doing that.

Generally speaking, 720p has fewer motion artifacts due to its higher frame rate and no interlace error, and 1080i has fewer static artifacts due to its higher resolution. 1080p24 has the advantage of 1080i resolution but no interlace error. That would seem to make it better than 1080i, especially if there is pulldown involved, but the slower frame rate of 1080p24 is far inferior to 720p60, so it has a lot more motion artifacting than that, and more than 1080i. You need 1080p60 (which is really only available to consumers in gaming at the present) to reap all of the separate, slightly different benefits of 720p60, 1080i30, or 1080p24. "1080p" has sadly become more marketing hype than anything else, and all of that is confused by the fact that we buy "1080p" TVs, which is completely unrelated to the display format 1080p.

3) Does something filmed digitally like Avatar have more benefits to doing 1080p or even a huge MKV rather than an older film like Back to the Future where the source is grainy and already looks just fine on 720p?
While the sheer number of variables means there is no true empirically-correct answer to that exact question, this hits on one of the basic rules of encoding/trancoding/compressing, which is the better the original is the more you can compress it to get a similar end result than if it were a bad original in the first place. So the answer is a very-qualified yes.

Another basic rule is to not use more algorithm than is needed. If a movie telecine transfer is not all that sharp, it really does not make sense to use a higher-resolution target format (1080), because the lower-rez format (720) will give the same result or even a better result in a smaller file size, and with the other benefits that 720 has over 1080 such as no interlace error, no judder/pulldown issues, etc.

As a professional compressionist, I have to understand very keenly all of the particular benefits of all of the different tools at my disposal, and know how to choose the proper set of those tools that fits both the condition of the original and the constraints of the target format, for each instance. Consumers dipping their toes into these sorts of things need to have the same level of understanding to achieve pro results.

...my Beauty and the Beast MKV was 20GB so subler would just make a 20GB M4V? Why would I want a size that big rather than a handbrake 1080p thats 3.6GB?
File sizes are only indicative of quality levels when comparing similar codecs; for instance, the quality of a MPEG-2 file may be the same as an MPEG-4 file while the file size of the MPEG-4 file may be 30-50% smaller for that same quality level. But strictly speaking, changing the file wrapper (MKV to M4V) without changing the compression level or codec would preserve the quality. The only reason to do that might be because of an incompatibility of the target player with the original wrapper format.

You may want a 20 GB file just because you want better quality, but that may not apply if the original is not that good in the first place, or if the differences between that particular source codec and the destination codec chosen make a particular level of quality impractical for a particular target file size. The situation always dictates the parameters, and knowledge of the tools is necessary to achieve the best results with the least compromise. For instance you may want to instead go for a Handbrake rip at 3.6 just because of the portability factor; you can get more movies on your iPad for that trip to NYC if the file size is smaller. Each case is unique. And if you know what you are doing, you will not have to compromise much quality to do this.

Experiment. Find what works for your job parameters by trying different things, which can leapfrog a steep learning curve if you are really only interested in a particular sort of task, such as rips for an iPad, or whatever. Of course the more you understand the more targeted the experimentation can be, but you are only limited by time and motivation.
 
Last edited:

TyroneShoes2

macrumors regular
Aug 17, 2011
133
3
It's all dependent on the blu-ray source material and the TV you're viewing it on. Since it's taking you 9 hrs. to convert, I would suggest sticking with 720p. You won't lose much. The jump from 480p (DVD's) to 720p is dramatic. However, going from 720p to 1080p is not as dramatic a difference.

Personally, I rip my blu-rays and convert them in 1080p. However, I have a Core i7 iMac and can accomplish this in a reasonable amount of time. If I simply remux using Subler (if the resulting MKV video file is H.264), I can rip and convert an entire blu-ray in well under an hour (MakeMKV = 20 minutes, Subler = 15 minutes). For all other video file types, I use Handbrake, where the conversion in 1080p usually takes about as long as the movie duration (i.e. a 2-hour movie takes about 2 hours to convert).
These are great recommendations.

Just for everyone's FYI, even pro server setups have difficulty transcoding HD video all that fast. To get the best results you need a separate server to host the SQL database, and you need a state-of-art server platform like a Dell R710 with 16 Xeon cores, just for starters. Even that might not yield faster than real time results. The fastest systems, which can do broadcast-quality HD transcoding at 2.5 to 2.8 times real time, have only been available since about NAB this year, and they require a proprietary server that uses OpenCL to harness the power of the available CPUs and GPUs together efficiently enough to approach those speeds.

Pro dedicated transcoding ystems just a few years old that worked OK for SD can take up to 6 times real time to transcode even a highly-compressed HD program. Simply said, the requirements of everyday professional transcoding have greatly outpaced the ability of hardware and software to keep up since the move to HD. It will take a few years and a couple more generations of CPUs to get practical, acceptable results.
 

HobeSoundDarryl

macrumors G5
Generally speaking, 720p has fewer motion artifacts due to its higher frame rate and no interlace error, and 1080i has fewer static artifacts due to its higher resolution. 1080p24 has the advantage of 1080i resolution but no interlace error. That would seem to make it better than 1080i, especially if there is pulldown involved, but the slower frame rate of 1080p24 is far inferior to 720p60, so it has a lot more motion artifacting than that, and more than 1080i. You need 1080p60 (which is really only available to consumers in gaming at the present) to reap all of the separate, slightly different benefits of 720p60, 1080i30, or 1080p24. "1080p" has sadly become more marketing hype than anything else, and all of that is confused by the fact that we buy "1080p" TVs, which is completely unrelated to the display format 1080p.

You provided so much good information I almost hate to say anything. BUT it is important to note that while the above is true, it implies something that doesn't mesh with :apple:TV3. There is no 720p60fps on any :apple:TV. It is 720p30fps. Thus, there is no "fewer artifacts due to it's higher frame rate" as the frame rate of 720p30fps vs. 1080p30fps is the same frame rate.

The HD standard definitely supports your comments (there is a 720p60fps format) but it hasn't been implemented by Apple in :apple:TVs (I wish Apple would make it play back native 60fps and native 24fps, at both 720p and 1080p... maybe :apple:TV5 or 6?). So if there is some head-to-head being made between 720p and 1080p, the frame rate won't be a differentiator if all other factors are consistently focused on delivering best picture.

And OP, just in case your own head-to-head was an animation ("Beauty & the Beast"), you may want to try something other than an animation. Animation can look good without really pushing the HD standards.

And you may want to try the :apple:TV2 preset for 720p and the "high profile" preset for :apple:TV3. That way, you'll really be pushing the hardware's capabilities. If you have something that has a lot of dark scenes (lots of blacks), you may want to convert that film and have a look.
 

currahee2100

macrumors regular
Feb 9, 2009
182
74
Ignoring what everyone said, because a 720p vs 1080p answer doesn't need a whole book to write.

Bottom line: You will notice it the bigger the screen and the closer up you are. The further away you are and the smaller the screen it won't matter all that much.

It's not something device specific. You could easily test it out by borrowing a friend's Xbox and changing the settings between 1080 and 720, if it has that capability. Or hook up a computer to your screen and see what it looks like at 1080 (1920x1080) and 720 (1280x720).

Technically you should be able to tell the difference because 1080p pictures hold more detail, but again the further away you are the less you'll notice anything very specific.
 

TyroneShoes2

macrumors regular
Aug 17, 2011
133
3
You provided so much good information I almost hate to say anything. BUT it is important to note that while the above is true, it implies something that doesn't mesh with :apple:TV3. There is no 720p60fps on any :apple:TV. It is 720p30fps. Thus, there is no "fewer artifacts due to it's higher frame rate" as the frame rate of 720p30fps vs. 1080p30fps is the same frame rate.

The HD standard definitely supports your comments (there is a 720p60fps format) but it hasn't been implemented by Apple in :apple:TVs (I wish Apple would make it play back native 60fps and native 24fps, at both 720p and 1080p... maybe :apple:TV5 or 6?). So if there is some head-to-head being made between 720p and 1080p, the frame rate won't be a differentiator if all other factors are consistently focused on delivering best picture...
If you read carefully, you will see no reference to AppleTV anywhere in my posts in this thread. My discussion of the merits of the differences between the formats referred to the broadcast formats, and briefly to consumer rips from and within those formats, all as examples in a quest to help the OP understand the larger questions.

The two common formats for broadcast are 1080i30 and 720p60. The actual frame rate for 1080i is 29.97, while the actual frame rate for 720p is 59.94. Unless pointed out specifically, 720p refers to the broadcast standard 720p60, and 1080i refers to the broadcast standard 1080i30; they always have, probably always will. There are other less-implemented flavors, and while they may be more and more common, in the professional world 720p and 1080i refer to the broadcast standards. And since I am from that world, I will freely refer to them as 720p and 1080i expecting most people to understand what I am talking about. I feel no need to qualify everything to the nth degree.

There is indeed an inferior, non-broadcast version of 720 which is 720p30. Certain internet delivery options probably opt to use this, at an obvious penalty of doubling the motion artifacts, so I guess it is germane to this discussion, but that is not what I was speaking of. To get 720p30 from a 720p60 source, half the visual information available within the broadcast standard is simply thrown away, for starters. If that is what Apple is doing, then that explains why 720p is inferior on AppleTV--its not true 720p, only half of it, or less. Its 720p Lite. But then I get broadcast quality OTA and from DirecTV, so it is somewhat out of sight/out of mind to be reminded that there are inferior ways to view TV programs.

My TV can take video at 30 or 60 fps and generate unique intermediate frames from the previous and next frames that effectively changes the unique frame rate to 120, thereby cutting the amount of motion artifacts in half or even to one-quarter from what they were as broadcast, along with removing all pulldown judder. I consider this a significant step in the right direction of improving the accuracy of televised images, and I use this feature all the time. Some people are so used to pulldown artifacts from seeing them and nothing else for their entire lives, that they are actually disturbed by those artifacts being finally removed; they just can't yet make the transition.

But to take something and throw half of it away which effectively doubles the motion artifacts seems like a significant step in exactly the wrong direction. The last time there was this sort of travesty was 30 years ago when VHS unveiled their brand-spanking-new feature, SLP. The defense rests.

That is one of the many reasons that AppleTV and the rest are not nearly ready for prime time, and why Steve considered it merely a frivolous side project.
 
Last edited:

HobeSoundDarryl

macrumors G5
TyroneShoes, again no problem with the facts of what you've posted. It's the context that's the issue. The OP of this thread is not asking about this debate relative to the broadcast standard. He's asking relative to what an :apple:TV can send to his HDTV.

And the presets he's using for his testing are geared to yielding an :apple:TV file which will NOT be 60fps. So even if he reads your comments and concludes that he should do everything in 720p because that will reduce the motion artifacts, the way he is planning to go is not compatible with what you are saying (he will not end up with 60fps 720p that he can play on his :apple:TV3).

The OP is obviously new to this and is trying to make a decision with a fair amount of ramifications for his purposes (if he is going to convert- say- 100 or 300 films, this is a big (time consuming) decision for him). So, it's important that he gets input that is relevant to his stated purpose, which is not broadcast 720p60 vs. broadcast 1080i30 but :apple:TV3 720p30 vs. :apple:TV3 1080p30.

In that debate, the fps is identical so the motion artifacting is a relative non-issue (for his purposes). What you have posted is true but that part of your comments are not relevant to his issue. You seem to be taking a pure approach to 720p vs. 1080i where his purposes are constrained by what the :apple:TV can manage. Thus, :apple:TVs incarnations of 720p vs 1080p differ solely by pixel detail, with the latter having greater pixel detail.

OP, if you try a few more movies and can't really see a difference, then the primary benefit of 720p over 1080p will be smaller file sizes. To that I would offer 2 observations:
  1. Big storage is cheap. Is file sizes really that big of an issue? One fat external drive can hold an awful lot of 1080p files processed through Handbrake
  2. Consider your future. Just because you may not see enough of a difference on your current HDTV, you might see more of a difference on your next HDTV.
Especially with the latter, it will be a bit of a pain to go back and re-encode all of your movies again if later on you do get the quality of screen on which you can more easily see the difference.
 

Irishman

macrumors 68040
Nov 2, 2006
3,388
842
ignoring what everyone said, because a 720p vs 1080p answer doesn't need a whole book to write.

Bottom line: You will notice it the bigger the screen and the closer up you are. The further away you are and the smaller the screen it won't matter all that much.

It's not something device specific. You could easily test it out by borrowing a friend's xbox and changing the settings between 1080 and 720, if it has that capability. Or hook up a computer to your screen and see what it looks like at 1080 (1920x1080) and 720 (1280x720).

Technically you should be able to tell the difference because 1080p pictures hold more detail, but again the further away you are the less you'll notice anything very specific.

+1,000,000,000
 

Navdakilla

macrumors 65816
Feb 3, 2011
1,100
13
Canada
Personally I can see a very little bit of difference.
Not enough for me to re-encode my 400 bluerays from 720 (from the ATV 2 days) to 1080p. I am re-encoding my favorites and the best ones, but not everyone of them.

I am also planning on getting a 720p projector (optoma gt750) and from what I hear you can't tell the difference between 720/1080 on a 120" screen sitting 12 ft away, this is good enough for me (I'm on a budget and want 3D)
 

Ratatapa

macrumors 6502a
Apr 3, 2011
665
25
Answer is simple

When your girlfriend change something about her appearance (w/e it is) do you remark it?

If yes keep 1080p if not keep 720p


Problem solved
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.