Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Once again, stop. A GTX 680 can game on the resolution of the retina MB. The HD7970 can do it as well. Hell, their smaller brothers the 670 and 7950 can also do it as well. You do not need a SLI'd Titan or CrossFire'd 7990.

If physics doesn't allow a good GPU, then don't market it as a "Pro" Retina product. Simply don't.

I would not call 20-35fps "gaming". I call it suffering. I can't once again "stop" as this would be my first instance of "stopping".
What parts would you have included then to make it "pro"? Keeping in mind the tdp limits set for the chassis and indeed all PC's that are >1" thick? Even after that everyone would just start posting how stupid hot it gets and "is it safe?" etc...
 
I would not call 20-35fps "gaming". I call it suffering. I can't once again "stop" as this would be my first instance of "stopping".
What parts would you have included then to make it "pro"? Keeping in mind the tdp limits set for the chassis and indeed all PC's that are >1" thick? Even after that everyone would just start posting how stupid hot it gets and "is it safe?" etc...

Your human eye can only distinguish 24 fps to make it "smooth". why do you need 35 or more? Ever gone to the movies, all in 24 fps.

If you can't include said parts, then don't call it "Pro". Simple.

Also, here is a link to Anandtech's comparisson of the GTX 680 with the HD7970. Both GPUs I mentioned.

Click Me

None of the benchies on that link (and that link covers all benchmarks) are below 30fps; which in the vast majority of gamers is acceptable. although 40fps is desired.
 
Last edited:
Your human eye can only distinguish 24 fps to make it "smooth". why do you need 35 or more? Ever gone to the movies, all in 24 fps.

If you can't include said parts, then don't call it "Pro". Simple.

Ok, this shows you are just trying to provocate... That, or you really haven't got a clue.

You should quite easily be able to distinguish that movement is not smooth with 24 fps. That is the reason why HD material is usually 50 or 60 fps, and so are most movies in cinemas. Today 24 fps is mostly used as an effect, to get a "cinematic" look. Or backwards compatibility with old footage. Which is great, as the low frame rate really limited how fast you could move the camera or objects in front of it without breaking the illusion of smooth movement.
 
Last edited:
Ok, this shows you are just trying to provocate... That, or you really haven't got a clue.

You should quite easily be able to distinguish that movement is not smooth with 24 fps. That is the reason why HD material is usually 50 or 60 fps, and so are most movies in cinemas. Today 24 fps is mostly used as an effect, to get a "cinematic" look. Or backwards compatibility with old footage. Which is great, as the low frame rate really limited how fast you could move the camera or objecta in front of it without breaking the illusion of smooth movement.

Go read on why Peter Jackson had it hard from fans. Then come back.
 
The main reason an average 30 FPS is still considered barebones in gaming isn't because it's so dramatically worse than 45+, it's because averaging at less than 40 or so implies that, at times, your game is dipping even below 30. Averaging 60 FPS certainly helps guarantee fluidity for your eyes, but it's much more important that it means you're not going to spike down to 15 or 20 during busy moments.
 
Please try to make sense. Or provide a link that will help you make sense.

There is enough sense in that post. If you don't understand it, then stop arguing.

----------

The main reason an average 30 FPS is still considered barebones in gaming isn't because it's so dramatically worse than 45+, it's because averaging at less than 40 or so implies that, at times, your game is dipping even below 30. Averaging 60 FPS certainly helps guarantee fluidity for your eyes, but it's much more important that it means you're not going to spike down to 15 or 20 during busy moments.

Yes, this is true, but that is why you have min frame rates which are all at the 24fps mark on average.
 
The main reason an average 30 FPS is still considered barebones in gaming isn't because it's so dramatically worse than 45+, it's because averaging at less than 40 or so implies that, at times, your game is dipping even below 30. Averaging 60 FPS certainly helps guarantee fluidity for your eyes, but it's much more important that it means you're not going to spike down to 15 or 20 during busy moments.

True. And with movies it would be the same: some scenes require a higher fps to preserve the illusion of movement, but most scenes don't require it. But variable framerates are somewhat problematic with movies, which is why they aren't really comparable to games.


Go read on why Peter Jackson had it hard from fans. Then come back.

I really do not understand what you are referring to. Please provide a link, or stop arguing.

And I'm really intrigued, what has some celebrity gossip about Peter Jackson have to do with, well, anything? Especially Macbook Pro's not being professional enough computers to you, even when they actually are used professionally quite widely? Could you clarify what would constitute a professional computer in your opinion? I don't see the connection to any of the strange points you try to make... But you seem to enjoy arguing with everyone, so have your fun, I won't stand in your way ;)
 
Last edited:
Your human eye can only distinguish 24 fps to make it "smooth". why do you need 35 or more? Ever gone to the movies, all in 24 fps.

If you have proper motion blur in your video. Disjointed animations at 24 fps will look rather jerky. While it should be possible to make a game that looks smooth at 24 fps, it would be probably more computationally intensive than render the same game at 60 fps - and surely much more complicated algorithmically. One thing you could do is render a number of subframes and blend them together. Or use some other funky techniques like rendering a fast-moving object multiple amount of times under different animation stages within the single frame.
 
Your human eye can only distinguish 24 fps to make it "smooth". why do you need 35 or more? Ever gone to the movies, all in 24 fps.

If you can't include said parts, then don't call it "Pro". Simple.

Also, here is a link to Anandtech's comparisson of the GTX 680 with the HD7970. Both GPUs I mentioned.

As others have pointed out you either are just trying to start a unintelligible fight or you don't know tech past your elbow. In no way can you compare a movie at 24fps to a video game for fluidity. I guess you can but you come off uninformed.
Show me a laptop with a 7970 or GTX 680 in it? Otherwise it is a tremendously moot point to argue against a rMBP. NO laptops have those desktop GPU's in them. Apparently your the guy with the $4500.00 4" thick mobile LAN party box with the 15 min battery life that gets bested by a $1200.00 desktop. You are arguing 2 separate worlds schizophrenically.
 
True. And with movies it would be the same: some scenes require a higher fps to preserve the illusion of movement, but most scenes don't require it. But variable framerates are somewhat problematic with movies, which is why they aren't really comparable to games.




I really do not understand what you are referring to. Please provide a link, or stop arguing.

And I'm really intrigued, what has some celebrity gossip about Peter Jackson have to do with, well, anything? Especially Macbook Pro's not being professional enough computers to you, even when they actually are used professionally quite widely? Could you clarify what would constitute a professional computer in your opinion? I don't see the connection to any of the strange points you try to make... But you seem to enjoy arguing with everyone, so have your fun, I won't stand in your way ;)


You stated, that movies are all 60-50 fps. They are all shot for 24 fps cinemas. Peter Jackson had it rough since The Hobbit was going to be shot double the fps (48fps). Therefor, cinema goers were going to see a soap opera style motion which was going to "ruin" the movie. He still shot the movie at 48fps and displayed it so for many cinemas. Although it just to point out that this was due to the 3D effect rendering. Yet, the 48 fps did stick for regular 2D film.

There you have it. Movies run at 24 fps, and not higher since in a movie, there is no "processing" going on at the projector, just a reel of stills running through a lamp.

If you have proper motion blur in your video. Disjointed animations at 24 fps will look rather jerky. While it should be possible to make a game that looks smooth at 24 fps, it would be probably more computationally intensive than render the same game at 60 fps - and surely much more complicated algorithmically. One thing you could do is render a number of subframes and blend them together. Or use some other funky techniques like rendering a fast-moving object multiple amount of times under different animation stages within the single frame.

True. But still calling 24 fps (the min frame rate for many modern GPUs) unplayable is insulting.

As others have pointed out you either are just trying to start a unintelligible fight or you don't know tech past your elbow. In no way can you compare a movie at 24fps to a video game for fluidity. I guess you can but you come off uninformed.
Show me a laptop with a 7970 or GTX 680 in it? Otherwise it is a tremendously moot point to argue against a rMBP. NO laptops have those desktop GPU's in them. Apparently your the guy with the $4500.00 4" thick mobile LAN party box with the 15 min battery life that gets bested by a $1200.00 desktop. You are arguing 2 separate worlds schizophrenically.


There are none. And yes you are wrong. There are laptops that have had desktop class CPUs and GPUs in them. You just don't know of them. No.

What I am arguing is that the Retina shouldnt be called a "Pro" product as the product performance doesn't do justice to the name and its predecessors. When the PowerBook was around, it was known as a Pro product, same with the classic MacBook Pro before the nVidia 9400M introduction. They were "Pro" products due to the performance parts. Now, an Intel IGP makes it as a "Pro" product. Laughable notion. The 13" MacBook line should be just that, MacBook, not Pro.

If you want the Retina to be called Pro, then back the needy display with equal GPU power behind it.
 
You stated, that movies are all 60-50 fps. They are all shot for 24 fps cinemas. Peter Jackson had it rough since The Hobbit was going to be shot double the fps (48fps). Therefor, cinema goers were going to see a soap opera style motion which was going to "ruin" the movie. He still shot the movie at 48fps and displayed it so for many cinemas. Although it just to point out that this was due to the 3D effect rendering. Yet, the 48 fps did stick for regular 2D film.

There you have it. Movies run at 24 fps, and not higher since in a movie, there is no "processing" going on at the projector, just a reel of stills running through a lamp.

Thank you. That clarifies your statement a lot.

I did not state that movies are all 50-60 fps. I said 'HD material is usually 50 or 60 fps, and so are most movies in cinemas.' Of which the latter part is not correct, I'll admit being quite off there. I'm a bit surprised to learn, that only under 30% of European cinemas are digital, so prints are still the more common method of distribution here, I had thought that the figures were pretty much the other way around... That means, that higher frame rates are still quite uncommon in cinemas.

But your statement "movies run at 24 fps" is also not true. Prints distributed to cinemas that have 35mm projectors run at 24 fps. But movies are also shot and distributed digitally, to cinemas and to consumers alike. And those often don't run at 24 fps.

And funnily, with that Peter Jackson reference you just pretty much countered your own statement about people not being able to see more than 24 fps. This is getting ridiculous :D

Good night ;)
 
I think you're taking the brand-name too seriously. My last MBP- 2007 pre-unibody had the graphic chip failed after 3 years of daily use. But that was a known issue with the Nvidia 8600 and which Apple replaced with no question asked. It's still working after 6 years of ownership and continued use.

I don't know how long my 15" rMBP will last. I bought mine used from a owner who's using it for 6 months before switching to the smaller 13" rMBP.

And after 6 months, the LG screen shows no sign of IR. I don't go about testing it every week (I did it a few weeks ago when I came across this forum) but in my normal everyday use, I haven't notice anything peculiar. I will be vigilant but not obsessed.

Well when you pay nearly 2000~3000USD on a computer, you expect it to be near perfect.
I returned mine due to blurry video in 720, most websites not supporting retina display, etc.
In fact mine had a faulty screen.
On the other hand, my 2011 MBP shows no sign of problems what so ever.
 
There are none. And yes you are wrong. There are laptops that have had desktop class CPUs and GPUs in them. You just don't know of them. No.

Wow. Another instant contradiction. Is the english the problem?

True. But still calling 24 fps (the min frame rate for many modern GPUs) unplayable is insulting.

It's not unplayable. It's unenjoyable.
I suggest you take your "knowledge" over to Anantech, ArsTechnica, or Hard Forum and enlighten them with your pearls of wisdom. Some place that has real gaming tech heads (ie not Mac forums, sorry Mac people). 60fps and even 120fps are the desired targets generally to go lock step with display scanning frequencies. Better yet ask a Counter-Strike player if they enjoy gaming at 24-35 fps.
The human eye has a static contrast ratio of about 100:1. Using your logic of 24fps perception there would be no reason at all to have a display have anything more than this. Anything more is unnecessary and lost in experience. But this is not the case at all because of the dynamic nature of the eye and it's constant adjustments. Same for gaming. Dynamic content requires much more than something static like a movie. And I would rather have 1000:1 static contrast over 100:1, but maybe that is just me.
 
Wow. Another instant contradiction. Is the english the problem?



It's not unplayable. It's unenjoyable.
I suggest you take your "knowledge" over to Anantech, ArsTechnica, or Hard Forum and enlighten them with your pearls of wisdom. Some place that has real gaming tech heads (ie not Mac forums, sorry Mac people). 60fps and even 120fps are the desired targets generally to go lock step with display scanning frequencies. Better yet ask a Counter-Strike player if they enjoy gaming at 24-35 fps.
The human eye has a static contrast ratio of about 100:1. Using your logic of 24fps perception there would be no reason at all to have a display have anything more than this. Anything more is unnecessary and lost in experience. But this is not the case at all because of the dynamic nature of the eye and it's constant adjustments. Same for gaming. Dynamic content requires much more than something static like a movie. And I would rather have 1000:1 static contrast over 100:1, but maybe that is just me.

My posts aren't confusing, but posting from an iPhone does have its quirks and kinks. Anyways, yes, there are laptops with desktop class GPUs and CPUs (I believe Alienware made them at one point). Just because you don't know of them, doesn't mean or imply they don't exist.


Really? You are now arguing enjoyable? This isn't about being enjoyable. I stated it was playable, but I didn't mention enjoyable. That is a whole subjective manner; which like you so gleefully stated, depends on the user (I enjoy a game at 40fps, and think 100fps is down right ridiculous, but others will disagree). Also, that 24 fps, was the minimum the quoted GPUs do. In other words, you will see 24 fps in heavy scenery. However, those GPUs are very well capable of higher (even in 100s) fps. I believe the GTX 680 has a benchmark in there of 102fps in a 2560*1600 environment. Talk about muscle.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.