Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
Not open for further replies.
So, am I the first one on the 500th page? ;) As I see, this post have the most views and replies on MacRumors forum. Apple, I think we deserve an update on the 22th since we are talking so much about it.
 
:rolleyes: Spoken like a true novice. I hope you didn't spill any kool-aid on your keyboard as you wrote that...

Thank you for your convincing counterargument.

Perhaps you could, please, use smaller words in explaining to me: for a computer that caters to people who don't want to bother with upgrading RAM or storage or replacing a battery, who don't need an optical drive or a built-in Ethernet port ... why would Apple decide against a high-performance integrated graphics solution that gives everybody better battery life, and instead use a dGPU that will benefit a few Photoshop plugins and a few people who want to play Skyrim?
 
Excuse me please, what is that "openCL", that is mentioned so often in this thread?

It is an open standard for a toolkit that allows developers to execute general purpose algorithms (anything other than rendering) on GPUs. It leverages a certain programming model that emphasizes data parallelism, that is, the massively parallel execution of a single instruction stream independently on a vast number of similar data items. CPUs, on the other hand, execute a single data stream on single data items, although they also have comparatively limited support for data parallelism with vector operations.

Since GPUs have massive raw computation capabilities, many algorithms can be significantly improved by implementing them on GPUs. However, the programming model is comparatively limited. Not everything can be done on a GPU efficiently. Stuff like image processing, which operates on large arrays of data (in this case pixels), can usually be done well. Final Cut Pro X is one application that uses OpenCL to accelerate some operations.

CUDA is a proprietary but very similar alternative from nVidia.
 
Thank you for your convincing counterargument.

Perhaps you could, please, use smaller words in explaining to me: for a computer that caters to people who don't want to bother with upgrading RAM or storage or replacing a battery, who don't need an optical drive or a built-in Ethernet port ... why would Apple decide against a high-performance integrated graphics solution that gives everybody better battery life, and instead use a dGPU that will benefit a few Photoshop plugins and a few people who want to play Skyrim?

Because the newly updated MacBook Air caters to the the target audience who require only "integrated graphics solutions that gives everybody better batter life." It has the long 12 hour battery life that you speak of, Intel Graphics 5000 and to top it off its extremely light and thin.

Counter to what you claim, "MacBook Pro" is designated for more intensive computer users. However, I can understand your misunderstanding for often see many teenagers using MacBook Pros to simply browse Facebook. On the other hand, the "MacBook Pro" has 'never' actually been the most powerful laptop. Instead, it is a sleek/elegant, powerful, and portable machine for those who use intensive applications [not games]. I myself am a graphic designer and have been saving up for the higher end model.

I hope that this response uses the 'smaller words' you speak of and is sufficiently subjective. :D
 
Last edited:
Oh, not this argument again, where people try to decide who qualifies as "pro" and who doesn't.

Iris Pro handles CUDA just fine - I believe the benchmarks showed it doing better than the GeForce 650M, in fact..

Ohh you silly, silly goose.



To the topic of it all. There are drawbacks and improvements of both configurations. The fact tho is still that a dGPU is better to have in many situations.
Iris pro comes out on top in a few tho, and there lies the issue at hand.
It's hard to know what road Apple will choose, but most would agree on that to have a dGPU as BTO is a great option since that satisfies both camps.
 
Hey guys about to buy my first mac which will prob be the 13 inch just sold my 4th gen ipad is it worth picking up a ipad aswell do you guys currently own both I'm just curious to see if you ditch the pad once u have an mac book?
 
Same here. I hope it shows before 2016 though. :eek:
Don't you mean before 2015?

The CUDA argument, detestable or not, is still a valid one. It is far more mature than OpenCL, however since intels embrace of openCL cpu's I predict eventually openCL will eventually beat out CUDA, but for now, as a 3D animator who needs quick renders and high count fluid and particle simulation, there is no substitute for CUDA. On the flip side, not many people need this.. and while valid, the advantages of CUDA would probably be utilized by <1 of buyers.

I know it is a valid argument, but from what I've read, and don't quote me in this, but isn't CUDA only used by a select amount of programs and isn't OpenCL supported by many of these programs as well?

:rolleyes: Spoken like a true novice. I hope you didn't spill any kool-aid on your keyboard as you wrote that...

That post was unnecessary and derogatory, based upon someone being miss-informed. How does that have any berring on if he spilled some sugary drink on his keyboard?
 
Hey guys about to buy my first mac which will prob be the 13 inch just sold my 4th gen ipad is it worth picking up a ipad aswell do you guys currently own both I'm just curious to see if you ditch the pad once u have an mac book?

From experience, the increase of productivity or satisfaction if you own a MacBook and a iPad is meager compared to simply owning a MacBook. Flash, as you know, does not work on the iPad [unless you use those work arounds which are a pain]. For me, the only actual benefit was when I was in bed and wanted to read the news but didn't want my MacBook in bed [it becomes warm and it much heavier]. Therefore, I sold my iPad but kept my work supplied iPhone. While the screen on the iPad facilitated more pleasurable reading, I'm still able to do the same with my iPhone.

Thus in conclusion, I advise you not to buy an iPad with a MacBook Pro and invest in an iPhone if you do not already have one [but wait for the 6].
 
It is an open standard for a toolkit that allows developers to execute general purpose algorithms (anything other than rendering) on GPUs. It leverages a certain programming model that emphasizes data parallelism, that is, the massively parallel execution of a single instruction stream independently on a vast number of similar data items. CPUs, on the other hand, execute a single data stream on single data items, although they also have comparatively limited support for data parallelism with vector operations.

Since GPUs have massive raw computation capabilities, many algorithms can be significantly improved by implementing them on GPUs. However, the programming model is comparatively limited. Not everything can be done on a GPU efficiently. Stuff like image processing, which operates on large arrays of data (in this case pixels), can usually be done well. Final Cut Pro X is one application that uses OpenCL to accelerate some operations.

CUDA is a proprietary but very similar alternative from nVidia.

Wow, thank you for explaining so very profoundly. Although I think that now I might have more questions than to begin with... I got some of it but not in detail.
I'll have to do some reading on a techside for dummies.
Thanks a lot anyway!

So what does that mean for my decision about the macbookpro if I want to run mainly Lightroom and some Photoshop?
 
Thank you for your convincing counterargument.

Perhaps you could, please, use smaller words in explaining to me: for a computer that caters to people who don't want to bother with upgrading RAM or storage or replacing a battery, who don't need an optical drive or a built-in Ethernet port ... why would Apple decide against a high-performance integrated graphics solution that gives everybody better battery life, and instead use a dGPU that will benefit a few Photoshop plugins and a few people who want to play Skyrim?

apparently his words weren't small enough.. a dGpu benefits a lot more than a few photoshop plugins and video games.. we have been through this already.. come on now chief XD
 
Hey guys about to buy my first mac which will prob be the 13 inch just sold my 4th gen ipad is it worth picking up a ipad aswell do you guys currently own both I'm just curious to see if you ditch the pad once u have an mac book?
I ditched my iPad and stuck with a 13-inch Macbook Pro. It really depends on how extensive you use the internet and certain applications, but I found my Macbook Pro had far more convenient uses than my iPad. Although my iPad was obviously better for browsing the web in bed and whatnot. If you're in college, the MBP is easily the better choice. If not, it's up to your own preferences and intended uses. I still have a Galaxy tablet that gets a decent amount of use, when I'm too lazy to use/turn on the Macbook.
 
You missed my point -- whether VMware or Parallels, there's no native support in OSX for running Windows programs

As far as I understand VMware gives you the possibility to load a virtual machine but keep it in the background such that you don't see the windows desktop but just the program you are using.
This does not mean that they have implemented a native support in OSX for win programs.
All win program icons (in the dock, expose ecc..) are simply a link to a virtual machine (that run under OSX thus being natively supported by it in the dock, expose, mission control ecc..) that, when loaded, show you in a window just the win program.

In any case I will test it as soon as rMBP arrives and I will report my impressions if you are interested.
 
Because the newly updated MacBook Air caters to the the target audience who require only "integrated graphics solutions that gives everybody better batter life." It has the long 12 hour battery life that you speak of, Intel Graphics 5000 and to top it off its extremely light and thin.

Counter to what you claim, "MacBook Pro" is designated for more intensive computer users. However, I can understand your misunderstanding for often see many teenagers using MacBook Pros to simply browse Facebook. On the other hand, the "MacBook Pro" has 'never' actually been the most powerful laptop. Instead, it is a sleek/elegant, powerful, and portable machine for those who use intensive applications [not games]. I myself am a graphic designer and have been saving up for the higher end model.

I hope that this response uses the 'smaller words' you speak of and is sufficiently subjective. :D

First of all, any portable computer will benefit from longer battery life, not just a consumer marketed product. 2nd, there's a pretty big difference between the Intel Graphics 5000 and Iris Pro 5200, I suggest you look it up. The word "Pro" after MacBook signifies that it is better than the Air, but it doesn't necessarily designate it as a professional machine. The Haswell rmbp will be all those things, even with Iris Pro instead of a dgpu. Iris Pro is pretty competitve when benchmarked against the 750M. That last sentence was pretty unnecessary, even if it is a joke.
 
I can not wait to see the trolls faces.

Trolls don't really care about being "right" versus "wrong". They simply say whatever they say to get a rise out of people. And, in the case of this thread, they seem to have succeeded.

This is what I'm "worried" about. I know it's difficult to predict but what is the usual waiting time after you order a BTO MacBook?

Basicly I'll be tempted to buy on day 1 but I'd like to read a couple of reviews first. The question is: there would be a massive, virtual queue to get these boys?

I think you'll be fine. Unless you order some new BTO part—e.g., a 1TB SSD—I don't see the BTO waits being significant at all. The big thing playing in your favor here is that this release is so late in the year—much later than most expected or anticipated—so it would be kind of nutty for them not to be ready to go at this point.

How will a BTO dGPU option mess up/conflict with iGPU base only units? Not to mention, everything will still have iGPU's.

Or are you saying that putting a dGPU in a machine validates that an iGPU on its own is inferior to having both? That's pretty much how it is. The new iMac with its Nvidia BTO says that as well.

Neither, really, but closer to the second. Given that the cost of Iris Pro 5200 today is about the same as a dGPU + HD 4600, the marketing campaign around the integrated graphics has to be about how they do a solid job on performance. Making a dGPU a BTO option really complicates the marketing story, especially since the types of dGPUs Apple would even consider (read: those that don't break the power bank) are merely incremental improvements over the current GT 650M. They're not going to want to suggest that base units have undergone a performance reversion, and trying to craft a story that appropriately deals with most consumers (i.e., not your usual MacRumors board poster) that says, "Hey, this iGPU is better than the 650M, but this 750M is better than the iGPU" simply is too complex.

That said, the point you raise about the iMacs is a valid one, so it's possible we'll see that outcome. I for one certainly wouldn't object to it!
 
From experience, the increase of productivity or satisfaction if you own a MacBook and a iPad is meager compared to simply owning a MacBook. Flash, as you know, does not work on the iPad [unless you use those work arounds which are a pain]. For me, the only actual benefit was when I was in bed and wanted to read the news but didn't want my MacBook in bed [it becomes warm and it much heavier]. Therefore, I sold my iPad but kept my work supplied iPhone. While the screen on the iPad facilitated more pleasurable reading, I'm still able to do the same with my iPhone.

Thus in conclusion, I advise you not to buy an iPad with a MacBook Pro and invest in an iPhone if you do not already have one [but wait for the 6].

I have an ipad now and waiting to purchase my first MacBook. I'm also going to purchase another ipad to replace my 16gb ipad2.

IMO, I need the ipad2 for portability. I travel weekly and the ipad is so convenient on planes, waiting at airports, hotels, or on the go.
 
what would allow for longer battery life on average?

Iris Pro 5200 or HD 4600 + dGPU (assuming you can use gfxcardstatus & disable dGPU to increase battery life)?

Would it be the HD 4600 + dGPU combo (with dGPU disabled)? Or would it be the Iris 5200? Or would they be about equivalent?
 
it's quite simple. the current logic board has a spot for a discrete graphics card. apple has no reason not to include one if the performance gains are worth it.
 
TDP of HD4600 and Iris Pro should be the same (both are integrated in the CPU and both quad core CPU run at the same TDP.
The main point here is that HD4600 is way cheaper than Iris Pro.
This is the only reason to justify the combo.
But it may also be that a rMBP will carry both Iris Pro and dGPU taking advantage of what, concretely, is an L4 cache on the processor.

These two CPU are equivalent:
i7 4960HQ @2.6Ghz, Tb 3.8ghz
i7 4900MQ @2.8Ghz, Tb 3.8ghz
The first ships with Iris Pro, the second with HD4600. What I think is that Apple will allow you to choose between this two configurations in their upper end CPUs for 15".
 
I'm still on my very first macbook pro... a late 2008 core 2 duo. I'm itching for an upgrade after all this time ugh. Can't wait for these Haswell machines and I'll be ordering the top end retina the moment they're available.

Are we expecting them during the "iPad Centric" event coming up? I've been keeping out of this thread on purpose because I'm so impatient.
 
Last edited:
I'm still on my very first macbook pro... a late 2008 core 2 duo.
Wow, the difference with a 5 years old machine will be massive (I guess).

I'm not only talking about the performance. Consider the non-mechanical HD (super resistant and PCIe!), the size and weight, the quality of the screen, the battery life...
 
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.