Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I think antonio is just quoting the second post of this thread... which was made over a year ago...
 
If you think that's bad, here in the UK the 17" costs £2099, making it $3384 at todays currency rates; making the price £434 higher including the deduction of the UK's VAT rate.

It's part and parcel of what we call "rip-off Britain" here. Unless you live in the UK, quit the wining about price. We all know the Mac's will be expensive, but what are you going to do? Not buy it?

Anyway - OFF topic.

So, still No rumours about the MacBook Pro redesign then......? Well done everyone.
 
Last edited:
Not to mention, the screen elements are intended to keep the same size.
Where do you get this Retina requirement from? I think you are confusing an iOS app-scaling issue with Apple's definition of 'Retina'.


A quick check on www.notebookcheck.net ranks the video cards.
Those Notebookcheck rankings are using 3DMark, which is Windows based software for gauging the 3D performance of a GPU. What interests me (and I'm acutely aware I'm in a minority) is the GPGPU performance (particularly OpenCL) of the GPU under OS X. The only metric presently available for inferring this is the listed FLOPS (and a small number of OpenCL benchmarks which show Kepler has dire GPGPU performance).

I'm fully aware that the Windows gaming-performance of Kepler (GK107) is an improvement, but it has been made at the expense of GPGPU performance.

For gaming, the move to Nvidia Kepler will be a nice upgrade, but for the handful of people who use software that leverages the GPU as a GPGPU (such as Photoshop, or like me OpenCL) Kepler (at least in its GK107 form) will be a significant downgrade (over the 6770M) if anything less than the GTX 660M is used (or at least available as a BTO).

Some people will be happy with the move to Nvidia (and the increased gaming performance), and some will be upset (with the decreased GPGPU performance). Personally - being a long-term Nvidia advocate - I was eagerly awaiting the move back to Nvidia (Nvidia have traditionally been very strong on GPGPU) until the early GPGPU benchmarks appeared. Anandtech has a good review of GK104 (GK107 is effectively one quarter of GK104), which shows nice gaming increases, but also the GTX 680 being bested by cheaper AMD GPUs and last-generation Fermi cards in GPGPU/Compute (Anandtech - Compute: What You Leave Behind?).
 
Where do you get this Retina requirement from? I think you are confusing an iOS app-scaling issue with Apple's definition of 'Retina'.



Those Notebookcheck rankings are using 3DMark, which is Windows based software for gauging the 3D performance of a GPU. What interests me (and I'm acutely aware I'm in a minority) is the GPGPU performance (particularly OpenCL) of the GPU under OS X. The only metric presently available for inferring this is the listed FLOPS (and a small number of OpenCL benchmarks which show Kepler has dire GPGPU performance).

I'm fully aware that the Windows gaming-performance of Kepler (GK107) is an improvement, but it has been made at the expense of GPGPU performance.

For gaming, the move to Nvidia Kepler will be a nice upgrade, but for the handful of people who use software that leverages the GPU as a GPGPU (such as Photoshop, or like me OpenCL) Kepler (at least in its GK107 form) will be a significant downgrade (over the 6770M) if anything less than the GTX 660M is used (or at least available as a BTO).

Some people will be happy with the move to Nvidia (and the increased gaming performance), and some will be upset (with the decreased GPGPU performance). Personally - being a long-term Nvidia advocate - I was eagerly awaiting the move back to Nvidia (Nvidia have traditionally been very strong on GPGPU) until the early GPGPU benchmarks appeared. Anandtech has a good review of GK104 (GK107 is effectively one quarter of GK104), which shows nice gaming increases, but also the GTX 680 being bested by cheaper AMD GPUs and last-generation Fermi cards in GPGPU/Compute (Anandtech - Compute: What You Leave Behind?).

I am by no means an expert on GPU science. But wouldn't the Intel HD 4000 be enough for your non-gaming GPU needs? Perhaps by combining the best of each you could get decent performance without too much compromise? (Just theorizing here)
 
For most people the HD 3000 is more than enough. For large screen setups the 4000 packs a little more punch. In any case those Intel GPU do a very good job in 2D stuff.
What are you doing for which you think a fast GPGPU might be needed? cad, ps
 
Where do you get this Retina requirement from? I think you are confusing an iOS app-scaling issue with Apple's definition of 'Retina'.



Those Notebookcheck rankings are using 3DMark, which is Windows based software for gauging the 3D performance of a GPU. What interests me (and I'm acutely aware I'm in a minority) is the GPGPU performance (particularly OpenCL) of the GPU under OS X. The only metric presently available for inferring this is the listed FLOPS (and a small number of OpenCL benchmarks which show Kepler has dire GPGPU performance).

I'm fully aware that the Windows gaming-performance of Kepler (GK107) is an improvement, but it has been made at the expense of GPGPU performance.

For gaming, the move to Nvidia Kepler will be a nice upgrade, but for the handful of people who use software that leverages the GPU as a GPGPU (such as Photoshop, or like me OpenCL) Kepler (at least in its GK107 form) will be a significant downgrade (over the 6770M) if anything less than the GTX 660M is used (or at least available as a BTO).

Some people will be happy with the move to Nvidia (and the increased gaming performance), and some will be upset (with the decreased GPGPU performance). Personally - being a long-term Nvidia advocate - I was eagerly awaiting the move back to Nvidia (Nvidia have traditionally been very strong on GPGPU) until the early GPGPU benchmarks appeared. Anandtech has a good review of GK104 (GK107 is effectively one quarter of GK104), which shows nice gaming increases, but also the GTX 680 being bested by cheaper AMD GPUs and last-generation Fermi cards in GPGPU/Compute (Anandtech - Compute: What You Leave Behind?).

You are worried about GPGPU which almost nothing uses? That is generally used by gaming? GPGPU was made popular by nvidia with CUDA and PhysX. PhysX is a gaming feature.

GPGPU is a feature of graphics cards that Nvidia has been pushing to replace the CPU. It has no everyday application. Nvidia must have decided making great gaming cards is better than giving us something we don't need GPGPU.

What would the average user use GPGPU for?
 
Last edited:
So you are worried about GPGPU which almost nothing uses?
No, nothing. More and more productivity software is being accelerated with OpenCL. This will only increase now that (at Apple's behest) Intel's IGPUs are OpenCL capable. Apple are very keen on GPGPU, and OpenCL specifically; though admittedly the full potential of GPGPU is taking longer to realize than expected.

GPGPU was made popular by nvidia with CUDA and PhysX. PhysX is a gaming feature.
GPGPU != PhysX
PhysX is a small (Windows only?) sub-set of GPGPU. What relevance is it to OS X?

For anyone who does not know a GPGPU is using the graphics card to do the work of a CPU.
Close, but no. GPGPU is using the massively-parallel architecture of a GPU to perform tasks for which the CPU is less suitable. In general terms, CPU is good for sequential integer computations, GPU is good for parallel floating-point calculations.

What would the average user use GPGPU for?
See below.

...and I'm acutely aware I'm in a minority

What are you doing for which you think a fast GPGPU might be needed? cad, ps
Numerical modelling of sedimentary processes and land-form evolution. I'm a Geologist; one that likes to be mobile, and (by preference) uses OS X.

I am by no means an expert on GPU science. But wouldn't the Intel HD 4000 be enough for your non-gaming GPU needs? Perhaps by combining the best of each you could get decent performance without too much compromise? (Just theorizing here)
...and theorizing is always welcome. The HD 4000 is the first generation of Intel GPU to support OpenCL (specifically 1.1), but it is likely that the HD 4000 would be able to execute OpenCL code no faster than the CPU. GPGPU is about leveraging the power of the GPU for non-graphical purposes.

I maintain that anyone who uses productivity software could be taking a performance regression with the move to Kepler. Those whose primary use for their Mac is gaming will be very happy with the move. There are people in-between who will neither benefit nor lose-out.

And back on topic...
What I'm hoping for is a quad-core 13" MBP with a discrete GPU. The (recently announced, and suspiciously 'OEM') 3612QM CPU is the first Intel quad-core CPU that'll fit within the 35W that Apple currently specifies with the 13" MBP, and if - as is rumoured - they drop the ODD, they may be able to squeeze-in a discrete GPU (one of the low end Keplers) and a slightly bigger battery. I think this unlikely, so I'll probably be going for the top-end 15".
 
I doubted it would be tonight, given that tomorrow is a public holiday in Australia (although, this might be hubris, suggesting that Apple would delay a product launch for a small country). Thursday might the day!
 
Where do you get this Retina requirement from? I think you are confusing an iOS app-scaling issue with Apple's definition of 'Retina'.

The idea of retina displays is not to simply make it tow where you have a high resolution, but that you have a high resolution with the screen elements still retaining the same size. iPhone 4 and iPad both did this. I'm not sure if you've seen the HiDpi mode in Lion but it works exactly like this....
 
...but that you have a high resolution with the screen elements still retaining the same size.
Again, where is this from? You appear to have confused the concept of resolution-independence with the Apple's Retina marketing-hype.

Retina is a hardware concept: a sufficiently high dpi (at a normal viewing distance) such that the human eye cannot resolve individual pixels. It has nothing to do with UI elements. I linked the Math that Apple provided.

The pixel-quadrupling of the iPhone4 and iPad3 is an iOS app issue, and Apple has been working on a resolution-independent UI for OS X for a long time.

I'm not adverse to Apple pixel-quadrupling the MBP displays, but this has never been a requirement for the 'Retina' label.
 
Last edited:
Again, where is this from? You appear to have confused the concept of resolution-independence with the Apple's Retina marketing-hype.

Retina is a hardware concept: a sufficiently high dpi (at a normal viewing distance) such that the human eye cannot resolve individual pixels. It has nothing to do with UI elements. I linked the Math that Apple provided.

The pixel-quadrupling of the iPhone4 and iPad3 is an iOS app issue, and Apple has been working on a resolution-independent UI for OS X for a long time.

I'm not adverse to Apple pixel-quadrupling the MBP displays, but this has never been a requirement for the 'Retina' label.
Have you seen the HiDpi modes in Lion or not? Those modes would be completely unnecessary if they could simply up the resolution. Retina displays are not just a hardware concept, they are hardware + software. If Apple could simply increase the resolution of Macbook Pro, call it Retina, and NOT be scoffed at, they would have done it already. Hell, even intel states that retina 15" screen would be at 3840 x 2160 .
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.