Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Sure, it beats the less expensive and higher performance R7 260X in performance-per-watt, but the difference isn't as big as you make it out to be. In gaming the 260X doesn't go higher than 92W according to Tom's Hardware and it's still considerably faster.

Anandtech's tests beg to differ. Here, 260x draws whopping 40 Watts more!

So let's see how AMD's and Nvdia's new architectures scale across the spectrum before we start going nuts over them.

I absolutely agree! Still, speculating is fun ;)
 
Sure, it beats the less expensive and higher performance R7 260X in performance-per-watt, but the difference isn't as big as you make it out to be. In gaming the 260X doesn't go higher than 92W according to Tom's Hardware and it's still considerably faster. You should also remember that 60W is still well above what Apple likes to put into it's laptops and AMD hasn't released any Hawaii chips other than top-of-the-line stuff.

So let's see how AMD's and Nvdia's new architectures scale across the spectrum before we start going nuts over them.

I think all we are looking at here is did they achieve their big headline feature of double the performance per watt. If Maxwell achieves this its reasonable to assume where there is a maxwell chip there is a doubling of performance or 1/2 power consumption or some combination of the two.

Anandtech -
"Utilizing a number of techniques NVIDIA set out to double their performance per watt versus Kepler – a design that was already power efficient by desktop GPU standards – and it’s safe to say that they have accomplished this"


I agree though it will be nice to see the actual mobile chips and see how they perform.

edit:
I just poked around and AMD don't have a new architecture they just rebadged the old chips. I can't actually find anything on a new AMD architecture. If so this should be a slam dunk for mobile as 2x increase performance per watt is better than the current AMD crop.
 
Last edited:
AMD does have the GCN 1.1 architecture.
The 260X and 290X are differenct architectures while the 280 and 270 and 260 and I think 265 too are still the same old.
Still though GCN 1.1 is just good enough to be at about Kepler level they have nothing that can compete with Maxwell so far.
They might though for the 20nm switch. Maxwell was initially thought to show with 20nm in mind and is only 28nm because 20nm production just takes too long. I still doubt though that AMD really has something similar up its sleeve as that usually is communicated beforehand.
AMD might have been too busy with the Console chips and they had so many financial troubles that standing up to Intel or nvidia in R&D is just not in the cards.
durkin said:
Apple is more concerned with battery life than performance, but performance still matters. I think they will keep doing what they're doing now, integrated in low end, then discrete for professionals that need the power. The fact that Nvidia doubled performance with the same power consumption is incredibly promising. Apple doesn't have to sacrifice battery at all, it would even potentially get better with Broadwell enhancements and battery upgrades, and still get massive performance gains. Now if they can just fix those Nvidia drivers...
When it comes to dGPU they never really have to worry about battery life, only TDP as they stuff them in such thin notebooks. Any GPU can clock down quite low and power gate some parts and if that doesn't help it can be turned off. It never affects power but there is only so much heat the cooling system can handle and the power supply was never really capable of supplying any more than close to 100W (85W+battery) total for everything.
Currently max power draw is somewhere around 80W which means the without display the CPU+GPU don't need all that much. If the CPU wouldn't use much less than its maximum 47W while the dGPU is active a 30W+ GPU wouldn't even be in the cards. Who cares about the battery at full load it is going to be dry in a single hour anyway.
 
Anandtech's tests beg to differ. Here, 260x draws whopping 40 Watts more!

Yes, but that's just in a synthetic benchmark intended to stress a GPU to it's limits. I use Furmark whenever I build a new system to see if the cooling works properly and that's what it's actually intended to do.

In idle the difference is a single measly watt and in other closer-to-real-life use tests the difference is nowhere near as big. So like people who complain about women and minorities earning less (who in reality earn just as much once you factor in education, experience and hours), you should really look at the numbers a bit more carefully and not draw conclusions as easily.
 
Yes, but that's just in a synthetic benchmark intended to stress a GPU to it's limits. I use Furmark whenever I build a new system to see if the cooling works properly and that's what it's actually intended to do.

In idle the difference is a single measly watt and in other closer-to-real-life use tests the difference is nowhere near as big. So like people who complain about women and minorities earning less (who in reality earn just as much once you factor in education, experience and hours), you should really look at the numbers a bit more carefully and not draw conclusions as easily.

Crysis 3 is a synthetic benchmark? Since when? Also, idle power consumption does not matter for the purpose of this discussion. All modern cards are quite efficient in down clocking themselves when their resources are not needed (not to mention that the dGPU would usually be deactivated anyway in such a scenario on a dual-GPU laptop).

Besides, what interests us here is not the power draw of the system per se - but, rather, heat production under load (obviously, physically the two are the same). The less heat produced by the GPU, the easier it can be integrated into a thermally constrained product like a laptop. The tests show that the new Nvidia GPU will require significantly less power (and thus produce significantly less heat) while providing comparable performance. If you downclock all the tested chips to limit the drawn/dissipated power to max 40W, the Maxwell will likely be a winner by a large margin. In another words, if you limit the thermal headroom of the tested GPUs, the AMD chips will start to throttle even before they can reach the half of their performance capacity, while the Maxwell chips should retain the most of their performance profile. Of course, performance does not exactly scale linearly with power draw, but assuming linearity in this case should be good enough.

P.S. FYI, I teach statistics at a world renowned Swiss university. I am not a real 'mathematician' (although I do have a degree in mathematics), but I have to deal with statistical evaluation of scientific data every day in my job as a researcher and an educator. So I assure you that I probably have at least some basic understanding about 'drawing conclusions', otherwise all these smart people probably wouldn't have hired me in the first place ;)

P.P.S. Its a statistical fact that women are being payed less on average - with the same qualification and work experience. At least here in Switzerland. Didn't do any analysis for the USA.
 
Crysis 3 is a synthetic benchmark? Since when?
If you bothered to read the next sentence you'd see that I was referring to the Furmark numbers...

leman said:
Also, idle power consumption does not matter for the purpose of this discussion.
In desktop use you might be right, not in laptop use when the general stress level is a lot closer to idle than all out full burn. In laptop battery life tests the load on the GPU is NOT high when it's done ether by playing video (which is not a heavy task for modern laptop GPU's) or web browsing (same thing).

Let's not forget that AMD hasn't yet released any GPU's based on the new Hawaii architecture other than the R9 290 and 290X and it's supposed to be a good size step up in performance-per-core-and-per-clock. What this means is that once the laptop chips based on it become available, it too will see a good size boost in performance-per-watt.

In case you don't understand what I'm saying, here it is in a very simple form: While the boost does look good, let's keep out pants on and not get over-excited, specially when more power efficient GPU's by AMD are around the corner.

leman said:
P.S. FYI, I teach statistics at a world renowned Swiss university.
Considering you've so far been blindly staring at power consumption figures you're ether making that up or then you should probably consider a change of career.

leman said:
P.P.S. Its a statistical fact that women are being payed less on average - with the same qualification and work experience. At least here in Switzerland. Didn't do any analysis for the USA.
In the U.S that idea was apparently already debunked in the 80's. Where I live (Finland) feminists used to complain that "a man's euro is a woman's 90 sent", but they stopped after someone had a proper look at the figures they were using and concluded that all things considered, women and men were paid about the same and in some female dominated jobs (like a lot of jobs in the service industry) women actually earned slightly more than the equivalent man.
 
SarcasticJoe, I am still waiting for you to actually start being sarcastic - right now you are more like a, well 'SlowJoe'. Please don't get me wrong, I am sure that you are intelligent person and everything, but I find it a bit puzzling that its so difficult to understand what I am talking about. Am I really so unclear in what I am writing? Let's try once again then :)

Basically, I am not talking about battery life or idle power consumptions (these are different issues which we can gladly discuss in a different context), I am talking about GPU performance under a given thermal constraint.

I will try to explain it in a 'stupid' way (not because I have doubts about your intelligence, but simply to make sure that I am understood correctly this time). Say, I have a laptop and I want to play some games on it. I don't care for battery life at all, because I am obviously plugged into an outlet, I just want to have some great visuals with good performance. Again, what I am interested in is MAXIMAL performance. The Maxwell chips are able to deliver their maximal performance while consuming (and thus dissipating - first law of thermodynamics) around 40Watts less then the AMD chips. In another words, if I want to use the AMD chip in my system and still get the max performance, I need to improve its cooling capacity to take care of these additional heat - which is a huge thing for a laptop. In yet another words - because Maxwell GPU produces so little heat under load, I can have safely use it in my 'light and thin' laptop, whereby a similar-performing AMD GPU would need a much beefier cooling system to operate without throttling or setting my computer on fire.

Is it more clear now? Again, I am not talking about energy savings or anything like that, but simply about having more performance while maintaining the same thermal envelope. With mobile GPUs, what matter most is how much performance can you get from the 'slow and cool' chips. It is absolutely possible that Maxwell does not scale good at all, and that we will never see an actual high-performing card based on this architecture - still, for the mid-range mobile segment (which 750M and the like is occupying right now), it looks extremely attractive.

Now to your points.

If you bothered to read the next sentence you'd see that I was referring to the Furmark numbers...

If you bothered to read the Anandtech's article, you'd see that your statement here is completely non-sensical - because power usage for both Crysis3 and Furmark are de-facto identical. I mean, how can you even bring the 'but thats a synthetic test' kind of argument when a real-life test gives you exactly the same result?

In desktop use you might be right, not in laptop use when the general stress level is a lot closer to idle than all out full burn. In laptop battery life tests the load on the GPU is NOT high when it's done ether by playing video (which is not a heavy task for modern laptop GPU's) or web browsing (same thing).

Not relevant, because nobody was talking about battery life (well, not me at least). Nevertheless, you are absolutely correct of course.

Let's not forget that AMD hasn't yet released any GPU's based on the new Hawaii architecture other than the R9 290 and 290X and it's supposed to be a good size step up in performance-per-core-and-per-clock. What this means is that once the laptop chips based on it become available, it too will see a good size boost in performance-per-watt.

Again, a very valid and correct comment. However, I am not really seeing any substantial increase in efficiency from AMD based on what we know right now. E.g. http://www.anandtech.com/show/7481/the-amd-radeon-r9-290-review/15 - here, the R9 290 actually consumes more power then a Titan! Of course, it might be entirely possible that Hawaii gets much more power-efficient when scaled down. I am entirely open to this. I don't really see any evidence supporting this hypothesis so far. In contrast, Maxwell's dramatic efficiency improvement are clearly observable and measurable.

Considering you've so far been blindly staring at power consumption figures you're ether making that up or then you should probably consider a change of career.

Doesn't matter - just got my permanent position. Suck it up, swiss taxpayers, now you are stuck with another incompetent foreigner! Muhahahaha!

In the U.S that idea was apparently already debunked in the 80's. Where I live (Finland) feminists used to complain that "a man's euro is a woman's 90 sent", but they stopped after someone had a proper look at the figures they were using and concluded that all things considered, women and men were paid about the same and in some female dominated jobs (like a lot of jobs in the service industry) women actually earned slightly more than the equivalent man.

Sigh... Is that so difficult to do a little research? Look what I turned up in few seconds:

http://en.wikipedia.org/wiki/Male–f...n_the_United_States#Gender_pay_gap_statistics

If you read through all of that, you will see that apparently most of the studies conducted in the US have found an unexplained disparity between male/female income - after controlling for a number confounding factors. If things are different in Finland - congratulations! Unfortunately, discrimination is still quite a thing in the rest of the world.
 
If you bothered to read the next sentence you'd see that I was referring to the Furmark numbers...

Lets lay the thread of conversation out.

He said:
Anandtech's tests beg to differ. Here, 260x draws whopping 40 Watts more!

So Leman is saying the 260x draws a lot more power (40watts) than the maxwell chips.

So you said:
Yes, but that's just in a synthetic benchmark intended to stress a GPU to it's limits. I use Furmark whenever I build a new system to see if the cooling works properly and that's what it's actually intended to do.

The first statement : "thats's just in a synthetic benchmark"
Was not accurate. The Anandtech link he gave you showed power consumption in Crisis 3. Which is a real game and not a synthetic. So he pointed this out and said :

Crysis 3 is a synthetic benchmark? Since when?


I suspect when you opened the link you may not have scrolled the page as Furmark was used for testing, but so was Crisis 3.
 
Basically, I am not talking about battery life or idle power consumptions (these are different issues which we can gladly discuss in a different context), I am talking about GPU performance under a given thermal constraint.
I'm once again going to have to point out that the AMD R7-260-series (in the vast majority of benchmarks) do actually perform higher. This means that a proper comparison should instead be with a lower end, and thus also less power hungry chip. Another thing I'm now forced to repeat myself on is pointing out that AMD's Hawaii chips, which are currently only available in their highest end consumer GPU's, should also give a fairly good boost in both performance and power consumption.

The reason why the Hawaii chips run so hot is in part because the standard cooler AMD is putting on them is at best subpar and that they're going up against both dual chip cards and highest of the high end cards. In that market space power consumption is not that big on issue meaning that you shouldn't really draw that many conclusions. As for the price, it's been badly inflated by Bitcoin miners because the 290 and 290X happen to be some the fastest cards on the market for just this task.

A more realistic comparison than what's been posted so far would be some kind of FPS-per-watt, power-consumed-over-the-span-of-the-task numbers or just bringing the AMD chips to the Nvidia chips's level of performance by underclocking them or deactivating cores (which would also reduce power consumption).

As you seem to have made directly mocking the other side's intelligence a theme in this two person argument I'm going to once again clarify what I'm arguing about. You're ether intentionally misrepresenting my arguments (also known as a "Strawman Argument") or simply not understanding what I'm saying (I personally hope it's the former as the latter would not speak particularly well of your reading comprehension skills).

I'm not saying that this isn't step up in what can be crammed into the thermal envelope, but that when you factor in the actual performance of the newly released GPU and (higher) performance of the chips it's being compared to, the gains aren't as big as the first seem.

A more realistic comparison would be do some kind of average-FPS-per-watt-consumed, total-power-consumption-over-test-duration figures or bringing the AMD chips down to the Nvidia chip's level of performance

leman said:
Sigh... Is that so difficult to do a little research? Look what I turned up in few seconds:

If you actually bothered to read it properly you'd also see that it's FULL of figures that do not factor in the things I mentioned that actually cause the perceived income disparity. Hell, the article starts casting doubt on the figures even before the listing of the contents by pointing out that most of these studies don't factor in education, experience, hours and even the fact men and women tend to gravitate towards different professions. If you read trough the article you find multiple factors that help to explain this difference.

Let's not forget that Wikipedia is hardly the most reputable source considering that over the years it's been subject to a long string of incidents where various companies and interest groups have been caught editing articles so that they push their agenda. Recently we've had Wikipedia ban a vast number of accounts because suspicion of this and they even sued a company that sold editing of articles as a service, catering to both companies and interest groups.
 
Well, this conversation is getting quite ridiculous so I will probably have to withdraw from the subsequent dialogue (unless there are some qualitative changes)

I'm once again going to have to point out that the AMD R7-260-series (in the vast majority of benchmarks) do actually perform higher. This means that a proper comparison should instead be with a lower end, and thus also less power hungry chip. Another thing I'm now forced to repeat myself on is pointing out that AMD's Hawaii chips, which are currently only available in their highest end consumer GPU's, should also give a fairly good boost in both performance and power consumption.

And here I though that AMD R7 260 is the lower end chip :confused: How more lower-end do you want to go? Look at the R7-250 - uses just few watts less under load than the 750, but what a gap in performance!

The reason why the Hawaii chips run so hot is in part because the standard cooler AMD is putting on them is at best subpar and that they're going up against both dual chip cards and highest of the high end cards.

Wait, wait, what? You don't understand the difference between the temperature and power consumption? Now you are actually scaring me.

A more realistic comparison than what's been posted so far would be some kind of FPS-per-watt, power-consumed-over-the-span-of-the-task numbers or just bringing the AMD chips to the Nvidia chips's level of performance by underclocking them or deactivating cores (which would also reduce power consumption).

You are apperently so good at statistics, just do the rough extrapolations yourself... Its not that difficult to reasonably estimate how much power the remaining components would draw under load (Crysis 3). Even if we assume that its a clearly underestimated 60W, the Nvidia is ahead in power efficiency. If we take the CPU's TDP (130W), the difference in power efficiency is somewhere around 50%
 
Last edited:
Well, this conversation is getting quite ridiculous so I will probably have to withdraw from the subsequent dialogue (unless there are some qualitative changes)
That sure is a really classy way to pull out of an argument... Rather than admitting defeat or actually winning the argument you just complain about the other side being stupid and that they're not worth replying to.

Whatever you got to do to end the argument with your ego intact after I've twice pointed out that I never disagreed with the outset of the thread, just that it's not as big a step forward as the thread started tried to make it into.

leman said:
And here I though that AMD R7 260 is the lower end chip :confused: How more lower-end do you want to go? Look at the R7-250 - uses just few watts less under load than the 750, but what a gap in performance!
While the R7 260X is about $20 less, it is still higher performance in the vast majority of benchmarks. In other words, for a proper comparison (rather than the pure power consumption figures you've been staring yourself blind at) you'd need to do some kind of power-consumption-to-performance figure or bring the 260X down to the 750's level.

leman said:
Wait, wait, what? You don't understand the difference between the temperature and power consumption? Now you are actually scaring me.
You talked about how the 290 and 290X don't get to their full potential until you crank up the fan speed and I explained that it's because AMD uses some pretty subpar default coolers (meaning you should wait for much better aftermarket ones). I also pointed out the the top-of-the-range gaming GPU's aren't the best tools for judging power efficiency because they do really sort of crank up the cards to "11" so that they can get as big an advantage in performance as they can.
 
That sure is a really classy way to pull out of an argument... Rather than admitting defeat or actually winning the argument you just complain about the other side being stupid and that they're not worth replying to.

Sorry, right now the 'argument' between us goes something like this: 'A: French cuisine is very rich in tradition' - 'b: But french fries are not really french cuisine'. Plainly put, we seem to speak about completely different things from completely different angles, and to me, such an argument is a waste of time. No idea why you would mention my ego in this context. I just want to avoid needless bickering.

... after I've twice pointed out that I never disagreed with the outset of the thread, just that it's not as big a step forward as the thread started tried to make it into.

Again - do some calculations and you will see that Maxwell offers somewhere between 25%-50% better performance per watt than competitors. Its an INSANE step forward, because it allows us to have 25%-50% more performance in the same form factor.

While the R7 260X is about $20 less, it is still higher performance in the vast majority of benchmarks. In other words, for a proper comparison (rather than the pure power consumption figures you've been staring yourself blind at) you'd need to do some kind of power-consumption-to-performance figure or bring the 260X down to the 750's level.

Come on, what does this have to do with the price? All I am talking about all the time is performance/power ratio, which can be extrapolated from the benchmarks! If a card performs 20% better while drawing almost twice as much power - do we really need to discuss what it means for power efficiency? Its not like the AMD cards come with an instance factory overclock (where the power usage quickly spikes).

You talked about how the 290 and 290X don't get to their full potential until you crank up the fan speed and I explained that it's because AMD uses some pretty subpar default coolers (meaning you should wait for much better aftermarket ones).

Where did I even mention the word 'fan'? I just said that the current end-of-the -line Hawaii GPUs requires more power then a Titan while offering less performance - which makes it less power efficient in my book. I never mentioned temperature, fan or anything like that!

I also pointed out the the top-of-the-range gaming GPU's aren't the best tools for judging power efficiency because they do really sort of crank up the cards to "11" so that they can get as big an advantage in performance as they can.

And I have said the same thing. I have also said that I don't find it very likely that its power efficiency will increase that dramatically when downclocked, unless the factory model really pushes the chip to its physical limits.
 
Sorry, right now the 'argument' between us goes something like this: 'A: French cuisine is very rich in tradition' - 'b: But french fries are not really french cuisine'. Plainly put, we seem to speak about completely different things from completely different angles, and to me, such an argument is a waste of time. No idea why you would mention my ego in this context. I just want to avoid needless bickering.
Why did I mention your ego? Go read your own post because you did pull the "your arguments are *****" card on me and the say you probably wouldn't continue posting. As for the argument continuing, you're the one basically arguing for the sake of arguing, I've multiple times pointed out that the only difference in our view points is how significant the step forward is.

leman said:
Again - do some calculations and you will see that Maxwell offers somewhere between 25%-50% better performance per watt than competitors. Its an INSANE step forward, because it allows us to have 25%-50% more performance in the same form factor.

Come on, what does this have to do with the price? All I am talking about all the time is performance/power ratio, which can be extrapolated from the benchmarks! If a card performs 20% better while drawing almost twice as much power - do we really need to discuss what it means for power efficiency? Its not like the AMD cards come with an instance factory overclock (where the power usage quickly spikes).
I don't know where you hell get the 50% figure...

If you look at the Andtech review, the biggest difference in power consumption between the 260X and the GTX 750 is in the Crysis 3 benchmark. In it the Nvidia chip draws 178W while the AMD chip draws 222W, a difference of 44W. So how more much does the AMD chip draw in percent? About 24,5% more and well below the 50% figure you clearly just came up without running any numbers.

Sure, the GTX 750 beats the 260X in a lot of benchmarks, so let's compare it to the 265 which beats it in pretty much every benchmark. In Crysis 3 it draws a whopping 230W, which is 52W, or 29,2% more than the GTX 750. Hell it's even less when you factor in performance as for instance in Battlefield 4 the 265 beats it by almost as much as 40%.

Now do you see where I'm coming from? Out of the numbers you provided (or rather made up for the sake of argument), the actual difference is a lot closer to 25% than 50%.
 
All of you people who have convinced yourselves that Maxwell will "definitely" happen are just begging for some disappointment.
 
If you look at the Andtech review, the biggest difference in power consumption between the 260X and the GTX 750 is in the Crysis 3 benchmark. In it the Nvidia chip draws 178W while the AMD chip draws 222W, a difference of 44W. So how more much does the AMD chip draw in percent? About 24,5% more and well below the 50% figure you clearly just came up without running any numbers.

Sure, the GTX 750 beats the 260X in a lot of benchmarks, so let's compare it to the 265 which beats it in pretty much every benchmark. In Crysis 3 it draws a whopping 230W, which is 52W, or 29,2% more than the GTX 750. Hell it's even less when you factor in performance as for instance in Battlefield 4 the 265 beats it by almost as much as 40%.

Now do you see where I'm coming from? Out of the numbers you provided (or rather made up for the sake of argument), the actual difference is a lot closer to 25% than 50%.

Those are total system power consumption figures. It is a bit trickier to work out the power consumption differences for the graphics cards alone, but given the cards are pretty well idle when systems are idle, then the percentages are:

260x: 222-74 = 148W
750GTX: 178-73 = 105W

Difference is ~41%

Then we have to take into consideration that CPU load difference when CPU is idle CPU is loaded. Let's say the difference in CPU load is 30W.

260x: 222-74 -30W = 118W
750GTX: 178-73 - 30W = 75W

Difference is 57%.

All guesswork of course, but much better than taking full system power figures and equating that to graphics card power figures.
 
Sure, the GTX 750 beats the 260X in a lot of benchmarks, so let's compare it to the 265 which beats it in pretty much every benchmark. In Crysis 3 it draws a whopping 230W, which is 52W, or 29,2% more than the GTX 750. Hell it's even less when you factor in performance as for instance in Battlefield 4 the 265 beats it by almost as much as 40%.

The 265 has a 256 bus, the 750Ti only 128. No wonder the performance difference. Again, you are comparing based on price and not on hardware specs.

Now do you see where I'm coming from? Out of the numbers you provided (or rather made up for the sake of argument), the actual difference is a lot closer to 25% than 50%.

And this is exactly the reason why I say 'your arguments are ***' - you seem completely ignorant to the fact that the power consumption figures are for the WHOLE SYSTEM, not the GPU alone (how do you imagine a 60W TDP GPU to draw 170Watt in the first place???). The test machine was based on a 6-core Ivy Bridge-E CPU with the TDP of 130W - add RAM, disk and mainboard to the equation and you easily have those components consuming over 120W under load (as shown by various benchmarks of the 4960X CPU itself). If you are unable to realise this absolutely basic fact, I see no reason to discuss these things with you in the first place. Go and learn something about computers and then talk. Your problem is that you don't even try to understand what and why I am saying. I have doubted myself at every paragraph that I write here, I have re-read and recalculated my arguments half a dozen times by now. In contrast, you just repeat the same nonsense all over again without trying to question yourself. This is embarrassing :(

P.S. What I would add to thunng8 post above is that its unlikely that the GPU would go over its TDP. It is reasonable to assume that the system sans the GPU draws something around 110-130W under load. Under such assumption, the both Maxwell CPUs show very similar performance/power ratios, while 265 is around >40% less efficient and 260X around >50% less efficient (no wonder - the 265 has to more efficient because of its wider bus).

----------

All of you people who have convinced yourselves that Maxwell will "definitely" happen are just begging for some disappointment.

You mean 'happen' as in 'happen in a MBP'? You are right of course, just staying with an (adequate) IGP has a lot of attraction for Apple... But I doubt that they will abandon the dGPU if the performance gap between it and whatever Broadwell will deliver is too large...
 
Last edited:
The 265 has a 256 bus, the 750Ti only 128. No wonder the performance difference. Again, you are comparing based on price and not on hardware specs.
I mentioned the price ONCE because we all know Apple always uses low-to-mid GPU's in the Macbook Pro, not the best fit from a thermal perspective. The mid range chips being X1600, 8600M, 9600M, 330M, 6750M, 6770M, 650M and 750M and the low end ones being the 9400M and 6490M (it's debatable where you should place the Iris Pro).

I usually don't like to speak in terms of "if", but now that you dug up the width of the memory bus (which is FAR from the only thing that determines performance) I will point out that the Hawaii architecture uses a 512 bit memory bus.

Also, to return to the remark that the 290X consumes more power than the GTX Titan, the Titan is a dual GPU card and the only way to beat something like that with a single GPU card is to have a very fast single chip card. In other words you're going to need to overclock it like no tomorrow. If you've ever done any overclocking, you'd know that power consumption wise you're going to encounter some pretty serious diminishing returns once you pass a certain point. What this means is that in comparison to the actual performance gain you're going to se an exponential growth in power consumption.

In other words: Pretty much everything highly overclocked becomes more power efficient once you de-overclock it. While this graph may be that of a CPU, it demonstrates the principle fairly well.

leman said:
And this is exactly the reason why I say 'your arguments are ***' - you seem completely ignorant to the fact that the power consumption figures are for the WHOLE SYSTEM, not the GPU alone
Looks like I missed that about it being for the whole system (and assumed they'd have done their due diligence and reduced it to just the GPU consumption) in the sleep deprived state I've been in the for the last few days. Still, the belittling style you used to write the response to this mistake I made makes you really look like a egotistic academic *******.

If missing a few words written in a smaller font really and just assuming the testing site has done it's due diligence when it hasn't is enough for you to basically jump on your high horse (for the second time in this thread) and start going on about how your opponent isn't worthy of your time, that really doesn't speak very well of you as the opponent in an argument. Specially when you even said you probably weren't going to continue posting the last time you did it.

Now if you really want to get technical on the power consumption figures and start speculating about how much the system itself draws, you should remember that the figure also includes the power loss caused by the PSU. The way they set up the measuring is by after having a power consumption meter sitting between the computer and the mains. A PSU is an AC-to-DC converter and like all power converters, it's not going to be 100% efficient, actually far from it.

Where we're at right now in terms of efficiency in PSU's is around 70-90%, with the around 70% ones being the cheap ones, 80% ones being the regular ones and finally the 90% (or close to) ones being very expensive and coming with such a high markup that the vast majority of system builders don't bother with them.

Now let's assume they're using a PSU which is about 80% efficient as it's about the level on which most system builders use. At maximum load the GTX system uses 176W, the 260X uses 223W and the 265 uses 230W. The difference between these is that the 260X system uses 47W and the 265 system 52W more in total.

Let's reduce the PSU's share of that and see how it affects the situation: The 750 system uses uses about 141W, the 260X about 178W and 265 184W. So how much more do the AMD chips ACTUALLY consume? About 37 and 43W more, which is a whole lot less impressive.

However this flaw is nothing compared with the more groundbreaking flaw in the entire idea of comparing the 750 to the 260X and 265!

In this comparison it is about two architectures, one which was been available for less than a week, another one which has been available for about a YEAR and is basically on it's way out the door. Don't be fooled by the new architecture name on the 260X (the 265 is even officially a Pitcairn), it's still basically a rebadged chip.

Doing this power efficiency comparison in the low end desktop space is also pretty much meaningless considering power consumption is not really much of an issue and Nvidia naturally gets the advantage with Maxwell growing out of their tablet and smartphone line. It's basically like trying to draw conclusions on how economic Honda's and BMW's motorcycles are by comparing a Civic and a 1-Series.

In other words it's a flawed comparison from the ground up and as I've been saying all along, it's not worth getting over-excited about.
 
Last edited:
Sure, it beats the less expensive and higher performance R7 260X in performance-per-watt, but the difference isn't as big as you make it out to be. In gaming the 260X doesn't go higher than 92W according to Tom's Hardware and it's still considerably faster.
What the hell are you talking about? Don't make me putting Tom's review screenshots here. All of them (with the exception of AC4) show that 750ti is 5% to 20% faster than 260x. And it's still at least 50% better perf-per-watt ratio, which IS huge assuming we are talking about the same tech. process (28nm).
 
Last edited:
What the hell are you talking about? Don't make me putting Tom's review screenshots here. All of them (with the exception of AC4) show that 750ti is 5% to 20% faster than 260x. And it's still at least 50% better perf-per-watt ratio, which IS huge assuming we are talking about the same tech. process (28nm).
Go re-read the post... I'm talking about the regular 750 (which is NOT the same card as the 750 Ti) and a better comparison target for the 260X. Here's some benchmarks courtesy of Anandtech:

Benchmark 1
Benchmark 2
Benchmark 3
Benchmark 4
Benchmark 5
Benchmark 6

Also, we're comparing an two old and rebadged chips on their way out with a two brand new ones that haven't been out even a week. This and the fact that it's in a space where value matters more than power consumption really makes this comparison almost meaningless.
 
I did re-read the entire thread, tbh, before responding to you. You've been talking about 750ti from the start. Looks like the self-own to me, despite your attempts to mess with the numbers and videocard models. Sorry.

And you do loose the main thing. It's the forum of macbook owners. From the perspecive of those people who will use maxwell in their macbooks and iMacs, new nvidia's architecture is nothing short of amazing improvement, since it DOES up the perf-per-watt ratio almost x2 over the current graphics Apple is using, which is kepler. And it doesn't matter to most here, if those new cards are mediocre from perf-per-dollar perspective.
 
Last edited:
And you do loose the main thing. It's the forum of macbook owners. From the perspecive of those people who will use maxwell in their macbooks and iMacs, new nvidia's architecture is nothing short of amazing improvement, since it DOES up the perf-per-watt ratio almost x2 over the current graphics Apple is using, which is kepler. And it doesn't matter to most here, if those new cards are mediocre from perf-per-dollar perspective.

Well said Sir :)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.