The 265 has a 256 bus, the 750Ti only 128. No wonder the performance difference. Again, you are comparing based on price and not on hardware specs.
I mentioned the price ONCE because we all know Apple always uses low-to-mid GPU's in the Macbook Pro, not the best fit from a thermal perspective. The mid range chips being X1600, 8600M, 9600M, 330M, 6750M, 6770M, 650M and 750M and the low end ones being the 9400M and 6490M (it's debatable where you should place the Iris Pro).
I usually don't like to speak in terms of "if", but now that you dug up the width of the memory bus (which is FAR from the only thing that determines performance) I will point out that the Hawaii architecture uses a 512 bit memory bus.
Also, to return to the remark that the 290X consumes more power than the GTX Titan, the Titan is a dual GPU card and the only way to beat something like that with a single GPU card is to have a very fast single chip card. In other words you're going to need to overclock it like no tomorrow. If you've ever done any overclocking, you'd know that power consumption wise you're going to encounter some pretty serious diminishing returns once you pass a certain point. What this means is that in comparison to the actual performance gain you're going to se an exponential growth in power consumption.
In other words: Pretty much everything highly overclocked becomes more power efficient once you de-overclock it. While
this graph may be that of a CPU, it demonstrates the principle fairly well.
leman said:
And this is exactly the reason why I say 'your arguments are ***' - you seem completely ignorant to the fact that the power consumption figures are for the WHOLE SYSTEM, not the GPU alone
Looks like I missed that about it being for the whole system (and assumed they'd have done their due diligence and reduced it to just the GPU consumption) in the sleep deprived state I've been in the for the last few days. Still, the belittling style you used to write the response to this mistake I made makes you really look like a egotistic academic *******.
If missing a few words written in a smaller font really and just assuming the testing site has done it's due diligence when it hasn't is enough for you to basically jump on your high horse (for the second time in this thread) and start going on about how your opponent isn't worthy of your time, that really doesn't speak very well of you as the opponent in an argument. Specially when you even said you probably weren't going to continue posting the last time you did it.
Now if you really want to get technical on the power consumption figures and start speculating about how much the system itself draws, you should remember that the figure also includes the power loss caused by the PSU. The way they set up the measuring is by after having a power consumption meter sitting between the computer and the mains. A PSU is an AC-to-DC converter and like all power converters, it's not going to be 100% efficient, actually far from it.
Where we're at right now in terms of efficiency in PSU's is around 70-90%, with the around 70% ones being the cheap ones, 80% ones being the regular ones and finally the 90% (or close to) ones being very expensive and coming with such a high markup that the vast majority of system builders don't bother with them.
Now let's assume they're using a PSU which is about 80% efficient as it's about the level on which most system builders use. At maximum load the GTX system uses 176W, the 260X uses 223W and the 265 uses 230W. The difference between these is that the 260X system uses 47W and the 265 system 52W more in total.
Let's reduce the PSU's share of that and see how it affects the situation: The 750 system uses uses about 141W, the 260X about 178W and 265 184W. So how much more do the AMD chips ACTUALLY consume? About 37 and 43W more, which is a whole lot less impressive.
However this flaw is nothing compared with the more groundbreaking flaw in the entire idea of comparing the 750 to the 260X and 265!
In this comparison it is about two architectures, one which was been available for less than a week, another one which has been available for about a YEAR and is basically on it's way out the door. Don't be fooled by the new architecture name on the 260X (the 265 is even officially a Pitcairn), it's still basically a rebadged chip.
Doing this power efficiency comparison in the low end desktop space is also pretty much meaningless considering power consumption is not really much of an issue and Nvidia naturally gets the advantage with Maxwell growing out of their tablet and smartphone line. It's basically like trying to draw conclusions on how economic Honda's and BMW's motorcycles are by comparing a Civic and a 1-Series.
In other words it's a flawed comparison from the ground up and as I've been saying all along, it's not worth getting over-excited about.