Oh and we need someone to come and set this thread straight! People are actually hoping for a dGPU again?!
![]()
Hahah yeah, we definitely can't have that now can we
Oh and we need someone to come and set this thread straight! People are actually hoping for a dGPU again?!
![]()
Yeah, this is correct. In the end computers are smart space heaters. All the energy that goes in has to come out in some form. Since information is not a form of energy, it all goes into heat (except for the small fraction that radiates away as light, and maybe sound waves and EM waves from WiFi etc).
The CPU and surrounding chips communicate by setting the voltage on interconnecting lines to 0V (to represent a zero) and 1.2V (to represent a one). All, yes all, of the energy needed to do this eventually dissipates as heat. Regardless, the energy consumed by these interconnect lines is a tiny tiny fraction of 1% of the energy consumed by and dissipated by the chips themselves.
Yeah, but you're suggesting that whatever is consumed at the entry point from the power supply is entirely converted into heat that must be dissipated at once. Without leaving any left for time lag occurring when information needs to be transferred.
Guys, help me decide, I am in desperate need!
I just purchased a brand new (early 2013 model) Macbook Pro w/ Retina 13'' (256gb ssd) for $1100 off craigslist. It was factory sealed w/ receipt and warranty expires August 2014.
Should I keep this because I got it at such a great deal? Or should I try to resell for ~$200 profit and put that towards the immanent Haswell refresh rMBP?
Thanks so much!!! (I am sooo tempted to open it!)
Information is a form of energy. How else is it stored?
Heat is kinetic energy... just "accumulated" over a specific volume or area.
And we don't live in a vaccuum, so conservation of energy is not perfect. Thus the proposed scenario in the post I quoted is impossible in real life. If you don't believe me, feel free to measure power draw of a MacBook and then account for CPU/GPU. See if it makes any sense.
Yeah, but you're suggesting that whatever is consumed at the entry point from the power supply is entirely converted into heat that must be dissipated at once. Without leaving any left for time lag occurring when information needs to be transferred.
It's a tiny 1% fraction, but when you accumulate data transfer (note that these CPUs send information millions of times per second), it does add up to be a significant amount.
Correct me if I'm wrong, but that's an impossible scenario.
And also to say, we don't live in a vaccuum, so perfect conservation of energy is impossible inherently. Again, please correct me on that (and win a Nobel prize) if I'm wrong.
Guys, help me decide, I am in desperate need!
I just purchased a brand new (early 2013 model) Macbook Pro w/ Retina 13'' (256gb ssd) for $1100 off craigslist. It was factory sealed w/ receipt and warranty expires August 2014.
Should I keep this because I got it at such a great deal? Or should I try to resell for ~$200 profit and put that towards the immanent Haswell refresh rMBP?
Thanks so much!!! (I am sooo tempted to open it!)
Why you little... get over to the "I just ordered.." thread..
Congrats![]()
Hahah yeah, we definitely can't have that now can we
We will all migrate there eventually.
And we don't live in a vaccuum, so conservation of energy is not perfect. Thus the proposed scenario in the post I quoted is impossible in real life. If you don't believe me, feel free to measure power draw of a MacBook and then account for CPU/GPU. See if it makes any sense.
How should an end user measure/estimate “real world” server power?
It is reasonable end users would want to know the power their servers are expected to consume when
running “real world” workloads. Estimating server power based on a worst case TDP specifications will
result in over estimating server power. Intel and AMD both agree on this point: “It is of little value to
measure power consumption by only looking at the spec sheets for different components, adding the
totals together, because these generally only report the maximum power consumption.
As such, an ACP like value seems reasonable at first glance. However ACP only gives you the power of the
processors when running “real world” applications. It doesn’t help estimating the power dissipated by
the other components in the server such as memory, hard drives, I/O boards, disk controllers etc.
All is not lost, however, because there is a very accurate way to measure server power under a users
“real world” conditions. And again, Intel and AMD both agree on this point: “The best way to measure
a server’s power consumption is the power meter, an inexpensive tool that is plugged into the wall, and
then your device, like a server, can be plugged into the power meter. The meter displays the wattage
drawn “at the wall” and allows you to analyze the power consumption under a variety of different
utilization levels.”
So if an accurate “real world” power value is needed, simply measure it with a power
meter. Because of normal component power consumption tolerances, it is recommended that more
than one sever is measured. Power can vary slightly even between identically configured servers from
the same vendor.
Guys, help me decide, I am in desperate need!
I just purchased a brand new (early 2013 model) Macbook Pro w/ Retina 13'' (256gb ssd) for $1100 off craigslist. It was factory sealed w/ receipt and warranty expires August 2014.
Should I keep this because I got it at such a great deal? Or should I try to resell for ~$200 profit and put that towards the immanent Haswell refresh rMBP?
Thanks so much!!! (I am sooo tempted to open it!)
Why you little... get over to the "I just ordered.." thread..
Congrats![]()
I chickened out and canceled it! Since I'm not doing anything that will really use the 650M and I'm not in a hurry to replace my current MBP, I decided to wait and see what the Haswell model brings to the table. It looks like the Iris Pro will serves my purposes and it might bring me some extra battery life and cooler temperatures. And the refurb models will probably still be around later as well.
More waiting for me!
Jolly good! Welcome back to the BEST thread on macrumors!![]()
Thanks! I'll be here waiting with the rest of you.
Including Iris Pro, which I wonder where exactly should go, if rMBP is to get 755m.
Maybe these are actually for the IMac, but when was the last time Apple use multiple dgpu providers? It is possible, but considering they'll go with AMD for the Mac Pro, I found it highly unlikely.
Image
Do people really not care about the mac so much that basically no leaks show up? I mean I get that the iPhone has the whole "millions of units produced in semi-sketch factories in china" type deal. But no one even cares enough about the mac to bother? Imo, it being only 30% of their sales doesn't mean anything, I think it's because there are a lot less being produced, since it's a more expensive product mostly. But all leaks and rumors and from random websites and they're mostly speculation based on nothing. Not even semi-factually based assumptions come out for the Macs. Besides geekbench leaks - which thank god for that haha. How annoying would it be if we didn't even see Iris pro as a potential gpu
My thoughts as well, the 755m seems more likely destined for the iMac than a dGPU rMBP model.
I think you'll find it's been a few years since the Mac generated 30% of Apple revenue or profits. I think it's now below 15%. I remember noticing in the last quarterly report that iTunes revenues would soon exceed Mac revenues.the mac ... being only 30% of their sales doesn't mean anything ....
the 755m is basically the 650m, its the same damn thing
I'm using by brother's old thinkpad t500 temporarily.
Do people really not care about the mac so much that basically no leaks show up? I mean I get that the iPhone has the whole "millions of units produced in semi-sketch factories in china" type deal. But no one even cares enough about the mac to bother? Imo, it being only 30% of their sales doesn't mean anything, I think it's because there are a lot less being produced, since it's a more expensive product mostly. But all leaks and rumors and from random websites and they're mostly speculation based on nothing. Not even semi-factually based assumptions come out for the Macs. Besides geekbench leaks - which thank god for that haha. How annoying would it be if we didn't even see Iris pro as a potential gpu
I've been thinking about creating a "Drying Paint" thread for people to watch during the lulls that occur in this one...
If the Haswell (r)MBP is announced on September 10th, when will it generally be available in retail store? I am going to be in Mexico until mid September, and if it is going to be available by that time, then I would very likely cross the US borders and get a 13" rMBP base model. With to $ conversion rate and tax refund, I'll likely save 500 compared to the price in Spain...