Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I designed CPUs for 10 years at AMD so I know all about overclocking. Of course I was talking about lowerig, not raising voltage. And I'm talking about dynamically changing voltage, not bios settings. Do the intel chips actually reduce voltage when at low loads, is my question because that's what was suggested. No AMD chip I ever worked on changed voltage dynamically. It had a recommended min and max voltage and it was set in the bios and that was it (ignoring gamers with special motherboards and software).


i know the mobile CPU's reduce voltage when on battery
 
I designed CPUs for 10 years at AMD so I know all about overclocking.

I don't believe you. If you were really a chip designer at AMD for 10 years, you would know far more than anybody on this board could possibly answer.

You'd also be a member of far more specialist email lists and boards which would give you much more technical answers more suited to your needs.
 
I don't believe you. If you were really a chip designer at AMD for 10 years, you would know far more than anybody on this board could possibly answer.

You'd also be a member of far more specialist email lists and boards which would give you much more technical answers more suited to your needs.
Ditto.
 
Hopefully there will be a corresponding drop in prices, as there at least should be, which would be a big deal for Intel processors.
 
I don't believe you. If you were really a chip designer at AMD for 10 years, you would know far more than anybody on this board could possibly answer.

You'd also be a member of far more specialist email lists and boards which would give you much more technical answers more suited to your needs.

The fact that I know a ton about AMD chips from K6-II to Athlon 64 doesn't mean I know about what Intel chips that were designed after I quit the business do. Anyway, here's some proof. My name is Cliff Maier:

http://ieeexplore.ieee.org/iel3/4/13972/00641683.pdf?arnumber=641683
http://www.spoke.com/info/pArH3f1/CliffMaier
http://ieeexplore.ieee.org/iel5/4/18325/00845192.pdf?arnumber=845192
http://www.ecse.rpi.edu/frisc/theses/MaierThesis/
 
i know the mobile CPU's reduce voltage when on battery

Yeah, I figured that, but I didn't know they reduced voltage based on demand (via speedstep). Running under a volt is pretty aggressive.

Yes, they do. The Dell 14z uses an ultra low voltage Core2Duo processor, and it is advantageous in quietness, heat, and power consumption. That processcor is clocked at 1.33 GHz, yes, 1.33.

That's frequency, not voltage.
 
Tock, Tock

2010 will be a big year for apple. All that good iphone revenue has pushed to open more stores. They will increase their share in the diminished high-end market now. I see a mbp pro then a mac pro refresh in early 2010 perhaps it's just good timing now that 7 has launched ?
 
The fact that I know a ton about AMD chips from K6-II to Athlon 64 doesn't mean I know about what Intel chips that were designed after I quit the business do. Anyway, here's some proof. My name is Cliff Maier:

http://ieeexplore.ieee.org/iel3/4/13972/00641683.pdf?arnumber=641683
http://www.spoke.com/info/pArH3f1/CliffMaier
http://ieeexplore.ieee.org/iel5/4/18325/00845192.pdf?arnumber=845192
http://www.ecse.rpi.edu/frisc/theses/MaierThesis/

from what i read they threw the P4 design in the trash and used centrino or whatever the name is from their Israeli design team. now all CPU's from laptop to server are centrino based
 
Actually, your hence is backwards. It would draw less power, hence be cooler.

UNLESS adamw meant that they run cooler and would hence draw less power using a fan! ;)

Anyways, I'm in the market to replace my 3.5 year-old MacBook, so I'm very happy to hear about this. When do you guys think that MBPs will start showing up with these chips in them?
 
...Macworld has been moved to February 9th-13th this year and is struggling to find its place after Apple's decision to stop attending the event.

I'm hoping it might turn out like others that persist after a sponsor departs--persist because it was the spirit of the participants that made the event great, not the sponsor. We'll find out in February.

Yes, we'll miss Steve's keynote and apple's sprawling product display.

Missing the opportunity to see Steve in action is a huge loss, like visiting New York City and discovering Broadway's closed. But speakers like David Pogue will entertain and inform. And IF Apple announces a new product prior to MacWorld, I wouldn't be surprised to see an Apple bigwig on stage showing the faithful the company's next big thing. I didn't go to prior MWSFs knowing i'd see the intro of the first iMac, iPod or iPhone, but I'm glad i went. Those events made up for the Steve-O keynotes people left dejected because there hadn't been a "one more thing" to marvel at. Even with Apple's corporate absence, its products will be shown and sold by other vendors on the Moscone floor. And the apple store's only three blocks away if i need more hands-on.

Unchanged is the presence of (maybe not as many?) thousands of mac fans, enthusiastic about the products they use, eager to see and try new software to exploit those products, to handle and compare new hardware to extend them, and to meet other users to discuss the challenges they've encountered and the cool solutions they've found.

Whether or not the administration has withdrawn its support, if enough of the faithful gather around the bonfire, the pep rally can still be a hell of a party.

Bluto: "Over? Did you say 'over'? Nothing is over until we decide it is! Was it over when the Germans bombed Pearl Harbor? Hell no!"
 
this may be slightly off topic, but:

what would heavily use the graphics card on your mbp when you're not:
a) gaming
b) rendering some video (or doing some heavy audio work? dunno)
c) watching some HD movie on a big-ass TV, streaming it from your laptop

/EDIT: wait: audio? That's just ridiculous now I think about it. Unless it's using the 1337 nvidia CUDA technology using the gfxcard (or is that totally crazy?)

-Editing/browsing photos
-Using Quick Look, Cover Flow, or any other fancy UI elements in Finder or iTunes
-using iChat/Photo Booth (though these aren't super intense)
-even watching HD video on your computer takes lots of processor power
 
reducing voltage would save power, but what makes you think they do that? Typically voltage is constant in a CPU, and is determined by the required noise margin (which is a function of device threshold voltage and wire crosstalk). Voltage cannot be reduced below the point where the noise margin dominates, and is usually fixed above that point.

Does intel really reduce voltage when reducing clock?

Absolutely. You need some rather low amount of voltage to make the chip work at all. Above that, you need to increase the voltage to make the chip work at higher clock speeds, up to the point where you can't increase it anymore because the chip melts.

So yes, both clock speed and voltage are changed at the same time at least since Core Duo times. And because of that, power consumption grows with the square of clock speed (because higher clock speed requires higher voltage to work).
 
Absolutely. You need some rather low amount of voltage to make the chip work at all. Above that, you need to increase the voltage to make the chip work at higher clock speeds, up to the point where you can't increase it anymore because the chip melts.

So yes, both clock speed and voltage are changed at the same time at least since Core Duo times. And because of that, power consumption grows with the square of clock speed (because higher clock speed requires higher voltage to work).
Cool 'n' Quiet was introduced with Athlon 64/Opteron. SpeedStep has been around since Pentium III and more so on mobile parts.
 
Cool 'n' Quiet was introduced with Athlon 64/Opteron. SpeedStep has been around since Pentium III and more so on mobile parts.

Yeah, but we didn't change the voltage back in those days - we increased frequency using a clock divider, but kept the voltage constant, because we needed a minimum slew rate to avoid noise failures.
 
What problems for Apple do I see?

With the start of Windows 7 and the fact that it is hyped
in the media, has very good reviews and a very high overall acceptance,
PC's got a lot of previously lost credibility back.

HP already announced, and others will follow I'm sure, that they'll be able to sell notebooks with integrated grafics cheaper than
before, since the C2D and arrendale CPU's are similar in price, building the motherboard
however will be cheaper, since there are less chips/parts to incorporate onto the board.

Biggest competitor to the 13" MBP is the Macbook, that probably will be faster in some cases than an
arrendale MBP with no dedicated Grafics, since OpenCL (big selling point for SL) is fully
supported by the 9400M. This will most likely be the case with Applications that use a lot of processing power i.e. photoshop
(OpenCL support), for Safari you won't recognize the difference in cpu power anyway.

Skipping the arrendale Chips entirely will cause a loss of credibility among mac users. Even hardcore
apple fans would have quite a difficult time to explain to themselvers why they bought a laptop for twice the money
than that notebook from the discounter, that is faster and has newer components.

I'm in the market for a new 13" Macbook but a simple Arrendale with integrated grafics wouldn't do it for me. Not because
I can't live with the Intel grafics, I'm sure I won't even recognize the difference most of the time, it is merely because the
overall value would be worse than now (and it's not that good in the moment either, since the 250$ Premium to the Macbook). I use
my macbook 95% of the time for internet, music, video etc, so I probably wouldn't need more processing power and I'm fairly certain
I am not the only one here with that kind of use, but that doesn't mean however that I'm willing to settle with low end components
in a machine with the price of a mid to highend one.
I think everybody here is willing to pay a bit more for a Mac, me included, it's just that I have a problem with getting
ripped off by apple. And one thing is for sure, they won't make the MBP cheaper than now, because than the normal macbook would be
obsolete.

Again I will have no problems with the Intel grafics when Apple compensates you're loss of grafics power in other ways. Ok I burried
the dream of Blu-Ray a while back, but I still think replacing the Superdrive (again I am only talking about the 13" version) with a
big HDD (e.g. 640GB) and putting a 64-128GB SSD in the Place of the normal harddrive, will be a good trade-off. Replacing the Display
with an ips type panel (non-glare, pleeeease) would be also something I could agree with.

At the end only time will tell, but at least in my case Apple will have to make it worth it. (I'm sure they will somehow *praying*)
 
I'm hoping it might turn out like others that persist after a sponsor departs--persist because it was the spirit of the participants that made the event great, not the sponsor. We'll find out in February.

I think it's highly probable that this will be the last MacWorld. With the state of the web today, nobody really needs a conference just to view third-party offerings. Without Apple's presence, interest (and media coverage) will fall off rapidly.
 
The fact that I know a ton about AMD chips from K6-II to Athlon 64 doesn't mean I know about what Intel chips that were designed after I quit the business do. Anyway, here's some proof.

Thanks for answering, and I accept you're probably who you say you are. It was a bit odd someone like you with your background and contacts coming on a noobish-level board like this and asking the questions you were asking.

So yes, both clock speed and voltage are changed at the same time at least since Core Duo times. And because of that, power consumption grows with the square of clock speed (because higher clock speed requires higher voltage to work).

Make sense, Chris? That's how 4 cores at 25% clock speed use less power than 1 core at 100% clock speed.

Of course, there's some extra CPU work needed, due to thread management overheads etc. But because of the square law rule, it's plausible that 4 cores at 40%, even 50% clockspeed could use less power in total than a single core at 100%, while doing far more work, even after management overheads. And it gets better, the more cores you have (within reason).

That's why there's been such a big drive for thread management and developer core awareness in the last few years. It's preparing for the 128-core chip of the future. And the 1024 core chip after that.
 
Interesting.

1. Arrandale is dual core + hyperthreading, which is often better than dual core.
On intels latest spin of Hyperthreading it is almost always better than dual core. I say almost because it is always possible to find a bench mark where there is little advantage. The good thing though it is vey hard these days to find a HT CPU loosing big time, like the old HT versions did.
2. Arrandale uses the improvements made in Nehalem, so it has higher performance at same clock speed than Core 2 Duo.
Yes but it is still dual core on 32nm. I suspect Apple and AMD will surprise us with the top clock rate. That Top rate might be Turbo Boosted but I suspect Arrandale will be able to run at that rate for longer periods of time than any other processor. It really looks like 32 nm is a very sweet process for Intel.
3. Snow Leopard is all about making better use of multiple cores.

This is my greatest concern about Arrandale, that is the complete lack of real threads. I could see Apple skipping this processor in the 17" and maybe even the 15" MBP as SL will not be able to fully leverage the hardware. Apple really needs a professional laptop for people that need raw performance on the road. The MBP really are not raw power machines, they are very good but really not a transportable classs machine. It is just that a lot of things are done on location these days that could use the power.

In any event people are missing one important product line here. That is the Mini. It is the Mini that really stands to gain with Arrandale. A 2.66 GHz Arrandale based Mini with a decent GPU would really rock. Especially with some of ATIs new very low power GPUs. Squezze Light Peak in there and that would become one versatile little box.


Dave
 
New MBP

If I'll buy the new MBP in January,
i7 Intel CPU + brand new NVidia GPU
would be enough for me (I think so)
:apple:
 
No, I'm not saying 1.33 Ghz is low voltage, I never said that. I said that it has a very low voltage C2D which runs at 1.33 GHz, saying that lower voltage processors generally have lower clock speeds.

Still not getting your point. Of course lower voltage processors have lower clock speeds. The discussion was about dynamically adjusting voltage in response to load.
 
Going to try to help you a bit here.

Yeah, but we didn't change the voltage back in those days - we increased frequency using a clock divider, but kept the voltage constant, because we needed a minimum slew rate to avoid noise failures.

I'm not a chip designer, frankly I'm an automation tech, but you could save a lot of trouble here if you could just print out the formula for power draw in CMOS electronics. I can't remember it off the top of my head.

In any event beyound static power draw or leakage power in CMOS logic varies with clock rate. Apparently many here don't know the difference between a GHz and a volt.

As to DYNAMIC voltage adjustments to a running CPU I do not believe that is happening with i86 class processors but I could be wrong as it has been a very long time since I cared about such things. That being said it wouldn't be impossible as modern CPUs can or do control the voltage regulator. From a technical standpoint I suspect that dynamic voltage adjustment would be a big problem for this class of CPU and frankly dynamic gating and shutting off non functional blocks would be much bigger gains. Things do change so I could be wrong but frankly I'd rather see documentation instead of random claims from people that mix GHz and volts.

As to those claiming that 4 CPUs running at a 1/4 of the clock rate of a single core draw less power yes that is possible given the same process and CPU design. But there are qualifications here. For one your static power has to be very low otherwise a good portion of your power budget goes to keeping those for cores turned on. Then you have to consider efficiency of execution, if the software in question requires a fast CPU you loose running on a slow core. In other words 4 CPUs running at 1/4 the clock rate of a single core does not always equal single core performance. I see Intels Turbo Boost as recognizing this. In the end a multi core chip is likely to use more power than a single core chip to complete a quantum of single threaded work.

Dave
 
Haha, I believe we just got told:
294168009_b25decaddf.jpg
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.