PDA

View Full Version : Apple is granted a patent for increasing the operating frequ...


MacBytes
Nov 15, 2005, 11:46 AM
http://www.macbytes.com/images/bytessig.gif (http://www.macbytes.com)

Category: News and Press Releases
Link: Apple is granted a patent for increasing the operating frequency of an electronic circuit (http://www.macbytes.com/link.php?sid=20051115124612)

Posted on MacBytes.com (http://www.macbytes.com)
Approved by Mudbug

lopresmb
Nov 15, 2005, 01:08 PM
hmmm, controlable (and more importantly non-warranty voiding) overclocking, that sounds pretty cool.

Also, might be a way to get a leg up on competitors who are using the same processors in wintel machines

mdavey
Nov 15, 2005, 01:43 PM
Hmm, could this patent mark the end of quoted CPU clock speeds in computer specifications?

I don't know about the G4 and G5 but many CPUs aren't manufactured with a definitive clock speed in mind - once they come of the production line, each unit or batch is characterised and their clocking frequency set with reisistors (typically the presence or absence of zero-Ohm jumpers).

There is often no component difference between, say, a 1.25GHz processor and its 1.67GHz counterpart, just that manufacturing tolerances meant that one batch dissipates heat a little better or consumes a little less power than another.

It sounds like this patent will allow CPUs to run at different clock frequences as long as a CPU temperature limit isn't exceeded. Which might mean that modders could upgrade the fan or heatsink to get better performance; and that sysadmins that locate their equipment in cool server rooms could get better performance. For modders that like to go to extremes (liquid nitrogen cooling, etc), the possibilities are near endless ;)

shamino
Nov 15, 2005, 01:48 PM
Software controlled clock speeds?

How is this different from what's done now? Right now, most processors are designed so software can reduce the speed to something less than 100%, as needed to power consumption.

This looks like the exact same thing, with only a change of the documentation. So instead of running at 100%, with reductions during idle time, you're normally running at 90%, with both increases and decreases from the baseline value.

This reminds me of the bit from Spinal Tap, about amplifiers going to eleven.

Qunchuy
Nov 15, 2005, 03:18 PM
How is this different from what's done now? Right now, most processors are designed so software can reduce the speed to something less than 100%, as needed to power consumption.

This looks like the exact same thing, with only a change of the documentation. So instead of running at 100%, with reductions during idle time, you're normally running at 90%, with both increases and decreases from the baseline value.
The difference is in the idea of a "sustainable" clock rate, and the invention is about the ability to exceed that rate for a limited amount of time when necessary. The abstract talks about being able to run at >100% for as long as it can deal with the increased power dissipation. It's not about saving power, it's about running as hard as it can at any given time without destroying itself.

shamino
Nov 15, 2005, 04:54 PM
The difference is in the idea of a "sustainable" clock rate, and the invention is about the ability to exceed that rate for a limited amount of time when necessary. The abstract talks about being able to run at >100% for as long as it can deal with the increased power dissipation. It's not about saving power, it's about running as hard as it can at any given time without destroying itself.
This is no different from any other software-controlled clock. They're just renaming what "100%" is.

What they're describing can be done right now. Take a 2.7GHz chip and overclock it to 3GHz. Now update the pre-existing Energy Saver code so that it drops down to 2.7 whenever it gets too hot for the cooling system to keep up with, and you're done.

As for what is "sustainable", with enough cooling, almost anything is sustainable. KryoTech made an entire business out of selling massively-overclocked PC's, using refrigerant/compressor-based cooling systems to keep it stable.

phelix_da_kat
Nov 15, 2005, 07:06 PM
I was thinking..

Apple is sometimes "clock and dagger" about submitting patents, until it is ready to be "used" in a product..

Could this be one of Apple's ways of managing the temperature/performance of their new rumored laptops? :rolleyes:

Some_Big_Spoon
Nov 15, 2005, 08:21 PM
Must spring from all their expertise in overclocking the hell outta the G4 for all these years :cool:

mdavey
Nov 16, 2005, 10:51 AM
Must spring from all their expertise in overclocking the hell outta the G4 for all these years :cool:

Hmm. Any developers know if it is possible to overclock the G4 or G5 processors in the current Apple range, via software? That is, do those resistors provide a 'hint' to the Kernel or do they physically select the clock frequency such that the software has no control over the clock frequency at all?

Perhaps 10.4.3 implements this patent and that is why the fans go mad when the system boots or installs a software update - the OS is trying to complete the task as quickly as possible.

shamino
Nov 16, 2005, 12:08 PM
Perhaps 10.4.3 implements this patent and that is why the fans go mad when the system boots or installs a software update - the OS is trying to complete the task as quickly as possible.
Assuming there is no overheating situation going on at the time, this is probably due to the fan-control process running a too-low priority.

IIRC, the fan-control circuits in a G5 switch into high-gear whenever they stop receiving information from the OS. This is a fail-safe mechanism to prevent overheating in the event of a system crash. This is also why they race during bootup (because the fan-control software hasn't started running yet.)

If some CPU-intensive task isn't allowing the fan-control software enough time to run, that would also cause the fans to race, slowing down again once the software gets a chance to send another speed-control message to the fans. Ideally, this shouldn't happen, but it is possible because OS X is not a real-time OS.