Honestly, this has to be the single best and most comprehensive answer I have ever received on here. Normally the response is just a baseless assumption marked by overly inflated self-congratulation. Rumors or one thing, and they are great fun, but informed comments based on legitimate questions is a completely different situation. Thanks again!
Leopard does not, nor will it, require Shader 3.0. Most Apple users do not have an appropriate graphics card to do so. It may well be relevant to 10.6 or 10.7, but probably still will not be required, just as Shader 2.0 isn't absolutely required now.
Not at all. The chipset and CPU are quite distinct from each other; a single chipset can support several generations of vastly different CPUs. The venerable Intel 945, for example, supports Pentium 4, Pentium D, Core Duo, Core 2 Duo, and Core 2 Extreme CPUs (as well as Celeron derivatives). Thinking of them as a single unit places unnecessary and inappropriate limits on the technology.
In short, the chipset supports the operation of the system as a whole. It contains all the limiting logic of the motherboard--what memory types and speeds are supported, the socket/FSB/model of CPU supported, and it packages the onboard ethernet, sound, video, and wireless as applicable to the current platform. In most respects, the chipset is the logic board in the traditional Apple sense (i.e. it is "the" computer, minus the RAM and CPU [and graphics card, for systems without onboard graphics).
The CPU on the other hand is just the part that does the calculations. It is responsible for the major performance of software (apart from graphics). The chipset plays an important, but less dramatic role in speed and performance. The chipset has become the key limiting factor in the modern age--the onboard graphics can't be upgraded; you can't break the limits imposed on the type and quantity of RAM; you can only upgrade CPUs to the maximum supported by the chipset. Exceeding any of these limits requires a new computer.
I wouldn't say that Santa Rosa's major contribution is in the area of graphics. It is certainly the most visible and directly applicable improvement for end-users, but it is the next major step in the comprehensive overhaul of the x86 architecture as we know it. We aren't going to be seeing any massive speed changes like we experienced five years ago; the technology we have is mature and improvements will be incremental barring any major breakthroughs.
This is why we're having multiple cores pushed onto us--it's a way of extending Moore's Law while the semiconductor industry has essentially stalled. In its defense, home users rarely need even as much power as available to them now. There's not really any new necessity to push home user markets forward.
Updates usually do happen after the "back to school" sales have depleted inventory. This is advantageous for everyone--students get good deals and push out extra inventory allowing for the faster shipment of updated models. Students simply don't need the absolute newest and cutting edge to ship before school starts; in the traditional marketing sense, they're budget-oriented shoppers distracted by shiny objects. Any unusual student seeking the best and newest would schedule updates around the end of the first semester to capitalize on the typical release schedule; those students aren't first time computer owners.