Hardware Up to date program
For those reading this thread, looking to get Snow Leopard - who've bought a Mac since June 8th - I'd recommend purchasing early, in the possible ways on the page. The Terms & Conditions do state they give up to 4 weeks leeway for the disc to ship to you, so you're best off getting your order in soon (assuming first come first served) to reduce that waiting time.
http://www.apple.com/uk/macosx/uptodate/ and
http://www.apple.com/macosx/uptodate/ (presumably there are other country specific uptodate program links.
Would be nice to be able to buy the OS at a Mac store at the reduced price (just coming in with the money, and the proof of qualifying purchase) but it might not be likely to happen. Not too shabby still, for an 10.x update.
Grand Central Dispatch
PDF from Apple on Grand Central Dispatch
here. To recap - GCD is Apple's approach to multicore computing, to shift the responsibility of managing threads and their execution from applications to the OS. We'll see how it pans out - it isn't a universal panacea. To quote and mention a few sections:
Woven throughout the fabric of Mac OS X version 10.6 Snow Leopard, GCD combines an easy-to-use programming model with highly effi cient system services to radically simplify the code needed to make best use of multiple processors. The technologies in GCD improve the performance, effi ciency, and responsiveness of Snow Leopard out of the box, and will deliver even greater benefi ts as more developers adopt them.
Power play: A little reduction in clock speed can go a long way in reducing power consumption. That’s because the relationship between clock speed and power consumption isn’t linear. The numbers vary with specifi c processor models and manufacturing processes, but reducing the clock speed of a processor by as little as 20 percent can cut the power consumption of the processor by nearly one-half. And you can add a second core to that processor at the reduced clock speed and nearly double the performance while seeing just a tiny increase in overall power consumption.
Units of work are described as blocks in a developer's code. Queues are used to organize the blocks based on how the developer believes they need to be executed. This moves away from threads and thread managers. GCD reads the queues created by each application and assigns work from the queues to the threads it is managing. The threads are managed based on the number of cores available and the demands being made at any point in time on the computer. Apple lists the benefits of GCD as improved responsiveness, dynamic scaling, better processor utilization and cleaner code.
Blocks are an extension of C (and Obj-C and C++). A block in code is denoted by a a caret at the beginning of a function. E.g. To declare a block and assign it to x:
x = ^{ printf("hello world\n"); }
Turning the variable x into a way of calling the function so that calling x( ); in the code would print the words hello world.
"What’s really powerful about blocks is that they enable you to wrap much more complex functions—as well as their arguments and data—in a way that can be easily passed around in a program, much as a variable can be easily referenced and passed."
There is mention on the Snow Leopard
Refinements page of the Xcode the Instruments application - part of the Xcode tools supplied with every OS (10.6) disc provides capabilities to analyse GCD usage and performance on multicore Macs. Developers using 10A380 discs might be able to analyse Apple's apps to see what their use of GCD is. (Also of note - this will be relevant to iPhone developers also, as the iPhone will go multicore in a year).
A numbers versus an animal
Paul Thurrot from
winsupersite.com (who is also on TwiT with Leo Laporte on
Windows Weekly)
has an article on Windows 7 vs Snow Leopard
here.
I'd disagree on some points (saying that every 10.x update have not been major updates). Are both Windows 7 and Snow Leopard are minor, evolutionary updates, technologically? Thurrot states that Windows 7 is a major update from a user experience perspective, while Snow Leopard is not. part 1 of a series...
Edit - It is interesting to see him hold his position (that he can't do Mac, that his wife shouldn't etc) on Microsoft - he does admit to certain things - if you selectively quoted him, he'd be a Mac guy. Least he ain't as bad as Enderle or Dan Lyons.
Intel rebranding, renaming
Intel
has given a
statement explaining branding in the future. The Core CPU range will be split i3, i5 and i7 categories, each with different tiers of performance.
i3 - Lower performance
i5 - Mid-range
i7 - Maximum
But then of course, wouldn't CPUs need to be rebranded each year, as they become slower than newly released CPUs?
Lynnfield, the upcoming desktop chip, will fit into either the i5 or i7.
Clarksfield mobile chip will be an i7
The Centrino brand is being phased out and will in the future be used for Wi-Fi and WiMAX.
VP & Director of Corporate Marketing at Intel Deborah Conrad: ""In the back half of this year you'll begin to see Core i5 and more Core i7s coming to market. Then by the first part of next year you'll begin to see Core i3, and i5, i7. Then the old names will get retired as those products get phased out."
So they'll have:
Celeron for entry-level computing
Pentium for basic computing
Atom for all those devices they can cram it in.
Core i3 desktop and mobile
Core i5 desktop and mobile
Core i7 desktop and mobile
Clarksfield (a slower Lynnfield quad-core based on 45nm Nehalem)- will become Core i7.
Some Lynnfield CPUs might end be Core i7 slower ones Core i5.
Arrandale (32nm mobile) will initially be Core i3, then Core i5 and Core i7 as well.
Clarkdale (32nm desktop) will be Core i3 & Core i5.
In an attempt to simplify, we've got a year or so of both naming schemes on the market. Maybe a way to obfuscate the Clarksfield, Arrandale release period as well as clean up some confusion for normal customers (bigger i is better)
The problem, is that faster clock speed isn't necessarily better, you have to take into account the number of cores, if its 45nm or 32nm etc. You've got to factor in GPU and how well it works with the CPU now. The digital divide as
this article calls it.