Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

t0mat0

macrumors 603
Original poster
Aug 29, 2006
5,473
284
Home
overview_hero20090608.jpg

overview_hero20090608.jpg


snow_leopard_ars-thumb-640xauto-8029.jpg


John Siracusa's review is **here**

This thread is intended to be a collation of information regarding Snow Leopard (abbreviated to SL), both normal and also Server editions.

Starting from Known knowns: Apple’s Snow Leopard page

UPDATE

When - August 28th
What

“A Quantum Leap”. No solace required. Billed as changing it’s focus, “taking a break from adding new features” and building on Leopard
  • Delivering “a new generation of core software technologies” to
    - streamline Mac OS X
    - enhance Mac OS X, including improving quality.
  • Reduce the OS footprint
  • Out-of-the-box support for Microsoft Exchange 2007 built into Mail, Address Book, and iCal (using the Exchange Web Serices protocol).
  • Grand Central
    – A set of technologies to improve performance
    - Makes “all of OS X multi-core aware”
    - Optimises Mac OS X it for “allocating tasks across multiple cores and processors”
    - Helps developers, by making it easier for them to create programs that can effectively use the power from multiple cores and processors.
  • Extension of 64-bit technology in Mac OS
    - Allowing up to a theoretical 16TB maximum of RAM (No word on what type)
  • Quicktime X
    - Streamlined platform for modern media and internet.
    - Optimised support for modern codecs
    - More efficient media playback
  • Through Safari, delivering fast Javascript (e.g. implementing this through MobileMe)
  • OpenCL (Open Computing Language)
    - A language to help developers use the power of GPUs (graphics processing units) and redirect it for general purpose computing.
    - In other areas, OpenCL is akin to GPGPU.
    - Please watch this tutorial series for more! http://www.macresearch.org/opencl_episode1
  • ZFS - Not mentioned on the normal SL page, but confirmed for the SL Server edition here
    - Read & write support for the 128-bit ZFS file system
    - Features such as storage pooling, data redundancy, automatic error correction, dynamic volume expansion, snapshots.


Who
Apple, obviously. Intel. NVidia and more.

Why
To make money, serve the stockholders, and to create useful Apple product - software and hardware.
Basically, Apple has sweetened the pot to get people to move to OS X through the low upgrade price, and new features.

How Much
wwdc2009-369.jpg


T&C say only US and Canada
$9.95*for a single user if you buy a Mac between June 8 and September (via Snow Leopard up-to-date program)*
$29 to upgrade from Leopard for a single user.
$49 for a family pack to upgrade Leopard machines
* "Covers product plus shipping and handing fee. U.S. customers add appropriate sales tax."

$9.95 upgrades qualifying Xserves ("MA882LL/A, MB449LL/A. Z0E7, Z0FR, Z0GM") or an Apple Certified Refurbished Xserve from the Apple Online Store on or after June 8, 2009 to Mac OS X Server v10.6 Snow Leopard Unlimited Client for $9.95 plus tax.

UK page here

Happily lower prices than previously thought. I stand corrected, and happy that Daniel here at least this time was wrong. Terms and Conditions apply, Apple reserves the right to change the terms of this offer at any time without notice etc etc.

Amazon is taking pre-orders for Snow Leopard. Upgrade is available for Intel based Macs currently running Leopard. Tiger users can buy a Mac Box Set. From the front page:

- Mac OS X version 10.6 Snow Leopard ($29.00)
- Mac OS X Snow Leopard Family Pack (5-User) ($49.00)
- Mac Box Set - (with Snow Leopard) ($169.00)
- Mac Box Set Family Pack with Snow Leopard (5-User) ($229.00)
- Mac OS X Server version 10.6 Snow Leopard ($499.00)

Note: Purchases through these Amazon links benefit MacRumors financially.

Amazon has also begun offering pre-orders of OS X Snow Leopard family packs, Mac Box Sets bundling Snow Leopard with iLife '09 and iWork '09, and Snow Leopard Server.


For breaking that down, currently we have some main topic areas being
  • Level of multi-touch in SL
  • Usage of SL as a basis for the OS of future iPhones and other "electronic devices"
  • Microsoft Exchange 200 support, and how much other Windows support SL will have
  • “Grand Central” dispatch & threads
    - how it ties in with Intel and their multiple core chips, and also with multiple processors.
    - How it ties in with developing Mac OS X apps, and also iPhone apps
  • How 64-bit technology and a 16TB maximum (along with the other mentioned factors) will affect computing
  • Quicktime X
  • Safari - and Javascript - how this will be deployed. Looking at Air, Silverlight and others (e.g. Prism).
  • OpenCL
    - How it will tie in with SL, and Grand Central, and who Apple goes for to supply these GPU chips/ GPU boards for GPGPUage
    - How it will affecting computing, alongside Grand Central
    - It's rivals on the scene, and how Intel will play it's part.
If anyone's interested in this, please say hello, and contribute - i'm all for useful info-packed threads, with as much sources and multimedia thrown in as possible. Unfortunately it's a pain in the RSS to use the macrumor guide wiki section currently, so thread it is.

There's a list of sources on page 2 (the word limit is biting).

Some guides for Snow Leopard - here
The state of play currently

ticktock.jpg


As wiki on the Tick Tock strategy notes:
Tock - Intel Core microarchitecture of 2006
Tick - Shrink/derivative (i.e. Penryn) 45 nm of 2007
Tock - New Intel microarchitecture (i.e. Nehalem) for 2008
Tick - Shrink/derivative (Westmere) 32 nm for 2009
Tock - Future Intel microarchitecture (i.e. Sandy Bridge) for 2010

So Penryn's successor is expected to be Nehalem. Nehamlem will bring back hyperthreading, which will crop up soon enough. SL will release after Nehalem, and potentially around Westmere, and before Sandy Bridge in 2010.
 

Attachments

  • softwaresellssystems.jpg
    softwaresellssystems.jpg
    62.6 KB · Views: 915
  • 41rFGlGJYPL._SL500_AA280_.jpg
    41rFGlGJYPL._SL500_AA280_.jpg
    12.5 KB · Views: 6,653
  • 411pdDRQNrL._SL500_AA280_.jpg
    411pdDRQNrL._SL500_AA280_.jpg
    11.6 KB · Views: 6,630
  • 51w7n+tXwgL._SL500_AA280_.jpg
    51w7n+tXwgL._SL500_AA280_.jpg
    14.1 KB · Views: 6,684

t0mat0

macrumors 603
Original poster
Aug 29, 2006
5,473
284
Home
Does Apple have a plan to greatly expand its position?
Mashing up this source amongst other things.

The shift from single core processors to multiple core processors is "probably the single most disruptive thing that we will have done in the last 20 or 30 years." Microsoft Chief Research & Strategy Officer Craig Mundie

Apple, if it gets ahead of the market in taking advantage of (multiple) processors with multiple cores, they could expand into the server, supercomputer, and also gaming and heavy usage computing fields, whilst bringing the benefits to all consumers. Is Microsoft in such a position?

The Mac is poised for innovation over the next few years on a scale that we haven’t experienced since the initial move to OS X in the previous decade.

With the iPhone 3G a few days from release, this platform is stable and just starting to take off. There are hardware and software complaints, and we'll see whether they'll be taken care of (flash for the camera, video, front cam, better Mpixel CMOS camera etc). Can the iPhone and Touch get away with maintenance and evolution, rather than revolution? Quite possibly.

Implication: Apple’s best hardware and software teams may now have time to work on real interesting Mac stuff.

For processors, it's less a MHz race than before, and more about cores & sockets. Beyond the multi-core multi-processor environment, we also have GPUs and custom chips on the scene too.

What's on the horizon?
Raw performance of the upcoming CPUs, the ability to harness it effectively with things like Grand Central, and the potential for either hybrid CPU/GPUs, or GPUs complementing CPUs will bring in a whole lot more processing power. Apple for its part, is wanting to make this tech useful, and easy to use for consumers, easy to implement and access for developers. Snow Leopard is basically blowing the top off the theoretical top end Mac configuration using Leopard. Fully 64-Bit, up to 16TB RAM, and dual socket Nehalem chips to start off with.

Cast
NVIDIA - The graphics chip leader
Intel - The #1 CPU maker
Apple - The #1 OS ;)
AMD - The #2 chip maker
Microsoft - The #>=2 OS provider

Processors - Intel

Remember the Guinness surfer advert? Same deal. Tick follows tock follows tick follows tock. Like Apple, good things come to those who wait!

2007 - Tick - Shrink/derivative to 45nm (i.e. Penryn (aka Yorkfield))
2008 - Tock - New Intel microarchitecture 45nm (i.e. Nehalem (aka Bloomfield))
2009 - Tick - Shrink/derivative to 32nm (i.e. Westmere (aka ?))
2010 - Tock - Intel microarchitecture 32nm (i.e. Sandy Bridge (aka ?))

Core microarchitecture
(An Intel CPU core roadmap is at the bottom which may help)

In 2006, the Intel Core architecture was unveiled. Core chips had Virtualisation Technology (virtualisation support), Intel 64 (Intel's implementation of x86-64) and SSSE3. They were based around an updated version of the Yonah core & and could be considered the latest iteration of the Intel P6 microarchitecture, which traces its history back to the 1995 Pentium Pro).

The Core processor lines - with differing socket use, bus speed, power consumption etc:
  • Merom - for mobile computing
  • Conroe - for desktop computing
  • Woodcrest - for servers/workstations

    Branding:
    Mobile processors - Core 2
    Desktop processors - Core 2
    Low end Core processors - Pentium Dual Core
    Low end Core processors - Celeron
    Servers and workstations - Xeon

    Just to note - confusingly, Intel processors branded as "Intel Core", e.g. the 65nm Yonah processor and its variants do not use Intel Core microarchitecture despite its name.

    Core 2
    The Core 2 brand refers to a range of CPUs based on the Intel Core microarchitecture.

    The Core 2 processor lines:
    • C2S - Core 2 Solo: single-core CPU
    • C2D - Core 2 Duo: dual-core CPU
    • C2Q - Core 2 Quad: quad-core CPU
    • C2X - Core 2 Extreme: dual/quad-core CPU

    Branding - (Lower is newer - e.g. Penryn is successor to Merom):
    Laptops
    Merom (65nm) - Core 2 Solo/Core 2 Duo/Core 2 Extreme(Dual)
    Penryn (45nm) - Core 2 Solo/Core 2 Duo/Core 2 Quad Core 2 Extreme (Dual)/Core 2 Duo Extreme (Quad)

    Desktops
    Conroe (65nm) - C2D/C2X (Dual)
    Allendale (65nm) - C2D
    Wolfdale (45nm) - C2D
    Kentsfield (65nm) - C2Q/C2X (Quad)
    Yorkfield (45nm) - C2Q/C2X(Quad)

    Servers and Workstations:
    (Intel brands servers & workstation Core 2 CPUs as Xeon processors. Xeon CPUs generally have more cache than their desktop counterparts in addition to multiprocessing capabilities. (afaik - it takes peons eons, to work out Xeons...))

    Dual-core Xeons:
    5100-series Woodcrest
    5200-series Wolfdale (45nm)
    7100-series Tulsa (65nm)
    7200-series Tigerton
    3000-series Conroe
    3100-series Wolfdale

    Quad-Core Xeons:
    3200-series Kentsfield - Relabelled C2Q processor
    3300-series Yorkfield - Relabelled C2Q processor
    5300-series Clovertown - Consists of 2 Woodcrest chips in one package
    5400-series Harpertown
    7300-series Tigerton - A 4 socket and greater capable

    (The fastest Harpertown is the X5482 and is also sold under the name "C2X QX9775" for use in the Intel SkullTrail system).
    (The Clovertown X5365 is among the fastest processors, performing up to ~38 gigaflops = 0.038 teraflops )

    Future versions:
    Whitefield (cancelled)
    Aliceton
    Dunnington - Last of the Penryn generation - a single die 6 core processor
    Gainestown - Based on Nehalem microarchitecture
    Beckton - 8 or more core Nehalem processor. Based on Nehalem microarchitecture.

    The successor to Penryn is Nehalem, the 32nm shrink of Nehalem is Westmere.
    Release schedule: Penryn chips coming out up to ~Q3 08, Nehalem for late 08

    Intel seems to be holding some cards to its chest. It has Larrabee (more on other posts) which is an integrated graphics platform that can natively execute CPU x86 code. i.e. when not rendering 3-D graphics, it can donate processing cores for general CPU work.
    Intel also has Nehalem coming very soon, then the Sandy Bridge platform in late 2009/early 2010 (which is expected to integrate Larrabee onto a single die with quad core (or more) processors, leading to improved performance).
    • Set to be released in 2 flavours - both based on Nehalem CPU Architecture, one being a desktop chip (Havendale), the other a notebook chip (Auburndale).
    • Auburndale & Havendale will have 2 Nehalem cores paired with a graphics subsystem. The twin cores will share 4MB of L2 cache and feature an integrated dual-channel memory controller that supports memory configurations up to DDR3-1333 apparently.
    • The graphics subsystem will be initially derived from Intel’s G45 integrated graphics. This indicates that neither Auburndale nor Havendale will be for heavy graphics processing, but will be more of an integrated graphics replacement.
    • According to Intel roadmaps, the new processors are expected to enter the market in the H1 2009. Which is slap bang when Snow Leopard is pencilled in for release.

    Intel's Senior VP Patrick Gelsinger says Xeon MP versions of Nehalem will be up to octocore processors, and will use HT. Most "Enterprise software vendors charge by the socket and not by the number of CPU cores." - If this was to change, we could see quad sockets be more widely available i'd imagine. Until that time, more cores per socket is desired, due to licencing costs... Apple currently only has a dual socket configuration. Could they go hardcore 4 socket crazy?

    How does Apple sell all that power to customers? What are the compelling reasons, the killer apps? It's about getting people to start using apps that use this. Video, 3G, The internet in your pocket, a supercomputer at your desk.


    Graphics
    The main players of the graphics market are
    • NVIDIA's GeForce
    • AMD's Radeon
    • Intel in 2009.

    Interesting for the discrete market, a side step for the integrated market. So what's NVIDIA doing? Is it getting into the CPU business? Trying to buy Via - an x86 license and with a line of CPUs...? nVidia makes an offer to buy AMD? ATi?


    Custom chips:
    Could Apple use either of a graphics card manufacturer's GPU's for GPGUage? Or will it make it's own chipset including GPU(s) for performance?
    Apparently, Nehalem could allow Intel to integrate a graphics core into the processor, if it wanted to. So this may very well be where we see the introduction - Larrabee could fit right into this plan.

    Other techWiMax, LTE, WiUSB, eSATA, A-GPS, SSD, USB 3.0, FW1600 (Edit - now out & coming to a Mac near you)/3200...

    Applications for GPGPU
    Snagged from the wiki
    • Computer clusters or a variation of a parallel computing (utilizing GPU cluster technology) for highly calculation-intensive tasks:
      - High-performance clusters (HPC) (supercomputing) including distributed computing.
      - Grid computing (a form of distributed computing) (networking many heterogeneous computers to create a virtual computer architecture)
      - Load-balancing clusters (a server farm)
    • Physical based simulation and physics engines (e.g. Newtonian style physics models) inc. cloth, hair, fluid flow (liquids, smoke)
    • Segmentation – 2D and 3D
    • CT reconstruction
    • Fast Fourier transform
    • Tone mapping
    • Audio signal processing inc. for digital, analog & speech processing
    • Digital image processing
    • Video Processing
      - Hardware accelerated video decoding and post-processing (Vista has it. Come on Snow Leopard!!)
      - Hardware accelerated video encoding and pre-processing
    • Raytracing
    • Scientific computing - weather, climate forecasting, molecular modelling inc. X Ray Crsytallography
    • Bioinformatics[4][5]
    • Computational finance
    • Medical imaging
    • Computer vision
    • Neural networks
    • Cryptography and cryptanalysis

    I'd imagine SIGGRAPH 08 & 09 will be buzzing with this stuff.


    cuda_logo2.jpg


    CUDA - Compute Unified Device Architecture.

    Good long read here

    An SDK and API - a C compiler and set of development tools for programmers to help use C to code "algorithms for execution" on the GPU. (graphics processing unit). Developed by NVIDIA, it requires an NVIDIA GPU to use CUDA (G8X upwards, including GeForce, Quadro & Tesla lines). It gives developers access to the native instruction set and memory of the massively parallel computational elements in CUDA GPUs. Initially the CUDA SDK made public Feb 2007. So through CUDA, the NVIDIA GPUs can be turned into powerful, programmable open architectures like today’s CPUs (Central Processing Units) simplistically as the wiki says.

    What might be helped by this? For the gaming industry, physics calculations - including debris, smoke, fire, fluids. Wiki provides the links to http://www.biomedcentral.com/1471-2105/8/474 and http://www.biomedcentral.com/1471-2105/9/S2/S10 for the acceleration CUDA gives for non-graphical computation in computational biology/other fields.

    Advantages over over general purpose computation on GPUs (GPGPU) using graphics APIs.
    • Uses the standard C language, with some simple extensions
    • Code can write to arbitrary addresses in memory.
    • CUDA exposes a fast shared memory region (16KB in size) that can be shared amongst threads. This can be used as a user-managed cache, enabling higher bandwidth than is possible using texture lookups.
    • Faster downloads and readbacks to and from the GPU
    • Full support for integer and bitwise operations

    Cons:
    • CUDA-enabled GPUs are only available from Nvidia
    • Texture rendering & recursive functions are not supported
    • Deviation from the IEEE 754 standard.
    • Potential bottleneck of Bus bandwidth and latency between the CPU and the GPU.
    • Threads must run in groups of at least 32 threads that execute identical instructions simultaneously. Branches in the program code do not impact performance significantly, provided that each of 32 threads takes the same execution path; the SIMD execution model becomes a significant limitation for any inherently divergent task (e.g., traversing a ray tracing acceleration data structure).

    You can see examples of what CUDA can do here (It's flash based).

    Why? From Beyond 3D's article:
    - Neither DirectX nor OpenGL are made with GPGPU as their primary design goals, thus limiting their performance
    - Arbitrary reads and & writes to memory while bypassing the caching system (or flushing it) is still not supported in the Direct3D 10 API

    AMD: Streaming Close to the Metal

    CTM's commercial successor is the AMD Stream SDK, released in 2007.
    Like CTM, Stream SDK provides tools for general-purpose access to AMD graphics hardware.

    Differences:
    "The idea behind CTM is that there is efficiency to be gained by giving an experienced programmer more direct control to the underlying hardware.
    CTM is thus "fundamentally [an] assembly language. CUDA on the other hand aims to simplify GPGPU programming by exposing the system via a standard implementation of the C language. At this point in time, the underlying assembly language output (also known as "NVAsc") is not exposed to the application developer.

    "CUDA exposes the NVIDIA G80 architecture through a language extremely close to ANSI C, and extensions to that language to expose some of the GPU-specific functionality. This is in opposition to AMD's CTM, which is an assembly language construct that aims ot be exposed through third party backends. The two are thus not directly comparable at this time."

    Chipsets, graphics, handhelds, desktops, Visualisation, near-time and real-time rendering. Market area examples, for rigid body physics, matrix numerics, wave equation solving, biological sequence matching, finance.

    GPGPUS: General-purpose computing on GPUs (graphics processing units)
    From the wiki: Made possible "by adding programmable stages and higher precision arithmetic to the rendering pipelines, which allows software developers to use stream processing on non-graphics data."

    Basically expanding the purpose of a GPU from just accelerating parts of the graphics timeline, to using it for general purpose computations, to also accelerate the computer's non-graphics related computations. There are certain restrictions in operation and programming - their effectiveness is suited for solving problems using stream processing - processing things in parallel - operating " in parallel by running a single kernel on many records in a stream at once."

    A stream being "a set of records that require similar computation. Streams provide data parallelism."
    Kernels are the functions that are being applied to each element in the stream. e.g. in GPUs, vertices & fragments are the elements in streams, with the kernels to be run on them being vertex & fragment shaders.

    "The most common form for a stream to take in GPGPU is a 2D grid because this fits naturally with the rendering model built into GPUs. Many computations naturally map into grids: matrix algebra, image processing, physically based simulation, and so on."


    Apple's position? OpenCL, on a post below.
 

Attachments

  • tera-scale_header2.jpg
    tera-scale_header2.jpg
    40 KB · Views: 498
  • UPCRC-diagram.gif
    UPCRC-diagram.gif
    25.5 KB · Views: 619

KingYaba

macrumors 68040
Aug 7, 2005
3,414
12
Up the irons
If Snow Leopard is more than $40.00 I will not pay for it. Unless for what ever reason performance will increase substantially on my Macbook Pro.
 

t0mat0

macrumors 603
Original poster
Aug 29, 2006
5,473
284
Home
In short: 10.6 = $129, won’t likely set any sales records, not aimed to irritate the PPC user-base, but will be something quite substantial as we’ll see in the years ahead.

A mash up of a series of ongoing articles by Daniel Eran Dilger at roughlydrafted.com

Myths of Snow Leopard 1: PowerPC Support June 16th, 2008

Snow Leopard is going to be the first version of Mac OS X that only runs on Intel Macs. PowerPC Mac users will still be able to use Leopard, which will get updates. The new features in SL are primarily going to impact on
multicore, multiprocessor machines. (OpenCL, Grand Central, and the new 64-bit kernel)

Universal Binaries of the new Mail, Address Book, and iCal could be made, if Exchange support is wanted. Will 3rd party developers keep making Universal Binaries? If it's in their interest - yes (e.g. if they already have their PowerPC code, and it's easy to update the UB).

Of the 27 odd million Mac OS C installed user base, Leopard is on over a third, Tiger over a third, and the rest on an earlier version, so there is still a user base to sell Universal Binaries to, if that user base buys enough software.

The article purports that with the Universal Binary architecture, developers can target both Intel & PowerPC Macs with minimal extra effort. (Adobe being an exception)

Will SL definitely be Intel only? We await to find out - but signs are it will likely be.

Myths of Snow Leopard 2: 32-bit Support June 17th, 2008

Myth 2 - Apple is dropping support for 32-bit Intel Macs because Snow Leopard is 64-bit.

Apple doesn't have a problem, but Windows does have a 32/64-bit conundrum, as Microsoft has to ship multiple architectures of Windows XP, Vista, and Server:

  • IA-32, for the 32-bit Intel x86 architecture of most PCs (the same as that of the early Intel
  • IA-64, for the 64-bit Intel EPIC architecture developed for Itanium and largely unused by anyone (now Windows Server only).
  • x64 (aka AMD64), the 64-bit x86 architecture developed by AMD and copied by Intel after the humiliation of IA-64’s failure. It is the mainstream 64-bit PC architecture, and is used in the latest Core 2 Duo Intel Macs.

Windows' 32/64-bit Conundrum:
Its IA-64 & x64 versions of Windows run from different code bases, and offer poor compatibility with a lot of existing 32-bit software. Also, 64-bit versions of Windows don’t run on 32-bit PCs, meaning that the market for developing 64-bit drivers and apps for Windows is artificially small, and can’t get bigger because there are software barriers to adopting 64-bit PCs that are of Microsoft’s own doing. That chicken and egg problem has no solution outside of Microsoft figuring out how to merge its code versions together, which it doesn’t have the time, inclination, or expertise to do.

Microsoft is betting users will upgrade to 64-bit PCs and yet continue to buy and run the old 32-bit version of Windows until it can manage to clean up the sticky bits and deliver a 64-bit, EFI savvy version of Windows for the mass market. (Daniel noting it's unlikely to happen by Windows 7 in 2010, but perhaps Windows 8 in 2013 will deliver what Mac OS X Leopard did in 2007.)

Apple’s solution?
  • Run whatever binary is appropriate to the existing hardware (processor architecture). This has allowed Apple to support both PowerPC & Intel hardware across a user base of ~20-27 million Mac OS X users since 2006.
  • Support the the conventions of running either 32 or 64-bit code.
    - If Leopard runs on a 32-bit Mac (e.g. 1st generation Core Duo machines) it runs the 32-bit binary.
    - If Leopard runs on a 64-bit Mac (e.g. the latest Core 2 Duos) it runs the 64-bit binary.
  • Universal Binary approach allows it to ship one edition of Leopard and will similarly enable one edition of Snow Leopard. In order to take advantage of 64-bit processor features, apps need to package a 64-bit version of their executable into their Universal Binary.

    The majority of Leopard system apps delivered by Apple are 32-bit. (64-bit apps shipping with Leopard being Xcode & Chess only apparently (and httpd (Apache’s daemon)). A move to 64-bit apps helps with apps that need to access large memory spaces e.g. apps working with large files or data sets. Moving to a 64-bit binary can make some apps run slower, as they have more memory to manage. Overall, shifting the entire OS to 64-bits has created an system wide performance. So, all system apps in Snow Leopard will be 32+64 Universal Binaries, unlocking more of the latent performance available in modern 64-bit Macs.

    (Tuning these apps to perform as well on 64-bit G5 PowerPC Macs would be a large investment for little upside, benefiting a relatively small number of G5 owners at the expense of diverting resources from optimizing the performance of today’s much faster 64-bit Intel processors. That helps to explain why Snow Leopard is dropping PowerPC support.)

    Snow Leopard is also moving to a 64-bit kernel (from Leopard's 32-bit kernel), an essential step in supporting more than 32GB of RAM. Developers will therefore need to deliver 32+64 bit versions of all their kernel extensions and device drivers. All plugins will also need to provide 64-bit support as well, including printer drivers. It'll be interesting to compare how the creation and introduction of these new drivers are managed in comparison to Vista...

    By Mid 2009, when Snow Leopard is estimated to come out, Apple will have sold ~8 million more Macs, most of which will benefit from 64-bit software support in Snow Leopard. With 15 million Intel Macs already sold, that would create a ~23 million Intel Mac installed base for which Snow Leopard would be most relevant.
    In comparison:
    - At the release of Tiger, there was a 16 million Mas in the entire installed base.
    - At the release of Leopard, there was a 22 million Mac installed base.
    - Currently there is a 27 million Mac installed base, 12 million of which are PowerPC.

    Thus the proportion of 64-bit Macs is quickly outnumbering the active number of PowerPC models. Omni Software reports that 83.5% of its customers are actively updating their software from Intel Macs, compared to just 16.5% on PowerPC Macs. A hint as to why Apple can drop PowerPC support in Snow Leopard. Snow Leopard is for where the state of play is in Mid 2009, and where the puck will be after then.

    64-bit models - Apple's LP64 vs Window's LLP64
    Windows’ 64-bit development model is based on LLP64, which is really a 32-bit model that uses 64-bit addresses.
    Apple's 64-bit LP64 model is not only more broadly compatible but is also more powerful.

    Why did Microsoft chose the LLP64 model? Microsoft essentially hoped to add 64-bit pointers to allow apps to access more RAM while retaining 32-bit integral types for compatibility with all of the legacy operating system constructs in Windows that made assumptions about 32-bit code. However, Microsoft should have been fully aware that 64-bit computing was coming a decade ago when it was doing its work on porting NT to the 64-bit Alpha and again in its efforts to port Windows to the Itanium IA-64. Why the great compromise of tacking on partial 64-bit support as an afterthought today?

    Whatever the reasons, Microsoft's choice of using "an oddball, legacy-limited version of 64-bit computing" whether the "result of malice or just plain incompetence" is opposite to Apple's choice.

    Apple's 64-bit model LP64 seen in Leopard and Snow Leopard use the same as the 64-bit versions of Linux, SGI IRIX, and other commercial distributions of Unix. That ensures broad compatibility with the 64-bit applications and libraries already available. Kinda useful.

    In "64-Bit Programming Models" (here, representatives of groups with expertise in 64-bit computing (including Digital, Hewlett-Packard, IBM, Intel, Novell, NCR, the original Santa Cruz Operation, Sunsoft, & X/Open) delivered a joint explanation of why LP64 is a better model for the future of computing than the LLP64 model Microsoft chose.

    The short answer Daniel gives: LP64
    • Supports easier porting of existing code
    • Supports interoperability between 32 and 64-bit computing environments
    • Has industry standard compliance for cross-platform interoperability
    • Has better performance
    • Gives a smoother transition from existing systems.

    The upshot? Microsoft and also developers trying to work with Vista, & Windows 7 will struggle with the transition to 64-bits. Apple will be furthering its lead in deploying 64-bit computing to mainstream consumers in a highly interoperable, no compromise strategy that can backwardly support existing 32-bit hardware. For end users, Snow Leopard will simply make everything faster when running on the latest 64-bit hardware. Apple is hiding a lot of planning and work under the guide that Snow Leopard is just about taking stock and refining Mac OS X.

    With this in mind, the 3rd myth is easier to unravel.
    A message seems to be that Apple has a superior design for 64-bit. However, they have yet to create 64-bit apps.


    Myths of Snow Leopard 3: Mac Sidelined for iPhone June 19th, 2008

    It's not like the iPhone is getting much attention, is it? With limited comments on Snow Leopard (due mid 2009) there is a myth that Apple is de-emphasizing the Mac as it focuses attention on the iPhone.

    Snow Leopard is marketed currently as “taking a break” from adding major new marketing features. But this is just a supposed lack of new features (maybe just a good case of unde sell, over deliver). "Software sells systems" ...

    Daniel's angle? "Apple postponed Leopard’s release on the Mac in order to prepare for the iPhone debut, not because it decided Mac sales weren’t important, but because Mac sales were through the roof and didn’t need Leopard to accelerate them."

    In comparison, look at bad-selling, over-selling and fraudulent mis-selling of Vista, e.g. the "Vista Capable" PC problem, an ongoing legal case currently, or XP sales going down as Vista (you can "buy" Vista, and then downgrade" to XP).

    Apple did have record unit sales in 2007. And Apple doesn't make too much money on OS sales, it makes money on systems, hardware. The iPhone might well have been rushed out, and needed more attention, but that hasn't taken Apple's long term focus off the Mac side of Apple either. With the iPhone released, Leopard sold well, and was as the author of the article said, "fashionably late".

    iPods and iPhones have helped sales of other Apple products, helped finance retail store rollout, and widened the potential audience for the Mac. They have driven buyers to the iTunes Store, and to Safari. iPod sales are still high, despite the addition of millions of new iPhones. No need to talk about cannabilism of sales.

    Also, the iPhone has also been reaching out to persuade Windows users to consider the Mac platform. Daniel didn't provide hard numbers, but i'd imagine they aren't too hard to come by - the number of people converted to Mac is growing, and the hardware sales are on decent profit margin devices (e.g. the cornering of the >$1,000 pound laptop market, the mp3 player market)

    Another effect is that as consumers and execs become iPhone users, this increases the audience for the upcoming App Store, which in turn feeds into the appeal for developers to work on apps for iPhone, giving them a taste of Obj C, and Apple’s Cocoa development tools.

    The R&D from the iPhone, iPod and iMac have all been "cross pollinating" in terms of technology as well. Knowledge can be passed between the device areas, and integration can be created.

    Apple's development of an entirely new interface paradigm for the iPhone OS, may well yet feedback to OS X. I'd think that the iPhone’s UIKit, and SDK will benefit the Mac OS X AppKit (e.g. adding the modern convention of properties as a way to simplify the class interfaces for the iPhone, and then adding properties to the desktop AppKit in Leopard.)

    Daniel comments that QuickTime X (on Snow Leopard) is another example of repurposing code retooled for the iPhone to provide a highly efficient media playback. The extensive work on developing push support for Exchange Server on the iPhone, will be included in other ways, as Exchange support baked into Snow Leopard. MobileMe's mayb well be helpful in terms of research done/tools etc. for Snow Leopard Server’s push services.

    Apple’s new Push Notification Service, allows iPhone & iPod touch users to set up server side notification alerts that don’t require any mobile applications to stay running in the background. Along with Bonjour discovery, PNS will keep iPhones wirelessly connected in all sorts of sophisticated ways that third party developers imagine in their applications.

    Why couldn't this be used in Snow Leopard too? The point is, the technologies Apple is working on is flowing back and forth within the business it seems, helping not only share and build upon existing ideas, but also create combined value that is greater than the sum of its parts.

    We'll have to wait for Snow Leopard. But seeing as it's slated to arrive before Vista's successor, and the fact that Apple at any time will be able to drop more hints to the consumer, and beta versions to developers, no-one should be worried Apple is forgetting about the Mac platform. Apple's aim is to do a few things, well.

    Myths of Snow Leopard 4: Exchange is the Only New Feature! June 20th, 2008

    Myths of Snow Leopard 4: Exchange is the Only New Feature!

    Well, to start, we can just look at the Snow Leopard and Snow Leopard Server pages on Apple.com, and see what is publicly listed as features...

    It helps Apple keep its work under the radar for a bit longer, and simplifies current marketing. Apple it seems has several reasons to promote the idea of "no new features", whilst promising overall improvements in how Mac OS X works under the hood (in a kind of "don't tell me how it works, just show it works" way).

    Apple has the opportunity to improve its code through:
    - code refactoring (Wiki definition: Code refactoring is the process of changing a computer program's code to make it amenable to change, improve its readability, or simplify its structure, while preserving its existing functionality. - Martin Fowler has apparently written in depth about refactoring)
    - Beyond code refactoring in it's strictest sense, optimising the code
    - adding new features

    From the sounds of Quicktime X, Apple will be doing a mix of things. It has the opportunity to make 64-bit versions of apps, optimise the apps, add new features, and also pare the app size down).

    (Aside in the article: Bill Gates was a big fan of "new" rather than "better" as can be seen by quotes from him- in an interview with Focus magazine in 1995, he explained why his company cared more about adding new features than refactoring code to fix bugs:

    “The reason we come up with new versions is not to fix bugs,” Gates said. “It’s absolutely not. It’s the stupidest reason to buy a new version I ever heard. When we do a new version we put in lots of new things that people are asking for. And so, in no sense, is stability a reason to move to a new version. It’s never a reason.”

    Ouch. New features were easier to sell than the concept of good software, so Microsoft took the low road. Touting features, keeping schtum about any lack of improvements under the hood.

    Consumers perception is part of the problem - Consumers happily pay for hardware, but hate having to buy software. "They are well aware that the hardware they buy will soon be replaced by a faster model with more RAM at perhaps a lower price, but when it comes to software, every new release that “only” fixes bugs is regarded as something that “should have been” offered for free." It is also typically much harder to track down and eliminate bugs than to simply tack on more new features.

    It is possible to sell quality to the consumer though, and in part, Apple can do this by not relying on OS sales for money, and not bothering too much about piracy of the OS - as Apple has a tight reign on the hardware. Apple will be able to inform users as to how Snow Leopard will be a better quality product, and show the doubting Thomas's the proof of the pudding - they'll be able to go into a store and see the improvements, and hear about them in reviews.

    Apple has the luxury of doing such things, as it isn’t facing an immediate need to out-feature Windows Vista. The company has announced that Snow Leopard will involve a lot of code refactoring to tighten up performance, improve reliability, and slim down disk consumption. The only new feature, according to Apple, will be new support for push messaging with Exchange Server. That isn’t exactly accurate however.

    In some respects, many of the new features in Snow Leopard can be regarded as a form of code refactoring because they will only improve how things work, rather than adding extensive new features. But there will also be a lot of new features that are just plain new.

    Apple will be hard at work driving home the point that the "just works" feeling on the iPhones and iPods also extends to the experience when using Macs running OS X 10.5/6.

    Why would someone want an upgrade, for something that works reasonably well? Well, with Snow Leopard, the new OS will be able to potentially show a decent performance benefit solely from the OS change, without needing any hardware updates, whilst also showing the performance bar of it's hardware as being raised significantly. Makes a change from people actually paying more, to "downgrade" to XP...

    Through iLife 09 and other applications, Apple can bring in many more features which will link in well with iPods, iPhones, and Macs.


    Myths of Snow Leopard 5: No Carbon! June 24th, 2008

    Is Apple killing Carbon so all apps will be Cocoa only? Not exactly.


    Carbon and Cocoa both compete and complement. Should it be ripped out, or slowly faded out? By shifting Carbon out of the frame, Apple can deliver a cohesive, consistent, and potentially more stable user experience while focusing its development efforts around a single strategy.

    Currently, Carbon apps include iTunes, Final Cut Pro, Photoshop, and huge assortment of other important apps. Many apps are a mix of both. Whilst pure Cocoa apps can offer a more consistent user interface using less code, and benefit from other features - i.e. represent better technology, the transition from Carbon to Cocoa isn't an overnight one.


    Cocoa is the modern incarnation of the object-oriented NeXTSTEP Objective-C (Obj-C) frameworks. Carbon is the extension of the classic Mac OS Toolbox; it was developed by Apple in order to pacify the complaints of existing Mac OS software authors during the development of Mac OS X after they rejected the move to Rhapsody, which would have essentially shifted Mac development to Cocoa in one great leap forward.

    It not being feasible to convince developers circa 1997 to write all their software over largely from scratch using a new approach and tools that demanded a significant investment in mastering new concepts. Also, NeXTSTEP’s desktop development tools and frameworks had been sitting in cold storage from around 1994 through 1997 as NeXT worked to repurpose its core technologies into developing web server applications in WebObjects - so developers would have had a hard time using those tools from a cold start.

    Apple needed to overhaul and modernize NeXT’s frameworks just as it needed to bring NEXTSTEP’s core OS foundation up to date with the latest software technology that had been delivered by the BSD development community over that period.

    Existing Mac developers obligated Apple to spend much of its efforts getting Carbon up to speed first before prioritizing updates to the new Cocoa frameworks. A large amount of functional overlap between the two APIs resulted in a hybrid model where most of the shared foundational core of Mac OS X was written in Carbon-like C/C++ libraries, and exposed as modern, object-oriented APIs using a layer of Cocoa frosting.

    In addition to software that had originated on the classic Mac OS and had been ported to native Carbon libraries, Mac OS X can also run POSIX software developed for Unix or Linux. Some of that software has an X Window System user interface (aka X11), which looks rather ugly and out of place on the Mac desktop, but can run just as it does on Linux thanks to integrated X11 support.

    Unix software without any GUI can be given one using Cocoa. That includes huge libraries of highly regarded code from OpenGL routines to the GNU FFmpeg media decoding libraries to BSD firewalls. When Apple developed Safari, it used an off-the-shelf, open source HTML rendering engine from KDE to produce WebKit, which it then wrapped in a Cocoa interface to deliver Safari as a Mac application. That modular design has enabled third parties to port WebKit to Windows, Linux, and even Nokia’s smartphones.

    Apple has also hinted at technology that would allow developers to access Windows DLLs to rapidly port device drivers or other specialized software to the Mac with little effort. The ability to take foreign software, whether open or proprietary, for use in creating native Mac OS X apps offers a look at how Carbon apps can migrate their user interfaces to Cocoa, resulting in user interface consistency and other benefits for users while resulting in less code for developers to maintain.

    Apple last year announced it would only be implementing a 64-bit Cocoa architecture, and not implementing a 64-bit Carbon architecture.
    Developers who need a 64-bit user interface will need to use Cocoa. This line in the sand enables Apple to focus its resources on developing a single object-oriented user interface API for the 64-bit future. Developers such as Adobe and Microsoft will need to either stay in the past or move decisively into the future (See Adobe's CS4 suite, Microsoft Office etc.).

    However, Apple plans to support and maintain the 32-bit Carbon Human Interface Toolbox well into the future, although it will not be adding any significant new features to those APIs. Snow Leopard will lead Carbon developers to Cocoa with carrots rather than just sticks - HICocoaView enables Carbon apps to add Cocoa features as an incremental step; Carbon apps will be required to adopt a Cocoa user interface entirely, and whilst doing so, Apple will encourage deveopers to consider adopting the Cocoa frameworks for other parts of their apps as well.

    An example is Apple's own Finder in Leopard - largely a Carbon app, but it makes use of HICocoaView to embed Cocoa NSViews, such as when displaying CoverFlow. For Snow Leopard, the Finder would therefore apparently require building the entire user interface in Cocoa. Apple has indicated that will be happening in Snow Leopard. Therefore, the company is well aware of the effort needed to move to Cocoa, and is starting to lead by example.

    There is still a murky area, neither Carbon, nor Cocoa it seems (Core Video, Quartz), amongst other things. So there is still work to be done on these areas too. Slowly raising the bar on what amount and parts of an app should be Cocoa at minimum, Apple is clearly pushing developers toward Cocoa. Meanwhile, Apple continues to support legacy code. Office 2004 was written as a PowerPC CFM app, which requires Apple to host it on top of CFMApp, which itself runs on top of Rosetta on Intel Macs. It will continue to work as expected in Snow Leopard. Anyone who likes to say that Apple “doesn’t support legacy” hasn’t looked too hard at what Apple has done to jump through hoops so Adobe and Microsoft wouldn’t have to bring their old code into the modern world.


    Myths of Snow Leopard 6: Apple is Out of Ideas! June 27th, 2008

    An article touching on aspect Daniel at Roughlydrafted.com's already talked about in previous articles.

    Snow Leopard doesn't indicate Apple is out of ideas for new applications and features - it indicates it's not willing to promote and advertise features and applications it doesn't want to talk about yet.

    Marketing. Jobs and Apple aren't giving away their grand views of the road ahead, unlike Microsoft.

    Another aspect is the strange notion that having a list of new applications and features is better (maybe a hangup from drinking Microsoft Kool Aid) rather than wanting features and applications only on merit - only if they're useful, and worthy enough to be included.

    From what can be read between the lines of the known confirmed Snow Leopard information thus far, Daniel makes the assertion that Apple has laid out a cohesive strategy for strengthening Snow Leopard’s performance and its suitability for running the next generation of software on the next generation of hardware. As Daniel says, clearly "Apple is being lead by engineers, not just clever marketers."

    OpenCL, Grand Central, LLVM, ZFS, CUPS, Quicktime X to name the advances that have been published -marketers can get to work after the work is done...

    Apple's continuing investment in enabling technologies seems to be going to pay off again, when Snow Leopard rolls out. "When viewed within context of technology cross pollination with the iPhone, Apple’s Pro Apps, its consumer app suites, and its expanding role in online subscription software, it’s clear Apple is not running short of ideas. As for Snow Leopard, there’s still a lot to be revealed."


    From the comments:
    "OS X made the iPhone possible. The iPhone feeds OS X both financially and in feature demands. OS X matures further still as a desktop and a handheld platform par excellence. What’s not to like!"

    A line of thought coming from the comments and other articles, is the possibility of Apple championing portability of the OS with Snow Leopard - e..g to have the OS on an SSD to give performance gains. The possibility of 10.6 being a (mini) "code review".

    "It’s nice to hear, read, and see Apple increasingly described as an engineering firm within the “Halo Effect” realm. The end-user products naturally garner the deserved accolades — design aesthetic, ease of use, ergonomic attention, stability, and the intangible sensory experience — from its consumers."


    Myths of Snow Leopard 7: Free?! July 1st, 2008

    Why $129?

    Selling Snow Leopard for Less Would Make Selling 10.7 at Regular Price Rather Difficult.
    - If Apple sold Snow Leopard at a steep discount as an apology for not adding fluff features, it would deflate the perceived value of Apple’s operating system software.
    - The main group to benefit from Snow Leopard will be owners of recent, 64-bit Macs who are likely to willingly pay full price to fully unlock the power of their existing hardware.
    - Everyone else is just as likely to just wait for Snow Leopard until they buy their next new Mac and are able to take full advantage of its advances.
    - Keeping the retail price of Snow Leopard unchanged wouldn’t help set any new sales records for a reference release of Mac OS X, but would help induce sales of new Macs, because buyers would think of new systems as including an additional $129 of software for free.

    Apple doesn't make much money from OS software sales. Apple, unlike Microsoft
    - does not sell bundled licensing to other hardware makers.
    - is forced to actually deliver a product that is good enough to convince the market to go out of its way to choose to buy it.
    - can't coast on a software licensing model like Microsoft’s (which has allowed MS to continue making money on sales of Windows XP for years despite minimal feature enhancements over the last half decade.)

    - has to work harder to add value and differentiation to the company’s OS software.
    - has to work hard to trumpet the retail interest in Mac OS X at every release

    4Q 2007 - Apple brought in $9.6 billion, almost entirely from Mac and iPod hardware. It "only" earned $170 million from sales of Leopard that quarter.
    1Q 2008 - Retail box OS sales quickly dropped down to $40 million.

    There are no compelling reasons to lower the price of Snow Leopard - Apple doesn't need to induce volume sales to broaden its installed base, and has no direct rival that it has to compete against.

    Apple would rather you buy a new computer, than give away Mac OS X. Most of Snow Leopard’s features announced so far exploit the potential of new and forthcoming hardware. The primary purpose of Mac OS X is to distinguish Mac hardware from PCs. Selling it at retail only helps Apple pull in some extra revenue from users who are not ready to buy new hardware.

    The alternatives to buying a Mac OS X upgrade at retail?
    - Not upgrading at all
    - Buying a new Mac
    - pirating a copy.

    It now makes no sense for Apple to give away its development work - Mac users who aren’t going to upgrade unless the software is nearly free are not worth Apple’s attention. They are likely to just steal it anyway.

    We see it with Microsoft, and it happens with Apple. But Apple doesn't go all WGA on us. Apple doesn't really police Mac OS X licensing with DRM, activation procedures, or spyware because it only sells to premium customers rather than trying to tax the entire PC market.

    The majority of Microsoft’s customers are thieves that would only pay for Windows if they had no choice. A fair percentage of Apple's customers probably use a non-licenced version of the OS too.


    The key benefit Apple has marketed in Snow Leopard so far is Exchange Server support. But there will be more benefits to come, and any current Mac Pro, or MBP, or any version prior to Snow Leopard will be definitely able to receive a decent performance boost from it, if Apple's enhancements bear fruit.

    Exchange - Microsoft charges Mac users $500 (a whopping $350 premium over the regular version) for the version of Office 2008 that includes support for Exchange. Why? Microsoft knows that the organizations who have chosen Exchange are not price sensitive. Those customers already pay absurd licensing costs for its server and client access licenses, so they are likely to also shell out crazy amounts of money for a slightly less awful version of the Entourage Mac email client.

    If Microsoft can get away with charging businesses and education users $500 for Exchange support in Office 2008, Apple will have no problem selling those same customers an overhauled operating system that adds Exchange support for Mail, iCal and Address Book for just $129.

    What about home users who have no need for Exchange? Outside of those that want to buy every new release, that segment of the market is unlikely to buy Snow Leopard. We know this because they largely didn’t pay for Leopard.

    Who Bought Leopard?
    Only a minority of Mac users will actually upgrade at retail. Then a number will upgrade via a nbon-licenced copy, and a large number will upgrade via hardware purchases.

    (Consider the Leopard launch. Apple’s $170 million in Leopard revenues reported in its debut quarter is only enough to buy 1.3 million copies at retail price. A third of retail packages were family pack versions, meaning Apple actually sold fewer boxes than that at full price. Of course, lots of those retail boxes where sold to retailers at lower wholesale prices and then marked up by the retailer.)

    Apple reported selling 2 million copies of Leopard in the first weekend. It did not continue to report how many additional copies it sold after that initial figure because Apple didn’t want to highlight the fact that most of the people who bought Mac OS X in the quarter did so over the first weekend. That weekend figure also probably included shipments to stores, further padding the number with marketing muscle.

    More recently, the company indicated that of the 27.5 million installed base of Mac OS X users, 37% are running Leopard. That would be 10.1 million Macs running Leopard. Apple has sold roughly 4.6 million new Macs in the last three quarters with Leopard pre-installed. That means “only” 5.5 million Macs have been upgraded to Leopard.

    But Apple didn’t earn something like $709 million by selling 5.5 million boxes for $129 or more. It only reported $210 million in total revenues in Leopard sales over first six months, and has sold less than $40 million worth of Leopard since then. That’s less than $250 million in total retail software sales. Clearly, a lot of retail boxes are getting applied on multiple Macs using the family pack or are simply being installed on multiple Macs contrary to the license agreement. Big surprise: lots of people are stealing Leopard.

    So of the 27.5 million Macs that perhaps could be using Leopard, “only” 37% have been upgraded, and about half of those got Leopard by buying a new Mac. That’s great compared to the percentages of retail software upgrades for Windows, but indicates that setting a lowball price for Snow Leopard wouldn’t have a major impact on new sales; it would only leave money on the table that Apple could otherwise earn from a reasonable charge for its software work.

    User comments:
    "The family pack is one of the best consumer loyalty deals going."

    "It’s true that as a pure feature [Grand Central], it’s not going to queue-up customers at the stores on release day. However, once the pro apps and even iLife are upgraded to use GC, we’ll see some seriously increased demand."

    "In the most recent 4 quarters Apple has sold nearly $2 billion in software. As a percentage of overall revenue it’s only about 6%. I don’t know if I’d say Apple doesn’t make much on software. $2 billion makes them one of the worlds largest software makers."

    "I think that there’s an interesting timing issue here. Right about the time that Mac requests will start to hit corporate IT departments like a tidal wave, IMHO, there will be this new version of the system that is “Exchange Server Compatible.” [echo, echo, echo] Corporate IT managers seem to only recognize the word Exchange on packaging so they will wipe their brow, with their Vista logo towel, and purchase a bunch of licenses for those pesky Mac people. We might see a huge bump in sales if only because it says EXCHANGE on the box."

    "What’s important to remember is this eternal truth of operating systems and hardware: it doesn’t matter how ideal you think either one of them is, or both together: without end-user applications that people want to use and actually do use for their desires and needs, the hardware and the software are only useful for consuming resources in terms of time, money and energy, and contribute essentially nothing to humanity. If the OS is too hard to work with, it won’t gain enough traction: if the hardware is too hard to work with, it won’t gain traction: if the hardware is too expensive to work with for those needing it, it won’t gain traction: if the software is too expensive for people that want/need it, it won’t gain traction. Apple has past examples that demonstrate all of those to some degree or another, and those examples occupy landfills. But, that’s just Apple, and they’re far from the only company with such things in their histories. So, too, was the other company that Jobs brought about in the same field: sure, was great hardware and software, but… well, fortunately, it’s been brought back into the fold to where the unwashed masses can afford to take advantage of it, after a long delay and a lot of work to make it more management-friendly for mere mortals."
 

fluidedge

macrumors 65816
Nov 1, 2007
1,365
16
WOW great thread - is this your masters thesis or something! :p

Will watch this thread with interest. I'm really excited about Snow Leopard - I think taking a step back from adding things like spaces and stacks (which are nice but you don't REALLY need) and concentrating on making the most stable OS apple have ever made is a great idea. I will happily pay $129 or more (upto $159) if it means an increase in performance, stability, productivity etc

great thread
 

t0mat0

macrumors 603
Original poster
Aug 29, 2006
5,473
284
Home
@Big tree - good point! Will edit in the numbers.

I tend to write long - to be honest, it's an area i'm just looking at, but it's so darn hard to find decent information beyond puff pieces, I thought i'd collate what I found, simmer it down a bit, and at some point, make it readable!

I'll sort out the above post 1st, as Intel's Larrabee is interesting but techincal, and CUDA, NVIDIA and OpenCL need a decent run up.

But for now -

Snow Leopard & gaming

id Software CEO Todd Hollenshead's recent interview with Kikizo kinda sounds like there may be an actual shift going on here - get iPhone games sorted, and well, look at the stats on a potential Snow Leopard box - It's running Intel,it might have a decent GPU due to Grand Central and other parts of the OS.

Apple has a historic problem with gaming and games development - a lack of commitment from Apple, a lack of following through on promises. Todd Hollenshead, CEO of id Software and Rage's lead designer, Matt Hooper are in the interview here

In the interview, Todd agreed that id is in competition with Epic (http:// and from a licensing standpointhe seems them as "quite frankly the industry leader". As the The Doom 3 engine [id Tech 4] was underused/under-licenced, you'd imagine id wants to push it.
Gaming is an "technology-driven industry".

id doesn't have a PC art team, a 360 art team and a PS3 art team (and a Mac team) - it's the identical media working across all platforms. The benefit? Todd uses Quake Wars as an example - Splash Damage in the UK working on the PC, Nerve working on the 360 version in Dallas, Activision working on the PS3 version in California. If they can do the same they do to art work, to the game as a whole, they have a winner. (And the Mac benefit - a port of a PC level game). Luckily id have John Carmack on the technology side.

Kikizo:
When we last spoke to Gabe Newell, one of the interesting things we discussed was the Mac. His view is that the people in charge of games at Apple change jobs every couple years and that there's no consistency or they don't take it seriously enough. But here we see you have this running perfectly on Mac. Would you agree with his comments on Apple gaming?

Todd: It's a true comment. I think historically, Gabe is absolutely right. The Apple guys will probably frown to hear me say that, but I mean there are facts and there are facts [laughs], and the fact is, that over the years Apple has shown an interest in gaming and then not followed through on it. Certainly our hope is that they are going to follow through. I do think they have made a significant investment... Jobs had a limited amount of time [at his WWDC 07 keynote] and John Carmack isn't the kind of guy who's going to get up on stage just to try and please Steve Jobs. John has his own ideas and he's his own guy, and even the persona of Steve Jobs isn't going to work on John very well, if at all! But if Steve had games on his show; not only did he give time to id, he gave time to EA, and I do think that it demonstrates at least a commitment at a high level to sharing the platform's face, if you will, with games.

But I mean, it is about the follow up. Now Apple was great to work with us; we were in some dialogue and they asked what we thought of having it on Mac, they sent some engineers down and they made a commitment about drivers and how they were going to support this stuff in the future, and I certainly hope they follow through on it, because with the hardware now, you're not having to deal with this weird Power PC architecture; they have Intel chips and all that stuff, and it does make it a whole lot easier for us to work with it. I don't think that they're hamstrung at a performance level - they don't have to create these weird Apple-only benchmarks.

Kikizo: What will the trend in gaming be for the next ten years?

Todd: I still believe that the industry over the next ten years is going to be driven primarily by technology, and I think that it has been since its existence over 25 years or however long you want to say it's been around. The chief innovations have been enabled by the rapid pace of technological progress on the hardware, and then by what engineers like John and others have been able to do on the software side. And I think that is the enabling factor that allows us to do all these things like a higher art form so you're not just moving white blocks around a screen or chasing dots through a maze. All that stuff is fun, but when you talk about emotional aspects of games, or better storytelling, or more interactivity in the environments, just more visual richness, all these things are constraints put on the industry that we work within. So I do see the future being driven by what the pace of technological change is, and when John talks about that stuff - and he's been right for fifteen years so I'm not going to swim against him on stuff like this - when you see what he's done here at the texture level, and being able to make the perfect level - glorious, unique, huge, vast and at the same time incredibly detailed - and then when he talks about the next horizon is geometry, I think you start to talk about things that you've really only been able to do with massive offline render farms, being able to be done and realised in real time. What sort of characters, worlds and interactivity you can develop, is being driven primarily by what technology enables. So it's a technical question, but I think ultimately the answer is that the industry will be driven by that. It will allow the artistic side of the industry to shine through.

Commenting on working together on a recent game (Rage) Hollenshead notes, "Apple was great to work with; we were in some dialogue and they asked what we thought of having it on Mac, they sent some engineers down and they made a commitment about drivers and how they were going to support this stuff in the future."

"With the hardware now, you're not having to deal with this weird Power PC architecture; they have Intel chips and all that stuff, and it does make it a whole lot easier for us to work with. I don't think that they're hamstrung at a performance level."

Is the iPhone turning the corner for gaming on the Mac? Rage runs on id's "id Tech 5" engine which will power Doom 4, and the Mac will have a version.
 

Erasmus

macrumors 68030
Jun 22, 2006
2,756
298
Australia
I hereby award you a PhD in Software Analysis (Apple) from the University of Corrin.
Great thread!
 

richard.mac

macrumors 603
Feb 2, 2007
6,292
4
51.50024, -0.12662
OMG t0mat0! thankyou so much for this! you definitely have a great interest in this as its seen in your writing. well done.

its so long im going to have to read it tomorrow. :sleep:

⬅ check out my avatar hehe :D
 

ihabime

macrumors 6502
Jan 12, 2005
480
0
You really should wrap all the text you borrow in quote tags instead of copying and pasting whole articles from roughly drafted, wikipedia, etc. Selectively putting some text within quote tags implies that the rest is your work. When clearly it is not.

Posting a link roundup is fine, even selected quotes, but wholesale ripping of entire articles does a disservice to the authors who actually did the research.
 

smogsy

macrumors 6502a
Jan 8, 2008
592
1
Thread of the Month!!

Just Keep Using Them Dated Titles To Spilt Data So Its Easier to Read :D
 

t0mat0

macrumors 603
Original poster
Aug 29, 2006
5,473
284
Home
Hello all - thanks for the positive feedback :)

Very briefly to respond:
- It's a mashup style. Anything taken verbatim is in quotes hopefully, I try and copiously reference, copiously source, and copiously go off topic. Anyone who followed the iPhone thread will attest this ;)
- I take your point ihabime - but as i'm mixing up so much sources, it's not practical to use quote tags all the time - it'- look worse than it does already. I'll try adding more inline pictures to break things up, but until macrumors runs on the Snow Leopard WYSIWYG Wiki system, this it it :) (Anyone know of a better free image host that imageshack? i'm looking for something that'll keep the pictures on for a while)
- Is it doing the sources a disservice? Hopefully it isn't taken that way. I understand the level of research that goes on, and hopefully by linking to the works, you can get that too.
- Posts get edited. (like this one, i'll think of something to put here - I try not to bump, so posts are more likely long than copious).

A visual overview of Mac - If you know your shapes, you won't even need to enlarge. The past at the top, the present at the bottom.
From the left PowerMacs/Mac Pros, PowerBooks/MacBook Pros, iMacs, Bits and bobs, iPods & iPhones. Mice at the bottom, because it's a great example of design change. It's not the best, but it's a top to bottom version of this with both coming from the fine work from tofslie.com[url], in particular the ground work image [url=http://tofslie.com/work/apple_evolution.jpg]here.
(No I didn't ape it from Fake Steve Jobs - i've referenced it previously). If you're pattern watching, Jobs was absent from Apple from 1985 to 1997; my personal favorite part is the mice; and the ACD's are crying out for an update).

For games, the visual side, you can do management of dual or 3 cards rendering: e.g. supertiling, where the graphics to be shown is processed in tiles, I guess a version of split-frame rendering, alternate frame rendering etc.


So what is happening on the OS X 10.6 'Snow Leopard' Seeded to WWDC Developers thread?

Pretty much as above really. PPC support issues, cost, usefulness, new under the hood changes, features etc.

Initially really confirmed at WWDC, the initial round of information on the upcoming OS X 10.6 came[url] [url=http://www.apple.com/server/macosx/snowleopard/]out, and a seed given out to developers at WWDC June 10th, this year, with OrchardSpy giving a build number of 10A96.


With the work on taking advantage of multi-core multi processors, Apple will then be able to, via consumer upgrade to Snow Leopard, take advantage of Penryn and SSE4.1. Hopefully Apple's input via Grand Central, OpenCL etc will reduce the delay in parity between hardware and software and game software.

The shocker? If Apple does it well, then Apple could actually once again, create some of the fastest Vista machines on the planet. A kick in the teeth to Sony, HP, et al. And then they can compare a new Mac running Snow Leopard, to one running Leopard, and a year old (2008) Mac running Leopard. I'd imagine those figures will be laudable. Seeing as Apple's gone the way of H.264, this should be a mighty boost - it'll make conversion of any non DRM'd HD, BR, DVD etc media to H.264, much much quicker, and will make the higher settings (multi-pass etc) more obtainable/default. Why do that? iPhone. Pure and simple. They'll be pushing the graphics soon enough to at least 720p, given that NVIDIA has Tegra coming, and Apple's chip for the iPhone now has a successor that does HD - the PowerVR SGX, over the older MBX.

Imagine if upgrading to Snow Leopard was akin to throwing a switch, where you apps suddenly became able to use all your cores, without having to upgrade all your software, or buy new versions (that might not be out yet - Adobe...).

What apps?
Quicktime, Photoshop, Final Cut, iLife, iWork, Shake, Logic Studio..
 

Attachments

  • Snow Leopard time line copy2.jpg
    Snow Leopard time line copy2.jpg
    293.8 KB · Views: 3,207

Jak3

macrumors regular
Jul 11, 2008
160
0
Man, Nehalem is quite an interesting architecture...but I won't worry about it too much and update my gaming rig with it in 2010 when the really massive amounts of cores start coming out

Imagine gaming on one of those with the decendent of the nVidia GeFore GTX 280 graphics card *drool*

Maybe by then we'll have Octo-SLI :D

If you find this stuff awesome, maybe you'll find this just as

http://www.guru3d.com/article/geforce-gtx-280-sli-triple-review-test/
 

t0mat0

macrumors 603
Original poster
Aug 29, 2006
5,473
284
Home
Man, Nehalem is quite an interesting architecture...but I won't worry about it too much and update my gaming rig with it in 2010 when the really massive amounts of cores start coming out
Imagine gaming on one of those with the decendent of the nVidia GeFore GTX 280 graphics card *drool*
Maybe by then we'll have Octo-SLI :D
If you find this stuff awesome, maybe you'll find this just as http://www.guru3d.com/article/geforce-gtx-280-sli-triple-review-test/

Nehalem versus the hardcore: A Core 2 Extreme QX 9770 Yorkfield(quad core Skulltrail rig) versus a Nehalem rig

Yorkfield is 2007's Intel Tick.
Nehalem architecture is Intel's 2008 tock
Westmere (32nm) being the 2009 tick

In Snow Leopards timescale, the QX 9770 is just an overpriced Yorkfield released a few months prior to Snow Leopard, that'll be made old hat by a late 2008 processor. By 2010 Nehalem will be old school.

In the article you give, doing 2 way then 3 way SLI - Guru used a C2X 6800 2.9GHz processor for 2 way SLI, but there was definite CPU bottlenecking going on. (Having additional cards creates extra CPU demands above and beyond the game's demands on the CPU). So in the 3 way SLI (Talking 3-4 grand rigs here) they used a Penryn based processor that could overclock - an engineering sample Core 2 Quad Extreme QX 9770.

So they used C2X 6800 for 2 way SLI C2QX 9770 (45nm) for 3 way - a C2X (Quad) Yorkfield chip - basically a dual 9770). And it rocked. It actually got Crysis running ~60 frames on the crazy good visual settings. But, and it's a big but, we're looking to Snow Leopard and beyond:

Indeed, we've already seen those figures: Looking at the stats for say an Intel skulltrail platform here

Built around server class hardware, it's the high end enthusiast kit, and have a power generator in the garden type machine. A mainboard supporting dual sockets to let you have dual Intel C2Q Penryn Xeon processors, with SLI graphics, some space for 4 PCI Express x16 slots. (e.g. Guru used a Intel(r) Desktop Board D5400XS with the Intel(r) 5400 chipset, two C2X Quad core QX9775 s and 2x2GB 800MHz FBDIMM, that had nVIDIA SLI support, and also AMD ATI Crossfire support).

So that's basically a dual QX9770.
The skulltrail mainboard is potentially an example of what could be done if you wanted to expand out your mainboard to let rip with the upcoming raised bottlenecks for Snow Leopard. Make a board that can take more RAM (FB-DIFF RAM), more sockets, more SLI slots etc.

As the article says, the Skulltrail rig gives early 90s supercomputer power, for about $4,000. And this isn't even using GPU for CPU! The stats from the article also show how non-multi-core aware/designed for applications bring the performance way down. (E.g. FEAR being single thread/single core minded, Crysis being 2 core max minded, COD4 as well. FutureMark 3DMark06 seems to show the potential, and also the performance enhancement levels that might be seen when Snow Leopard's enhancements for multi-core multi-socket kick in. Click here for a graph. Encoding, decoding, this will kick ass - it'll make turning HD media manipulation into a doddle. Kinda sucks up power though - getting on to 600W when really pushing it...

As the verdict said:

The one bothersome downside however remains, you can have a thousand CPU cores available, yet if your application / game is programmed to utilize only one or two CPU cores .. all other cores will not be used; and that's pretty much the reality for 90% of today's games & software applications..... [Review was in Feb 2008]There's no doubt about it, Skulltrail is the fastest platform on this globe (if you equip it with the right components). Running two high-end QX9775 processors on a 1600 MHz bus is an triumph by itself, extraordinary interesting. Looking purely at the hardware, a Skulltrail system is nothing short of astounding to have in your PC. But in all fairness, there's hardly a situation at this very moment where I can see an application using eight logical CPU cores (looking at it from a consumer point of view, not professional); and until that physically is happening you are dropping a too grand amount of money into a PC with a very small return of your investment.

Factor in the noise, only really mostly going 2 way for most people, power consumption and expense, and the potential to get this kind of power from the next generation chip at lower cost, power and noise just seems too easy a choice. Even at guru: "So I asked myself this question, would I buy a QX9770 (~1400 USD / 1100 EUR)over the ? And the answer is no." (due to cost, power usage, lack of support, etc).

Something like Skulltrail is an expensive premium, that'll be less powerful than Nehalem rigs at a lower cost, fairly soon.


How Nehalem compares with Skulltrail
 

t0mat0

macrumors 603
Original poster
Aug 29, 2006
5,473
284
Home
Pulled to a new post for clarity.

UPCRC-diagram.gif


Supercomputers, and Apple's forthcoming super computers
Guess i'll have to use "Powerful to the core" another time...

SI units, just to be clear:
1 Tera(e.g. 1 Teraflop (FLoating point Operations Per Second)
1 million = 1 mega = 1 thousand thousands– 1,000,000 i.e. 1,000 to the power 2
1 billion = 1 giga = 1 thousand million – 1,000,000, 000 i.e. 1,000 to the power 3
1 trillion =1 tera = 1 million million - 1,000,000 ,000,000 i.e. 1,000 to the power 4
1 Quadrillion = 1 peta = 1 thousand trillion – 1,000,000 ,000,000 ,000 i.e. 1000 to power 5

#1 of the Top 500 Supercomputers
Now: Blue Gene/L – capable of just under 500 trillion operations – 0.478 Petaflops (478 teraflops).
Very soon: IBM's new Roadrunner supercomputer which runs at >1.0 petaflops (>1,000 teraflops)

Roadrunner's power is equivalent to the PS3 Folding@home project world record (1 petaflop)

With supercomputers being assembled through standard racks of smaller servers, with high speed connections, why couldn't smaller supercomputers use Snow Leopard and off the shelf equipment (SAN, XServe, Snow Leopard Server, etc)? If you look at the extrapolation graph attached of predicted power of these supercomputers it's a straight line on a log graph. And they’ve been keeping pretty much on a straight line for a bit.

So what might be around the corner?

Intel

Intel's Tera Scale Computing’s 80-core silicon can do 1 Teraflop - at 1 teraflop, it’s at 3.13Ghz, and using only 24 Watts!
So 1,000 Tera Scale chips = 1.000 teraflops (i.e. 1 petaflop) =1x Roadrunner supercomputer

So, how many Nehalem Snow Leopard Macs does it take to make a Petaflop? Like eckers I have the stats to hand. Feel free to post what the flop potential is of a dual socket Nehalem Snow Leopard rig!

Where does the 80 core fit into Intel?

Intel's Tera scale Computing Research Program (TsCRP) is a worldwide effort. They're looking to scale multi-core architecture to 10s-100s of cores, embracing a shift to parallel computing.
It includes the 80 core processor department, and Universal Parallel Computing Research Centers (UPCRCs)

The hardware research vision:
  • Scalable multi-core architectures that integrate streamlined processor cores and accelerators with fast, energy-efficient, modular core-to-core infrastructure e.g. 80-core prototype processor, Tera-scale Emulator, Dynamic Thermal Management, Task Queues.
  • Memory sharing & stacking - e.g. 3D Stacking, Cache Quality of Service.
  • High Bandwidth I/O & communications balancing the compute demands with I/O & network demands within the platform power and cost budgets e.g. High-speed Copper I/O, Silicon Photonics, I/O Accelerators.

The software research vision:
  • Model-based applications e.g. Visual Computing that'll use use tera-scale capabilities More i.e. Ray Tracing, Physical Modeling, Media Mining, Enhancement
  • Parallel programming tools e.g. Transactional Memory, Accelerator Exoskeleton, Ct Data-Parallel Programming
  • Thread-aware execution environments that provide real-time performance & power management across cores and scale with increasing thread and core counts. e.g. Many-core Run Time, Tera-scale Virtual Machine

Intel has created 2 Universal Parallel Computing Research Centers (UPCRC), through working with Microsoft, University of California (UC) Berkeley & the University of Illinois (UIUC). They're there to look into new ways to program software for multi-core processors. Parallel computing brings together advanced software and processors that have multiple cores or engines, which when combined can handle multiple instructions and tasks simultaneously, but you knew that already :))

They are going to
- tackle the challenges of parallel computing (e.g. programming for processors with multiple cores), thus being able to carry out multiple sets of program instructions at a time
- accelerate accelerating developments in mainstream parallel computing, for consumers & businesses in desktop & mobile computing.
- advancing parallel programming applications, architecture and OS software.

Microsoft & Intel stumped up $20 million over 5 years, $8 million put up UIUC, & UC Berkeley is getting together $7 million. (As many as 20 universities - including MIT, Stanford competed for the funding)

Lots of news about the place. EETimes, Microsoft, NYTimes, Wiredand so on.

The industry is in a little bit of a panic about how to program multi-core processors, especially heterogeneous ones... To make effective use of multi-core hardware today you need a PhD in computer science. That can't continue if we want to enable heterogeneous CPUs
Chuck Moore, a senior fellow at AMD

(Sidenote: "The academic community has never really recovered from Darpa’s withdrawal" Daniel A. Reed, director of scalable and multicore computing at Microsoft, who will help oversee the new research labs - NY Times
Both Intel and Microsoft executives said the research funds were a partial step toward filling a void left by the Pentagon’s Defense Advanced Research Projects Agency, or Darpa. The agency has increasingly focused during the Bush administration on military and other classified projects, and pure research funds for computing at universities have declined. )

Image of Tera chips from here others from Chipzilla and the Supercomputer place...

Berkeley's project is looking at the >16 core market. So that's stuff for 2010 - not that far off considering! Uni's that didn't get the funding are still looking at the problem. So how the **** is Apple ahead on this? They didn't boast per se, they just hinted that they had everything under control, not to worry, and enjoy Nehalem. Win-Tel is pouring $2 million a year into this. You've got huge academic uni's like MIT and Stanford on the case - how could Apple be ahead - and from the noises Apple made - ahead by a fair bit? (Patent's rear their head again - if Apple is ahead like it is with the iPhone in some areas, boy is that going to be potential fun for them)

Power to the people - distributed processing

With an aim
to build China's most powerful supercomputer, The ICT, SSC, & DIIC Ltd have agreed a joint project to build the fastest supercomputer system in China, to turn on in December 2008. It's the trade off - If you wait just a few more months, your money will get you a whole lot more boom for your buck.

1920 nodes of Tyan S4982 quad-socket server motherboards & sprinkled liberally with (presumably) ~7,680 quad-core AMD Opteron processors. They want a peak of ~230 TFlop/s - Which would get you into the top 10 worldwide currently.

So doesn't Apple have a go helping get some computationally challenged, cash laden organisation, with some Intel chips, whatever dual/quad-socket server motherboards are going, and some Mac OS X Snow Leopard beta juice?

Another possibility is though of somehow testing the feasability of using standard Snow Leopard Mac Pros of consumers, ala Folding Project or SETI... With 3/4 of the world's fastest supercomputers using Intel, it isn't too much of an extrapolation it might seem. The rest being mainly IBM Power processors, and AMD Opterons. I wonder what a tripped up maxed out blinged up Mac system with Intel Nehalem or Westmere chips could do on Mac OS X SL...


//Bits
In terms of Nehalem chips, apparently, more information will come in August at Intel's development forum. (including such terms as High K+strained metal gates...)
Can compilers mitigate the requirement for in-order operation?
Hyrbid SLI and CrossFire support from AMD and NVIDIA?
If the GPU can execute x86 instructions they could find a place in clusters/supercomputers.
The 45nm Penryn chip being used in Nehalem is Gilo - not just 32nm shrink of nehalem but a completely integrated gfx core?
G45?
Memory on the chip (One of the strands of research at the Intel parks)


Edit: 22nd July 2008:

Virginia Tech has assembled a Top 100 Mac cluster - 29 Teraflops based on 324 Mac Pros, at the Center for High End Computing Systems (CHECS) within the Virginia Polytechnic Institute and State University.
Using a Mellanox 40Gb/s InfiniBand tech piece of kit to interconnect the machines. So it's possible to do it with Mac Pros currently.

Roll on Nehalem and Snow Leopard! The 1st comment on the 9to5mac page was that you could do this cheaper with GPUs (Fastra project, NVIDIA Tesla etc). However, this supercomputer could potentially be very easily able to ramp up it's power through sequential upgrades and an OS change. I'd imagine there will be a similar project, with Mac Pros that have mainboards capable of more upgrades.



Example of current Apple hardware usage
http://www.apple.com/uk/pro/science/universityofnottingham/

Research into ophthalmology and other aspects of the human body, at the University Hospital, Nottingham -
It has a 5-node Apple Workgroup Cluster for Bioinformatics. (4 compute-only Xserve G5 cluster nodes with 2GB of RAM each, plus an Xserve head node, which will soon be upgraded to hold 4GB of RAM. In addition to this, two Xserve RAIDs give the system a total of 4TB of storage with the hardware linked together by a gigabit Ethernet switch.)

For the raw power we get from the solution, it’s very good value. Especially with the addition of the Xserve RAID arrays for mass storage, which have been of great benefit to us — ultra-reliable and very easy to set up and to manage user access. I only have to restart the machine when there are system updates, so I consider the cluster to be very low maintenance overall”.

Issues? "Tighe has only once needed help from the AppleCare Premium Service and Support team, and they swiftly diagnosed a failed component and immediately arranged for a replacement to be shipped out to the university."

SL might very well help bring a new level of power to bioinformatics and other labs which have computational areas. My pet example would be X Ray Crystallography - what was once left running on an old server - can now be done quicker pretty much by a souped up Mac Pro. With the right tech, any given Lab could yomp ahead if it got into Snow Leopard early - Once you've got crytals, and data, a fair bit of time is spent crunching the numbers, through an iterative modelling process. It could be cut down and even outsourced soon!
 

t0mat0

macrumors 603
Original poster
Aug 29, 2006
5,473
284
Home
OpenCL and Apple

800px-Cray_2_Arts_et_Metiers_dsc03940.jpg

Oh yay, like my desk isn't cluttered enough as it is ;)

members.png


OpenCL
"OpenCL is going to give each of us the power of a true supercomputer on our desktops."

The Khronos Group founded in 2000, dedicated to creating open standard APIs to enable the authoring and playback of rich media on a wide variety of platforms and devices.

e.g. Working groups of The Khronos Group:
OpenCL ==> Open Computing Language - a cross platform computation API
OpenGL ==> Open Graphics Language - a cross-platform computer graphics API
OpenAL ==> Open Audio Language

OpenCL

OpenCL is a language for GPGPU based on C99, that Apple created with cooperation from others. C99 is a revision of the C standard (ISO/IEC 9899:1999 in 1999, hence the name) which has several amendments itself. There is another revision started 2007 - "C1x". Computing is kind of like this. OpenCL's purpose - to use the power of GPU for stuff beyond graphics, as we know.

The Khronos Group, is a consortium focusing on establishing open standard application programming interfaces (APIs). OpenCL was proposed to the Khronos Group to create "royalty-free, open standards for programming heterogeneous data and task parallel computing across GPUs and CPUs"

So the "Compute Working Group" was formed, with Apple, AMD (ATi) & NVIDIA as members. Others in the initial working group now include 3Dlabs, AMD, ARM, Codeplay, Ericsson, Freescale, Graphic Remedy, IBM, Imagination Technologies, Intel, Nokia, NVIDIA, Motorola, QNX, Qualcomm, Samsung, Seaweed, TI, and Umeå University as per macrumors article.

We know about OpenCL from the WWDC from the 9th. Snow Leopard information. From macrumors The Compute Working Group is to specifically evaluate & establish Apple's proposed OpenCL spec.

OpenCL, like CUDA, is looking "to enable any application to tap into the vast gigaflops of GPU and CPU resources through an approachable C-based language." Think OpenCL on handheld devices, not just desktop devices. It's going to be a big part of the architecture for Snow Leopard. A manoeuvre also to get an inside track on a open-standard by Apple, kinda like H.264? It means that developers aren't tied so much to a specific GPU manufacturer. From the press release:

A widely available open-standard compute programming specification with high-performance, general computation support and robust numerics will complement existing solutions and further liberate GPU-based compute power from the realm of graphics-only applications and provide a multi-vendor, portable interface for coordinating all the many-core GPUs and multi-core CPUs within a system. Such capability will have broad applicability - including a central role in the Khronos API ecosystem by providing a powerful compute front-end to OpenGL and OpenGL ES, and a platform for accelerating tasks such as physics and image processing / recognition.

Who's not in the HCWG? Microsoft. Microsoft not in an open-source project? Get away...

OpenCL will no doubt show up at SIGGRAPH in LA 11-15th August, in 3 weeks time.

Larrabee has moved to the next page :)
 

patseguin

macrumors 68000
Aug 28, 2003
1,685
503
That's got to be one of the coolest posts I've ever seen.

I have sort of a broad rhetorical question about SL. Is the "average" user really going to be compelled to upgrade to SL? Apple have admittedly put a hold on new features, so what is going to drive someone to go out and spend $129 on the upgrade? I certainly will because of the the under-the-hood improvements. If Apple really expect to sell this OS release, I think they need to offer at least a few new features. Dare I mention a new UI? The "illuminous" UI that was rumored before Leopard perhaps?
 

t0mat0

macrumors 603
Original poster
Aug 29, 2006
5,473
284
Home
10.6 has a very subtle UI change, more in line with the iPhone UI. I don't really see anything too big past that.

I'd imagine that they'd be integrating Core Animation, to a greater degree than Leopard, but most of that is pretty subtle. My bet would be for Snow Leopard in the testing, to be just like Leopard - features coming out as the betas go on, a bit like the iPhone's SDK. Anything going into the SDK for the iPhone might well be fair game for Snow Leopard. With the cost of a Hammerhead II chip, and the ease of positioning an antenna on a laptop in comparison to an iPhone, why not allow integration of GPS for example? There would be some level of symbiosis and synergy.

I'd imagine those who were using the developer versions of Leopard before it went Gold Master would be able to say whether there were any hidden features that made the consumer version, but weren't seen until pretty near the release date. Maybe some seasoned developers can lend us the knowledge of what the state of play was for the 1 year run up to Leopard, and how much we knew as the release date got closer...
Maybe a WayBackWhen trip might be useful to see the front page posts.

In terms of UI, would they change it too much? Leopard brought in Spaces. If I was to bet, it would be multi-touch incorporation, to a greater degree. I'd imagine we'll see where that's heading by the end of the year when the MacBooks and MacBook Pros have had their refreshes (Montevina, or even later next year with Nehalem). I'd imagine that Apple would want the multi-touch to be incorporated in the hardware sold a year before Snow Leopard was release, otherwise any additional features in this area would be "new hardware customers only" which would suck. I'd imagine that Snow Leopard would be bringing in features that Windows 7 has billed thus far. Could be a spurious guess, but i'd imagine a lot of thunder is going to be stolen come WWDC 2009.
 

PowerFullMac

macrumors 601
Oct 16, 2006
4,000
1
How do you always know so much about everything, t0mat0? :eek:

I am not sure if this update is really worth it TBH, but I will get it for free... Before it comes out... Coz my uncle works for a company that works closely Apple and gets all the betas and final versions... And gets payed for testing them! I got Leopard a day before it came out thanks to him! :p
 

t0mat0

macrumors 603
Original poster
Aug 29, 2006
5,473
284
Home
Ack, it's just building castles in the sky ;)

Is Snow Leopard worth buying? I'd say the question is, is it worth installing! What performance upgrade could you normally get from $129? (Or for free?) A stick of RAM? An updated CPU or two? I'd imagine that level of performance boost may come from Snow Leopard, the caveat being the more the cores, the bigger the boost on average. You can pirate the software, easily, but not the hardware. Case in point: 22nd August 2008 sees ~20 additional countries selling iPhones. I'd imagine a lot of hardware sales (laptops, peripherals (always a money printer) come off the back of each one.

Q3 2008 Financial Results
As Kim noted, Apple's 3rd quarter (Q3) 2008 Financial Results were published July 21st 2008.

-11 million iPods sold in Q3 2008. Up 12% Q3 2007.
- Largest number of Macs shipped in Q3 2008. Roughly 50% of the customers were new to Mac.
- Apple strayed from the "we don't speculate about future products" line, instead talking repeatedly about
  • "Future product transition" which may affect Q4 revenue numbers. So we know it's between now, and the end of Q4.
  • ~"delivering state of the art new products at prices their competitors can't match."
  • "we're busy finishing several more wonderful new products to launch in the coming months."

Arn: "Typically, "product transitions" represent existing product line revisions." The MacBook and MacBook Pro as shown in the Buyer's Guide
MacBook Days Since Update 150 (Avg = 192) as of 25th July 2008
MacBook Pro Days Since Update 150 (Avg = 194) as of 25th July 2008
(Feb 2008 saw Penryn based MB and MBPs released)

It's possible that both 3rd Gen Mid 2007 MBP (had an NVIDIA Geforce 8600M GT) & 4th Gen Early 2008 MBPs (has an NVIDIA Geforce 8600M GT) are affected by NVIDIA's problems
that all G88 & G86 are affected. No official word as to which MacBook Pros are affected as yet.

NVIDIA is going to come out with a driver to kick in system fans earlier to alleviate "thermal stress", but failure rates will be higher than normal... I'd imagine Apple will have a no fuss repair & return or replacement service, with NVIDIA paying the tab (about a 1/6th of it's revenue for Q2 2008).

As a larger point, Apple have done a lot of work already as to
- what will work with Snow Leopard's OpenCL, Grand Central.
- what they'll provide for the iMac, MacBook, MacBook Pro, Mac Pro...

Wonder if NVIDIA has the same fault on other graphics card lines?


Platforms

July 2008 - Intel announced the Centrino 2 (Montevina) platform

Montevina platform consists of
- Faster Penryn C2D processors (2.26/2.4GHz 8000 series, 2.53/2.8GHz/3.06GHz) (C2Q coming later in 2008)
- Faster bus speed (1066MHz FSB from 800MHz)
- Faster integrated graphics (GMA X4500)
- Optino of of WiMax support (Intel got delayed by this)

Apple may follow tradition and not adopt the whole of the Montevina platform e.g. not support WiMax.

TG Daily's thoughts? Intel changes the name, but for no real reason. It's a refreshed 45nm Penryn CPU, new integrated graphics chipset & a wireless chipset.
So the platform is new, but the Penryn architecture isn't. The new Nehalem architecture is scheduled for a Q4 release. It'd be 2009 before notebook Nehalem come in i'd imagine.

Chipset: 45-series apparently. More on chipsets when i've found the HP.

Graphics:
Introduction of "Intel switchable graphics" - the ability to switch from between discrete & integrated graphics on the fly, working with both AMD (ATi) & NVIDIA graphics. Only useful to Apple if you had both graphic types in a laptop (e.g. MacBook has been always been integrated, MacBook Pro/Mac Pro discrete)

New wireless 5000 series chipsets:
5300 version form-factor for regular notebook sizes
5100 version smaller form-factor for you guessed it, more compact notebooks
Both support WiMax. A resounding howler in Australia from some reports...

Worth potentially holding out for a system that can drop in or contain Nehalem if you can - it's not like there aren't waiters on these boards...

Seems 2009 will be Intel's CPU year, but a contest for graphics integrated & discrete when put against AMD's Puma platform, and what NVIDIA can churn out, that doesn't blow if it gets hot...
(Interesting to note that the FDA held up Montevina due to WiMax and integrated graphic chip failures). Isn't just NVIDIA.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.