Tiger and Core Image
I'll throw my two cents in about how I feel about Tiger and Core Image as a Mac mini owner and a programmer. I just bought a Mac mini about six weeks ago. I will not get Tiger for free. I tried my SN on the update program website with April 12th as my purchase date, but it didn't work. I'm not upset though. I knew Tiger was around the corner when I purchased my mini. I'm happy to have been using it for the past several weeks. In all, I will have had two months of usage while Panther is the latest and greatest. I will probably be upgrading to Tiger in the next 2-3 months. I'm not in a hurry, I'll wait to see if there are any critical bugs and upgrade at 10.4.1 or 10.4.2 - ".0" releases can sometimes cause a little pain.
As for Core Image, my mini's GPU is not supported. I had assumed it wouldn't be when I bought my Mac mini. I knew this... well all knew this because the previous CI GPU list didn't include the 9200 when the Mac mini came out. If that pisses you off, you shouldn't have bought a mini.
I don't think CI will be that big of a deal for average users. This is mainly for graphic designers/programmers and gamers. There will be some new graphics capabilities, but I probably won't use them that much as I don't do much image/video editing (just iLife periodically) and I only game on my PC.
I'm a programmer and I understand what an API is - Jim, you're explanation that CI was an API set was helpful, I didn't know exactly what the technology was before. Apple has a way of describing things in "marketese" on their site. If this is the case, it would seem to be equivalent to MS's DirectX stuff. There are GPU's that are optimized for DX8 or DX9 and those perform better when playing games written with DX8/9. In the PC world, if your machine is running like crap on new games, you buy a new GPU that is optimized for DX 8/9. I'm don't use any pro image/video apps on PC so I'm not sure if they utilize DX, but they probably do.
So, here is my opinion, CI will probably only initially get used by pro image/video editting apps and graphics-intensive games, just like DirectX. If you want to use those apps/play those games, you need to get hardware that is capable. This would not be the mini, a budget machine. It would be a PM or at least an iMac. If, as SilconAddict predicts, many developers start incorporating CI intensive code into all their apps they are being stupid. If you are making a simple word-processing program, or some run of the mill application that you want to run on systems of varying specs then you need to program to the lowest common denominator. Otherwise they are minimizing the market for thier application.
As a programmer, I don't work with graphics much, but when I code for Windows apps, if I want my app to run on Win98 I cannot use certain Windows APIs that are only available with WinXP. Sometimes, I code around it using other APIs to accomplish the same task (usually more work) or sometimes the technologies you want to use are simply not there. Why do you think iTunes for Windows only runs on Windows 2000 and XP. Windows 9x doesn't have services support built-in. iTunes uses services for iPod detection and mounting on Windows. There may be other technologies they are using that are Windows 2000/XP specific too, not sure. Apple choose not to support the Windows 9x market, which is still fairly large - I have many friends who would like to run iTunes, but cannot because their OS is out of date. If they really want to use iTunes they will upgrade their OS or computer. This is the nature of technology, plain and simple.
I don't believe it is fair to blame Apple. In the role of Core Image, they are providing a core technology for their operating system. This is for the benefit of application developers and in turn, end-users. It is up to the development community to use that technology responsibly within their applications. Yes, all programmers could start littering their apps with CI that would run the mini's CPU into the ground, but that would be irresponsible and people would stop buying those apps - which is why application developers will only incorporate CI if it is the right thing for their market (graphics design, games, etc). New technologies have to be forward-looking, it would be stupid to design the technology for older (current low/mid-end) hardware. You design the technology for the latest hardware because by the time the technology is adopted and really gets used by developers the "then-latest" hardware is now more mainstream and more/most people get the benefits.
So if anything, I would fault Apple for buidling so few computer lines with upgradable GPUs. I really wish Apple had a consumer level machine that was truly upgradable. Only the PMs have the flexibility of upgrading after purchase. Yes, you can do some things to your iMacs, mini's and 'books after purchase, but not much at all for someone coming from the PC world. I understand that Apple is constantly balancing flexibility and upgradability with the style and deisgn of their machines. I'm glad they finally gave in enough to offer a consumer/budget-level headless Mac and that was enough concession to get me to buy a Mac. (I hate AIOs!) Honestly, I would rather NOT have it be "mini" but instead be "midi" with an AGP and some PCI or even PCI-E slots, more RAM slots and an extra bay or two. Apple could do this with a stylish case. This doesn't have to be top of the line - put a G4 in it, put an older chipset with slower bus in there, I don't care. Expandability and flexibility shouldn't have to come ONLY with the bleeding-edge technology.
And I'm not some rich PM user throwing money down for the latest and greatest every few months. I'm not going to get the full benefit of CI, but that's okay. I'll get some benefit now and when I get better hardware in the future, which is inevitable, I'll get more use from CI then. This is how technology evolves - it is supposed to be this way.
I'll throw my two cents in about how I feel about Tiger and Core Image as a Mac mini owner and a programmer. I just bought a Mac mini about six weeks ago. I will not get Tiger for free. I tried my SN on the update program website with April 12th as my purchase date, but it didn't work. I'm not upset though. I knew Tiger was around the corner when I purchased my mini. I'm happy to have been using it for the past several weeks. In all, I will have had two months of usage while Panther is the latest and greatest. I will probably be upgrading to Tiger in the next 2-3 months. I'm not in a hurry, I'll wait to see if there are any critical bugs and upgrade at 10.4.1 or 10.4.2 - ".0" releases can sometimes cause a little pain.
As for Core Image, my mini's GPU is not supported. I had assumed it wouldn't be when I bought my Mac mini. I knew this... well all knew this because the previous CI GPU list didn't include the 9200 when the Mac mini came out. If that pisses you off, you shouldn't have bought a mini.
I don't think CI will be that big of a deal for average users. This is mainly for graphic designers/programmers and gamers. There will be some new graphics capabilities, but I probably won't use them that much as I don't do much image/video editing (just iLife periodically) and I only game on my PC.
I'm a programmer and I understand what an API is - Jim, you're explanation that CI was an API set was helpful, I didn't know exactly what the technology was before. Apple has a way of describing things in "marketese" on their site. If this is the case, it would seem to be equivalent to MS's DirectX stuff. There are GPU's that are optimized for DX8 or DX9 and those perform better when playing games written with DX8/9. In the PC world, if your machine is running like crap on new games, you buy a new GPU that is optimized for DX 8/9. I'm don't use any pro image/video apps on PC so I'm not sure if they utilize DX, but they probably do.
So, here is my opinion, CI will probably only initially get used by pro image/video editting apps and graphics-intensive games, just like DirectX. If you want to use those apps/play those games, you need to get hardware that is capable. This would not be the mini, a budget machine. It would be a PM or at least an iMac. If, as SilconAddict predicts, many developers start incorporating CI intensive code into all their apps they are being stupid. If you are making a simple word-processing program, or some run of the mill application that you want to run on systems of varying specs then you need to program to the lowest common denominator. Otherwise they are minimizing the market for thier application.
As a programmer, I don't work with graphics much, but when I code for Windows apps, if I want my app to run on Win98 I cannot use certain Windows APIs that are only available with WinXP. Sometimes, I code around it using other APIs to accomplish the same task (usually more work) or sometimes the technologies you want to use are simply not there. Why do you think iTunes for Windows only runs on Windows 2000 and XP. Windows 9x doesn't have services support built-in. iTunes uses services for iPod detection and mounting on Windows. There may be other technologies they are using that are Windows 2000/XP specific too, not sure. Apple choose not to support the Windows 9x market, which is still fairly large - I have many friends who would like to run iTunes, but cannot because their OS is out of date. If they really want to use iTunes they will upgrade their OS or computer. This is the nature of technology, plain and simple.
I don't believe it is fair to blame Apple. In the role of Core Image, they are providing a core technology for their operating system. This is for the benefit of application developers and in turn, end-users. It is up to the development community to use that technology responsibly within their applications. Yes, all programmers could start littering their apps with CI that would run the mini's CPU into the ground, but that would be irresponsible and people would stop buying those apps - which is why application developers will only incorporate CI if it is the right thing for their market (graphics design, games, etc). New technologies have to be forward-looking, it would be stupid to design the technology for older (current low/mid-end) hardware. You design the technology for the latest hardware because by the time the technology is adopted and really gets used by developers the "then-latest" hardware is now more mainstream and more/most people get the benefits.
So if anything, I would fault Apple for buidling so few computer lines with upgradable GPUs. I really wish Apple had a consumer level machine that was truly upgradable. Only the PMs have the flexibility of upgrading after purchase. Yes, you can do some things to your iMacs, mini's and 'books after purchase, but not much at all for someone coming from the PC world. I understand that Apple is constantly balancing flexibility and upgradability with the style and deisgn of their machines. I'm glad they finally gave in enough to offer a consumer/budget-level headless Mac and that was enough concession to get me to buy a Mac. (I hate AIOs!) Honestly, I would rather NOT have it be "mini" but instead be "midi" with an AGP and some PCI or even PCI-E slots, more RAM slots and an extra bay or two. Apple could do this with a stylish case. This doesn't have to be top of the line - put a G4 in it, put an older chipset with slower bus in there, I don't care. Expandability and flexibility shouldn't have to come ONLY with the bleeding-edge technology.
And I'm not some rich PM user throwing money down for the latest and greatest every few months. I'm not going to get the full benefit of CI, but that's okay. I'll get some benefit now and when I get better hardware in the future, which is inevitable, I'll get more use from CI then. This is how technology evolves - it is supposed to be this way.