Well, I stayed away from the original CoreDuo because it was obvious that it was a temporary chip. Before it was released, Intel had already set an end-of-life date for it. So, I pretty much knew not to bother with it.
It also didn't make much sense that Apple had been pushing 64-BIT processors so hard, and then all of a sudden released all their systems with a 32-BIT processor (naturally not mentioning it anywhere).
So, it was such an odd movement, and had no indication that it was a trend to continue.
Basically, Apple was in a hurry to switch to Intel processors. And, there were no 64-BIT processors to use at the time. So, Apple jumped and made a move with a temporary solution just to get the ball rolling.
I couldn't see the logic in stepping backwards, so I decided to wait until the chips were actually 64-BIT again. Of course, now my needs have changed, and I am waiting for Apple to release a system that meets my needs. So, I'm still holding-off for a bit while I wait for them to hopefully introduce a system that fits me.
plus? The biggest horizontal resolution listed for "4k" on wikipedia is 4096, and the biggest vertical is 2664. So assuming a 16:9 form factor, a display capable of handling all these resolutions would have to be at least 4736x2664.
On a side note, what's with the designation "4k"? Sometimes these industry terms seem designed to deliberately confuse...
I remember last Summer when all the Core 2 Duo waiters were being poo pooed by the Core Duo earliest adopters 'cause 64-bit wouldn't matter for years ahead. Well so much for that theory. See ya.![]()
Sure is fascinating how "spin" can sell a feature, and delberate oversight can steer attention away and neutralize the hype driven by the original
headlines. 64-BIT is back, babeeeee.........
the people who said that were correct for the overwhelming majority of users. Most people who need this software probably weren't buying core duo imacs and mac minis.
My MacBook Pro is a core-duo, hence it is a 32 bit chip. I really hope Apple doesn't leave all us 32 bitters in the dust. I think those specs mentioned above are for Final Cut Extreme.
Sure is fascinating how "spin" can sell a feature, and delberate oversight can steer attention away and neutralize the hype driven by the original
headlines. 64-BIT is back, babeeeee.........
DVCProHD is already supported since 5.1.2. We're up to 5.1.4 now.I bet FCS2/FCP 6 will finally have DVCPROHD support. Also expect blue ray burner to cost u an arm and a leg. I am just looking forward to get the 2.66 quad to be the standard base model.
I would just love to replace my 3 year old 20" studio display with a current new 23" with a price drop, maybe around $699 and I would be all over it.![]()
I bet FCS2/FCP 6 will finally have DVCPROHD support. Also expect blue ray burner to cost u an arm and a leg. I am just looking forward to get the 2.66 quad to be the standard base model.
DVCProHD is already supported since 5.1.2. We're up to 5.1.4 now.
Who's arm and leg? They are already approaching only $499. That isn't too much for this early in the adoption phase.![]()
I never really cared for the marketing word "Extreme", especially since Taco Bell destroyed it a few years back.
Thanks for the clarification. I did not know that. the Photo Marketing Assn show in Las Vegas next week March 8-11 appears to be just an Aperture showcase opportunity and since it's so soon unlikely to be the Leopard announcement time.Not if you are using the Panasonic P2 cards. DVCPROHD files in P2 cards are all .mfx files that are not natively supported by FCP 5.X.X yet.
I just hope that Apple is not making us wait till Developers conference or whatever month in order to sell the Mac Pro + Leopard. But Apple did announced that Leopard will be available in the spring though![]()
I hope Leopard will be revealed during PMA and then the new Mac Pro will be available to order right after NAB.
Seems more likely to be a prominent part of iMovie 7 and Final Cut Express 4's new features although I agree it should be supported in FCP6 as well. This format troubles me. I see the compression scheme as inferior to the HDV scheme. Plus how you gonna archive all that non-tape footage? Fundamentally problematic to my way of thinking.I'm hoping that FCP 6 will have AVCHD format support (for those prosumer nerds with one of the Sony HD camcorders that you can't edit on the mac). I also would think that having an HDMI input (not just an output) would be a boon for these same people. Then again, I wouldn't be completely surprised is apple just ****s over people with Sony hardware. However, panasonic has also announced hardware using AVCHD and it seems like a good format for higher end consumer use.
I was thinking along the same lines when I read this-- not so much in the GPU as with a Cell coprocessor. NeXT boxes had a DSP chip on the motherboard to accelerate certain operations, so the concept wouldn't be foreign to a lot of the developers.I was thinking yesterday in those secret features in Leopard... and somewhat the recent article (http://arstechnica.com/news.ars/post/20070227-8931.html) came to my head... what if they are doing some kind of framework (say, CoreSpeed or some marketing-wise term) to handle GPU accelerated functions easy for developers? I guess that maybe even the stock GeForce 7300 that came with my Mac Pro would be able to speed certain type of operations if used correctly... imagine a framework where the OS decided if an operation can be performed faster in the GPU or the CPU, and execute it there? Suddenly PS works so much faster in Mac OS X 10.5 than in the same computer running Windows... or Safari opens 3 times faster!!!
They have a limited set of graphic cards... they do the drivers... doesn't sound so strange to me... anyway... I don't know how hard could it be to implement... but if they are doing acceleration cards using CELL maybe they alredy have some kind of framework inplace...
It could help to sell some (or a lot!) equipment to the HPC niche... they already have the XGrid thing...
Anyway, can't wait for Leopard...
I can't speak for all the C2D poo pooers, but for myself I was arguing that there was no real advantage to C2D in a laptop before Santa Rosa. I still think that's true. It's a pretty small segment that needs that much memory and the raw performance of the CPUs isn't much different with the current chipset.I remember last Summer when all the Core 2 Duo waiters were being poo pooed by the Core Duo earliest adopters 'cause 64-bit wouldn't matter for years ahead. Well so much for that theory. See ya.![]()
All of Apple's laptops used 32 bit processors at the time of the Intel switch over. Those systems lost nothing since they never had 64b capable CPUs to begin with. The only system that lost 64b capability was the iMac and the iMac only had a 64b CPU for a relatively short amount of time.It also didn't make much sense that Apple had been pushing 64-BIT processors so hard, and then all of a sudden released all their systems with a 32-BIT processor (naturally not mentioning it anywhere). ... I couldn't see the logic in stepping backwards, so I decided to wait until the chips were actually 64-BIT again.