Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
shamino said:
Viiv consists of:
  • One of the following Intel processors: Pentium-D, CoreDuo or Pentium Extreme Edition
  • One of the following Intel core-logic chipsets: 975X Express, 955X Express, 945G Express, 945P Express, 945GT Express, or Mobile 945GM Express
  • One of the following Intel LAN interfaces: PRO/1000PM, PRO/100VE, PRO/100VM
  • TV Tuner card (from a third party)
  • "advanced" video card (unspecified)
  • Intel's High Definition Audio chipset (featuring 7.1 channel surround sound)
  • Remote control (from a third part)
  • Windows XP Media Center
  • A software suite involving video (download, streaming, DVR), music (download and streaming), photo sharing, and gaming.
You have to have Windows XP Media Center to be Viiv complient!!!!!:eek: :eek: :eek:
 
fabsgwu said:
So does this card make up for lack of AltVec?

I hope it does or if it doesn't the new iMac would be slower then the G5 imac. Apple wouldn't put a slower card in one of their computers and say its 2 times faster.;)
 
Hattig said:
I'd guess that the graphics in the iMac are on a separate card - maybe ATI's mobile interface for PCIe, I forget the name - so they can be switched around.

Wait a month or two and there'll be a higher performance 3D iMac with the X1800 (or X1900 when it comes out), for some extra money...
I very much doubt you'll see that anytime soon. I talked to an X1800 engineer today at MacWorld Expo and the two cards are in different classes. The X1600 is the consumer version of the X1800, that alone makes it unlikely to be an option for iMacs, but PowerMacs will of course use them. The X1800 is a 512 MB card for PCI-Express only and is considerably larger and more power-hungry than the X1600. I can also tell you that while the demos I saw (running on a Quad G5 and a 30" Cinema Display) were impressive, there's a lot of bugs left to be worked out in the drivers between ATI and Apple. Antialiasing wasn't even working at all, for example, and I saw a number of artifacts in the rendering. Some of the shaders that will be possible will be beautiful once working, though. Now the X1600 will be able to render a lot of the same effects, but it's not going to have the horsepower to pull it off at an acceptable frame rate in many cases. Bottom line is you'll be more likely to see an upgraded X1600 in the iMacs (X1600 XT? X1600 Pro? X1650?) than an X1800.
 
patseguin said:
It sounds like the top of the line MacBook pro will outperform my 1st gen Dual 2.0 PM G5 and x800. Is that a fair assumption?
Based on what I know and my questioning of the ATI guy, the two are probably very roughly equivalent for both CPU and GPU (the dual G5 probably still has a slight edge in CPU and system bandwidth). After spending the day at MacWorld, I'm actually thinking about selling my dual 2.05 G5 and replacing it with a new 20" iMac (since I was looking at upgrading the GPU and buying an external 20" LCD anyway). I figure I'll come out about even on the processing speed (although I'll need to run a lot of apps under Rosetta for a while which will slow them down), but gain a nice 1680x1050 LCD, an iSight, a larger hard drive, a faster GPU with more VRAM, iLife '06, built-in Airport Extreme and Bluetooth, Front Row with remote, dual-layer DVD burning, Mighty Mouse, and a much more compact form-factor (much as I love the design of my G5, the thing is a beast). By selling the G5 I can probably have it for about the same cost as just buying a cheap external LCD. I lose the expansion slots, internal modem, and FireWire 800, but since I'm not using those anyway, I don't really care.

One thing I'm wondering is why Apple is using IR for the remote instead of Bluetooth for the new iMac and MacBook Pro. I guess for size/cost/power savings?
 
Am glad to see good vid cards in Macs.

Am not glad that writing firmware and drivers still is a pain in the butt for ATI/nVidia.
 
HiRez said:
I very much doubt you'll see that anytime soon. I talked to an X1800 engineer today at MacWorld Expo and the two cards are in different classes. The X1600 is the consumer version of the X1800, that alone makes it unlikely to be an option for iMacs, but PowerMacs will of course use them. The X1800 is a 512 MB card for PCI-Express only and is considerably larger and more power-hungry than the X1600. I can also tell you that while the demos I saw (running on a Quad G5 and a 30" Cinema Display) were impressive, there's a lot of bugs left to be worked out in the drivers between ATI and Apple. Antialiasing wasn't even working at all, for example, and I saw a number of artifacts in the rendering. Some of the shaders that will be possible will be beautiful once working, though. Now the X1600 will be able to render a lot of the same effects, but it's not going to have the horsepower to pull it off at an acceptable frame rate in many cases. Bottom line is you'll be more likely to see an upgraded X1600 in the iMacs (X1600 XT? X1600 Pro? X1650?) than an X1800.

How about putting the notebook version of the x1800 in the iMac?
 
shanmui1 said:
How about putting the notebook version of the x1800 in the iMac?
Dunno about that, does it exist yet? I'm curious whether the X1600 in the MacBook Pro is a mobile version of the one in the iMac, or are they the same (in other words, does the X1600 just naturally have a mobile form-factor, with no full-sized desktop counterpart?)
 
HiRez said:
Dunno about that, does it exist yet? I'm curious whether the X1600 in the MacBook Pro is a mobile version of the one in the iMac, or are they the same (in other words, does the X1600 just naturally have a mobile form-factor, with no full-sized desktop counterpart?)

I think so, but i'll check the website right now and give and update.
EDIT: No, I checked the site, they only have it for desktops. If I haven't checked hard enough go to the website http://www.ati.com/products/RadeonX1800/
 
HiRez said:
Dunno about that, does it exist yet?

A GeForce Go 7800 GTX currently exists, so I'm sure an x1800 is atleast planned . Now that thing would be great in there :cool: The 7800 scores some nice numbers.
 
HiRez said:
Dunno about that, does it exist yet? I'm curious whether the X1600 in the MacBook Pro is a mobile version of the one in the iMac, or are they the same (in other words, does the X1600 just naturally have a mobile form-factor, with no full-sized desktop counterpart?)


well, here is what i read on x1600/x1800

http://www.theinquirer.net/?article=28311

http://www.theinquirer.net/?article=28952

7800 gtx Go is only a bit slower clock and memory than desktop and 256mb instead of 512, but that class GPU in an iMac......
 
Airforce said:
A GeForce Go 7800 GTX currently exists, so I'm sure an x1800 is atleast planned . Now that thing would be great in there :cool: The 7800 scores some nice numbers.
Hmm ok. Well I played WoW on a 17" Intel iMac today, and it was pretty smooth (with some weirdness that I attribute to it being a beta version and the X1600 drivers being immature). Once the drivers are mature, I think the claims of it being close to X800 performance will be fairly accurate. And that's not bad for being built into an iMac or PowerBoo...*cough*...I mean MacBook Pro!
 
shanmui1 said:
well, here is what i read on x1600/x1800
Thanks for the links. It's always tempting to wait for the next GPU but that's a never-ending game. Based on what I've seen, the X1600 in the Intel iMac makes for a mighty nice modern gaming machine (I would spring for the extra VRAM in the 20"), and this should get better in the coming months as drivers and games tweak things out. Super high-end graphics whores will not be satisfied, but they never are, and they aren't buying iMacs no matter what anyway. I remember witnessing the sucky GeForce "GO" mobility chips in the PowerBooks a few MacWorlds ago and being extremely unimpressed, not so with this year's iMac/MacBook and X1600.
 
So I have to assume the X1600 in the MacBook Pro is the Mobility version, the question is, Is the X1600 found in the new iMacs the same Mobility chip or is it the full X1600 version? And is "desktop X1600" the same as "X1600 XT"?
 
fabsgwu said:
So does this card make up for lack of AltVec?

No, but the SSE3 on the 'Core Duo' processor does.

Basically, AltiVec is a 'vector processing' command set that supports large floating point operations on a bunch of numbers at once. Intel's original 'MMX' was similar, but only did integer operations. SSE came along, adding floating point, but not very well. Altivec was released after SSE; so much was made about how vastly superior Altivec was to SSE. Then Intel released SSE2 (with the original Pentium 4,) which was an improvement; but still not quite up to Altivec. With the 'Prescott' Pentium 4s, and now on the 'Core Duo' on the mobile side, they are up to SSE3. SSE3 is meant not only as a vastly improved vector processing unit, but it even takes over most of the non-vector floating point. SSE3 is easily an equal to Altivec, possibly superior.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.