PDA

View Full Version : Apple's New Graphics Card?


arn
Apr 16, 2002, 08:50 PM
James submitted this interesting Architosh article (http://www.architosh.com/news/2002-04/2002c1-0412-applegraph1.phtml) detailing a possible Apple graphics card in development:

Apple is apparently working on a superfast graphics card with twin-engines. Our sources tell us that the card may or may not be co-designed with Nvidia.

The article fills in whatever details it can... but of note, reconfirm's Apple's apparent interest in the Hollywood/Video Market... with recent acquisitions telling us more than Apple is publically.

Of note, they report on a Studio Summit which revealed the cheif "wants" from the industry... including powerful graphics, rack mounted units, and dual/quads.

Beej
Apr 16, 2002, 08:56 PM
Originally posted by arn
Our sources tell us that the card may or may not be co-designed with Nvidia.Now there's a quality quote ;)

Sounds cool. An Apple branded video card would have to rock!

Rower_CPU
Apr 16, 2002, 09:00 PM
Sounds like Apple is trying to find a solution on par with nVidia's Quadra line of video cards for PCs.

From the article it looks like there is a lot of support for this type of hardware development from the Hollywood elite...interesting...

Mr. Anderson
Apr 16, 2002, 09:01 PM
If you dig deeper into the article, its got a lot to say. I especially am interested in the G5 test boxes. I hadn't heard about this. But also that the development of a dual chip graphics card means a solid effort by Apple to support the 3D animation and post production apps.

A Quad G5 with a dual chip graphics card. Now that would be a nice present for MWNY. I'm not expecting that though, but it sounds good.

mac15
Apr 16, 2002, 09:06 PM
woah Nvidia make the best GPUs
no with apple thats cool
and the G5 sounds nice to

Xapplimatic
Apr 16, 2002, 09:17 PM
Each GPU is fitted out with "heaps of ram" our sources say, estimating the amount at 128 MB for DDR-RAM for each engine (GPU). Besides a curious daughter-card slot on the card, our sources say that they no very little about the rest of the card—other than to say its performance on the test box was nothing short of astonishing.

Wow! If this is Apple's sole creation, I am completely astonished! Even better that it's said to be hand in hand with a Hypertransport motherboard with dual processors (~G5).. If this is true, this year has MUCH to hold instore for Apple and Mac users alike.. :)

Mr. Anderson
Apr 16, 2002, 09:23 PM
Originally posted by Xapplimatic
If this is true, this year has MUCH to hold instore for Apple and Mac users alike.. :)

Not only that, it could have wider impact. Think if the really fast dual chip graphics card was only available for the Mac. It would give Apple an advantage they don't arleady have. This could be huge.

jelloshotsrule
Apr 16, 2002, 09:45 PM
this could be amazing


if only they'd make a card that has two adc's... i mean, i don't have 2 apple monitors, but it just seems like there should be one dual card with two adc ports...

ahh, can't wait to get my 15" in...

dongmin
Apr 16, 2002, 10:51 PM
All this talk of new technology is great but I'm a little concerned about the direction Apple's taking. Apple seems to be really pushing for that high-end video/animation market. Why? I don't doubt that it's a potentially lucrative market, but it's also a very small market, super-niche. A little like what SGI used to be.

Sure, there are some trickly down effects, like iMovie. But most of these new technologies and acquisitions seem limited to production-house uses. Whatever happened to "a computer for the rest of us"?

How about, instead, focussing on developing a better brower for the Mac? Mac browsers blow, in comparisons to PCs. Or introducing some new digital appliances?

SPG
Apr 16, 2002, 11:34 PM
Originally posted by dongmin
All this talk of new technology is great but I'm a little concerned about the direction Apple's taking. Apple seems to be really pushing for that high-end video/animation market. Why? I don't doubt that it's a potentially lucrative market, but it's also a very small market, super-niche. A little like what SGI used to be.

Sure, there are some trickly down effects, like iMovie. But most of these new technologies and acquisitions seem limited to production-house uses. Whatever happened to "a computer for the rest of us"?

How about, instead, focussing on developing a better brower for the Mac? Mac browsers blow, in comparisons to PCs. Or introducing some new digital appliances?

A lot more trickles down than you think, quicktime, font and display technology, graphics cards for games, so much of what you use is developed from the high end stuff. The innovation is faster when the tests are harder and 3D design and video are some of the hardest tests to pass.
The size of the market is bigger than you think when you include FCP, and DVDSP, and like it or not, just as the 90's saw the desktop publishing revolution, right now the video revolution is upon us, and I'm glad Apple isn't passing up the opportunity to lead the charge.

The "computer for the rest of us" is called the iMac and it's portable equivalent is the iBook.
Apple make a browser? Isn't that how MS got in trouble?

eirik
Apr 17, 2002, 01:17 AM
A more knowledgeable person than I noted sometime ago in a post regarding video cards that the video RAM per se doesn't do that much for speedy performance. More doesn't necessarily make it better. The poster said that, in games at least, the RAM is used to store textures (?) or something so they don't have to be regenerated.

So, whoever that poster was, made the comment in the context of game performance. I really don't know about 3D and video editing with respect to increased video RAM.

I'm waiting for a major upgrade to the PowerMac line, as well as some kind of PVR peripheral/software package (fully-baked). So, I'm using a Windoze2k by Dell these days.

My point? Last winter when the price of main memory RAM was so low, I bought the max that this and my other PC could handle. Thus, my Windoze2k has 768MB of SDRAM100. In the months since I installed the RAM, I've never seen more than 25% of the RAM being utitilized. (I'm just starting to learn Photoshop; so that ought to challenge it much more.)

This all relates to Apple's rumored super GPU in that if having more and more RAM in my Win2K had little if any impact on performance, would the presence of 2 x 128MB of DDR significantly help? Bear in mind when you answer this, do YOU know what specific functions, processes, and threads that video cards ACTUALLY execute; do YOU know what the video memory is ACTUALLY used for? I honestly can't answer that well.

I remember a very informed poster from within the last three months or so indicating the adding video RAM makes GPU's much faster is something of a myth. Again, this was in the context of games performance. So, if this too holds in 3D and video editing, then this 2 x 128MB sounds a bit suspicious.

Hey gurus, what do you think?

Eirik

Rower_CPU
Apr 17, 2002, 01:29 AM
eirik-
VRAM as it applies to gaming is very different than for 3D rendering...

From what I understand, the performance of the vid card ties into three things: 1) GPU speed, 2) VRAM speed and size (overall bandwidth, GB/s), and 3) CPU speed...

Now a vid card with 2 x 128 MB DDR RAM would have twice the theoretical memory bandwidth as a 128 MB DDR RAM card...
I'm sure some of the guys who do 3D work can weigh in on how this benefits them...

Xapplimatic
Apr 17, 2002, 01:55 AM
Originally posted by eirik

This all relates to Apple's rumored super GPU in that if having more and more RAM in my Win2K had little if any impact on performance, would the presence of 2 x 128MB of DDR significantly help? Bear in mind when you answer this, do YOU know what specific functions, processes, and threads that video cards ACTUALLY execute; do YOU know what the video memory is ACTUALLY used for? I honestly can't answer that well.

I remember a very informed poster from within the last three months or so indicating the adding video RAM makes GPU's much faster is something of a myth. Again, this was in the context of games performance. So, if this too holds in 3D and video editing, then this 2 x 128MB sounds a bit suspicious.


Eirik:

More RAM would make game performance faster (3d gaming).. The RAM is used primarily for 3d texture mapping.. The more detail, the more memory needed. The higher the resolution etc.. In short, the more the card can store in its own high-bandwidth access memory, the less the card has to interface with the computer's slower buss and memory architecture. Less time wasted accessing stuff that could all be cached in the faster on-card memory means the card has more time to spend on things that matter like the actual mapping and higher frame rate throughput. Usually more memory = better performance in this particular area.

Dongmin:

As to why Apple should pursue high end video effects market and graphics systems.. I think that's simple too. It gives Apple a lock on something really prestigious. Not only is Apple probably aware that this puts them in favor with Hollywood and assures them lots of free advertising, it also gives them an edge in designing their regular computer systems as well. Trickle down hardware design. What is learned above, so applied below. Etc.

The high-profile high-performance of video industry leaders also catches the imagination of a very large segment of regular Windoze users --> gamers. If Apple has the absolute best to offer for highest performance 3d stuff, PC gamers will start buying Macs to play Quake 3 instead. There are people who spend all their spare dollars just to stay at the top frame rates at the highest resolutions in their favorite games... High-end 3d effects is a highly coveted market, and I think a lucrative one for Apple to tap into be it Hollywood or home have-to-have keeping above the Jones types. It will also spur the Mac gaming market and bolster the number of offerings.. It's good all around.

thedude
Apr 17, 2002, 02:54 AM
FInally!!! I just hope that the card is a best in breed type deal. I would hate to see people staying away from maya osX because of a sucky vid card. My only concern is price. currently, 3dlabs Wildcat III 6110 is (at last check) $1800. If apple decides to package a card in a production g4/5 with dual procs, price is going to be through the roof. Right now an sgi FUEL (o2+ are less) is priced at 12 grand. If apple can bring it in for 5-8 grand sgi is going under for sure. Unix command line, a GUI that actually works, dual g5's and a kick ass vid card...wow... And if SGI goes under, I wonder if they'll sell AliasWavefront? hmmm.....

freedom
Apr 17, 2002, 02:55 AM
This seems really cool.
If apple releases something like this
I will buy a dv-cam and start making
my own films! I know that I already can
do that with an iMac, but what this
also would do for Photoshop-performance…
Pro machines are the testing ground for
consumer products, as said above,
and don´t you remember all speculation
about the new iMacs?
Pro vs Consumer blaha blaha blaha…

Xapplimatic, you´re damn right that
attacting Hollywood would mean
"free" advertising.
Apple will benefit from this!
As a kid I always wanted an SGI…

If this also mean that a few hardcore
PC-gamers will convert, that means
that some gamedeveloper will have to as well…
Right?!

tastybrains
Apr 17, 2002, 03:10 AM
You are comparing the RAM on your motherboard with the RAM on your graphics card, which is completely ridiculous.

In the old days, more Video RAM meant higher-resolution display capability. This is because your video card maintains a complete record in RAM of every pixel displayed on your screen (the "frame buffer"). For instance, just to keep track of each pixel currently being displayed on a 24-bit 1280x1024 display would require (1280x1024x24)/(8*1024) = about 4mb.

Obviously modern graphics cards have way more than enough memory to display almost any reasonable resolution and color depth. Much of the remaining memory is used to store textures (a "texture" is the pattern that fills an object -- the more detailed, the more realistic the result). This can help games to a certain extent, in that they can use much more detailed graphics without a huge performance hit because the textures stay loaded into the video RAM for fast retrieval. Think of this concept as being analogous to CPU cache. It is used for fast retrieval of frequently-used data and as a workspace for the application of various low-level hardware-accelerated tasks.

I am not exactly sure how this would pertain to the final product in high-end animation, since each frame would probably have to be individually rendered with a level of non-repetetive detail that would lessen the benefits of the GPU. Of course, pre-production work (editing, viewing previews, etc) could be significantly accelerated. As far as gaming is concerned, the primary benefit would be higher texture quality (an absolute requirement for the next generation of game engines, exemplified by the new Doom).

The simple upshot is that new games will be much more photorealistic and still able to be rendered in real time.

Pants
Apr 17, 2002, 03:39 AM
Originally posted by Rower_CPU
eirik-
VRAM as it applies to gaming is very different than for 3D rendering...

From what I understand, the performance of the vid card ties into three things: 1) GPU speed, 2) VRAM speed and size (overall bandwidth, GB/s), and 3) CPU speed...

Now a vid card with 2 x 128 MB DDR RAM would have twice the theoretical memory bandwidth as a 128 MB DDR RAM card...
I'm sure some of the guys who do 3D work can weigh in on how this benefits them...


hmm....theres a bit of a law of diminishing returns with vid card memory - more isnt always noticeable. Secondly, the approach taken to 3d (game) rendering in the current crop of cards has been argued to be a little wasteful - the performance of the kyro ][ class of cards was as good as that for the contemporary nVidia cards, but with a much lower clock speed due to its different (tile based) rendering system. And again, bigger textures are a bit of a 'fudge' - bigger textures dont imply more realistic scenes. Plus plus plus - just bunging on more ram and a faster chip does not a better vid card make.

I suspect any vid card made for industry use isnt going to be the kind of thing you or I will be able to afford ar actually want. After all, how many of us desire a Sun or SGI vid card?

freedom
Apr 17, 2002, 04:59 AM
An industry high-end vidcard would be
too expensive for most of us…
All I wan´t is an ongoing evolution…
Sooner or later this technology will
drop down to us "regular users".
I really can´t use that much power
at the moment, but who knows what
the future holds in hand?!?
Time for lunch!
I can sometimes appreciate flaming
as long as flamers explain what was
wrong and educate us with less
knowledge at hand!

Macmaniac
Apr 17, 2002, 06:39 AM
This sounds really cool! I hope their new pro line comes with DDR RAM to supplement this new graphics card! Along with a G5 chip of course.

Pentium Killer
Apr 17, 2002, 07:03 AM
and we all would like such a beast,with a dual G5 but I would be also happy if a single G5 came out this summer,and still I have my doubts,but who knows(Steve I assume);) ;)

teabgs
Apr 17, 2002, 07:55 AM
The Hollywood market though specialzed and "small" is a very HUGE market. They need AS MUCH POWER AS POSSIBLE. If apple can come out with this suppossed beast of a machine and post-production and animation houses were to buy them it would give apple SO MUCH MONEY AND PRESS. Its not like they'd buy a $5000 station for each user. They'd be buying so many , so much more expensive machines.

PLus, they'd upgrade them a lot more often then most users do, this creates a steady permanent flow of money to Apple. This will be used for R&D and in turn produce faster updates. Even if those updates are only afffordable by the Hollywood market the benefits would trickle down to others when upgrades came about. Hollywood would get newer machines and we'd be able to get the previous machines at a huge discount from their origional prices.

This could be one of the greatest feats of apple ever...if played out right AND true.

Side note: How cool would it be at the end of the credits of an animated feature to see "Made On A Mac" with the Apple Logo Underneath?

wymer100
Apr 17, 2002, 08:38 AM
The idea of dual GPU's isn't a new idea. Both 3dfx and ATI have offered dual GPU's, Voodoo2 and Rage MAXX respectively. The biggest problem will be making both chips work efficiently and making sure the drivers aren't buggy. Apple may simply be making a dual GPU card to drive 2 ADC monitors, hard to tell. I am glad that apple is really addressing the issue slower chipset performance. The biggest problem with the current crop of G4's is the current chipset. The G4 and GPU's are as good as you can get (gaming-wise).

People are correct when they say there is trickle-down effects for persuing the high-end 3d environment. Developing a DDR bus and faster chipsets cost about the same if you are doing it for the hollywood or joe-public. Apple might was well go after the hollywood market to get additional bang-for-the-buck. Steve Jobs knows how big the 3d market has gotten and will get. Just look at movies like Shreck and Monsters, inc. Apple is being smart about their approach. At least they are getting feed-back from hollywood and increasing their chance for success. Hollywood isn't going to jump to apple until the platform is better than their current systems.

Mr. Anderson
Apr 17, 2002, 08:43 AM
Originally posted by Rower_CPU
Now a vid card with 2 x 128 MB DDR RAM would have twice the theoretical memory bandwidth as a 128 MB DDR RAM card...
I'm sure some of the guys who do 3D work can weigh in on how this benefits them...

As someone who dabbles in 3D animation, I know for a fact this will have a huge impact. The more RAM, the more realtime objects you can preview and use. All of the big apps use OpenGL to display polygons, and are there for limited on how many they can actually display by the video card. If Apple wants to go Hollywood, they need big machines, capable of handling extremely complex models and scenes. Shrek pushed the limit of this and has pretty much set the standard for future full length animations.

As for rendering the final output, that's done by the CPU, not the graphics card, so the speed and power of the machine come into play here. Can anyone say quad G5?

Mr. Anderson
Apr 17, 2002, 08:45 AM
Originally posted by teabgs

This could be one of the greatest feats of apple ever...if played out right AND true.

Side note: How cool would it be at the end of the credits of an animated feature to see "Made On A Mac" with the Apple Logo Underneath?

I totally agree and it would make me proud to be a Mac Owner to see that at the end of the credits. They're at the end of mine, problem is I have a much smaller audience, but someday that all might change.

serpicolugnut
Apr 17, 2002, 08:59 AM
The lack of a "serious" 3D graphics card for the Mac is currently only half the problem. The other half is that Carbon 3D appllications currently suffer from really bad open GL performance. Lightwave and Cinema 4D both have awful OpenGl redraws in the OS X versions. Which is a major shame, because OS X finally gives users of these applications the stability and rendering speed that OS 9 never could. I'm a Lightwave user (with a G4/800DP w/ GF3 card), and the performance is so lame it has me seriously considering adding a AMD Box just for Lightwave.

The kicker is that some Carbon applications feature great OpenGL performance, like QuakeIII. It has better frame rates under OS X than under 9. Maybe it's the individual developers (NewTek, Maxon) responsibility to optimize their apps OpenGL perfrormance, but from what I've heard, it needs some serious work under OS X.

iGav
Apr 17, 2002, 09:02 AM
It all sounds very impressive and all......

But I have one question?

Will they be able to squeeze it into the TiBook??

Now that would be impressive!! :p

Or will it be the size of an ICE card??

Mr. Anderson
Apr 17, 2002, 10:03 AM
Originally posted by serpicolugnut
Lightwave and Cinema 4D both have awful OpenGl redraws in the OS X versions.

This totally sucks. I have only tried Lightwave with OSX once and noticed it was a little 'sticky' on the redraws. I thought at the time that I need to get the 7.0b upgrade from NewTek. But if what you're saying is true, I might have to wait until OSX.2 or even further before the OpenGL problem is solved.

Damn.

Not only that, the graph editor tends to screw up in OS9 on 7.0b because of AppleQuickDraw issues. I was talking with tech support at NewTek about this one and they said to disable all quickdraw extensions. Are you seeing the same problem? Oh, its only with some graphics cards - I was running on the TiPB, not the same issues with my desktop G4.

mischief
Apr 17, 2002, 11:34 AM
I've got a hunch that most ported 3D apps are in the same situation as ArchiCAD: Non-native or nonexistant AltiVec. If the thing is set up for Wintel chipsets to do it's base calcs than it'll always stick on the redraw on the Mac.

This is where the Open-GL GUI concept starts looking good. If all the monitor-related tasks are offloaded the CPU can deal with just the admin and math.

This model doesn't work without the kind of Video card we'd get out of this. It would be possible on the current machines but the rumored G5 MP with one of these would smoke any production desktop currently made.

As to GPU RAM performance gain: If the program has native support for Open GL and is anything that renders live (see any CAD or CGI software) frames you use up that 32 Mb per monitor pretty damn fast. It's more than just textures but light mapping, BMP mapping, polygons and split polygons,colours, shading, grouped objects, vectors and, of course the screen buffer.

I want that 256Mb DDR card please.

This also explains a few other rumors:

I heard one about an Apple G5 test box with an AMD chip in with a G5.

AMD and nVidia have been discussing OS X.

Upshot: what if nVidia does an Apple designed dual head card using AMD built GPU chips?

Picture it:

Choice of:

2, 4 or 8 G5 1.2Ghz CPU's (built under liscence by IBM)

500Mhz Rapid I/O bus

4Gb of RAM

choice of ATA 133 or SCSI 160 two to six HDs

256Mb HyperTransport Dual GPU apple/nVidia Xforce video card with two 1.2Ghz AMD GPU chips.

Workstation cost: $10,000.00 USD.

gbojim
Apr 17, 2002, 12:00 PM
The film and video folks have not been very happy with SGI in recent times. And, through a couple of my clients I had been hearing rumblings about Apple making a strategic push into that market to try to unseat SGI. I consider that to be a good move - a lot of very high margin sales in that market.

The thing that was weird was people were talking about a very high end graphics card - as in $2000+ - becoming available when the new powermacs are released (don't know if that meant G5 or not). I had checked around with current producers of high end vid cards and nobody was willing to admit they were even thinking of working with Apple on this.

But, if Apple is working on this on their own, that starts to tie a lot of things together that I had been hearing about.

This could get really interesting.

Mr. Anderson
Apr 17, 2002, 12:14 PM
Originally posted by mischief
I've got a hunch that most ported 3D apps are in the same situation as ArchiCAD: Non-native or nonexistant AltiVec.

Lightwave is optimized for OSX, atleast on the CPU side of things. The OpenGL issue is something I'm looking into.

mischief
Apr 17, 2002, 12:22 PM
Originally posted by dukestreet


Lightwave is optimized for OSX, atleast on the CPU side of things. The OpenGL issue is something I'm looking into.

Patched is different from native support. There's always some lag if the program is saying:"Go here, no wait go there!" for every FPU/Altivec action.

ejm625
Apr 17, 2002, 05:59 PM
Twin GPU's? I have to change my shorts. Time to count my pennies.

Mr. Anderson
Apr 17, 2002, 07:37 PM
I downloaded the latest update to OSX and Lightwave and I don't have the same lag anymore, it runs smoothly, even better than in OS9:D

So I'm happy and I'd love to see a dual card for when I open up the big models. Right now I'm still building my objects, the scenes where I put them all in together won't happen for a while.

Catfish_Man
Apr 17, 2002, 08:36 PM
Originally posted by mischief

This is where the Open-GL GUI concept starts looking good. If all the monitor-related tasks are offloaded the CPU can deal with just the admin and math.


Exactly! That's what I was trying to say in the topic I started about that (3d os). I think it would be fun to try making an entire graphics system (QuartzII or something) that did everything in OpenGL. 2d graphics would be done as flat 3d objects or textures. It's completely impractical of course, but it would be sorta cool.

thopter
Apr 17, 2002, 09:35 PM
$10,000 dollar Macs!? The day of super expensive workstations is over. As is that of $10,000 dollar apps. Notice the falling market share of SGI, and the drastic price (and outlying office) pullbacks by the likes of Alias, and Newtek, and Electricimage......

Mr. Anderson
Apr 17, 2002, 09:45 PM
Originally posted by thopter
The day of super expensive workstations is over.

If you went to the Apple store right now you could order yourself a $10k machine. The days are far from over. What's changed is the power you can get at more reasonable prices. There will always be a market for highend machines, just not a very big one.

thopter
Apr 18, 2002, 06:32 AM
If you went to the Apple store right now you could order yourself a $10k machine.
Yup, but you'll be able to get a good deal on one soon because they're not selling very well. I work for an Apple VAR and I havn't seen a quicksilver in some time now. The Mac market right now is solely iMAc's and PB's. Certainly there is pent up demand for the fabled G5, but if Apple can't/won't deliver those by summer's MacWorld, all this hard earned momentum could really begin to fizzle.
The real key to the sucess of products like FCP has been their affordabilty!
If FCP had been intro'd at the same price as other Vid software, it would have been a non-event. But at a market busting price it has been a revelation.

mischief
Apr 18, 2002, 10:22 AM
If SGI are selling workstations for $50,000.00 than an Apple KillerMac G5 at 10K would roast them if it was an updated mobo with the above video card. Particularly if Jaguar is all it's supposed to be.

Mr. Anderson
Apr 18, 2002, 11:40 AM
Originally posted by thopter
I work for an Apple VAR and I havn't seen a quicksilver in some time now. The Mac market right now is solely iMAc's and PB's.

That's got to be hurting Apple. I totally agree that Apple has got some good momentum going now and the can't jeapardize it by not updating the Pro line. Besides that, the TiPB are starting to get long in the tooth too, they're up for a redo as well.

iMax
Apr 18, 2002, 04:09 PM
As someone who was in Middle School not too long ago, I can still remember arguements with ym friends about whether or not Macs were better than PCs. I'd always try to explain better OS, better looking, etc, but it would always come down to this: My friends would be "Yeah, I guess Macs are better for graphics." That was the only concession I could get out of them. Apple used to havea real reputation for being the best in the graphical area... they need to have the ahrdware to reclaim that title. All this talk sounds like they are taking a step in the right direction.

z0gster
Apr 19, 2002, 09:44 PM
I think that maybe they are making the graphics card with 2 gpu's is so that they can make stereoscopic displays. Cause stereoscopic displays need 2 gpu's, and there was that rumor like a month ago about apple using them. That would be really cool...

thopter
Apr 21, 2002, 10:11 AM
So now Newtek (Lightwave) is closing branch offices as well.
Which begs a question. If Apple does offer a $10,000 G5 workstation, who
is in the market to actually buy one?

thedude
Apr 24, 2002, 12:08 PM
Put it this way, Square Pictures bought a whole slew of SGI for a hell of a lot more than 10k each. (granted, they're not doing so hot right now, as in splat!) The money is there, and companies are going to spend for a better product.
With maya's price drops, people should have more money to spend on the hardware anyway :D .
One issue that nobody has mentioned yet is Apples policy on UPGRADES...This could be a problem, as larger companies don't want to have to spend money buying brand new machines every 6 months.

Scottgfx
Jun 16, 2002, 03:57 AM
At work I have Lightwave running on a dual G4/500. The Gfx cards are Rage 128s (Two displays... PCI is regular, AGP is Pro). At home I have a single G4/733 with a Radeon 8500. (single disply) Lightwave, does feel "sticky" with the dual with the slower gfx cards. Lightwave feels a lot better with the Radeon 8500... except this... the Dual 500 does render a bit faster than the G4/733. This doesn't suprise anyone, does it? :)

Edit: These are both running under OS X 10.1.5

Originally posted by dukestreet
I downloaded the latest update to OSX and Lightwave and I don't have the same lag anymore, it runs smoothly, even better than in OS9:D

So I'm happy and I'd love to see a dual card for when I open up the big models. Right now I'm still building my objects, the scenes where I put them all in together won't happen for a while.

Scottgfx
Jun 16, 2002, 04:18 AM
Originally posted by thopter
So now Newtek (Lightwave) is closing branch offices as well.
Which begs a question. If Apple does offer a $10,000 G5 workstation, who
is in the market to actually buy one?

FYI: I don't work for NewTek or anywhere near it.

Lightwave has always been developed by a core team of two guys under contract with NewTek. Allen Hastings and Stuart Ferguson are the two main brains behind Lightwave. They have formed a company called Luxoloy that is an umbrealla for Lightwave's development. NewTek still has their own developement team in Texas, but the main developers are still in the area where NewTek closed the office. Long story short... Not much has changed.

http://www.luxology.net/company/timeline.aspx

ccornish
Jun 17, 2002, 02:35 AM
Check out Matrox and 3Dlabs for their new products.

The Parhelia is very impressive. Quake or UT 2003 on three monitors.

3Dlabs has something up their sleeves as well. Hopefully we will see some fresh new products from new vendors instead of ATI or Nvidia.