Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Originally posted by agent302
If I read that correctly, that adapter is the same as the $45 Gefen one I linked to; as in, DVI graphics card to VGA monitor, not the other way around. And try to spell my name correctly, please.
********s* :mad:

You do not read it correctly. From the website:
Our DVI adapters provide a high-bandwidth video interface for the host and display devices of today, while addressing the bandwidth requirements of tomorrow. Digital Display Working Group (DDWG*) Digital Visual Interface (DVI) standard guarantee to work with analog or DVI-I flat panels/liquid crystal displays. No other adapter delivers better performance.
This adapter converts a DVI-I analog male plug to a VGA plug and can adapt to any existing system that uses VGA technology.


And watch the name calling...Alpha doesn't pull any punches.
 
Rower_CPU, thanks for the backup.

agent302, you have been warned... Rower has seen me rip people new ones (sometimes multiple ones). :D... It's a hobby of mine.

Hey, eyelikeart... what's your take on this?? Since you are the other top ranking demi-god here. :D
 
Originally posted by Rower_CPU
And watch the name calling...Alpha doesn't pull any punches.

yeah...he won't think twice to kick it on u...

and if that's not enough....there's plenty more to go around from the rest of us...he he he...

did someone hear the cracking open of a can?? :eek: ;)
 
it's gotten awfully quiet in here all of a sudden?! :eek:

back to new TiBooks...

I still need to talk to my Mac dealer to see if he happens to know anything at all...

but I think I'm definitely going to chunk my Rev. A Ti 500....replace it with a nice whatever's the fastest...he he he

32 MB of VRAM would blow away my meak 8 MB (ugh)...
64 would just....nevermind...this is supposed to be a PG website... ;)
 
Originally posted by eyelikeart
it's gotten awfully quiet in here all of a sudden?! :eek:

back to new TiBooks...

I still need to talk to my Mac dealer to see if he happens to know anything at all...

but I think I'm definitely going to chunk my Rev. A Ti 500....replace it with a nice whatever's the fastest...he he he

32 MB of VRAM would blow away my meak 8 MB (ugh)...
64 would just....nevermind...this is supposed to be a PG website... ;)

I'm probably going to sell my rev a as well and get a new one...

Hey eyelikeart... you need a mop or a damp towel over there??? LOL
 
re: mops & towels

man...it's been rough these past couple months...

with my trip to see my girlfriend in 2 weeks and the possibility of a new TiBook....it just may get a bit messy around here...he he he :p
 
Originally posted by Rower_CPU


You do not read it correctly. From the website:
Our DVI adapters provide a high-bandwidth video interface for the host and display devices of today, while addressing the bandwidth requirements of tomorrow. Digital Display Working Group (DDWG*) Digital Visual Interface (DVI) standard guarantee to work with analog or DVI-I flat panels/liquid crystal displays. No other adapter delivers better performance.
This adapter converts a DVI-I analog male plug to a VGA plug and can adapt to any existing system that uses VGA technology.


Fine, that may be the case.. BUT, it involves a loss of quality throughout the process. You are going from Digital in the computer, to analog in the card, back to digital in the monitor. If you start with DVI-I, however, you go from digital to digital to digital, no loss in conversion (and it's still cheaper than your solution). Why spend money on an Apple Display if your going to have bad quality?


And watch the name calling...Alpha doesn't pull any punches.

Then don't criticize what I say based on my post count. That means nothing in regards to what I know. (And, if you look on the sidebar, you'll notice I've been registered here for 8 months)
 
agent302,

I have attached digital monitors to a VGA connection on computers before without ANY quality loss. The MAIN thing you need to remember is to go with the native resolution of the screen. THAT yields the optimal results on the screen. Go to a lower resolution, and you probably will get a lesser quality of image, but it isn't the 'conversion' of the signal, it is the fact that the screen doesn't want to go that low.

I have also used LCD's that were digital, but also had a VGA port on them. I found that the VGA port gave more options, and actually yielded BETTER image quality then going digital. That was with the conversion happening within the screen.

Your blanket statement that using the VGA to DVI adapter "involves a loss of quality throughout the process. " doesn't hold any fluids (water or otherwise).

UNLESS you are going from actual, personal, experience, don't make statements like that. I have used many different kinds of monitors and computers (one of the benefits of being in IT/IS) and can tell you that no such loss exists. If by some freak chance it does, it doesn't on any display of good quality. I use mostly ViewSonic and IBM displays, both of which are rated VERY high. Not some cheap, off brand display that WILL give you all kinds of problems (there is a reason they are cheaper). Except for the very newest Apple LCD's (haven't played with one YET), ViewSonic LCD's are of equal (if not higher) quality. IBM displays rate below ViewSonic, but still higher then many other brands.
 
Originally posted by AlphaTech
agent302,

I have attached digital monitors to a VGA connection on computers before without ANY quality loss. The MAIN thing you need to remember is to go with the native resolution of the screen. THAT yields the optimal results on the screen. Go to a lower resolution, and you probably will get a lesser quality of image, but it isn't the 'conversion' of the signal, it is the fact that the screen doesn't want to go that low.

I have also used LCD's that were digital, but also had a VGA port on them. I found that the VGA port gave more options, and actually yielded BETTER image quality then going digital. That was with the conversion happening within the screen.

Your blanket statement that using the VGA to DVI adapter "involves a loss of quality throughout the process. " doesn't hold any fluids (water or otherwise).

UNLESS you are going from actual, personal, experience, don't make statements like that. I have used many different kinds of monitors and computers (one of the benefits of being in IT/IS) and can tell you that no such loss exists. If by some freak chance it does, it doesn't on any display of good quality. I use mostly ViewSonic and IBM displays, both of which are rated VERY high. Not some cheap, off brand display that WILL give you all kinds of problems (there is a reason they are cheaper). Except for the very newest Apple LCD's (haven't played with one YET), ViewSonic LCD's are of equal (if not higher) quality. IBM displays rate below ViewSonic, but still higher then many other brands.

Fair enough.
 
what about the screenresolution in the spring Ti, moore than 1024x768?
It would be nice...
 
DVI, VGA etc.

Instead of arguing about DVI, VGA and external monitors, I'd prefer it if Apple included a higher resolution, kick-*ss *built in* screen with the revised TiBook. Not that the present one is bad...
Also, if they change the form factor slightly to fit in a Superdrive, then maybe that would allow the space to more easily fit in a better graphics chip and the DVI port. Of course, then I'd need a separate carrying case for the batteries ;)
 
Originally posted by tacojohn
well- having DVI out would be a great solution for hooking up an ADC display. You would be forced to use the DVIator (which allows you to plug in the displays power). And its only $150 if you want to use an apple monitor now not $300 like it was before. But this does cause a problem- what are you going to be about the VGA projectors that most colleges and other areas use? I know you could use the s-video out, but its not as high quality....
:

Apple could also put DVI on the machine and include an Apple equivalent to the DVIator. I could easily imagine them doing something like that.
 
Originally posted by lunDisc
what about the screenresolution in the spring Ti, moore than 1024x768?
It would be nice...

I'm willing to bet he meant more than 1152 x 768...

people have been whining about that for a while now...

I don't see the point of having a higher resolution on a screen that size...it's already big....I suppose they wanna have to use binoculars when looking at it?! he he he... :p
 
I know people have been asking for higher resolutions on the screens, which if it was there, many of them would use it. At least initially. After a while, you have to decide upon a resolution that is best for you. Maximum resolution on a screen is not for everyone. I never thought I would like the high resolution on the LCD I have, but with the clarity it has, and lack of flicker, it is very nice. I might use a higher resolution, if available, on a new PowerBook G4, then again, I might not.

I can't see Apple putting JUST DVI or ADC for outputs on their PowerBooks. For one, how many people travel to give presentations and have to use a projector that is provided?? We have many people within the company that I work for that either go to other company sites, or to client locations. You don't know what kind of projector you are going to run into. All projectors (made within the last 5-10 years) have a standard VGA connection (or cable with that connection on it). If Apple was to convert to either DVI or ADC, then they better provide the converters or people will be royally pissed off. I can just see it now... a VP goes to TX to give a presentation with his brand new PowerBook G4. He goes to plug into their projector, but it only has VGA input, and he only has either DVI or ADC and he forgot his adapter. I can hear the fallout from that already *taking cover*.

Hell, even Apple's towers still have a VGA connection on them (except for the GF Ti card). The thing with towers though, is they don't go to other sites or travel for weeks all over the country. 99.95% of them stay where they are put (at least at the same site) unless there is a company move. Adapters on towers are not the issue... it's on the laptops. I remember people praising Apple for putting a standard VGA connection onto the laptops for outputting to a larger screen/projector. They went to the STANDARD, not something on less them 5% of the screens (ADC).

I have also yet to SEE a monitor (CRT) with ONLY a DVI connection. If they do have DVI, then they either have a VGA port as well, or the cable goes to VGA input (have heard about those :D ).

Bottom line, while I see Apple as an innovator, I don't think they will make the mistake of alienating so many people by going to either DVI or ADC only on the PowerBook line. If they add it as a second video out port, that's one thing. But not as the only video output connection.
 
Originally posted by AlphaTech
...Bottom line, while I see Apple as an innovator, I don't think they will make the mistake of alienating so many people by going to either DVI or ADC only on the PowerBook line. If they add it as a second video out port, that's one thing. But not as the only video output connection.

Exactly. Moving to their proprietary connector would be a deadly move, and DVI is just not _standard_ enough, yet.

Multiple display outputs would be interesting, but difficult given the current form factor. What would they have to ditch to fit it in? The s-video out..the modem...?

On the other hand, can you imagine hooking up two external displays and still having the TiBook screen...triple display baby! Could the video card even handle that?
 
Originally posted by Rower_CPU
On the other hand, can you imagine hooking up two external displays and still having the TiBook screen...triple display baby! Could the video card even handle that?

If they're using the new video card, but more than likely one monitor would be a mirror of the TiPB screen, so you'd only get 2 displays. I'd love it anyway, I use 2 displays at home sometimes and its just wonderful, all that acreage.:D
 
Originally posted by Rower_CPU
Exactly. Moving to their proprietary connector would be a deadly move, and DVI is just not _standard_ enough, yet.

Multiple display outputs would be interesting, but difficult given the current form factor. What would they have to ditch to fit it in? The s-video out..the modem...?

On the other hand, can you imagine hooking up two external displays and still having the TiBook screen...triple display baby! Could the video card even handle that?

I could imagine plugging a laptop into two displays and having an almost panoramic display. :D That would have to be one kick ass video card though, a mobile dual header, or would it be a tripple header??? Have a pair of 23" Apple display's flanking the new PowerBook G4 and have a game playing on all three, giving you almost 180° of viewable space. You would be the UT champ in no time. :D
 
SPG, I just have a low threshold for idiots today.

Maybe tomorrow will be better, since will be Friday and with improved weather (no rain) I will be able to ride the HD.
 
Originally posted by AlphaTech
SPG, I just have a low threshold for idiots today.

The War On Stupidity And Ignorance takes no prisoners. I totally agree, if you don't do your research or no what you're talking about, don't expect to get away without someone letting you know. People can make a mistake, but a fact like that is too easily turned up.
 
i am not a graphics guy so help me here, but i like laptops...i have 2

is the graphics card on the tibook made for photoshop more or for gaming and moving animation more?

and do you traditional non-moving graphics people find agp graphics better than older pci graphics for photoshop and illustrator?

and if so, by a lot?

at the gaming store in the mall, the gamers, many of whom are techies, talk about the cards as if the high tech industry are gearing them for motion related games and moving graphics only...
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.