Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

emw

macrumors G4
Original poster
Aug 2, 2004
11,172
0
So I've searched many of the other threads around here, and I think I understand the technology behind Apple Remote Desktop and VNC, but I have a question around performance.

I've used VNC quite a bit, but have never had the opportunity to test out ARD. With VNC, the performance always seems to be sluggish at best. I'm assuming it wouldn't be much, if any, better with ARD since ARD relies on VNC "technology" underneath, if I understand it correctly.

So my question is this - is it theoretically possible to get real-time performance (as in the local user sees the response from the remote machine as though it was their local machine) with this technology? If so, how? Does it boil down to network speed/performance, or is there a significant technological limitation which makes this impossible?

I assume I could limit the screen resolution or color depth to help with this, but I don't want to do that.

Thanks for any insight you can provide on this.

As a point of clarification, both local and remote workstations would be Macs (thanks for the note aristobrat :)).
 
Remotely controlling my home mac mini from work, I've personally found better performance with running OSXVnc on the mini and using the TightVNC client on my work PC.
 
on gig eathernet, VNC is almost real-time.

on wireless, it's slow, and over the net it's laggy at best.

I've never used ARD, just OSXVNC and the official VNC viewer for XP.
 
Gig ethernet, huh? I wonder - does anyone know how much data is sent for each view? I'm assuming it's not the complete screen data every time.

Of course, some lag may be introduced by data encryption or by compression, not just the data transfer itself.
 
I use OSXVnc a lot, and have relatively good performance, as long as both computers have decent specs.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.