So I've searched many of the other threads around here, and I think I understand the technology behind Apple Remote Desktop and VNC, but I have a question around performance.
I've used VNC quite a bit, but have never had the opportunity to test out ARD. With VNC, the performance always seems to be sluggish at best. I'm assuming it wouldn't be much, if any, better with ARD since ARD relies on VNC "technology" underneath, if I understand it correctly.
So my question is this - is it theoretically possible to get real-time performance (as in the local user sees the response from the remote machine as though it was their local machine) with this technology? If so, how? Does it boil down to network speed/performance, or is there a significant technological limitation which makes this impossible?
I assume I could limit the screen resolution or color depth to help with this, but I don't want to do that.
Thanks for any insight you can provide on this.
As a point of clarification, both local and remote workstations would be Macs (thanks for the note aristobrat
).
I've used VNC quite a bit, but have never had the opportunity to test out ARD. With VNC, the performance always seems to be sluggish at best. I'm assuming it wouldn't be much, if any, better with ARD since ARD relies on VNC "technology" underneath, if I understand it correctly.
So my question is this - is it theoretically possible to get real-time performance (as in the local user sees the response from the remote machine as though it was their local machine) with this technology? If so, how? Does it boil down to network speed/performance, or is there a significant technological limitation which makes this impossible?
I assume I could limit the screen resolution or color depth to help with this, but I don't want to do that.
Thanks for any insight you can provide on this.
As a point of clarification, both local and remote workstations would be Macs (thanks for the note aristobrat