I think you missed my point, I was talking about latency, not whether gigabit was necessary for streaming. I just don't like watching the spinning wheel while my movie loads - and this happens with standard definition content. Also try "rewinding" or scrubbing, the ATV can't keep up and the screen goes black.
This does not happen when I access the same iTunes server with home sharing from my Mini or Macbook Air, movies start instantly and scrub smoothly. But I was also talking about the ATV3, which is what I have. It may be a little better on the ATV4 if you use 802.11ac wifi instead of the 100baseT ethernet.
I understand what you are saying just don't see why you would experience such a noticeable difference.
The difference in latency on an AppleTV shouldn't be discernible between 100 and 1000base-t, or at least I wouldn't think so. Latency will increase with large packet sizes on 100 vs 1000 however we are dealing with standards here so its likely 1500. I wouldn't think the latency difference would be anymore than 1ms.
EDIT: Looks like I was wrong. Appears to be ~1.5ms
Out of curiosity I just pinged things on my local network to compare. Ping sent from iMac on 802.11ac with good signal strength through AirPort Extreme to AppleTV (10/100base-t and 802.11ac wifi) and NAS (1000base-t). Ping count was set to 10 for reasonable sample size.
AppleTV Ethernet 10/100base-t
AppleTV Wifi 802.11ac
NAS 1000base-t
Getting off topic a bit but there isn't much of a difference between ethernet and wifi when it comes to the AppleTV. And with higher throughput of wifi AC it might be a better choice for some, assuming you have a stable network.