I think you missed my point.
Even using the exact same settings, in the same software. Let's pretend the movie studio gives the exact file to Netflix and Apple. Film/Movies have leader, heads and tales, those countdown numbers in old movies.
If Apple or Netflix trim the intro or outro of movies a couple of frames, to save bits and bytes transmitted, the entire compression and individual frames will be compressed in different groups.
I'm not saying one is better than the other, I'm just saying your methodology ( or lack of ) would have failed my 6th grade science class. ( I say that as a student not teacher ) You are comparing single frames from different sources? Different codec settings? Different number of frames?.
Even with the same codec settings, do you know if the files have the same exact number of frames? A different number of frame in the beginning (even black space) could cause frame 1001 in the actual movie to be compressed differently.
I don't care which is better and it really doesn't matter, your analysis still fails.
Your opinion may be valid or may not be valid, but you don't have any science or math to support your rant.