WARNING: continuing the off-topic-ness. Apologies to any offended parties in advance.
MarkCollette said:
I think we both agree that given time it might be best to put the effort in, and that in this specific case there was not a lot of time.
If you're talking about this iPhoto update (see, we're on topic!

, then yes, I agree. I was more addressing the patching issue in general.
MarkCollette said:
I don't think you can write off the QA time by having a solid patching program.
Write off, no. Be less cautious is more like it, especially if you make sure to put the patcher itself through the QA ringer beforehand (thus making the worst-case QA for any patch re. the patches themselves a regression test).
MarkCollette said:
With different software versions you sometime have non-code changes like config files changing, documentation errors being fixed, etc. that might be missed by just looking at code changes.
All of which are just bits, and handled by the binary patching. I'm not talking about code changes, I'm talking about a system that takes a set of files that are labeled "iPhoto 5.0.3" and another set called "iPhoto 5.0.4" and figures out bit-for-bit what's different. That would catch, well, everything
MarkCollette said:
And even if you catch all of those, what about files that are generated by the code only on the user's machine, like preferences, caches and indexes. Those might have changed format, or become redundant.
And they won't be handled by full-file patching, anyway. In those cases, it's either a) the installer's job to upgrade or erase them, or b) the app's job to gracefully handle older formats. Either way, it's a separate issue from binary patching.
MarkCollette said:
Plus we don't know what was causing the photo corruptions. It could have been bizarre interactions with many different subsystems. Or maybe when fixing some little issue, some coder thought "I should rename this method that's called everywhere, to have a more descriptive name"
I *hope* they don't have debugging symbols included in those builds! In case they do, though, the name change would only be in one place (the symbol table) and thus only add a few dozen bytes to the binary patch
As far as the many-different-subsystems, as I said, if the binary diff becomes too large (approaching the size of the original file), you just fall back to the full-file method. No harm done, since that's not what the binary diff is meant to help with anyway.
Maybe it would help to think of it a different way: the binary patching would basically just be a more efficient file copy than the file copy that the installer/Software Update performs *already*. Everything else remains identical, it's just that you get a file copy without having to actually download the original file. It could literally be dropped into place without affecting the rest of the entire update utility, very easily. If it were any other way, then it *would* be a QA nightmare and not worth the risk. It's the self-contained, modular, narrow-field-of-focus approach to binary patching that makes it work so well in patching/updating/installing land and keeps the QA and risks to a minimum.
Just to keep this back on track wrt. iPhoto, in this case, yes, getting the patch out the door to fix corruption issues was paramount. With a binary patching method in place already, they could've just released the full version to Software Update, then generated the patches afterwards. The initial adopters that *had* to have the update immediately would have to suffer through the full download, but they'd get it ASAP, which is all they care about. Those that waited would still get the patch, but would get the smaller binary-diff patch (once they were done generating), which will make them happy since the time-to-download is a larger (relatively) issue for them.
...
ah, to hell with it. Just gimme access to the Software Update code and I'll show you. Easier to just demonstrate it and be done with it
--mcn