This is not a meaningful question, in that I'm not asking for help, but it's not rhetorical; I am in all seriousness curious, as a geek, what the installers are doing. CS3's updaters were known for being finicky and slow, sure, but I didn't realize exactly how bad it was until I needed to reinstall CS3 on a new iMac and run all the auto-updaters.
Most of the older ones were similar, but as an example, the Flash Light updater is a ~50MB update to one component of a ~650MB software package.
Now, this installer ran for nearly ten minutes, sucking up about 50% of a CPU on an i7 iMac, and writing ~6MB/s of data the entire time, steadily. Not a thing running besides it and Activity Monitor. It wasn't reading much data--a few KB/s at most, with occasional bursts of a few MB--but the writes were constant for the entire period.
I am 100% serious in wondering how you manage to write and installer that writes roughly 3GB of data to disk from a 50MB package, when the entire thing it's updating is 1/5 that size. Seriously--what is it doing? Is it repeatedly re-writing the same files after changing a few bytes? Is it writing and deleting temporary files over and over again?
This is different from one of the Pagemaker updates of an older vintage that gracefully re-wrote thousands of help files at a rate of about one few-KB file every 5 seconds--I several times assumed it had just frozen, until I finally figured out it was only horrifically inefficient and took a half an hour to install a few MB of files. That, at least, made sense--it was just slow.
Anybody done some forensic analysis on one of those abominations or even able to hazard a guess?
Most of the older ones were similar, but as an example, the Flash Light updater is a ~50MB update to one component of a ~650MB software package.
Now, this installer ran for nearly ten minutes, sucking up about 50% of a CPU on an i7 iMac, and writing ~6MB/s of data the entire time, steadily. Not a thing running besides it and Activity Monitor. It wasn't reading much data--a few KB/s at most, with occasional bursts of a few MB--but the writes were constant for the entire period.
I am 100% serious in wondering how you manage to write and installer that writes roughly 3GB of data to disk from a 50MB package, when the entire thing it's updating is 1/5 that size. Seriously--what is it doing? Is it repeatedly re-writing the same files after changing a few bytes? Is it writing and deleting temporary files over and over again?
This is different from one of the Pagemaker updates of an older vintage that gracefully re-wrote thousands of help files at a rate of about one few-KB file every 5 seconds--I several times assumed it had just frozen, until I finally figured out it was only horrifically inefficient and took a half an hour to install a few MB of files. That, at least, made sense--it was just slow.
Anybody done some forensic analysis on one of those abominations or even able to hazard a guess?