I read the update--And the fact remains Google intentionally blocked the local content workaround at the moment.
You don't seem to know what the words "fact" or "intentionally" mean.
If I released an unsupported iOS app that relied on an unlisted testing feature in OS X Mavericks that was removed or rewritten during the developer preview, you wouldn't be dogging Apple for anything, you'd be saying "well, what did they expect? it wasn't even an approved feature, and it wasn't made for that purpose!"
This is the same thing, only the developer reverse-engineered Google's proprietary code to enable the work-around. That's not a reliable solution for anything. It's possible that all they had to do to "break" his code is to recompile the source a slightly different way, or standardize the name on a couple of functions, or merge a couple of code libraries that were redundant.
We have no idea, but it is transitional, early code, and they said as much right from the start. Google isn't known for malicious coding. It isn't worth the investment of developer time, and it isn't worth the potential backlash. This is a company whose major products are all free.
----------
Chromecast isn't able to receive streamed data directly from any device, so AirPlay would be hard.
Chromecast is actually running a simplified version of Chrome OS, i.e. a Google Chrome browser on a chip that you give commands to.
For example, to play a YouTube clip, the app actually tells Chromecast to "Visit YouTube and play this clip at this address". No video is sent from your device! That's why it can do Netflix, YouTube, or mirror a Google Chrome tab (it simply visits the page for you), but why AirPlay would be hard and probably also suffer in quality since it'd need to stream AirPlay via some proxy website. It can show anything that Chrome can show easily, but everything else is hard or impossible. It'll probably never be able to mirror an iPad display.
Chromecast has unofficial PC desktop mirroring support, but I assume this one uses some proxy method, so that your desktop (grabbed by the local Google Chrome) is sent somewhere on the web, and Chromecast retrieves it from there. If this method doesn't support audio, that would explain why this experimental feature indeed doesn't mirror audio along with the screen.
Not quite right, there. When you cast a Chrome tab, it doesn't go and pull that content from the internet directly via the Chromecast. That wouldn't be possible on things like websites that require authentication. It is casting directly from your local device (laptop, for example), and sending that info (audio and video) to the Chromecast directly.
I can Cast my work email, which won't even let me log in to two browser sessions at once using two different browsers or two windows of the same browser. It's legitimately like "screen sharing", and there is experimental support for screen sharing, as well. That isn't going out to the internet and coming back to the chromecast. It's staying on the local network.