I've grown so weary of the never ending stream of "...possibly gain root access...execute arbitrary code... FIX: improve bounds checking" security issues/resolutions we see documented in security bulletins, changelogs, etc.
I've been in software development for 40+ years. Looking back to the earlier days [puts on dinosaur mask], I used systems such as HP3000 and Burroughs that had hardware stack architectures which enforced separation of code and data. Executing "arbitrary code" was virtually impossible. And, by default, the HLL compilers automatically interjected bounds checking code everywhere there was an array/buffer access. The hardware facilitated optimized execution of those bounds checks. Back then there was some discernible execution slowdown caused by the bounds checking but nobody questioned the need for it. Nowadays, with modern hardware, the impact of doing so would be absolutely negligible. But yet, for the most part, it isn't being done.
Furthermore, many of the elements of the modern OS and the apps running thereon revolve around interpretive operations such as parsing URL strings and acting upon them. That has to be done carefully and methodically with robust prerequisite checks to prevent unintended and/or risky operations. But typically those interpretive operations are done hastily and sloppily. I have had to analyze regular expression pattern matching strings found in code that were so long and complex that it was tough to maintain my sanity.
Back in the day, we learned a principle called KISS which I'm sure is still taught in school. But with today's aggressive software development schedules, overly ambitious feature sets, and usually poorly managed tech workforce, that principle has been largely forgotten. The reality is that complex problems can usually be solved elegantly with technically straightforward, durable, and maintainable approaches. To devise such an elegant approach requires a solid up-front analysis and design by senior level staff members and will consume (as it should) a substantial portion of the total time required to reach the deployment finish line. With agile development methodologies, I've found that the overall high level design never gets enough attention and is seldom refactored. So suboptimal, overcomplicated, and inherently deficient approaches remain in place until an inevitable rewrite, while a seemingly endless parade of tweaks and kludges is necessary to eventually result in a functional work product. Regardless of what QA processes are involved, that "functional" work product will remain insufficiently robust, overcomplicated, and suboptimally maintainable, as it will have been irreversibly hobbled by the poor design. And when additional staff is onboarded for that kind of project (during initial development or maintenance phases), it will take substantially more time to get them productive and fully embracing the design, such as it is, and to comprehend the cumulative effect of the parade of tweaks and kludges. While top tier developers should be able to embrace a solid, elegant design quickly and hit the ground running within a couple of weeks, it could take months of struggling for those same developers on projects which were started with a weak design and suffered a long history of agile remediation. To make matters worse, the fringes of projects are often worked on by lower tier developers who will struggle even more with all of the things that aren't intuitive (and they shouldn't be blamed for that). The confluence of these factors inevitably results in a jumble of code that only gets more jumbled over time, and this holds true even if the end result visible to the user seems to function properly -- perhaps with some set of bugs fixed -- and looks nice.
Even having come from the dinosaur era, I think C and C++ are rather primitive and arcane. They are fine for low level OS code and utilities but there needs to be something far better. I don't know what that something is. And for those with perl, php, ruby, javascript, etc. on their tool belts, I must add that not everything is a web page. And for those wearing Java on their tool belt, well, Java isn't the answer, either.
Compared to the primitive application software and operating systems of days past, modern offerings are much more interactive, well connected, and beautiful. But in many ways, the foundations are much weaker now than in the past. Furthermore, mid- and upper-level development managers are less competent now than in the distant past, at least based on what I have experienced.
I will end my rant now so I don't cause a buffer overflow and accidentally execute arbitrary code on a MacRumors web server.