I’m sure you’ve heard the old maxim, “Those who can't do, teach”. I think the fact they’re former devs is telling.
As someone that has over 25 years experience developing enterprise grade software, I can tell you they’re full of hooey. If that’s truly what they’re trying to communicate they’re either being deceptive (unlikely), or they really don’t have proper understanding of what it means to work on large projects over years (quite likely), or their propensity for rewriting everything is what got them pushed out of the tech industry and they don’t realize that was a contributing factor (also a possibility).
I also think there’s a perspective shift that needs to happen around the idea of what a ‘major’ project is. It is unlikely anyone in university will be involved with, or produce, a product that would be considered major in the real world (barring incidental exposure during internships). I’m not trivializing the work university students do, but nothing done at school is going to be on the scope of a project that requires dozens, scores, or even hundreds or thousands of developers, working in concert, for years, to develop a single product. And iterating releases over that product almost never involves total rewrites. Too many lessons would have to be re-learned, too many ‘gotchas’ would have to be rediscovered and solved for again, far too much code would be rewritten almost identical to how it had been before, and far, far too much development effort vs massaging what you already have and smoothing out the freshly introduced destabilizing bits.
It would be helpful for people to let go of this idea that a company is being silly for not embarking on a ground-up rewrite of something like a major OS. It’s virtually never done and for many good reasons. If Apple were to do it, it would take several years, at least, and would still require a large number of iterations to ‘get right’. That, and the fact that they certainly don’t need to, I don’t see it happening for a long, long time, if ever.