I've been a developer since the mid-1980s, so I have much experience.
Look at something as simple as the Y2K problem. Where programmers used 2 digits for the year, so '00' could mean either 1900 or 2000. Adding just 2 digits took 6+ months of development work for 3-4 programmers. I experienced it and lived it, and helped many companies out with their issues. Of course, the madness that the media portrayed of 'planes falling out of the sky' was ridiculous, but there would be massive failures without our work.
Research any articles about it, and realize that just adding 2 digits and storing dates as 01-01-2000 instead of 01-01-00 was not the only thing that had to be done.
It's possible there might be a 16-bit issue (The highest number that 16 bits can store is 32,768). If something in the code is stored as a 16-bit number, every reference must be changed to a 32-bit number to fix this. And all those references must be found. This is just the tip of the iceberg.
I am not sure what is more disingenuous: assuming that server software which got to production 3 years ago was written with 16 bit values, or comparing THAT to a cross-industry problem related to software written across lots of platforms, OSes and languages, during 30 years – or more.