This kind of logic assumes processors will continue to advance at the same rate they did when Moore's law held true. I don't see that happening for a few reasons.
1) Physics: this one is self-explanatory. One need only read up on the challenges of smaller and smaller processor architectures to see that, unless some new model of computing is invented, processor gains are about to hit a brick wall.
2) Economics: this is a bit trickier to explain. The rapid growth of technology in the past 30-40 years has been funded largely by consumer-level purchases. But the average consumers don't even need advanced processors now, since all they really need is something to watch Netflix on, browse the Internet and add filters to their pictures. The processors in phones are all they need, and all they wil ever need. With the average consumer taken out of the upgrade loop, you should expect to see all computing devices take a nose-dive in sales. Funding for research will decrease. This, in combination with the above physical limitations, will mean a much slower upgrade cycle than we're used to.
I don't expect the iPhone of 2026 to be more capable compared to 2016 models than the 2016 phones are to 2013 models. For consumers this will be fine, but professional and advanced consumers will need machines capable of handling more intensive tasks. Those devices will have to be larger form factor and have software to match, so I expect desktop/laptop models to persist.