In my opinion, you have to look at two different cycles:
a) The hardware cycle
b) The software cycle
Both of them feedback to each other. If you have slow hardware, it does not make sense to develop slow and bloated software, because no one will be able to use it.
But if your software is limited, new hardware looks lacking and uninteresting.
We are just at a stage were speed and first and foremost power-efficiency makes great devices possible.
Now the software has to follow suite.
You might think that we have already been developing software for powerful desktops in the past, so software should not be a problem.
But indeed it is. Software needs to follow new ways of interaction.
Sure you still want powerful hardware to process your movies, scan through your photo or mail archive or similar, but most of this can be done remotely.
Nobody ever thought of storing even 1TB of photos somewhere, because their internet connectivity would never allow to seamlessly access this data.
And this is why software today and in the next 10 years will mean different things than you 3 CD Microsoft Office Installation package.
I really believe this is why most people thing current companies have reached a limit.
They have the hardware, they mostly have the connectivity, but there are too few attractive options for the end user.
The main reason is that internet providers weigh bandwidth in gold, meaning that in most places you cannot stream Netflix, YouTube, Amazon or any other service without having some bandwidth related negative glitchy experience.
It's really exciting when it all works (software, hardware, connectivity), but it fails even in your local WiFi, which lets people be disappointed.
Just my 2 cents.
There will be so many connected small devices without any of this BS that currently leads to "My browser is so slow".
Learn networks, Learn real programming languages, Learn about protocols and data processing.
You'll be fine without the "Web".