Umm, posters are mixing
bandwidth (rate) with monthly totals (
amount).
A laptop has traditionally used more
bandwidth during use because it has a faster processor. Supporting a high rate of transfer costs the carrier far more than trickles of data that add up over a long time.
E.g. A single laptop user is downloading files at 3 Mb/sec. He's using a huge portion (1/3) of the available 9 Mb/sec
bandwidth (rate) assigned to an older tower. If he stops after about an hour = ~ 1GB total transfer.
OTOH, let's say there are 200 instant messaging users on the same tower, each sending and receiving 50 char messages twice a minute 24 hours a day. Together all 200 people are using almost no tower bandwidth because of their slow rates, but still end up with 2GB total a month.
So how do you bill these users fairly? Who requires the most network resources? Clearly the higher bandwidth users cost many times more in infrastructure requirements than lower bandwidth users do, even if the latter end up with higher total amounts over a month.
However, carriers don't monitor each user's bandwidth usage, they add up transfer totals. So they have to assume that tethered laptops will be using more expensive bandwidth.
An added glitch these days is that smartphones are getting more and more powerful and also use more bandwidth for video etc.
Personally, I'd hate to be in charge of figuring out contract plans