UVA (where I work) has a disproportionate number of spoiled trust fund country club preppy frat kids so their numbers might not be the best way of measuring the U.S. college scene.
Of course not: its just one regional datapoint, just as is also Princeton University,
where their freshmen blew past 60% Mac adoption back in 2007, so they're obviously ahead of the National Average.
At the other end of the college pool, we'll find two year community colleges.
I've not seen any studies - and I intend no maliciousness - but IMO its safe to say that the student demographics there aren't quite as high. Statistically, they're merely below the National Average ... to which advocates will suggest that they're
'dragging down the average'. Yes, they are, but by this language choice, we also have to say that places like Princeton are
'dragging it up'.
What is obvious to any good Statistician is that the composition of the types of colleges that were respectively surveyed - - as well as how they may very well have had been weighted - - will also make a potentially very significant difference in what gets reported.
Getting back to using UVA's information, it was merely that MR made the annual adoption data conveniently available for use & illustration of the point that I was making.
To reiterate, that point was that it is Invalid Statistics to compare Apples to Oranges.
Here, the one fruit was was the contribution of just the new incoming freshmen (single datapoint) and the other fruit was the total sum average of all enrolled students (moving four point average), where we know that there is a gradient across the dataset.
Those who see & understand what I'm talking about ... are probably thanking their College Statistics Teacher and wondering how Elmer-DeWitt could have gotten such a basic thing so very wrong.
-hh