Others have mentioned you should be downsampling, not upsampling to conduct this test, and they are correct. That aside, one person does not a sample make; someone has to be lucky occasionally...
This is a separate issue that has nothing to do with bit depth / sample rate. Yes, there is a tendency for 24/96 re-releases of recordings to be produced from superior masterings than the original releases (although ironically HDTracks has been known to use some terrible masters for their releases, see Red Hot Chili Peppers...), but this is no argument for higher bit depth / sampling rates per se. I often download 24/96 releases downsampled to 16/44.1 for this reason.
This is just plain silly. HDMI carries a digital signal. Digital signals are either transmitted in full or they are not; there is no in between. It is true longer distances require higher-spec cables to carry a signal, but determining if you need such cables is a triviality: if the cable isn't up to spec for the distance in question, it simply won't work, if it is, it will. There is absolutely nothing to be gained in investing a single dollar more than the minimum required to get a cable that will produce a signal.
There is a lot of subjectivity in audiophile land, but this is one area where there simply is not (better still this can be demonstrated with math; debunking other areas like amps, analogue cables and DACs requires ABX testing). Anyone who tells you otherwise is selling you snake oil.
Indeed, speakers/headphones may be the only area where there even is any subjectivity at all.
HDTracks isn't doing the mastering. The Record labels send the masters to various studios who do the actual mastering and conversion, HDTracks only sells what the record labels give them.
Yes, in some cases all they are getting is 24/44.1 versions and there is not much difference to the original 16 Bit. Some of their content is actually still in 16 Bit form. What I see HDTracks has in SOME cases is the record label has the original analog tapes either reconverted to 24/96 or higher and they simply remove a lot of the compression they used in the Redbook CD version and they sometimes do other things in addition. What exactly, I don't know, but I've compared the 16 Bit rebook to the 24/96+ and there is a HUGE difference.
Unfortunately, there is no set standard on what's being done. The first issue is how was the recording originally done, Analog or Digital. Then there is a matter of do they upsample or downsample or if they use less/more compression, etc.
Some of their recordings are simply converted from DSD (SACD masters) to PCM, that's a fairly simple process, but i don't know how many of the preexisting albums were archived in DSD to begin with.
Some of the newer digital recordings were originally done at 24/96 or higher and it makes so much sense to offer those recordings in both 16 Bit MP3/ACC for the mobile crowd AND 24 Bit versions that are left alone for the home audio enthusiasts crowd.
As far as what some of these non-audiophiles say, sometimes they have a valid point and sometimes they don't. The problem in the audio world is for there to be set standards for being able to do more quality of sound measurements that are valid and repeatable and used throughout the audio industry.
A lot of measurements these companies are making don't exactly analyze quality of sound. Looking a 1kHz sine wave doesn't tell you anything because music is NOT a consistent one frequency sine wave. Sine waves only exist on a test bench, not with music. With music, you have rise time, sustain, decay, and harmonic structure, etc.
Now, in terms of HDMI, most people get hung up on just the video portion. If you look at 4K video, you have different bit depths and fps, the higher bit depths and fps require higher bandwidth and in the case of certain installations, you need longer distances and these longer distances and higher bandwidth of 18Gbps require higher quality cable and in the longer runs, they need the more expensive silver plated copper or solid silver, with gas injected foam insulation, etc. And the terminations have to be well made and lots of good shielding. The cheap garden variety HDMI cables simply won't go long distances at 18Gbps, they are only rated at 10.2Gbps and for shorter distances. Then there is a thing called CL rated so it be installed in wall, not all cheap HDMI cables can be installed in wall. In terms of what works and what doesn't. I've seen tests where the cheap cables simply don't perform well when connected to lots of different devices, even for shorter distances, some have too much sparkles. Now, if you don't have high end equipment, then it's not as big of a deal because if it works, it works, but if it doesn't, then it doesn't. But when it comes to AUDIO portion of HDMI, these higher end installations are using up to 32 Channels, whereas the lower entry level systems are 5.1 systems. There is jitter in the audio portion of HDMI, PERIOD. The more expensive cables will simply have less jitter and that is measurable and that is audible in the higher end systems. The majority of consumers, I agree, will probably do fine with a cheap cable, but some will have to at least go to Monster's highest end cable, which is less than $100 (which is completely reasonable) and then there are those that are installing more expensive equipment in a much bigger installation and they will hear/see a benefit with the more expensive WireWorld or some other brand of higher end cable. The higher end crowd is using far more expensive calibration equipment for video and they are running their projectors at high color depths and fps vs the low to mid range consumer, in those cases they need the higher bandwidth.
I agree there is a lot of subjectivity in audio, you can't simply measure equipment and just use technical specs to choose a product, it only helps narrow down products. You simply have to hear the equipment in your setting with the music you play because you are the judge of the sound quality and it varies from person to person as to what they like and dislike.
I think there is simply a long ways to do before they can really determine quality of sound with JUST measurements or a spec. Again, making a generalization about 16 Bit vs 24 Bit is kind of crazy since there are varying quality of AD/DA converters with vastly different specs. There are some DACs that can play 16 Bit Redbook VERY closely to a high end turntable where you get the same emotional connection with the music. Why that is, I couldn't tell you. CDs for the longest time were very flat sounding, etc. and that may be partly due to the way the CD was mastered or the DAC or both. But they are getting much better with these DAC designs over what they had 30 years ago.