Unfortunately, we can't conduct the required test. But even lacking the test, to claim my assertions are unsubstantiated is nonsense. They are based on recognized principles of engineering. Now you claim this would make EEs giggle, but I think they would laugh at what you say. It's pretty easy to understand that the socketed RAM has more pins and connection points than does the soldered variety, and it is immediately obvious to anyone with the least understanding of how electrical currents work that the more crap you add on to your connection, the more resistance and voltage is required to pass your signal. There is a greater signal to noise ratio in the socketed variety. Yes, the differences may only be microvolts, but as I already indicated early, those accumulate and make a difference in things like standby performance, even if under normal usage you wouldn't notice any difference.
I tried telling this to an actual electrical engineer. The result was giggles. Here's the thing: While it's totally possible that there are microvolts involved, they are... well, microvolts. They are not even measurable with most equipment compared to the active components involved. If you mess around with a PC BIOS, you'll note that you can adjust RAM voltage, usually by increments of .05V. So if your theory that "microvolts" are significant and have a measurable effect is correct, obviously it should be the case that the average PC bios lets you change a setting which matters roughly 10,000 times as much.
That's... frankly, stupid. Even if we grant that there is a measurable difference in resistance (and for most measurement tools, there isn't; the resistance of a series of connectors is usually too small for people to measure), the difference is on too small a scale to be measurable compared to other things.
And actually, I think you have it backwards. What happens is not that voltages increase; it's that amperages decrease a little. So higher resistance tends to reduce total power consumed. Now, if it reduces power consumed too far, that would prevent things from working. But it doesn't in reality.
So not only is the difference unmeasurably small;
it is in the other direction.
Furthermore, resistance of many materials varies noticably with temperature. Which means that the alleged differences you're talking about are going to be totally swamped by the difference in resistance you see from, say, running with or without a laptop cooler under the machine.
So there are three ways in which your position is laughably incoherent:
1. You're talking about things that are on the wrong scale to even be measurable.
2. There are much larger effects from other things.
3. The effects, if they existed at all, would go in the opposite direction from the direction you're going in.
And again, yes, I checked with an actual electrical engineer.