Well, when I rip movies with Handbrake, I get a file that is usually around 1.3 GB, but I usually don't notice a difference between the rip quality and the DVD.... I do think this effect could have something to do with the compression...
If you are ripping a 7 GB movie to 1.3, that means you are discarding about 82% of the information that was remaining when it was compressed to 7 GB originally, during
that transcode. So yes, that will be a compression level that will be noticeable from the DVD, especially on motion.
Broadcast TV in HD can discard 99% of the original information (1.485 gbps to 14.85 mbps or less) and still look pretty darned good, but when you compress a file secondarily that is already compressed greatly, the concatenation means rounding errors accumulate significantly. It's similar to trying to create a 128 kbps MP3 from a CD quality file. That works pretty good, but if you try to do that starting with a 256 kbps file, the resulting 128k file will not be nearly as good (which is why Apple is trying to get record companies to supply original 192 kHz 24-bit masters for them to make their 256k library copies from).
One of the things that most folks don't realize is that even if your compression level is severe, if you have enough bits, it will not create artifacts at all.
Also, where compression breaks down is usually only on motion. Still shots even in compressed video have few if any artifacts.
Quality also depends on how well the deinterlacing from 1080i or 480i is done to 720p. 1080i, 1080p, and 720p all have essentially the same perceived resolution, all else held equal, surprisingly enough, but if the source is 1080i and is converted to 720p and the deinterlace is done badly, that image will suffer.
1080p will have slightly better dynamic perceived resolution (when objects move or the camera pans or trucks or dollys) than 1080i, but on still images it will look exactly the same as 1080i. Also, most 1080p is 1080p24, meaning the motion artifacts on 1080p are actually greater than on 1080i (motion will be sharper, but it will be jumpier from the slower flicker frequency).
And SD these days does look pretty darned good. One of the reasons is that most of the signal chain is now HD, and this preserves the quality all the way to the final downconvert, where in the past there was an increase in fuzziness and a loss of color and an increase in noise for every step in the chain, and there are a lot of steps so the cumulative effect was crappy SD.
The only thing that HD has (other than usually a wider aspect ratio) over SD is increased sharpness. Every other aspect is exactly the same as SD. The color gamut and color space is the same, the audio is the same, the contrast ratio is the same (again, all else held equal).
Also, the distance you view from makes a huge difference; we can't even see HD resolution from the distances that most folks view HD from in their homes; it's better than SD, but not as good as it would be if they sat closer. You have to sit less than 7.8 feet away from a 1920x1080 image on a 60" screen to fully resolve HD, and most folks sit 10-15 feet away from a 50-55" image. If you are viewing from a distance that keeps you from resolving the sharpness of HD, it really might as well be SD, because sitting that far away removes the single thing that distinguishes HD from SD.