or translate it into terminology similiar to 1600x900
1080i means 1920x1080 interlaced. So of the 1080 vertical lines, only 540 are sent on each refresh. 1080i content can be played on any hi-def device (all the 1366x768 tv's support 1080i), but if you want it to be scan doubled, so that you're de-interlacing and showing all 1080 lines of each pair of refreshes, then you need the same thing you need for 1080P -- a display of resolution 1920x1080 or greater (subject to the caveats already discussed).
I don't mean this part to be snide at all.. I do get snippy

but not at this moment. I am very new also to learning all of this, because I only very recently got a hi-def television.
The upshot... or the short version of the answer... is that it really doesn't matter. When you ask blind judges to pick out which image is playing on the 1080p device and which image is playing on the 720p device, or which image looks best, you tend to find that it is very hard to tell the difference. People who've been in blind AB comparisons even say that telling a 480P DVD apart from a 1080P Blu-ray when they're blind AB'ed is very difficult. TVs tend to distinguish themselves more on things like black levels and contrast.
I'm not saying it's impossible to tell... under the right circumstances, one can. But it's not so worth worrying about as it seems.
The short answer is that it doesn't matter, and any new Mac has a screen sufficient to play any Hi-def materials reasonably adequately. Whenever you have a choice in a media source, pick as high as you can manage -- that is, get 1080p if you can, get 720p if you cannot get 1080p, and get 1080i if you cannot get 720p. If you can't get any of those, even 480p is not nearly as bad as its made out to be.