If I were to open, for instance, a 1080p video on a 1440x900 HiDPI screen, would I be able to view it at full resolution? Would I have to tell the player to display it at half size to fit it on the screen?
Yes, to the first question. (Well, it's possible an app that isn't HiDPI aware might inadvertently prevent the OS from taking advantage of the retina display -- e.g., by using non-HiDPI offscreen buffers or doing it's own pixel-level rendering. My guess is that apps using QuickTime for playback while using Apple-supplied codecs will definitely take advantage of retina displays automatically.)
In regard to your second question. First I think most players will fit the video to the screen when first opening the movie. From there you can set it the way you want: e.g., "actual size", "full screen', etc.
In an app that is not retina-display aware the names of menu items might be named confusingly. So when a "half size" option is used on a retina display, the video would actually display at full resolution (exactly one physical screen pixel for each source pixel), while the "100%" option would actually display one source pixel into 2x2 physical screen pixels. For retina aware apps, I'd expect them to avoid confusing labels like that, though I'm not sure exactly how. Apple will hopefully set a good standard for that in their QuickTime player.
There's some well thought-out speculation on here, but I'm still not sure how a retina display would cope with images, such as jpg photos.
The consensus seems to be that if I had a small photo, say 320x200 that currently filled up quarter of my screen, the pixels would automatically be 'doubled' so the image would still fill up quarter of the screen and effectively would look exactly the same, as the OS layer displaying the photo would still address the screen in a kind of 2x2 pixels = 1 pixel 'non-retina' mode. That I can go along with.
However, what then happens if my photo is larger than the 'non-retina' size of the screen? If I reduce it to fit on screen, it's still doing to be displayed in the same 2x2 mode where it would now be better to show it in retina mode.
I realise that imaging software could be modified to show smaller images using 2x2 pixels and larger images in retina mode. However, this is the sort of thing that should really be handled at OS level, but it seems too 'messy' a solution for that.
In other words, when images are displayed at 100%, do we really think the OS will choose whether to use retina resolution or not depending on the size of the image?
I think it's going to be simpler than that. Generally, apps aren't going handle images at a high level any different than they do now. It's just when an image is actually rendered, it will use all available physical pixels to do so.
I think when an app runs on a 2880 x 1800 retina display, for example it will still think of it as a 1440 x 900 display and behave accordingly. Only low-level OS rendering routines will actually work with the physical "sub-pixels".
So for example, suppose you open a 4000 x 3000 photo on two different macs:
(a) has a 1440 x 900 13" display
(b) has a 2800 x 1800 13" display.
The app thinks of both displays as 1440 x 900.
For both it calls an OS API to create a window of size, say, 1400 x 1050 to display the image. But on the retina display the OS knows to actually create a window covering 2800 x 2100 physical pixels.
The app will then call an OS API to draw the 4000 x 3000 pixel image into the window. On the non-retina display the OS will scale the image down to 1400 x 1050 pixels. But on the retina display the OS will scale the image to the 2800 x 2100 physical pixels. Since all those pixels take up the same amount of space on the 13" retina display as just 1400 x 1050 pixels take on the non-retina 13" screen, the image looks a lot sharper on the retina display. Meanwhile, the app doesn't really need to do anything special to take advantage of the retina display. The OS is handling it automatically at the lower level.
Of course, as you mention at the start of your post, this is speculation. But it makes sense because it's the best way to ensure as many apps as possible take advantage of retina displays with the lest amount of effort. Most apps will mainly need to add high DPI artwork for their own icons and other built in bitmap graphics.
edit: sorry went on really long. I think this is the kind of thing that may be confusing when trying to talk about but will instantly make sense once you start using it.