Sort of. Here's how it works:
The resolution you pick isn't really in pixels anymore, the OS refers to them as points. The screen's pixel buffer will always be at least as large as the native screen's pixel resolution. There is a multiplication factor here.
So in your case, you have a screen that's 2048x1152 points. A multiplication factor of 3, and a pixel buffer that's 6144x3456 in size. So the pixel buffer is actually a bit larger than the screen. And for apps that support it, which should be most, images are rendered using pixels for the extra sharpness. But since your screen is now rendering more pixels, it has to scale that buffer down for the screen. So one pixel in your photo is no longer one pixel on your screen, although it will all still fit.
The default screen setting is set so that the multiplication factor (2), multiplied by the size in points, gives the screen's native pixel resolution. So an image at 100% has pixels that are 1:1 with the screen pixels. If that sort of accuracy is important, such as critical photo editing for professional use, it may be an annoyance. Otherwise, I wouldn't worry too much about it.