One point I think worth mentioning in regard to the pentile pixel layout is the layout of the rods and cones in our eye. Specifically, that there are a lot more rods (that don’t perceive color) than cones and that the rods themselves are much more sensitive. Thus, most of our visual acuity isn’t influenced by what color light is being perceived, just the intensity/contrast.
I would guess based on the Apple developer info describing 1125x2536 pixels rendered that the screen driver is able to selectively either include or exclude “borrowed” red or blue subpixels based on relative location to the logical pixel. This would allow a full physical pixel to be divided in a way not possible on an LCD and, at least in the case of rendering black/white/gray appear just as sharp to the rods in our eye as a screen truly 458 ppi. In the case of photos, the resolution of color gradients would be at a lower ppi as you have calculated since it would have to average red/blue components between 2 pixels, but that probably is not perceivable by the less sensitive cones in our eyes.
In other words, I suspect the screen drivers for the pentile OLED’s are able to selectively turn on/off red/blue sub-pixels to deliver the full 458 ppi of light contrast that our eyes are most sensitive to, without much practical loss of resolution of color since our eyes are naturally less sharp at interpreting color.