It's a non-standard resolution that nobody else but Apple uses. 3840 x 2160 is the industry standard for 4K. It's double the resolution of 1080p. Personally, I'm glad LG switched to that resolution.
On a side note, I wish they would have remained consistent and used vertical resolution for marketing and called 3840 x 2160 "2K" instead of "4K".
1080p = 1K
2160p = 2K
4320p = 4K
That's not entirely true. While 4096X2304 isn't a common standard, it's still a variant of DCI 4k, of which 4096X2160 and 4096X1716 are common standardized resolutions. Nothing wrong with you preferring the new resolution, which is indeed standard for tv (4k UHD), but 4096X2160 is the standard for DCI 4k and I, for one, appreciate having the full resolution for playback of DCI 4k files. (On the other hand, I can imagine DCI 4k is annoying when it forces you to upscale your UHD 4k files by a tiny bit for full screen playback. I'm not saying either resolution is inherently better, just that the old one wasn't chosen randomly.)
That said, obviously neither display is ideal for working with either variety of 4k footage (DCI or 4k UHD), except as a video playback monitor, which neither is really intended as, because you need room for the UI for everything except playback. So you'd probably really want the 5k monitor either way ideally imo if you're editing a lot of 4k video. But the old resolution is indeed a recognized variant of DCI 4k, of which 4096X2160 is by far the most common, and I suspect 4096X2304 was chosen to match that horizontal resolution with a 16:9 aspect ratio as 17:9 is uncommon for desktop displays. And for working with 4k DCI content, the higher resolution is preferable.
(The older monitor is also a much more pixel-dense monitor, but that's another argument entirely.)
Fwiw, I think the reason for the discrepancy between naming conventions is that 2k originally referred to film scans (which were typically 2048X1536 if you scanned the whole 4x3 negative), so in that case the horizontal resolution is the primary differentiating factor, as the vertical dimension could change based on aspect ratio, while the horizontal dimension wouldn't change... much, at least... 1998X1080 is also a variant of DCI 2k, but I don't believe it's a scanning resolution, just a resolution for projection. :/ I'm not sure.
I also suspect manufacturers embraced "4k" because it sounds like a lot more than 1080p despite not really looking much different. So that is kind of annoying, I agree.
Regardless, there's nothing wrong with you preferring the new resolution, but I and others have our reasons for preferring the old one.
And while, yes, it is a consumer-grade monitor, there are industries that use consumer grade monitors professionally. A 4k DCI grading monitor might cost $20k, so there will be departments (editors, assistant editors, vfx, graphics, etc.) that can benefit from the full DCI resolution without needing perfect calibration and full gamut.
Just my opinion of course, but I prefer the old resolution, and don't think my reasons are entirely trivial. Of course, for those who aren't editing video, I suspect it doesn't matter one way or the other if they're working with DCI 4k or UHD.