Again: A panel needs a DPR of 2 and above and a PPI of around ~218 at minimum to be considered a pixel-dense screen.
It’s common knowledge by UXers, engineers, and Human-Computer-Interaction (HCI) computer science experts that resolution is overrated in representing sharpness of a screen.
Rhetoric text such as “4K” is easier and cheaper to consistently market than pixels-per-inch (PPI), device pixel ratio (DPR), pixels per degree (PPD)—you know the things software actually use and care about the most to decide whether or not a screen can be considered a pixel-dense-screen.
A "pixel-dense screen" isn't a standard recognized by anyone and it would appear you've made up that standard for yourself. Giving an exact number is even more out of touch since again, it depends entirely on the application.
Hollywood production cinema at 4k on the big screen is only around 10 ppi -- it must amaze you that anyone could even tolerate watching such a low pixel density

And many still use ~1080p projectors since it still looks good with 4k content, as it's effectively 4:4:4 subsampling and low compression makes for a high bitrate 1080p, but still only about 5 ppi.
It's the same in print. 300 DPI for high-quality prints but a high-quality billboard might be 20-40 DPI.
5K for 27” and 6K for 32” is the minimum for large panels to catch up to where mobile devices have been for over a decade.
4K resolution beyond 24 inches is ill-equipped to provide high pixel density—it’s the McDonalds of resolution targets to shoot for on a large display.
It’s definitely convenient and prevalent—doesn’t mean it’s good for you!
8K, 16K, and higher is obviously more ideal—as well as necessary for large displays to catch up to the pixel density prevalent on mobile devices today—but you’re right the costs can be prohibitive for manufacturers and some end users to justify
Handheld panels need higher resolution than pretty much any other since they're the closest -- why would their resolution be relevant for large panels? The same way a 218 ppi would be poor in a camera EVF.
Higher than 16k? For what? What are you displaying that needs that kind of resolution?
4k is the
upper standard for video and, worse yet, it's still mostly 8-bit 4:2:0 chroma subsampling along with compression. Cinema cameras filming ~4k at 600 Mb/s yet final product is 50 Mb/s, for example. So we have
a long way to go before we even saturate 4k video quality. 4k gaming is still the upper standard and requires the upper-spec'd GPUs with lots of VRAM and now tricks like DLSS to invent frames.
Many would argue the slow 60Hz framerate makes any display the "McDonalds" quality -- certainly too slow for gaming but also for video because it gives 3-2 judder on the common 24fps video, unlike 120 Hz or higher displays. Imagine paying $5k for a 6k monitor but it's only 60Hz. Other qualities like bit depth, colour spaces and accuracy, and HDR of course are other significant contributors to apparent quality.
And if it's for general computer use otherwise, what would you need 8k or 16k for? According to Apple, "retina" is the limit at which your eyes cannot discern any more, so why go further? Is your text in Xcode going to be that much sharper? Is it that much sharper over 4k at normal viewing distances? Enough to warrant the 2-3x cost and loss of high-refresh options?