Apple Pulls 4K LG UltraFine Displays From Online Apple Store in U.S.

Again, you are making a straw-man argument of things I never said and then attacking that while completely ignoring what I did say.

My post you replied to was

here's nothing at all there about laser printers.

As far as displays. During the heyday of Apple's Cinema display line there were plenty of commodity PC monitors at 20% of Apple's price. But Apple monitors were far superior. They were targeted at people doing graphic and video work, they had superior gamut, color fidelity, and richness. They used IPS panels when the whole market was using a very poor version of TN, the viewing angle on those TN panels was so poor, even looking straight on, the edges looked all washed out and the colors were inconstant across the screen as the angle to your eye changed. For people used to commodity monitors at the time, the Apple displays looked magnificent in the store and nobody questioned the 5x markup being worth it. Today cheap monitors are a lot better than they were a decade ago, but there's still a market for high end monitors. Apple just doesn't want to compete in the high end computer hardware business.

As far as computers. That's even easier. Until 2011-2012, Mac hardware was, performance-wise, on par with the best PC hardware at about a 30% apple markup, which was well worth it for the sleek well-designed computer. I looked hard at windows laptops around the time I bought my 2011 MBP. And for the same price I paid for the macs, it was hard to find a PC to match its power. And then the mac had a better keyboard, case build, and display. It was a no-brainer to buy the mac. At the time I remember a PC Mag article saying the best windows laptop on the market was a macbook running windows, even with price factored in. IIRC, the Sandy Bridge CPU in my macbook wasn't even available to PC vendors yet at the time I bought the macbook. Apple had an exclusive window with Intel.

Today the latest Apple laptops just don't compare well to windows. You have the gimped keyboards, poor performance, non-removable storage (which makes data recovery impossible if anything else goes wrong with the logic board), emoji bars, etc, all in a computer that is nearly double what it cost in 2012 for a comparably specced machine. My 2011 MBP is by far the best computer I ever owned. My 2017 is well in the running for the worst. And the only Mac (out of about a dozen) I ever had buyers remorse over. It is also absolutely my last mac unless Apple does a hard 180 (which I'm not holding my breath for), which I find pretty disappointing.

So, please continue to pretend your nonsense about laser printers is not building a straw-man because you've got no legitimate response to any of this. Laser printers....wow.

You’re still not getting it and how analogies are used. Not surprised. I am enjoying your jibber-jabber!
 
Unfortunately Dell has discontinued the 27" dell P2715Q. The P2415Q needs a TB3 to TB2 cable.

That's a shame. Are there any monitors on the market aside from the LG that have native Thunderbolt 3 support? I'd love a mid-range monitor that supports Thunderbolt 3 along with a couple of other common ports (HDMI, etc.) for maximum flexibility.
 
You’re still not getting it and how analogies are used. Not surprised. I am enjoying your jibber-jabber!

I don't think you understand the definition or analogy or the purpose of analogy in debate. You're using straw-man, pure and simple.

Let's give you a little more rope, how do you believe switching to a discussion of laser printers is a useful analogy that helps describe the displays and understand the situation? I have pointed out exactly why it is a completely wrong and useless straw-man, so why do you believe it's a useful analogy?
 
The LG monitors had issues in the beginning that need to be patched/fixed. I’m sure Apple wasn’t too happy about that. Apple probably thought LG can do a good job, but didn’t. So Apple decided to go back making their own monitors again.

Hopefully this trend continues into software with the revival of Aperture. Somebody needs to give Adobe some competition.

But the iMac screens are made by LG and they are pretty fantastic. o_O
[doublepost=1555985801][/doublepost]
I wonder if they figured out the burn-in/ghosting issues. I was at the Apple Store a few days ago and saw one of the 5K displays connected to the Mac Pro, and it had severe burn-in. Not only could you see the windows/dock/menu bar if you open something else, but you can also clearly see the desktop. It's like someone took a screenshot and permanently overlayed it over the display at 50% opacity. I wonder what causes this kind of phenomenon.

A) That's crazy and unacceptable.
B) I have an LG monitor, not an Ultrafine, but a 4K model released at CES 2018 and it has ZERO ghosting / burn-in issues. It's a 27" monitor and I paid NOWHERE near what the Ultrafine $1300 5K 27" monitor costs.
[doublepost=1555985842][/doublepost]
True, since it’s impossible to use all those pixels for anything other than editing 4K videos.

Um, watching 4K videos?? ;)
[doublepost=1555986543][/doublepost]
I wish. But instead retina monitors are still extremely rare. Even the LG one is hard to find and still costs as much as three years ago.
Sadly, most of the monitor industry is either cheap low resolution office displays or gaming displays that focus on refresh rate instead of resolution.[/QUOTE]

Bingo. I had such a touch time trying to find a second monitor to connect to my iMac. I ended up on an LG that's halfway between a gaming monitor and a has enough resolution and color gamut to properly display large format pics and videos.

(the monitor is an LG27UL650-W)
 
Most Apple computers don't even have the processing power or ram to even edit 4k video so I don't even know why they are pushing 4k monitors.

I know people have already mentioned how incorrect your post is, so perhaps I'll answer using the language of your avatar: You're wrong, and I know wrong, I'm the best at knowing wrong and you're completely wrong, trust me, ok? Thank you.
 
I know people have already mentioned how incorrect your post is, so perhaps I'll answer using the language of your avatar: You're wrong, and I know wrong, I'm the best at knowing wrong and you're completely wrong, trust me, ok? Thank you.

Stop trolling.
 
The time it takes the computer to process a change relative to the amount of time it would take to watch the footage that was altered. If you make a change that effects 60 seconds of footage, what is an acceptable amount of time that you can't use the computer because it is processing your change. Is 3 seconds of down time acceptable? How about two minutes?

What NLE and hardware are you using? All NLEs are instantaneous while cutting, trimming, playing back, applying color grade, even keying. There is no "process time" when I trim a clip shorter. It's all instantaneous.
 
What NLE and hardware are you using? All NLEs are instantaneous while cutting, trimming, playing back, applying color grade, even keying. There is no "process time" when I trim a clip shorter. It's all instantaneous.

That statement is fundamentally false for anything done on a computer. Perhaps you meant to say it feels instantaneous because you don't appreciate the time it takes to process? I use Premiere and I assure you there is downtime when editing video. I promote iMovie for people who don't want to learn a complex program and that NLE has a lot of downtime when working with clips they took on their iPhone X.
 
That statement is fundamentally false for anything done on a computer. Perhaps you meant to say it feels instantaneous because you don't appreciate the time it takes to process? I use Premiere and I assure you there is downtime when editing video. I promote iMovie for people who don't want to learn a complex program and that NLE has a lot of downtime when working with clips they took on their iPhone X.

Yes, what I mean by instantaneous is I do not perceive any time when I click my mouse to cut or trim. I mean, sure, it takes time for the light from the monitor to reach my eye.

Since you want to argue semantics, what word or phrase would you prefer I use? Real time?

I use Final Cut Pro, Premiere Pro, Davinci Resolve and I see zero "down time". I play native video files in real time without transcoding. It takes time to export the final video, sure, but I do that in the background using Media Encoder with Premiere or Compressor with FCP, so I still don't have "down time" with the NLE.

Maybe you can give me a detailed example of what causes all this "down time" with Premiere Pro, and tell me what your hardware is.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.
Back
Top