In my 20+ years as a videographer, I have never seen a display of arrogant ignorance as this. And it wouldn't be so pathetic if it was just misinformation from someone who just started getting into the world of video, or if it was an opinion, but there's not one single thing in what you wrote that is correct from the technical standpoint. Plus you salted your ignorance with the LOL emojis, which makes you look like, well, a certain celebrity/politician famous for making ignorant statements and having no clue what he's talking about.
So let's see:
😂…Never mind that the encoding and bitrate for a BD is dismal compared to even the worst video hosting sites and "the cloud", whatever that even means to you, has NO LIMITS on bitrate!
Probably 99% of the people reading this knows that you made an ass out of yourself with this statement since it's common knowledge that Blu-rays have far higher bitrate than anything from even the best streaming services, and 4K Blu-rays have even higher bitrates. A 1080p Blu-ray normally has a variable bitrate from 16 to 50 Mbps. On 4K Blu-rays I have seen bitrates as low as 35 Mbps and as high as 90 Mbps, and this is under the HEVC codec, which has a better compression algorithm than the codec used mostly on regular Blu-rays, which is h.264, which is great codec in itself.
Streaming services go from terrible to decent when it comes to bitrates, with Disney+ and Apple TV+ having the best picture quality, and it stands to reason that that is accomplished by using higher bitrates. However, it is also widely accepted that a movie on HD Blu-ray will look better than the same movie on 4K in a streaming service. This is because the encoding software and hardware for streaming services have to encode at far lower bitrates, so they have to smooth the picture with DNR and other techniques. Also, a great chunk of the color data has to be thrown out. That's why when I compare movies I own on Blu-ray to the streaming equivalent, even on my iTunes library, which is the same as any series or movie on Apple TV+, the Blu-ray has more vibrant colors and I can observe far more detail, especially on the film grain. Even if the movie on the streaming service is in 4K Dolby Vision and I only have it on Blu-ray, the Blu-ray always wins.
The cloud has no limits on bitrate if you encode a video at 100 Mbps and just upload it to your Google Drive. That doesn't mean it's going to play every frame smoothly. Most likely it won't, because even if you have the best internet connection with Fiber, it depends on a lot of factors that I won't go into here.
However, the streaming services do have their self imposed bitrate caps, for example Netflix will go only as high as 16 Mbps (rounding down, I don't remember the exact number), but only for their own titles in 4K Dolby Vision, for the rest is a horrible mixed bag that can be either decent, mediocre or downright terrible, even for some of their titles on HD, like the great movie "Anon" and some others. Movies in HD that are not produced by them can go as low as 1 Mbps in HEVC. When it first started, Netflix quality for HD titles wasn't terrible, just mediocre at 5.7 Mbps in h.264, but when HEVC came out, their engineers saw it had much better compression, and decided that they could compress the living hell of out everything. The result is that almost everything on Netflix looks like crap now.
I could keep going on and on, but the point here is, you are absolutely WRONG.
As if using the archaic, painfully outdated AVCHD codec was somehow a sign of quality? 😄
This also shows your ignorance to anyone who knows even a little bit about video codecs and formats. AVCHD is not a codec, it's a format that is very useful for consumer applications, and still better than almost everything on streaming services. AVCHD uses the h.264 codec to record video from consumer and prosumer camcorders onto a DVD-R or DVD-RL, or onto memory cards, usually SDHC cards, with some recording onto HDDs, although less common. It usually goes as high as 16 Mbps for the consumer cameras and around 25 Mbps for prosumer cameras. It was a great format, I have two camcorders that record great video onto SDHC cards.
So AVCHD is not a codec. Again, you are absolutely WRONG.
Burning to BD and especially DVD has been a massive downgrade for 10+ years in comparison.
In this phrase you made the only statement that could be partially true. And it's that usually (not always), burning to DVDs has been a massive downgrade in comparison. If you mean in comparison to the usual content in HD in streaming services, generally yes, although if you watch the worst encoded movies Netflix has, and if you were to have a high quality version of that movie on DVD, with a good 4K Blu-ray player that has decent upscaling, that movie on DVD would probably look better. But generally speaking, yes, the HD streaming version will look better, because DVD was crap and even worse than the Laserdisc format that it replaced. Laserdisc was 480i but uncompressed, DVD was 480i or 480p and used the MPEG 2 codec at usually 6 Mbps for the typical movies you could buy or rent at Blockbuster. It was a terrible codec that could only look good at 50 Mbps or 40 at best.
But as far as your statement that burning to BD has been a massive downgrade, well, you're just wrong, although given your obvious ignorance about video technology I can only assume that you burned BDs with some crappy consumer software like Cyberlink Power something or similar, the type of software that uses the worst codec implementations and introduces all kinds of digital artifacts.
Regardless, to say that burning to Blu-ray has been a massive downgrade for over ten years is just blatantly ignorant.
And good luck getting 4K on to one (without paying through the nose, assuming you can even still get the needed tech). But then that's probably too modern anyway?
It's true that it's difficult authoring a 4K Blu-ray, or even an HD Blu-ray, hence my post asking if anyone has come across a decent authoring tool. However, authoring a 4K Blu-ray is not impossible, as long as you have a Mac with Final Cut Pro and a Windows PC, because there are certain programs that are free but don't have a Mac version. And the Windows PC doesn't have to be new, in fact my PC is 12 years old. Ideally you also need to have a good burner in that PC. But I have authored 4K Blu-rays of my own videos that I shot myself, so I know that it's possible.
So I guess you have vinyls made of all your audio and drive a horse-drawn carriage to work?? 🤣
I do have a collection of about 300 vinyl records. As for smartass question, I drive a Jeep Cherokee Trailhawk.
Blurays? Or any optical medium… reliable??!
Yeah, again, maybe if you pay through the nose and go to inordinate, redonkculous lengths to keep them safe. Totally worth it!
Hilarious.
Absolutely. Verbatim still makes blank optical media of the highest quality, in fact, I never had a "coaster" with any Verbatim spindle, and I have used a lot of them. I was used to have coasters with Memorex, and I thought it was the normal, but once the first Memorex BD-Rs I bought in 2008 went bad in three months and I lost a couple of videos I had made, and switched to Verbatim, I realized that Memorex is garbage.
And I still have all kinds of Verbatim optical media burned since 2008 until now, and it all works. And I didn't go to any "redonkculous lengths to keep them safe". Of course I didn't use them as frisbees or put my fingers all over them like most people, or put them in boxes in the attic. They were all kept at room temperature.
So once again, you are WRONG. On everything you stated with your stupid emojis, trying to make me look stupid, you made so many ignorant remarks that you ended up looking like an ignorant jerk.
And yes, it was (fortunately) officially removed from everywhere. The age-old unreliable frameworks were just barely kept on life support for as long as possible and were never a part of AV Foundation. And seeing that Apple is a forward-thinking company and maintaining frameworks for (maybe) 164 people is not how they roll… 🤷🏼♂️
Apple is (mostly) a forward thinking company, at least on most things, save for the fact that you can't still find aptX or LDAC Bluetooth on any of their devices, and they didn't put a USB-C port on iPhones until Europe forced them to, but in most things, they are a great company. That's why I have a Mac Studio M1 Ultra that cost me a fortune, a Macbook Air M2 that is specced up to the highest it can go, an iPhone 14 Pro Max with 1 TB storage, and many more Apple products.
But I still think that it was an asinine move from them to take out Blu-ray support all of a sudden without even a dialog that would say something like "Beware that upgrading to FCP 10.4.8 will remove Blu-ray disc authoring support". At least I would've had the choice.
But this is the only part in your post that at least can be said it's a difference of opinion. But the rest is just sad.