Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
How old is Mr. Icahn? Perhaps he's simply out of touch with Apple. "Yes, they'll rule everything!" Yeah. TVs? No real profit in them which is why they don't generally make them in the US anymore. Apple was right to drop them. They would be better off partnering with someone that already makes them to add built-in ATV and Facetime, etc.
 
Lol, really?

I thought that was an idea DOA in my opinion years ago. Why make an $8000 TV (or $50,000 gold plated one) nobody wants to buy instead of enhancing a $69 box that can connect millions to Apple's platform?

An actual Apple TV set was never a good idea, period.
 
So, basically an Xbox Kinect... with added functionality.

Our TV entertainment is turning into more of a all-in-one system...



I thought 3D tv and kinect both flopped hard in recent years.

But 4k will eventually become the new 1080p, not for another few years we can presume.
 
I'm trying to imagine how they'd have it on display at the retail stores cause surely they'd wanna show off how thin they can make a TV

That seems the purpose of this otherwise non-sequitur display in the local Apple store:
 

Attachments

  • IMG_5427.jpg
    IMG_5427.jpg
    519.4 KB · Views: 95
HD took forever to start catching on though (first hit the US in '98, 25% adoption in 2008, and we are currently at like 70-80% adaption depending on who's numbers you see), and if companies didn't just stop making SD gear the transition would've even been more drawn out. Without the federally mandated move from analog to digital broadcasting (which meant everyone needed to buy new broadcasting and receiving gear anyway so why not go HD) HD wouldn't have happened until streaming, IMO, because it would have been too cost prohibitive.

With streaming though, it's just more bits flying across the Internet. I mean, in something like 5 years YouTube videos when from 240p to 1080p and it all happened in the background. Same for when Netflix went from SD to HD and now to 4k. Little to no disruption on the consumer side of things (unlike the analog to digital switch).




I doubt people will actually notice (just like most people didn't realize they were watching SD on their HD TV) but they'll believe the marketing so they'll think they'll notice and that perception, plus our consumer-centric culture, is all that it will take to move product.



The barrier for entry is actually surprisingly low. There are a number of 4k cameras out today including inexpensive GoPro's and DSLRs. 3D requires specialized products where as shooting and editing in 4k is just like HD (or even SD) just with bigger file sizes (and sometimes smaller file sizes thanks to better compression). I've got a 2009 Mac Pro and I can edit 4k on it in Premiere Pro so you don't even need a beast of a machine for it. Yeah, I have to edit at 1/2 or sometimes 1/4 res if I'm pulling enough streams, but even at 1/4 res that's still HD quality on playback.

Take a look at Craigslist ads in NY or LA and you'll see at lot of no-budget movies, music videos, TV pilots, etc., shot in 4k and looking to be finished in 4k. Need it or not, it is already on it's way to becoming the new norm.

Edit: I was going to counter you on the 4K editing, but you nerfed your own response. And then you received some great follow up responses from a couple of other users, so there's no need to be redundant.

----------

I don't think any of the OTA broadcasts will be 4k in the foreseeable future (if ever). Like you mentioned, the government mandate to go to digital broadcasting really opened the door for HD since it forced broadcasters to replace all of their equipment. 4k broadcasting would require the same thing (as well as new TVs or converter boxes for consumers) which is why I doubt it will happen.

The signal is already digital. What would a converter do for consumers? You can't upres to 4K on a 1080 TV.

----------

I feel we all know it's pretty clear the real reasons behind this:

1: Apple does not make anything other than some circuit board designs, a pretty case to put things in, and put together the software.

Hardware wise there is nothing they can bring to the table.

They know they cannot compete on price/quality.

2. More importantly. Apple only really want to be involved in hardware they can CONTROL and define how the user will use the product.

How many people would want to buy a locked down TV, that would not be able to probably connect to other competing services?

It doesn't really matter what apps an Apple TV's user interface would allow you to install or not. You can still connect a Chromecast/Roku to the TV's HDMI ports. And if in some way Apple restricted HDMI connectivity (which would never happen), most people have receivers, bluray players, gaming systems with external inputs that can be used in place of the TV's inputs. There is no such thing as locking down a TV.

----------

While I don't think a TV set makes sense for Apple it's not because of profit margins. It's because the future is mobile. Also I think people have shown they don't want 'smart' TVs. They want a basic TV with great picture quality and the box they connect to it to be 'smart'. People like Gene Munster are just looking at Apple's revenue base and thinking Apple will have a hard time significantly growing the top line with a $100 box. But I think they need to look at other areas, like the automobile for the big ticket item to drive top line growth. And who knows what else Apple has in the wearables pipeline where they've shown they're not afraid to go luxury.

Sorry but most people would rather watch a movie on their television than on their iPad or iPhone. The future of Apple is its mobile products if that's what you were intending to say.
 
Samsung UE48HU7500. One of the smaller TVs but one of the best tested 4Ks on the market (cost 1500 Dollars). I know about the general rule for distance from the couch. However. I have done extensive tests at home with content from different sources and I can tell you that there is a large difference between 1080P and 4K content of the same type.

The quality of Netflix 4k is actually very good. Especially House of Cards was a big chance after I activated the 4k option.

So, in general I know about the general principle of distance vs screen size, however in my personal subjective opinion the quality is definitely and easily perceptible from even 12 feet. That might not only be due to pixel size but also to other attributes. It is a great TV.

That's what I'm saying, maybe it is the set and not the 4K. Also much 1080P content (especially steaming, but also even on cable sometimes) is heavily compressed so much that you may have a better image from the 4K stream for that reason alone.

I'll even wager that if you see a difference at a distance higher than the ones I just stated, then it is probable that is the content itself that makes the main difference and not the set. Try buying demo quality 4K and 1080P content; you can probably put it on an USB key and play it that way, and I'm telling you that at the distance most people put their couch, there will be no difference if the only difference in the content is resolution.

Right now, a top of the line 60 inch plasma (too bad last year they stopped making them because profit margins for makers were too low) will blow out of the water run of the mill 4K at all distance people regularly put their couch at. Only OLED sets are really comparable. Anyone who has seen a top of the line Plasma or even better an OLED will find their LCD/LED TV to be utter crap! Too bad OLED are still too expensive (and their durability is also in question).
 
I thought that was an idea DOA in my opinion years ago. Why make an $8000 TV (or $50,000 gold plated one) nobody wants to buy instead of enhancing a $69 box that can connect millions to Apple's platform?

An actual Apple TV set was never a good idea, period.

But that never stopped analysts from predicting its arrival!
 
The signal is already digital. What would a converter do for consumers? You can't upres to 4K on a 1080 TV.

Over the air broadcasting in 4K is not w/in the current ATSC specs and the proposed ATSC 3.0 spec (which includes 4K) is not backward compatible with current hardware. So 4K over the air broadcasting will require new equipment to send the 4K signal and new equipment to receive it.

Wired (emphasis mine):

Pulling in a 4K signal over the air should also be possible, but it will take years if it happens at all. First, major networks will need to decide to broadcast content in 4K and upgrade their equipment. Then they’ll need to get on the same page regarding next-generation broadcasting technologies.

The most promising of those is ATSC 3.0, a proposed standard for television tuners that would not only allow over-the-air 4K broadcasts, but could also broadcast directly to mobile devices and add interactive elements to broadcast TV. That’s at least a few years out, and not all major networks are fully behind ATSC 3.0. Also, because ATSC 3.0 isn’t backwards-compatible with the ATSC tuners in today’s TVs, you’ll need new hardware.


I was going to counter you on the 4K editing, but you nerfed your own response

Counter away. Here's a dirty little secret that you assumed was me nerfing my response, it's very common for editors to reduce the playback resolution in the NLE because it increases performance. I did when I cut SD, I do it when I cut HD, and I've done it when I've cut 4K. Typically I do it when I start adding FX and/or pulling enough streams that the machine won't playback smoothly in real time anymore.

I went through the SD to HD shift as a professional and I'm starting to do the same thing from HD to 4K and the only difference is file size and needing a new monitor if you need to view the footage at 1:1. Sure, gear gets upgraded over time (more powerful computers, faster storage, etc.,) but that's not a workflow change anymore than going from an iPhone 5 to an iPhone 6 requires a change in how you use a smart phone. Going from tape to file-based required workflow changes, but getting handed a drive full of HD footage isn't functionally different than getting handed a drive full of 4K footage. Even for productions that do an offline/online workflow (i.e. shoot in in high res but use low-res proxies for the editing) the same basic workflow applies whether you are working with SD/HD or HD/4K (or in the case of Gone Girl 2.5K/6K).

On the production side there's not really anything that drastically changes either. You might need better quality glass on the front, bigger/faster media to record to and ideally a 4k monitor, but those aren't workflow changes just gear changes. Given the advances in onboard compression you might not even need really big or really fast storage, relatively speaking. For example, the GH4 can record 4K at 100Mbps and 100Mbps is also the bit rate for DVCPro HD (which is a thin raster HD codec that's nearly 15 years old).

3D is a whole different story though. The majority of the films that are shot in 3D are shot with two cameras on a rig (sometimes at a 90 degree angle to one another) so right off the bat you need two cameras instead of one, you need matched lens pairs, you need the rig to hold the cameras, you need to worry about the inter-ocular distance, etc.,. Once you get to post you have to make sure your NLE can properly handle 3D, you have a specialist adjusting the convergence on a shot-by-shot basis so the desired 3D result is achieved w/o giving the audience a splitting headache, etc.,. Finally, of course, on the viewing end you need a 3D TV/projector and 3D glasses for the audience.

There's a reason more and more 3D films are actually being shot in 2D and then converted to 3D in post. It's way easier to shoot in HD, 2K, 4K, 6K, whatever, and then at the very end of post do the 3D conversion than it is to actually shoot in 3D to begin with.
 
But that never stopped analysts from predicting its arrival!

Well you could try thinking a bit beyond the established paradigms of what TVs are now. Bear with me and read my explanation.

TV's used to be only about the screen, but recently smart TV has become the second most important element of the TV. The software commands now all of the functions; even basic TV settings. But TV producers have little expertise in software development and user workflow. This is evident in basically all TVs available on the market. Even the best TVs on the market show great deficiencies in software while delivering good picture quality. As prices for good TV panels come down because of improved production processes and increased competition, the software side becomes more expensive to produce because consumers want more options and there are more and diverse content providers with dedicated apps. Producers are struggling with software production and testing which overall increases costs.

So if you look at the overall cost development of TVs, you will see that software and testing are becoming an ever bigger portion of that cost.

A possible solution is to completely divorce the operating system of the TV from the panel. Leave the panel production to those who do it best and the software production to the software producers. A great example is actually the cinema display. It has a timeless design and will look good in any environment for years and years. All processing and settings are left to the connected device, which works brilliantly. In that case there is no need to replace a screen every x years because it becomes obsolete. A good 4k screen would from now be relevant for the next 5-8 years at least.

Even Apple could command a sizeable margin from the production (well contracting out to Foxconn, or Sharp) of just a simple screen with little other electronics in the panel. Just a power supply, input controllers and a screen. That would push the Bill of Materials down enough to have a screen at a competitive price to those producers that still sell smart TVs.

So, it is possible. A smart box such as the Apple TV that not only manages iTunes content, but also cable TV input would do the trick and for the price of a few hundred bucks (199-299 dollars) there are not a lot of consumers that will complain about it needing to be replaced every two - three years.
 
Counter away. Here's a dirty little secret that you assumed was me nerfing my response, it's very common for editors to reduce the playback resolution in the NLE because it increases performance. I did when I cut SD, I do it when I cut HD, and I've done it when I've cut 4K. Typically I do it when I start adding FX and/or pulling enough streams that the machine won't playback smoothly in real time anymore.

I went through the SD to HD shift as a professional and I'm starting to do the same thing from HD to 4K and the only difference is file size and needing a new monitor if you need to view the footage at 1:1. Sure, gear gets upgraded over time (more powerful computers, faster storage, etc.,) but that's not a workflow change anymore than going from an iPhone 5 to an iPhone 6 requires a change in how you use a smart phone. Going from tape to file-based required workflow changes, but getting handed a drive full of HD footage isn't functionally different than getting handed a drive full of 4K footage. Even for productions that do an offline/online workflow (i.e. shoot in in high res but use low-res proxies for the editing) the same basic workflow applies whether you are working with SD/HD or HD/4K (or in the case of Gone Girl 2.5K/6K).

On the production side there's not really anything that drastically changes either. You might need better quality glass on the front, bigger/faster media to record to and ideally a 4k monitor, but those aren't workflow changes just gear changes. Given the advances in onboard compression you might not even need really big or really fast storage, relatively speaking. For example, the GH4 can record 4K at 100Mbps and 100Mbps is also the bit rate for DVCPro HD (which is a thin raster HD codec that's nearly 15 years old).

It was me who introduced this topic... so let me clarify.

You're exactly right... there's no difference editing 4K vs editing HD if your computers and storage can handle it.

I didn't mean the workflow was different... I meant the all the extra data that TV shows have to deal with will increase.

I think we've already established that most people won't even see the difference between 4K and HD... so why bother? It just seems like TV shows will be creating and archiving a bunch of bigger files for no apparent reason.

On the other hand... SD to HD was a worthwhile upgrade that people definitely noticed.
 
No compelling reason for Apple to get into this market...at least not yet. I think the big issue with TV isn't picture quality but user interface and content. These are two things that a set-top box can change and i am assuming apple will start with at WWDC. I don't think the TV service will be introduced until the fall...just in time for football and the tv season.

Don't understand this argument. What content do you think is going to be on this new streaming service? It's going to be the same content you can currently get on cable and satellite tv. Apple haven't got a bunch of new channels to reveal. The only difference with be the shape of the box and the UI, both of which I couldn't really care less about.

I don't have cable because I don't need the extra content. I'm fine with what I get from the free-to-air channels. That isn't going to change with the new AppleTV box. The AppStore might be useful although it will only have the same games I can already get on Playstation or XBox.

I don't see Apple's entry into the TV streaming market changing anything at all.
 
I didn't mean the workflow was different... I meant the all the extra data that TV shows have to deal with will increase.

I think we've already established that most people won't even see the difference between 4K and HD... so why bother? It just seems like TV shows will be creating and archiving a bunch of bigger files for no apparent reason.

Ah, I see what better where you are coming from.

From a production standpoint the benefit to shooting in 4K right now is if you are delivering in HD you get more flexibility for things like reframing and image stabilization, more image data for VFX, etc.,. For example, I know some reality shows that are shooting their interview segments in 4K because they get a close up, medium and wide shot all from one camera. Or if it's a two person interview they get a 2-shot plus a single shot of each interviewee from a single camera. They don't shoot the entire show in 4K because it's cost prohibitive currently when you are talking about hundreds, if not thousands, of hours of footage per show.

There is also a future proofing argument which means a bit larger investment now can give the product more legs down the line (which means a longer chance to make money). For example, old TV shows shot on film can be remasterd in HD (or even 4K) but TV shows recorded onto SD video tape will always be stuck at that resolution. There's certainly money to be made in offering back catalog movies and TV shows in glorious, remastered 4K (either directly or via streaming service).

For some there is a prestigious element too using 4K since it's newer, higher resolution, implies a higher budget, better production value etc.,. That alone might be enough to hook more people in and thus make the producers more money. It's kinda small right now, but there is certainly a growing vibe in Los Angeles of "Oh, you are *only* shooting in HD?"

Media companies are always looking for ways to one-up each other so Amazon and Netflix dropping the 4K gauntlet might mean we see the networks starting to offer some shows in 4K via Hulu or CBS All Access. It's a tough place for the networks right now though because there's no unified ratings tracker for online viewing yet so any online views risk cannibalizing the TV audience which means lower ratings which means less ad revenue. If Nielsen could get its act together and provide a comprehensive measure for online ratings then I'm sure we'd see the TV networks doing more branching out into online distribution.

With all this being said, 4K is a very new thing and we in still in the chicken/egg zone where content creators are asking 'Where's the distribution?' and distributors are asking, 'Where's the content?'. Someone always has to go first.

On the other hand... SD to HD was a worthwhile upgrade that people definitely noticed.

I don't see it as nearly so black and white. Many people still happily watch SD channels and SD DVDs on their HDTVs. Sure, in an A/B comparison the HD might really jump out at them, but for everyday use they don't care enough to be bothered by the difference. I'd wager that most people ended up with an HDTV because their SD TV died and HD was the only option or they believed the marketing and loved how awesome their DVDs look now that they are 'in' HD. ;)

1080 vs 720 is another example of marketing leading the way. Given the average living room viewing distance, HDTV screen size, and compression used for distribution most people wouldn't be able to see a difference between 1080 and 720. Yet 720 used gets poo-pooed while the marketing term of 'Full HD' for 1080 has made its way into common usage.

Those same marketing forces, plus the eventual phasing out of HDTVs, will usher in the 4K era the same way they ushered in the HD era. Not because consumers demanded it, but because consumers were given no other choice.;)

I wrote this a bit at a time throughout the day so apologies if it seems disjointed.
 
Ah, I see what better where you are coming from.

From a production standpoint the benefit to shooting in 4K right now is if you are delivering in HD you get more flexibility for things like reframing and image stabilization, more image data for VFX, etc.,. For example, I know some reality shows that are shooting their interview segments in 4K because they get a close up, medium and wide shot all from one camera. Or if it's a two person interview they get a 2-shot plus a single shot of each interviewee from a single camera. They don't shoot the entire show in 4K because it's cost prohibitive currently when you are talking about hundreds, if not thousands, of hours of footage per show.

There is also a future proofing argument which means a bit larger investment now can give the product more legs down the line (which means a longer chance to make money). For example, old TV shows shot on film can be remasterd in HD (or even 4K) but TV shows recorded onto SD video tape will always be stuck at that resolution. There's certainly money to be made in offering back catalog movies and TV shows in glorious, remastered 4K (either directly or via streaming service).

For some there is a prestigious element too using 4K since it's newer, higher resolution, implies a higher budget, better production value etc.,. That alone might be enough to hook more people in and thus make the producers more money. It's kinda small right now, but there is certainly a growing vibe in Los Angeles of "Oh, you are *only* shooting in HD?"

Media companies are always looking for ways to one-up each other so Amazon and Netflix dropping the 4K gauntlet might mean we see the networks starting to offer some shows in 4K via Hulu or CBS All Access. It's a tough place for the networks right now though because there's no unified ratings tracker for online viewing yet so any online views risk cannibalizing the TV audience which means lower ratings which means less ad revenue. If Nielsen could get its act together and provide a comprehensive measure for online ratings then I'm sure we'd see the TV networks doing more branching out into online distribution.

With all this being said, 4K is a very new thing and we in still in the chicken/egg zone where content creators are asking 'Where's the distribution?' and distributors are asking, 'Where's the content?'. Someone always has to go first.



I don't see it as nearly so black and white. Many people still happily watch SD channels and SD DVDs on their HDTVs. Sure, in an A/B comparison the HD might really jump out at them, but for everyday use they don't care enough to be bothered by the difference. I'd wager that most people ended up with an HDTV because their SD TV died and HD was the only option or they believed the marketing and loved how awesome their DVDs look now that they are 'in' HD. ;)

1080 vs 720 is another example of marketing leading the way. Given the average living room viewing distance, HDTV screen size, and compression used for distribution most people wouldn't be able to see a difference between 1080 and 720. Yet 720 used gets poo-pooed while the marketing term of 'Full HD' for 1080 has made its way into common usage.

Those same marketing forces, plus the eventual phasing out of HDTVs, will usher in the 4K era the same way they ushered in the HD era. Not because consumers demanded it, but because consumers were given no other choice.;)

I wrote this a bit at a time throughout the day so apologies if it seems disjointed.

Future proofing is a ways off still. The bottleneck is broadcast and that won't change until a government mandate or streaming becomes the replacement entirely. All of the companies we deliver our weekly episodes to (Dish, ATT, TWC, etc) still take the majority of their deliveries on beta. Antiquated systems still in place. And as much as you want to praise Netflic or Amazon for the two and a half shows they have in 4K, the fiber infrastructure isn't in place to deliver it fast enough to much of the U.S.

I agree with you on a couple of points. My company shoots on Red in 2K, 4K, and 6K. We mainly edit in 2K and use 4K for the occasional shot that needs to be manipulated while maintaining quality. Never heard of anyone reducing their playback quality in their NLE, maybe if they're editing effects in AE or something. Speed at the cost of quality?
 
Maybe Jobs was referring to the AppleTV when he said he "cracked it". A t.v. is a t.v. when all is said and done and it's all about the content as well as picture quality. Bells and whistles are fine as an attraction, but don't hold up too well in the long run (ahem... 3D -cough!-). Ever try using SMART TV on a Samsung? My God it's slow as hell and the router was just down the hallway. Now AppleTV I like (I have 2 of them) and I am eagerly awaiting to see what Apple will announce at the WWDC this June (I hope it's a new AppleTV). I am liking the new channels like Tastemade which, IMHO, offers way better programs than what Food Network is offering these days. Use Plex with AppleTV and if you have access to download the shows you like or are curious about, it's better than cable.

I thinks its funny complaining about Smart TV on a Samsung and then liking Apple TV. Apple TV has just about the worst intereface and navigation of any equivalent device. Hopefully the new device when announced will have a real remote.
 
I thinks its funny complaining about Smart TV on a Samsung and then liking Apple TV. Apple TV has just about the worst interface and navigation of any equivalent device. Hopefully the new device when announced will have a real remote.

I own both a Samsung Smart TV and two Apple TVs. While I agree that the Apple TV needs an update; it is consistent in its use, fast and easy to understand. The Samsung Smart TV interface is not the worst one around (Philips takes that prize) but worse than ATV in the sense that settings are difficult to find, it is buggy, slow and readily forgets settings and Bluetooth remote connection.

So, yeah ATV is bad at the management of large libraries and the remote is minimalistic, but it is miles ahead of all Smart TV interfaces on the planet in terms of simplicity, speed and absence of bugs. In addition the Samsung remotes that came with my TV are equally bad at entering text as the Apple remote.
 
I own both a Samsung Smart TV and two Apple TVs. While I agree that the Apple TV needs an update; it is consistent in its use, fast and easy to understand. The Samsung Smart TV interface is not the worst one around (Philips takes that prize) but worse than ATV in the sense that settings are difficult to find, it is buggy, slow and readily forgets settings and Bluetooth remote connection.

So, yeah ATV is bad at the management of large libraries and the remote is minimalistic, but it is miles ahead of all Smart TV interfaces on the planet in terms of simplicity, speed and absence of bugs. In addition the Samsung remotes that came with my TV are equally bad at entering text as the Apple remote.
We are never going to agree on this. I own two Samsung smart TV's and got rid of my Apple TV. As I said before the interface on the Apple TV is just about the worst I have seen. My Samsung Smart TV is neither buggy slow or forgets anything. With regard to entering text on the Samsung I point at the screen keyboard and select the text I have attached a photo to demonstrate. You just move the cursur with the remote to the next letter. I can also do voice input for just about any function.
 
attached picture for Utube navigation on Samsung Smart TV
 

Attachments

  • IMG_4448.jpg
    IMG_4448.jpg
    943.2 KB · Views: 68
Rollable OLED

Stop trying to solve this prediction with current technology. Think about the future. Apple will not build a TV until we see rollable OLED which is as soon as 2017 by LG and or Samsung. Plastic rollable 4k screen in sizes up to 60" initially solve all mentioned problems. RollRR.com
 
I thinks its funny complaining about Smart TV on a Samsung and then liking Apple TV. Apple TV has just about the worst intereface and navigation of any equivalent device. Hopefully the new device when announced will have a real remote.

This is from real world personal experience using both systems. I even used Plex on both and Plex is a native app on SMART TV and it's still way slower than ATV which Plex is accessed via a hack. So yeah I can complain about SMART TV.

I use Plex to access my media and if you use folders and separate Movies from TV shows, navigating through your media is quite easy. I even separated my movies into different categories like drama, comedy, etc. Now using Apple's own media apps like Television or Movies is a headache to navigate through if you have bought a lot of your media through iTunes (first hand experience going through my cousins iTunes collection of movies and shows).
 
Apple shelved plans to introduce a full-blown television set more than a year ago reports The Wall Street Journal

So it's news today that Apple shelved these plans "more than a year ago".

Perhaps it will be news in a year from now that Apple unshelved those same plans "more than 18 months ago" ? :D
 
Future proofing is a ways off still. The bottleneck is broadcast and that won't change until a government mandate or streaming becomes the replacement entirely. All of the companies we deliver our weekly episodes to (Dish, ATT, TWC, etc) still take the majority of their deliveries on beta. Antiquated systems still in place. And as much as you want to praise Netflic or Amazon for the two and a half shows they have in 4K, the fiber infrastructure isn't in place to deliver it fast enough to much of the U.S.

Hey, it's gotta start somewhere. Besides Netflix and Amazon, Sony is streaming 4K and I just read that DirecTV started offering some movies in 4K late last year (Comcast says they'll start offering 4K before the end of 2015). Here's a kinda weird thing to think about. DVD became available in the U.S. in 1997 and HD became available in 1998. One became the fastest adopted consumer tech ever in America and the other only reach 25% adoption after 10yrs on the market.

I agree that there is a bottleneck w/OTA broadcast which is why 4K is going to come via OTT and cable/sat. OTA, for the foreseeable future, will remain maxed out at HD. I also agree that currently 4K streams are too big to go mainstream, but throw H.265 into the mix and that will start to change.

There will certainly be a 'premium' aspect to 4K initially where ISPs and cable/sat providers will use it as a way to up-sell customers (which isn't too different from when HD started really hitting the mainstream).


Speed at the cost of quality?

Yup. FCP Legend had the RT Menu to the left of the timeline, Avid has it's yellow/green indicator beneath the timeline, X has Better Quality/Better Performance in the Viewer and PPro has Full, 1/2, 1/4, etc., settings under Viewer. I like my playback and shuttling fast and smooth even if that means the image during playback isn't tack sharp.

For example, my grandfather Mac Pro and OWC RAID can playback 4K R3D files okay, but if I FF/REV it turns into a slide show. Drop the playback down to 1/4 and the FF/REV is smooth again. Plus, at 1/4 and I can smoothly playback a green screen composite (both 4K R3D files) with a couple layers of color correction on top of it w/o rendering. I'd much rather drop the playback image quality a bit than deal with spotty performance or have to render a lot.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.