Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Disagree. The App Store on the Apple TV is way beyond any of it's competitors. The Shield TV has gaming, but the target customer is primarily someone who wants to stream from their gaming PC.

You aren't talking about the tvOS App Store surely? :D

Is there anything in it that is worth having that isn't available on Roku/Fire TV/Chromecast etc?
 
If you don't see the difference, it means your TV can't show it or your blu-ray player is lacking.
It's not 'just getting there' It's getting there as exact as possible.
CD players all sound different, exact same CD, still different. Video equipment also has impact on image quality. The less you have in the chain, the better.

Digital signals don’t break down. 1s and 0s get there or they don’t. There can be jitter and timing issues, but again, I’ve yet to meet a single person that can reliably and consistently discern the difference in an A/B test.
Dual HDMI outputs is not done for performance or quality reasons. The only reason two outputs is desirable is as a concession for people who have an older AVR which cannot handle 4K/HDR video streams. Two outputs allows you to send the audio to your legacy AVR/Receiver and send the 4K/HDR video directly to a display that supports it.

If you've got an AVR/Receiver that supports HDMI 2.0a and HDCP 2.2 then there's no reason to split the outputs and you can achieve the same exact performance and quality by routing everything through it.

This makes way more sense.
 
So sorry for you rippers. The TV is a streaming device, NOT a media server for you to load your thousands of ripped DVDs into. No internal or external SSDs.

Did someone say it was supposed to be?

It does work quite well with media storage devices and servers such as any using Plex, though.


It's helpful(necessary) for HDCP issues that can arise if you have an older receiver that isn't HDCP 2.2 compliant. You can bypass your receiver for the video running that directly to your 4K display/projector and run the audio to the receiver which isn't HDCP protected.

Wouldn't a cheap little HDMI splitter work just as well in that very specific niche case, and let the rest of the world not have to pay for (and license) an HDMI connector that will not get used? Monoprice has a selection of them for whatever your specific needs are...

25mbs is "recommended" but considering these broadband providers rarely deliver close to what you are paying for, I'd say you'd need at least a 50mbs plan to stream 4K. With the assumption your provider will avg 30 or 40 on a 50mbps plans.

The 25mbps-rated connection recommendation is specifically to allow room for "congestion" (they recently increased it from 15Mbps when people kept calling claiming they had 15Mbps service and Netflix wouldn't go UHD), so you don't need to count it twice. 15Mbps reliable is the standard Netflix 4k streaming recommendation, with Vudu (11Mbps), Hulu (13Mbps) and Amazon (15Mbps) being in the same range.

It all depends on how aggressively compressed the service dares to be. I suspect that five years from now we'll be talking about higher recommended bandwidth ratings as compression dials down with bandwidth availability increasing.

Note that Netflix currently says its UHD streams are "up to" 7GB per hour, which equates to a bandwidth rate of only about 2Mbps. That seems really low, so I would imagine that the 15-25Mbps ratings are primarily aimed at fast-start / fast-forward user experience rates.
 
They need a big game (like Destiny 2) to be put on the platform to gain attention. Lots of folks would pay 150 instead of 299 to play it. Or try to get EA to actually put the full version of Madden on it.

That's the whole thing though. The big "Hollywood" game development studios have literally zero incentive to port any games over to tvOS. The market opportunity to justify the effort just doesn't exist.

They may also have signed agreements with Microsoft and/or Sony that restrict which platforms they can develop on.
 
That's the whole thing though. The big "Hollywood" game development studios have literally zero incentive to port any games over to tvOS. The market opportunity to justify the effort just doesn't exist.

They may also have signed agreements with Microsoft and/or Sony that restrict which platforms they can develop on.
Apple has plenty of money to get a big devhouse to put an AAA game on the device.
 
Disclaimer: I have 2 4K TVs, one with HDR and I will be buying the 4K Apple TV.

The huge problem with all of this is that most people act like screen resolution is the only issue involved.

I have been watching movies ripped from Blu Ray that are 3GB to 5GB in size for a while and they have been very enjoyable.

I picked up a couple of Blu Ray players at a surplus sale and hooked them up to make sure they worked. I was all like "WTF? this looks incredible compared to my ripped movies."

I get so frustrated just trying to find content that is in 4K, and half the time when I find it, I end up having to go to info on my TV to make sure it is actually 4K.

Now, before you call me a 4K hater, as many have said, when you have a 4K, HDR, high bitrate, uncompressed source, shot in 4K, 4K is astounding.

Streaming a Netflix movie that was originally shot and edited in 2K, compressed to stream at a reasonable rate, not so much. Likely not as good as a 1080p Blu Ray.

Netflix claims HD quality streams at a data rate of 3 GB/hour (6.7Mbps). 4K streams at 7GB/hour (15.5Mbps). 3-5GB for a full movie is probably about 30% more compressed than what Netflix aims for. Hard drive space is cheap; I'd rip at a higher data rate. That said, the point is absolutely relevant: people's experience with 4K is absolutely dependent on the quality of the stream they are experiencing. If they are looking at UHD from Vudu (11Mbps, 5GB/hour) they will have a measurably worse impression of the 4k advantage compared to someone streaming 40% more data from Netflix.

That said, your original DVD source material is a data rate of between 3 and 9.5Mbps (in MPEG-2, which is relatively highly inefficient), while the Bluray is probably between 36Mbps and 54Mbps (in AAC, which is far more efficient than MPEG-2 but less so than HEVC), so a "high quality" DVD, even though SD in resolution, may be just slightly worse than the compressed HD from Netflix in terms of discernible data. It does matter what the source material is, but I find "macro blocking" and compression artifacts significantly more distracting than lack of resolution usually.
[doublepost=1505181519][/doublepost]
Jebus what year is it Apple? It should have at least 16gb of ram and a quad-core processor like modern 4k TVs.

If that is what you want, Apple sells that too. It is called a "Mac Mini". If I wanted a computer gulping power 24/7 next to my TV that's what I would use.

That said, I believe you are confusing RAM, which is volatile memory, with Flash RAM or "On-Board Storage". I tried to verify your 16GB claim and could not find any references to large amounts of volatile RAM. Example: https://www.sony.com/electronics/televisions/xbr-x850d-series/specifications. We don't have any indication how much flash memory ATV5 will include (so far as I have seen at least), but the ATV4 had 32GB or 64GB for caching of content on-device, far more than "modern 4k TVs" have in general and 2-4 times as much as the otherwise largest-flash standalone streaming player (Amazon Fire TV and Nexus Player both have 8GB in "third place"; nVidia Shield has 16GB); hell, the ATV3 had 8GB of flash memory even, which puts it right in line with "modern" devices by your metric. (Note that nVidia also sells a Shield Pro with 500GB of onboard storage, but that storage is a spinning HD so not quite the same thing as the others, so I did not include it).

From a volatile RAM perspective, I couldn't find any data from the main TV manufacturers, but Roku has all of 1GB RAM in their latest high-end devices. Amazon FireTV (2nd gen) has 2GB RAM. The Shield (standard and Pro) is the only one with 3GB of RAM out there right now. No current stats on Chromecast Ultra, but the standard Chromecast boast a whopping half a GB (512MB) of RAM.
 
  • Like
Reactions: ErikGrim
Apple's PC hardware: For no reason has small RAM, old CPUs from Intel, and old GPUs (often from stupid AMD).
Apple's other hardware: Surprisingly has cutting edge, top of the line parts.
I don't get it.
[doublepost=1505183926][/doublepost]
it's not the CPU that people care about in these settop boxes. Sure, the A10x will provide a great smooth experience. But if the cost of entry is too much, most people are going to opt for the slightly slower, but equally competent settop box.

Right now, Even with 4k added to the Apple TV. there's no tent-pole feature that sets teh Apple TV apart that warrants it's premium price-tag over any of the competition. [...]
AirPlay alone is a reason to buy it. There's Chromecast, but it's somehow still a piece of crap after all these years.

But if you don't mind the setup, a regular Mac or Windows PC is by far the best home media device. Plays literally everything with no limitations and no new UI to learn. I'm kinda done with Apple TV, only use it to stream from my Mac (which is still really useful).
 
Last edited:
  • Like
Reactions: ErikGrim



Apple's upcoming fifth-generation 4K Apple TV will be powered by an A10X Fusion chip and 3GB of RAM, according to details unearthed in the device's firmware.

Developer Steve Troughton-Smith made the claim this morning in a tweet, after going through code in the final software builds that were at the center of a major Apple leak over the weekend.

appletv4khdr-800x579.jpg

Apple's current fourth-generation Apple TV, originally released in 2015, runs on an A8 chip coupled with 2GB of RAM. But the boost in performance provided by the A10 series - which also powers the latest iPad Pro models - suggests Apple could see a bigger role for its next set-top box, possibly expanding beyond 4K content.

On the other hand, Troughton-Smith believes Apple may have chosen the A10X Fusion processor to play 4K content at 60 frames per second, which would make sense given that the iPhone X is expected to record 4K video at 60fps.

Even if that is the primary reason for including such a powerful chip in the next Apple TV, users can expect significant performance gains across the board, while tvOS game developers will be rubbing their hands together at the prospect of leveraging the processor's power to create immersive 3D experiences to rival modern console titles.


The 3GB of RAM would bring the extra working memory needed to stream 4K HDR content, which is expected to become available both in the iTunes Store and from third-party content providers, but the additional RAM may also factor into any role the Apple TV has in Apple's augmented reality future plans.

The ARKit developer framework is already turning out to be a major feature of Apple's upcoming iPhone 8 and iPhone X devices, which are set to be announced on Tuesday alongside a new Apple TV and Apple Watch, during a media event at the Steve Jobs Theater in Apple Park, Cupertino. Several details about Apple's iPhone X have already been found in the iOS 11 GM, including information on Face ID setup, [url="http://

Article Link: 4K Apple TV Could Feature A10X Fusion Chip and 3GB of RAM
I have a really hard time believing that the same hardware that the iPad runs on will be able to process the same games that Xbox one and PS4 ("consoles") run. Playing battlefield 1 and titanfall 2 on an A10X?? Don't think that's happening.
 
Yes. The others should note though that as of right now, Handbrake will encode to 4K hcv1 HEVC which will work in Apple Quicktime in High Sierra.
This is brilliant. Can you list your settings. I downloaded off a mkv tv episode which had ac3 and h256 audio/video streams. I just remuxed the mkv to mp4 using XMedia Recode keeping audio/video streams intact. But it did not transfer to my iPad 10.5”. Then I only encoded ac3 to aac but still the file refused to transfer over. Re-encoding an already h265 video stream to compatible h265 stream would suck big time.

Hence I would appreciate what exact settings are needed? Apple should allow universal h265 encoded files. Not being too specific.
 
That would be illegal in the U.K. Virgin are not allowed to favour and throttle traffic. Also their network is undergoing huge investment and is more than capable of dealing with big increases in data traffic. The last thing they would do is 'cap' connections. Usage limits are almost a thing of the past in the U.K. They keep upping the speeds of their packages, not limiting.

Another thing is the UK has a pretty competitive ISP industry, people would just switch providers if they brought in limits. It would be a PR disaster for Virgin, it also just does not fit in with their plans.

Virgin are actually the best ISP, I was only using them as an example. However some of the more shoddy ones (plusnet, BT) would do packet shaping and get away with it as they will make you sign a contract that makes it legal. Most users are not technical enough to realise what they are getting into, they just sign for the cheapest one with the best adverts.
[doublepost=1505208044][/doublepost]
Most of Europe and Asia.

And you'd be surprised at just how much contention there is in residential areas. Asia is ahead of the rest of the world for this, but in the UK we have a bit of catching up to do.
 
Jebus what year is it Apple? It should have at least 16gb of ram and a quad-core processor like modern 4k TVs.

You don't need 16GB of RAM for a set-top box, 2 to 4GB is enough unless you plan on it being a hardcore gaming device like an Xbox One X / Playstation 4 Pro.
 
UK shipping for the 64GB has already slipped to 2-3 weeks. 32GB still available for launch delivery.
 
UK shipping for the 64GB has already slipped to 2-3 weeks. 32GB still available for launch delivery.
Apple Store app totally failed for me thanks to concurrent launch with the iPhones and Watches. Why couldn't they have staggered this? Had to order on the web and delivery slipped to 2-3 weeks.

Ugh. I guess I'll show up in store and then cancel web order if I manage to snag one.
 
Well, the old one is probably more powerful than the Wii U.
Well Wii-U has a Radeon RV740 GPU. Wii U has 2GB of RAM Quad-core, 3 GHz IBM PowerPC-based 45 nm CPU called "Espresso". Wii-U is way more powerful than iPhone 7. Radeon 700 Series has 800 stream processing units, 10 SIMD cores composed of 16 shader cores containing 4 FP MADD/DP ALUs and 1 MADD/transcendental ALU. The GPU has GDDR5 memory, which runs at 900 MHz giving an effective transfer rate of 3.6 GHz and memory bandwidth of up to 115 GB/s. So no it is not even close. But one doesnt need to go so high, 12 seems like a good number for now provided its through put is somewhat comparable. The previous one was also 12 core but only merginally good. I personally would design AppleTV 4K with a Maxwell 10x series https://en.wikipedia.org/wiki/Maxwell_(microarchitecture) Thats is just an awesome GPU in a small package.
 
Last edited:
  • Like
Reactions: fairuz
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.