Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Is it just me or are each following betas getting worse and worse.
I suppose they are trying to lower our expectations to match iOS to the desktop ARM GM release coming. We can't have a stable GM release now can we?;)
 
  • Like
Reactions: alecgold
listen to this with the headphones

That doesn’t answer my question though. Those sort of demos have been around for decades now, my beige Windows XP computer could do this. All this feature does is tie it to a point in space. So if I put my phone down while audio is playing and walk away, my content would get quieter. Or if I’m watching a video and swap my phone from one side to another it’ll screw with the audio. How is that a benefit?
 
I had planned to jump in now, but then occasionally there's some guy who complains about multiple app crashes a day (yes on beta 4)... I wonder if it's really that bad?? I can take some instability and as a dev myself knows what "beta" means but that borders on unusable and then I'd rather wait.

Most seem to tell it's pretty stable for a beta though so...
I usually hop on every beta — this one has been the most stable in recent memory straight out of the gate.
 
I still don't get the benefit of spatial audio/head tracking. I can see how it might be a cool tool for games, but what other benefit does it bring?
My understanding is that it will deliver 5.1 Atmos like surround sound or something similar when watching compatible movies and TV shows through Apple TV.
 
The iOS 14 betas have been very stable for me - only 1 random reboot. I'm on the public betas for iOS and the developer beta for WatchOS 7. Did anyone else lose the gray (white) color option on the Chronograph Pro watch face? It was labeled as white, but looked gray. After updating today it's no longer an option.
 

Attachments

  • IMG_4004.png
    IMG_4004.png
    75.8 KB · Views: 87
  • Like
Reactions: roncron
My understanding is that it will deliver 5.1 Atmos like surround sound or something similar when watching compatible movies and TV shows through Apple TV.
Ah okay, that makes a bit more sense. I honestly thought that was already available.
 
  • Like
Reactions: mjs916
That doesn’t answer my question though. Those sort of demos have been around for decades now, my beige Windows XP computer could do this. All this feature does is tie it to a point in space. So if I put my phone down while audio is playing and walk away, my content would get quieter. Or if I’m watching a video and swap my phone from one side to another it’ll screw with the audio. How is that a benefit?

Imagine this: You’re watching a movie on your phone, which you’re holding straight in front of you. You’re wearing AirPods Pro. The movie has surround sound. In the movie, there’s a talking character off to the right, and you hear them out of your right ear. Now, let’s say that you turn your head 90 degrees to the left in real life, but don’t move your body or your phone. You’ll still be hearing that person out of your right ear, which wouldn’t make sense because from the perspective of you watching the movie, the sound should now be coming from behind your head, which is turned to the left of the movie screen.

This wouldn’t be an issue in a theater, where you can turn your head but the speakers don’t move; they stay in place relative to the screen. But if you’re wearing headphones, the speakers move with your head, so if you move your head while the screen stays in place, the perceived location of the audio is moving independently of the screen, which can distract from the movie.

The idea is that they’re trying to virtually anchor the audio relative to the screen, so that it still seems “correct” if you move your head or your phone.
 
I had planned to jump in now, but then occasionally there's some guy who complains about multiple app crashes a day (yes on beta 4)... I wonder if it's really that bad?? I can take some instability and as a dev myself knows what "beta" means but that borders on unusable and then I'd rather wait.

Most seem to tell it's pretty stable for a beta though so...

The only problem I've seen so far is that Facebook Messenger crashes on start, simply will not work.
 
Imagine this: You’re watching a movie on your phone, which you’re holding straight in front of you. You’re wearing AirPods Pro. The movie has surround sound. In the movie, there’s a talking character off to the right, and you hear them out of your right ear. Now, let’s say that you turn your head 90 degrees to the left in real life, but don’t move your body or your phone. You’ll still be hearing that person out of your right ear, which wouldn’t make sense because from the perspective of you watching the movie, the sound should now be coming from behind your head, which is turned to the left of the movie screen.

This wouldn’t be an issue in a theater, where you can turn your head but the speakers don’t move; they stay in place relative to the screen. But if you’re wearing headphones, the speakers move with your head, so if you move your head while the screen stays in place, the perceived location of the audio is moving independently of the screen, which can distract from the movie.

The idea is that they’re trying to virtually anchor the audio relative to the screen, so that it still seems “correct” if you move your head or your phone.
But if you turn your head, you’re no longer watching the movie. I get the point you’re making but I guess I don’t see how that’s an improvement.
 
I still don't get the benefit of spatial audio/head tracking. I can see how it might be a cool tool for games, but what other benefit does it bring?
Some people say it's part of the Apple Glasses instalment, that you would use both AirPods and the glasses together for the ultimate experience.

But when it comes to the feature itself, it's more of a new way to enjoy movies (those that are 5.1 or 7.1) There are already headphones with pretty much the same head tracking feature and comparing that to regular 2.1, it really does make a big difference when watching a movie. The audio is fuller and closer to what you would hear in a cinema.
 
  • Like
Reactions: mjs916
The last PB was 3—I think they skipped PB2.

Since they’ve been doing public betas, the numbering has been confusing to people, because:

- the first beta released would be called developer beta 1, and there would be no public beta equivalent.

- the second beta would be called developer beta 2 and would also be released shortly thereafter as public beta 1 (because it was the first public beta).

- the third beta would be called developer beta 3 and would also be released shortly thereafter as public beta 2 (because it was the second public beta).

Then people were all confused because if you say “beta 3”, are you talking about developer beta 3 which is the same as public beta 2? Or do you mean public beta 3 which is the same as developer beta 4?

This year there’s less confusion, except that there was no “public beta 1”, as the public betas started with “public beta 2” so the numbers align.
 
  • Like
Reactions: RealMonster
The only problem I've seen so far is that Facebook Messenger crashes on start, simply will not work.
Mine crashes when I take a picture or video and send it on there. I can send pictures and videos that were taken out of the app though. Actually, I haven't tried since beta 1 so it might be ok now.
 
When do you think is a good time to jump into this beta? Last year I did it early in the first build and it was rough for many builds following. Is iOS 14 fairing better now at build 4 or should I just wait

a good time to jump into any beta is when you don’t care about losing data and completely broken functionality
 
Some people say it's part of the Apple Glasses instalment, that you would use both AirPods and the glasses together for the ultimate experience.

But when it comes to the feature itself, it's more of a new way to enjoy movies (those that are 5.1 or 7.1) There are already headphones with pretty much the same head tracking feature and comparing that to regular 2.1, it really does make a big difference when watching a movie. The audio is fuller and closer to what you would hear in a cinema.
That's not how i recall the presentation at WWDC. They did not mention having to use glasses at all. I may not be remembering this correctly but when i heard about this feature i was pretty psyched. I want to say it's limited to the Air Pod Pros and the article below says it is. But it also says the current Apple TV 4K will not support this feature. Guess it will only be through an iPhone or iPad. Hope that changes in future Apple TVs if this is true.

I found this summary at WhatHiFi:

During its typically slick, but untypically pre-recorded WWDC 2020 keynote briefing, Apple announced something that really piqued our interest: spatial audio.

Coming (along with a number of other new features) to the AirPods Pros this autumn, spatial audio is designed to deliver surround sound and 3D audio via your headphones. It’s basically Apple’s take on Dolby Atmos for Headphones and Sony’s upcoming PS5 3D Audio.

Apple’s spatial audio has a unique feature, though: it not only tracks your head movement using accelerometers and gyroscopes in the AirPods Pro in order to position the sound accurately, it also tracks the position of the iPhone or iPad that you’re watching on, so that sound is also placed relative to the screen. This means that even if you turn your head or reposition your device, dialogue will still be anchored to the actor on the screen.

Apple spatial audio will be included in a firmware update coming to AirPods Pro this autumn. You’ll also need the new iOS 14 or iPadOS 14, which should be released around the same time. All of these updates will be provided free of charge.

It’s also worth noting that Dolby Atmos is only available on Apple devices launched since 2018. While you don’t need Atmos for spatial audio, the two working together will likely produce the best results.

On the software side of things, as long as an app supports 5.1, 7.1 and/or Atmos, it will work with spatial audio. That already includes apps such as Vudu, HBO Go, Hulu and Amazon Prime Video. Stereo content can also be converted to spatial audio. All a developer needs to do is allow stereo spatialisation via an Apple plug-in.


Here’s something of a surprise: spatial audio is not coming to the Apple TV 4K, or at least not in the first instance. That seems odd to us. True, AirPods are more often connected to a portable device than an Apple TV, but spatial audio seems a perfect way for those people without a surround sound system to get a taste of cinema-style sound while watching on their lounge TV.

Apple’s not explained why the Apple TV 4K isn’t getting spatial audio. The only reason we can think of is that it’s not powerful enough. The A10X chip in the Apple TV 4K dates back to 2017. Could it be that it doesn’t have the processing muscle for spatial audio? It’s possible, but that would also presumably preclude all pre-2018 iPhones and iPads from getting spatial audio. If this proves to be the case, it could be the best indication yet that a new, more powerful Apple TV is on the way.

 
  • Like
Reactions: roncron
Is anyone else having a problem with the clock widget not updating to the correct time? Last Beta did same thing. Fixes on a restart but later will be off.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.