As left and right are both independent and receive their own bluetooth signal of course there are independent "output buffers".
What makes matter worse: the signal can be huge amounts of times apart depending on pairing time and error correction / retransmit.
So as far as I can see the air pods need some extremely precise NTP-like clock sync. Because if the left and right channel are only a few ms, actually just a few samples apart the phase shift will irritate the left and right correlation of the human brain majorly, ...
(I'm not so much responding to you as continuing the same line of reasoning)
It has to be much better than a few ms. A few ms is good enough to sync audio to video-- it's the difference between seeing someone's mouth move and hearing their voice when you're standing next to them versus when you're standing a few feet away. Sound travels about a foot per millisecond.
Our detectable threshold for interaural time difference, the time difference between ears,
is about 10µs (depending on the frequency) because we use the time difference between ears to triangulate where a sound is and our ears are only about a foot apart. If the phase shifts around, it sounds like the low frequencies are moving in space (for lower frequencies, about 1.5kHz and below, we localize sound by phase-- for higher frequencies we do it by amplitude).
As a reference, GPS is pretty much the best clock any of us have access to and it is capable of providing timing information
to about 15ns of the system's atomic clocks. For the equipment most of us have, it's probably closer to 50-100ns. So Apple needs to get their time transfer to within a couple orders of magnitude of GPS.
NTP seems to give best case timing accuracy to
about 1ms, which is way too coarse. It seems like BLE synchronization is capable of getting
closer to 10µs, which is just at the upper limit of what's needed. I'm seeing some speculation that
single digit µs sync is possible with the right software stack.
The biggest problem with time sync in a comms system is that the systems are usually optimized for communications-- the only thing it cares about is the bits, it doesn't really care when they arrived. The radios themselves need to be very well synchronized, but no effort is made to pass that sync information up through the hardware/software stack. Once the bits are stuck into a FIFO without time stamps and you cross clock domains, you can pretty much give up on knowing anything about timing.
That can all be dealt with though if you prioritize timing information in the hardware chain. That's presumably what the W1 provides.
So it seems feasible that with the right stack and good hardware support, Bluetooth should be able to sync across radios to better than 10µs without needing to add additional radios.
That wasn't the only data. Look at the thread title
My post was going by the original article claim that the delay is about channel synchronization.
Hard to think of a manufacturing defect that could cause that, which wouldn't stem from a poor design. Like relying on matched timing crystals or something.
Now, if the thread title is no longer correct, then of course all related posts are meaningless. (Or someone went into damage control mode.)
Apple has world class radio engineers-- you don't really think they were naive enough to just rely on open loop crystal stability, do you?
I don't really give Gruber's comments any more weight than the WSJ-- they're both based on "I totally know a guy that works there and he told me...". I still think it's a pretty cynical view, though, to assume they never took into account
the fundamental engineering challenge.
That said, there's a fine line between what's a manufacturing issue and what's a design issue that didn't show itself until faced with the variances of millions of devices in mass production. Clearly something in their solution didn't scale (I don't think they blew the holiday sales season just to support BT5). This is taking long enough to rectify that it could easily be that they need to spin their chip. One of the critical functions of that chip seems to be maintaining time sync. So one way to harmonize the various rumors would be if there's a marginal circuit in the chip, such as a PLL instability, that manifests itself as a phase delay problem between nodes-- a manufacturing problem that results in a synchronization problem.
I'd be stunned though if this turned out to be a fundamental oversight rather than a bug in an otherwise sound design.