Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
63,523
30,814



Audio expert Peter Eastty of Oxford Digital Limited has left the company he co-founded to work for Apple. According to his LinkedIn profile, Eastty joined Apple in September of this year as the Director of SoC (System on Chip) Audio Processing.

eastty-linkedin.png
Eastty's career in digital audio spans more than 40 years, including oversight of a pioneering digital audio team at Solid State Logic, a 13-year stint at Sony as a chief consultant engineer, and more than eight years as CTO at Oxford Digital. Most of his work has focused on digital audio and digital signal processing for audio equipment.
"I've worked in the application of digital and computer technology to audio since I graduated from college and I still find the problems fascinating as well as the answers."
Details on Eastty's role as Director of SoC Audio Processing are unknown, but he likely is using his DSP expertise to improve the audio quality of Apple's hardware. Earlier rumors suggested Apple was working to add support for 24-bit audio files in iOS 8 and create a new 24-bit capable version of its In-Ear headphones, but those reports have not yet panned out.

Eastty is not the only audio pioneer hired by Apple in recent years. Back in 2011, Apple hired THX pioneer Tomlinson Holman to help "provide technical direction for the company in audio" as he reports on his LinkedIn profile.

(Thanks, Manu!)

Article Link: Digital Audio Veteran Peter Eastty Joins Apple as Director of SoC Audio Processing
 

JAT

macrumors 603
Dec 31, 2001
6,473
124
Mpls, MN
He wasn't the one in charge of applying ridiculous copy protection to Sony discs so they wouldn't even play, is he?
 

jayducharme

macrumors 601
Jun 22, 2006
4,531
5,977
The thick of it
I too am glad that Apple is re-focusing on audio for their products. Their sound quality hasn't been terrible, but there's definitely room for improvement.
 

kagharaht

macrumors 65816
Oct 7, 2007
1,448
978
Maybe Apple can finally have real "Normalization" and "distortion free" EQ settings. :)
 

vagos

macrumors 6502
Oct 19, 2014
271
1,675
This somehow makes me scared that they plan on killing the headphone jack to make everything thinner.
 

heeloliver

macrumors 6502a
Sep 6, 2014
639
423
Maybe apple will bring HiFi audio to the masses. Maybe an Apple Exclusive headphone, using lightning port, and a new type of streaming service that uses much higher bit-rate?

I can already hear audiophiles everywhere crying and moaning about it all.
 

blumpy

macrumors member
Oct 6, 2004
45
42
"...and create a new 24-bit capable version of its In-Ear headphones, but those reports have not yet panned out."

All headphones and speakers are analog. The digital conversion process needs to occur prior to amplification. So there's no such thing as 24bit capable headphones unless you're talking about digitally transmitting the data to the headphone via lightning or bluetooth. This is not a limitation of technology.... it is a limitation of human physiology unless you have a digital input implanted into your brain. So "digital" headphones convert the audio to analog and amplify the signal instead of your device. There's no gain for the end user to have the headphone convert the audio vs your iPhone unless you're headphone cable is 100ft long.

It doesn't matter, most people would not be able to tell the difference between 12, 16 or 24bit audio sources or 44.1k, 48k or 88.2 or 96k. This is not like the picture quality of SD, HD, 4k vs 8k

This is more like properly encoded H.264 vs H.265 vs AppleProRes with the same pixel count and framerate. You're not going to be able to tell the difference unless you've dropped a few hundred grand into a home theater setup and knew what to look for to see a difference. And even then there would no diminished entertainment quality between the various formats.

My point being that AAC 256kbps is not the weak link in your audio chain. It's your headphones or speakers. 192kbps lossy audio compression method far exceeds the reproduction quality of most headphones or speaker systems and 99.99% of people would never hear the difference unless switching between the same source at two different qualities in a properly designed recording studio.
 
Last edited:

CJM

macrumors 68000
May 7, 2005
1,535
1,054
U.K.
Hmm.. With him and Trent on the team, I have some really positive feelings about the future of Apple audio quality.
 

MikeSmoke

macrumors 6502
Mar 26, 2010
300
270
Maryland USA
This guy is part of the elite high end pro field. Not someone I would expect to be focusing on how to make iPad speakers sound better. I would hope the pro market is in for some future surprises.
 

brelincovers

macrumors newbie
Nov 3, 2014
5
0
ipods and iphones already support 24bit/48k audio, i've been listening to hi-fi audio on them for years, and yes you can tell the difference, you just need proper headphones. i just hope they up it to 96k so i dont have to downconvert the sample rates anymore.
 

milo

macrumors 604
Sep 23, 2003
6,891
522
24 bit playback in iOS probably doesn't require anything special. If it's not supported now it's probably just switched off or the hardware doesn't support it. And at this point, probably virtually all audio playback chips have supported 24 bit playback for years. It would be nice for iOS devices to playback 24 bit audio but it's not like they'd need to bring in an expert just for that.

It doesn't matter, most people would not be able to tell the difference between 12, 16 or 24bit audio sources or 44.1k, 48k or 88.2 or 96k.

For the most part I agree with your post although 12 bit audio is pretty bad. Some people may not hear the difference but "many" seems like a stretch.
 

kyjaotkb

macrumors 6502a
Nov 20, 2009
937
883
London, UK
I guess they meant "a 24 bit capable DAC conencting, on the one end, to the iPhone via Lightning, on the other end to a new sort of in-ear headphones"

iOS8 introduced digital audio processing via lightning (oh and lightning doesn't carry analog audio like the old 30 pin conenctor did)

"...and create a new 24-bit capable version of its In-Ear headphones, but those reports have not yet panned out."

All headphones and speakers are analog. The digital conversion process needs to occur prior to amplification. So there's no such thing as 24bit capable headphones unless you're talking about digitally transmitting the data to the headphone via lightning or bluetooth. This is not a limitation of technology.... it is a limitation of human physiology unless you have a digital input implanted into your brain. So "digital" headphones convert the audio to analog and amplify the signal instead of your device. There's no gain for the end user to have the headphone convert the audio vs your iPhone unless you're headphone cable is 100ft long.

It doesn't matter, most people would not be able to tell the difference between 12, 16 or 24bit audio sources or 44.1k, 48k or 88.2 or 96k. This is not like the picture quality of SD, HD, 4k vs 8k

This is more like properly encoded H.264 vs H.265 vs AppleProRes with the same pixel count and framerate. You're not going to be able to tell the difference unless you've dropped a few hundred grand into a home theater setup and knew what to look for to see a difference. And even then there would no diminished entertainment quality between the various formats.

My point being that AAC 256kbps is not the weak link in your audio chain. It's your headphones or speakers. 192kbps lossy audio compression method far exceeds the reproduction quality of most headphones or speaker systems and 99.99% of people would never hear the difference unless switching between the same source at two different qualities in a properly designed recording studio.


----------

Oh...
ipods and iphones already support 24bit/48k audio, i've been listening to hi-fi audio on them for years, and yes you can tell the difference, you just need proper headphones. i just hope they up it to 96k so i dont have to downconvert the sample rates anymore.

They support 24/48 files but can't decode them. You'd need an external DAC - which does not exist yet. But could, via Lightning.
 

MrCubes

macrumors regular
Dec 21, 2008
195
226
United Kingdom
I really hope this renewed focus on audio quality results in the iPhone finally supporting the apt-x codec over Bluetooth. The current quality is fine for podcasts and such but as soon as I start listening to music I switch to my MBP whenever possible (or connect the headphones via a cable, of course, but that defeats the purpose of having BT headphones!)
I don't know why they still don't have apt-x already - most high-end competitors do.
 

blumpy

macrumors member
Oct 6, 2004
45
42
24 bit playback in iOS probably doesn't require anything special. If it's not supported now it's probably just switched off or the hardware doesn't support it. And at this point, probably virtually all audio playback chips have supported 24 bit playback for years. It would be nice for iOS devices to playback 24 bit audio but it's not like they'd need to bring in an expert just for that.



For the most part I agree with your post although 12 bit audio is pretty bad. Some people may not hear the difference but "many" seems like a stretch.

If you have any pro-audio software, or even garageband, download the ToneBooster's Time Machine (free plug-in), drop in some music you know, add the plugin to the track and change the bit depth to 12bit. It sounds grainy from the distortion introduced but I standby what I said, most people will not notice this. Most non-professionals I know think music sounds good on youtube.
 

Morris

macrumors regular
Dec 19, 2006
179
87
London, Europe
24 bit playback in iOS probably doesn't require anything special. If it's not supported now it's probably just switched off or the hardware doesn't support it. And at this point, probably virtually all audio playback chips have supported 24 bit playback for years. It would be nice for iOS devices to playback 24 bit audio but it's not like they'd need to bring in an expert just for that.



For the most part I agree with your post although 12 bit audio is pretty bad. Some people may not hear the difference but "many" seems like a stretch.

I'm not sure if he just grabbed a figure out of thin air or whether he knows about the experiments but he might have. The 13th bit is indeed where most people (>90%) lose the ability to hear the difference. I most certainly don't hear it if done properly.

Obviously there are plenty of people who claim to hear the difference between 15 and 16 bit but that is because there are also plenty of people who pay money to someone behind an email account who says they are sitting on a big inheritance of a prince they need to shift.

This is an interesting read on this subject: https://xiph.org/~xiphmont/demo/neil-young.html (Please bear in mind, we're talking about reproduction/listening to music. For recording and editing 24/96 is absolutely advisable, for listening even 16/44.1 is overkill)
 
Last edited:

chrmjenkins

macrumors 603
Oct 29, 2007
5,325
158
MD
Sounds like they'll finally stop using Cirrus audio codecs and go in-house with their development of DSPs, etc. 24-bit audio seems like a certainty going forward. I hope they at least continue to output pure digital streams over the lightning cable for MFi capable devices.

"...and create a new 24-bit capable version of its In-Ear headphones, but those reports have not yet panned out."

All headphones and speakers are analog. The digital conversion process needs to occur prior to amplification. So there's no such thing as 24bit capable headphones unless you're talking about digitally transmitting the data to the headphone via lightning or bluetooth. This is not a limitation of technology.... it is a limitation of human physiology unless you have a digital input implanted into your brain. So "digital" headphones convert the audio to analog and amplify the signal instead of your device. There's no gain for the end user to have the headphone convert the audio vs your iPhone unless you're headphone cable is 100ft long.

It doesn't matter, most people would not be able to tell the difference between 12, 16 or 24bit audio sources or 44.1k, 48k or 88.2 or 96k. This is not like the picture quality of SD, HD, 4k vs 8k

This is more like properly encoded H.264 vs H.265 vs AppleProRes with the same pixel count and framerate. You're not going to be able to tell the difference unless you've dropped a few hundred grand into a home theater setup and knew what to look for to see a difference. And even then there would no diminished entertainment quality between the various formats.

My point being that AAC 256kbps is not the weak link in your audio chain. It's your headphones or speakers. 192kbps lossy audio compression method far exceeds the reproduction quality of most headphones or speaker systems and 99.99% of people would never hear the difference unless switching between the same source at two different qualities in a properly designed recording studio.

Well said. Everyone will notice if they make a significant amp upgrade, though.
 

blumpy

macrumors member
Oct 6, 2004
45
42
I'm not sure if he just grabbed a figure out of thin air or whether he knows about the experiments but he might have. The 13th bit is indeed where most people (>90%) lose the ability to hear the difference. I most certainly don't hear it if done properly.

Obviously there are plenty of people who claim to hear the difference between 15 and 16 bit but that is because there are also plenty of people who pay money to someone behind an email account who says they are sitting on a big inheritance of a prince they need to shift.

This is an interesting read on this subject: https://xiph.org/~xiphmont/demo/neil-young.html (Please bear in mind, we're talking about reproduction/listening to music. For recording and editing 24/96 is absolutely advisable, for listening even 16/44.1 is overkill)

I omitted studies and figures for the sake of brevity.

96k is not what most professionals use for the multi-track recording either. For the master mix from an analog console 96k 24bit is king but the overhead/production limitations outweigh the slight quality improvement. Even film, which has higher audio quality standards than music, do not record above 48k 24bit.

----------

Sounds like they'll finally stop using Cirrus audio codecs and go in-house with their development of DSPs, etc. 24-bit audio seems like a certainty going forward. I hope they at least continue to output pure digital streams over the lightning cable for MFi capable devices.



Well said. Everyone will notice if they make a significant amp upgrade, though.

Amplification as well as impedance play a large role. Headphones sounding different in each device because of different amps and impedances was extremely annoying.
 

SgtPepper12

macrumors 6502a
Feb 1, 2011
697
673
Germany
I really hope for some kind of AU/VST support. There's so many nice things you can do nowadays and all we're left with are Apple's own (pretty crappy) EQ settings.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.