Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

leerkeller

macrumors member
Original poster
Nov 1, 2011
96
0
Baltimore, MD
I currently have my audio/midi input/output format set at 96khz (For Low-Latency Amp Modeling Software). When I play a 44khz song in iTunes will some form of signal degradion occur in the sample rate conversion? Or is it possible that there might actually be a benefit? I know that I could always open up Audio Midi Setup and manually change the computers sample rate back down to 44Khz when I am finished using my amp modeling software. I use Amplitube/ Guitar Rig very frequently and every time one of them open they set the computer sample rate to 96khz and don't return to 44khz when they are done, and it just a small annoyance to have to remember to do it every time I close the modeling software.



I guess what I am looking for is just some confirmation that it wont make my 44khz music files sound worse if the computer is running at a different sample rate. I know personally I was not able to hear any difference, but if there is one with negative effects I want to make sure not to let it occur.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.