Resolved Display a waveform image from an audio file using Swift

Discussion in 'iOS Programming' started by HoboFlo, Mar 26, 2015.

  1. HoboFlo, Mar 26, 2015
    Last edited: Mar 27, 2015

    HoboFlo macrumors newbie

    Joined:
    Jan 9, 2014
    #1
    The goal is to generate a waveform image for an audio file that the user records. I found this library, but does anyone know of a way to do this with Swift?

    I also remember hearing that you can use Objective-C classes within Swift. Would that be a possible way to get that library working? I'm very new to iOs development so any guidance is appreciated!
     
  2. solinari6 macrumors member

    Joined:
    Aug 13, 2008
    #2
  3. HoboFlo thread starter macrumors newbie

    Joined:
    Jan 9, 2014
    #3
    Cool thanks for the link. I'll give this a go and post the results of my efforts!
     
  4. HoboFlo, Mar 26, 2015
    Last edited: Mar 26, 2015

    HoboFlo thread starter macrumors newbie

    Joined:
    Jan 9, 2014
    #4
    EDIT: The library I'm trying to use is EZAudio

    Alright, I've spent the last few hours trying to convert the EZAudioWaveformFromFile example to a Swift project, and I definitely need some help. I got the bridge file working, but the code itself is proving to be a challenge for me. Hope someone can give some advice!

    The EZAudio demo project contains the following WaveformFromFileViewController.h file:
    Code:
    //
    //  WaveformFromFileViewController.h
    //  EZAudioWaveformFromFileExample
    //
    //  Created by Syed Haris Ali on 12/15/13.
    //  Copyright (c) 2013 Syed Haris Ali. All rights reserved.
    //
    
    #import <UIKit/UIKit.h>
    
    // Import EZAudio header
    #import "EZAudio.h"
    
    /**
     Here's the default audio file included with the example
     */
    #define kAudioFileDefault [[NSBundle mainBundle] pathForResource:@"simple-drum-beat" ofType:@"wav"]
    
    @interface WaveformFromFileViewController : UIViewController
    
    #pragma mark - Components
    /**
     The EZAudioFile representing of the currently selected audio file
     */
    @property (nonatomic,strong) EZAudioFile *audioFile;
    
    /**
     The CoreGraphics based audio plot
     */
    @property (nonatomic,weak) IBOutlet EZAudioPlot *audioPlot;
    
    /**
     A BOOL indicating whether or not we've reached the end of the file
     */
    @property (nonatomic,assign) BOOL eof;
    
    #pragma mark - UI Extras
    /**
     A label to display the current file path with the waveform shown
     */
    @property (nonatomic,weak) IBOutlet UILabel *filePathLabel;
    
    @end
    And WaveformFromFileViewController.m:
    Code:
    //
    //  WaveformFromFileViewController.m
    //  EZAudioWaveformFromFileExample
    //
    //  Created by Syed Haris Ali on 12/15/13.
    //  Copyright (c) 2013 Syed Haris Ali. All rights reserved.
    //
    
    #import "WaveformFromFileViewController.h"
    
    @interface WaveformFromFileViewController (){
      AudioBufferList *readBuffer;
    }
    @end
    
    @implementation WaveformFromFileViewController
    @synthesize audioPlot = _audioPlot;
    @synthesize audioFile = _audioFile;
    @synthesize eof = _eof;
    @synthesize filePathLabel = _filePathLabel;
    
    #pragma mark - Customize the Audio Plot
    -(void)viewDidLoad {
      
      [super viewDidLoad];
      
      /*
       Customizing the audio plot's look
       */
      // Background color
      self.audioPlot.backgroundColor = [UIColor colorWithRed: 0.169 green: 0.643 blue: 0.675 alpha: 1];
      // Waveform color
      self.audioPlot.color           = [UIColor colorWithRed:1.0 green:1.0 blue:1.0 alpha:1.0];
      // Plot type
      self.audioPlot.plotType        = EZPlotTypeBuffer;
      // Fill
      self.audioPlot.shouldFill      = YES;
      // Mirror
      self.audioPlot.shouldMirror    = YES;
      
      /*
       Load in the sample file
       */
      [self openFileWithFilePathURL:[NSURL fileURLWithPath:kAudioFileDefault]];
      
    }
    
    #pragma mark - Action Extensions
    -(void)openFileWithFilePathURL:(NSURL*)filePathURL {
      
      self.audioFile          = [EZAudioFile audioFileWithURL:filePathURL];
      self.eof                = NO;
      self.filePathLabel.text = filePathURL.lastPathComponent;
      
      // Plot the whole waveform
      self.audioPlot.plotType        = EZPlotTypeBuffer;
      self.audioPlot.shouldFill      = YES;
      self.audioPlot.shouldMirror    = YES;
      [self.audioFile getWaveformDataWithCompletionBlock:^(float *waveformData, UInt32 length) {
        [self.audioPlot updateBuffer:waveformData withBufferSize:length];
      }];
      
    }
    
    @end
    
    So far, here is my (probably very poor) attempt at converting to one Swift file:
    Code:
    import UIKit
    
    class ViewController: UIViewController {
        
        var filePath = NSURL(string: NSBundle.mainBundle().pathForResource("simple-drum-beat", ofType: "wav")!)
        
        @IBOutlet weak var audioPlot: EZAudioPlot!
    
        override func viewDidLoad() {
            super.viewDidLoad()
        
            var readBuffer: AudioBufferList
            
            self.audioPlot.backgroundColor = UIColor.redColor()
            self.audioPlot.color = UIColor.blackColor()
            self.audioPlot.shouldFill = true
            self.audioPlot.shouldMirror = true
            self.openFileWithFilePathURL(filePath!)    
        }
        
        func openFileWithFilePathURL(filePathURL:NSURL) {
            var audioFile:EZAudioFile = EZAudioFile(URL: filePathURL)
            var eof: Bool = false
            self.audioPlot.shouldFill = true
            self.audioPlot.shouldMirror = true
        }
    }
    
    But I'm completely lost as to how to handle the rest, namely the...
    Code:
    [self.audioFile getWaveformDataWithCompletionBlock:^(float *waveformData, UInt32 length) {
        [self.audioPlot updateBuffer:waveformData withBufferSize:length];
      }];
    ...at the bottom of WaveformFromFileViewController.m.

    I'm sure my code isn't correct, but is it way off? Am I even remotely on the right track? How should I handle the next part of the code mentioned above? So far there are no errors, and I can start the app with a blank red rectangle where the waveform should be (which makes sense considering my code is incomplete).

    Thanks!
     
  5. solinari6, Mar 26, 2015
    Last edited: Mar 26, 2015

    solinari6 macrumors member

    Joined:
    Aug 13, 2008
  6. 1458279, Mar 26, 2015
    Last edited: Mar 26, 2015

    1458279 Suspended

    1458279

    Joined:
    May 1, 2010
    Location:
    California
    #6
    Code:
    [self.audioFile getWaveformDataWithCompletionBlock:^(float *waveformData, UInt32 length) {
        [self.audioPlot updateBuffer:waveformData withBufferSize:length];
      }];
    Ok, that's a block which starts with the ^
    Looks like it's a block with:
    1. no name
    2. two parameters called waveformData and length
    3. of types float and UInt32
    4. passed into a call to self.audioPlot
    5. the results of the block is passed into self.audioFile as the getWaveformDataWithCompletionBlock parameter.

    The format is: [someObject someMethodThatTakesABlock:^returnType (parameters) {...}];

    Google: objective c blocks then swift blocks

    I haven't seen that format of a block before, looks a bit odd to me.
     
  7. HoboFlo thread starter macrumors newbie

    Joined:
    Jan 9, 2014
    #7
    Thanks for the breakdown! Looks like I've got a little learning to do about blocks. I'll post my progress soon.
     
  8. 1458279 Suspended

    1458279

    Joined:
    May 1, 2010
    Location:
    California
    #8
    Yea, I looked at those Swift blocks for a bit and closed the page :D I'm already deep into ObjC and really only read a bit about Swift. I'm getting tired of learning so many languages.

    It is nice to know Swift has the power of blocks, I hope it at least has most of the power that ObjC has.

    If your in for the long haul, I'd dig into blocks, they are used quite a bit.
     
  9. HoboFlo thread starter macrumors newbie

    Joined:
    Jan 9, 2014
    #9
    Alright so I wimped out of trying to get EZAudio to work, and tried the much simpler, FDWaveformView.

    I've managed to get it to work for my purposes. For those interested, here's what I did:

    - Created an XCode project and set the language to Swift.
    - Downloaded zip of FDWaveformView, and dragged FDWaveformView.h and FDWaveformView.m into my project, responding "yes" to create the bridging file.
    - Added
    Code:
    #import "FDWaveFormView.h"
    to the bridging file.
    - Dragged their demo "Submarine.aiff" file into my project.
    - Dragged a View into my main.storyboard and set its class to FDWaveformView, then created an outlet in ViewController.swift.

    Here is my final ViewController.swift file. I chopped out 70% of the features that FDWaveformView has to offer because I only need a static image of the waveform:

    Code:
    import UIKit
    
    class ViewController: UIViewController {
    
        @IBOutlet weak var waveform: FDWaveformView!
        
        
        
        override func viewDidLoad() {
            super.viewDidLoad()
            
            var filePath = NSString(string: NSBundle.mainBundle().pathForResource("Submarine", ofType: "aiff")!)
            var fileURL = NSURL(fileURLWithPath: filePath)
    
            self.waveform.audioURL = fileURL
            self.waveform.doesAllowScrubbing = false
            self.waveform.doesAllowStretchAndScroll = false
        }
    
        func waveformViewDidRender(waveformView:FDWaveformView) {
            self.waveform.alpha = 1.0
        }
    }
    Here's the super exciting result!
    [​IMG]

    That's all I needed. Now I can implement it into my app. I appreciate the help, you guys saved me some time.
     
  10. 1458279 Suspended

    1458279

    Joined:
    May 1, 2010
    Location:
    California
    #10
    Glad to see you got a solution! Sometimes it's best to change directions and go with a different product.
     

Share This Page