Monday, March 30, 2015

How to record & play audio using SWIFT?

In this post you can learn how to record a voice in one scene, play that recorded voice in next scene and then add various effects to that voice using SWIFT programming language.

Read this blog to create a basic structure required for this project.

Recording Voice

Here is the code snippet to record a voice within iOS application (i.e. RecordSoundsViewController.swift file)

import AVFoundation
//Declared Globally
var audioRecorder:AVAudioRecorder!

//Inside func recordAudio(sender: UIButton)
let dirPath = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true)[0] as String

let currentDateTime = NSDate()
let formatter = NSDateFormatter()
formatter.dateFormat = "ddMMyyyy-HHmmss"
let recordingName = formatter.stringFromDate(currentDateTime)+".wav"
let pathArray = [dirPath, recordingName]
let filePath = NSURL.fileURLWithPathComponents(pathArray)
println(filePath)

var session = AVAudioSession.sharedInstance()
session.setCategory(AVAudioSessionCategoryPlayAndRecord, error: nil)

audioRecorder = AVAudioRecorder(URL: filePath, settings: nil, error: nil)
audioRecorder.delegate = self//Add this line only if delegate is added in class declaration
audioRecorder.meteringEnabled = true
audioRecorder.prepareToRecord()
audioRecorder.record()

//Inside func stopAudio(sender: UIButton)
audioRecorder.stop()
var audioSession = AVAudioSession.sharedInstance()
audioSession.setActive(false, error: nil)
  • NSSearchPathForDirectoriesInDomains: This method gets the location of the requested directory as a string.
  • DocumentDirectory: An iOS application can create/read files and folders within the Document directory of that application only. So in this code .DocumentDirectory property is specified as the location to save recorded voice. 

Segue

Segue provides us the ability to transfer the data from one scene to another scene. This is done by adding a Segue between the scenes in a storyboard. To add Segue in a storyboard 
  • Ctrl+click on the first small icon from first scene (RecordSoundsViewController) and drag it on to second scene (PlaySoundsViewController)
  • Select show option from the popup window as shown below.

  • Select and enter name for the Segue in Attributes Inspector.

Delegate

Delegate means designate some responsibilities of main resource to secondary resource or act/respond on behalf of another. In iOS programming you can do this by implementing delegation design pattern.

For example in the above code you record a voice in one scene and play that voice back from another scene. How does second scene know that audio recording is finished in previous scene? Apple's class AVAudioRecorder has methods to record audio and identify whether that recording is finished or not. But the custom class does not have the visibility to those methods. To overcome this problem your custom class has to become a delegate of AVAudioRecorder.

Here is the code snippet to add a delegate (i.e RecordSoundsViewController).



  1. Add AVAudioRecorderDelegate to your class definition.
  2. Assign your view controller to audioRecorder.delegate property. It means that your view controller becomes delegate of AVAuidoRecorder class by adding this code. The preferred place to add this line of code is in recordAudio() of your class.
  3. Implement audioRecorderDidFinishRecording() method of AVAudioRecorder in your class. This method is invoked once the recording is finished or user clicks on stop record button.

Model 

  • Model is used to store recorded voice in a file and captures it's properties like file name, path etc.
  • Create a Model name "RecordedAudio" with the following definition.


  • Declare a global variable of type "RecordedAudio" in your view controller class (i.e. RecordSoundsViewController).
  • Modify audioRecorderDidFinishRecording() method to save the file using model. Here is the code snippet for this.


Pass Model via Segue

  • You can pass data from scene to another via segue. This is done by implementing/overriding prepareForSegue() method of UIViewController class in your first scene (i.e. RecordSoundsViewController). prepareForSegue() method is invoked when segue is about to be performed. 
  • Declare a global variable of type RecordedAudio in second scene (i.e. PlaySoundsViewController) to keep data passed from first scene (i.e RecordSoundsViewController)
  • Here is the code snippet for prepareForSegue() method.
  • Play audio using model in second scene by modifying the following line of code.

Adding Effects

Here are the steps that we need to implement to add sound effects to an audio file (i.e. PlaySoundsViewController)

  1. Create an object of type AVAudioEngine. AVAudioEngine class is used to manipulate/play with audio files.
  2. Create an object of type AVAudioPlayerNode. AVAudioPlayerNode class is used to play buffers or segments of a audio file. This is like connecting to MP3 player and actually playing the audio.
  3. Attach the AVAudioPlayerNode object to the AVAudioEngine's object.
  4. Create an object of type AVAudioUnitTimePitch. This object allows us to change the pitch of an audio file.
  5. Attach the AVAudioUnitTimePitch object to the AVAudioEngine's object.
  6. Connect AVAudioPlayerNode object to the AVAudioUnitTimePitch's object
  7. Connect AVAudioUnitTimePitch object to the output (i.e. speakers)
  8. Play sound

Here is the code snippet for the steps explained above.

//Declared globally within PlaySoundsViewController
var audioEngine:AVAudioEngine!
var audioFile:AVAudioFile!

//In viewDidLoad
audioEngine = AVAudioEngine()
audioFile = AVAudioFile(forReading: receivedAudio.filePathUrl, error: nil)

//In playChipmunkAudio
playAudioWithVariablePitch(1000)

//New Function
func playAudioWithVariablePitch(pitch: Float){
    audioPlayer.stop()
    audioEngine.stop()
    audioEngine.reset()

    var audioPlayerNode = AVAudioPlayerNode()
    audioEngine.attachNode(audioPlayerNode)

    var changePitchEffect = AVAudioUnitTimePitch()
    changePitchEffect.pitch = pitch
    audioEngine.attachNode(changePitchEffect)

    audioEngine.connect(audioPlayerNode, to: changePitchEffect, format: nil)
    audioEngine.connect(changePitchEffect, to: audioEngine.outputNode, format: nil)

    audioPlayerNode.scheduleFile(audioFile, atTime: nil, completionHandler: nil)
    audioEngine.startAndReturnError(nil)

    audioPlayerNode.play()
}

Design Patterns

Frameworks

Classes