Ask Your Question

Revision history [back]

To obtain an audio frame from AVCaptureSession, you can use the AVCaptureAudioDataOutput class. Here is an example of how you can set it up and receive audio samples:

  1. Create an instance of AVCaptureSession:
let session = AVCaptureSession()
  1. Add an input for the audio device:
let audioDevice = AVCaptureDevice.default(for: AVMediaType.audio)
let audioInput = try AVCaptureDeviceInput(device: audioDevice!)
session.addInput(audioInput)
  1. Add an output for the audio samples:
let audioOutput = AVCaptureAudioDataOutput()

if session.canAddOutput(audioOutput) {
    session.addOutput(audioOutput)
    let audioQueue = DispatchQueue(label: "AudioQueue", qos: .background, target: nil)
    audioOutput.setSampleBufferDelegate(self, queue: audioQueue)
} else {
    print("Error: Unable to add audio output to session")
}
  1. Implement the AVCaptureAudioDataOutputSampleBufferDelegate to receive audio samples:
extension YourViewController: AVCaptureAudioDataOutputSampleBufferDelegate {

    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
        guard let audioBuffer = CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer, nil, memSize, nil, nil, 0, nil) else {
            return
        }

        for bufferIndex in 0..<audioBuffer.mNumberBuffers {
            let audioData = UnsafeMutableAudioBufferListPointer(audioBuffer.mBuffers)
            let channelData = audioData[bufferIndex]

            // Process the channel data here
        }

        // Release the audio buffer
        CMSampleBufferInvalidate(sampleBuffer)
        CFRelease(sampleBuffer)
    }
}

In this implementation, we use the CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer function to obtain the audio buffer from CMSampleBuffer. We then loop through the AudioBufferList to process each channel's data separately.

Note that you should also handle errors and release the buffer to avoid memory leaks.