I'm making a metronome using AVAudioEngine, AVAudioPlayerNode, and AVAudioPCMBuffer. The buffer is created like so:
/// URL of the sound file
let soundURL = Bundle.main.url(forResource: <filename>, withExtension: "wav")!
/// Create audio file
let audioFile = try! AVAudioFile(forReading: soundURL)
let audioFormat = audioFile.processingFormat
/// Create the buffer - what value to put for frameCapacity?
if let buffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: ???) {
buffer.frameLength = audioFrameCount
try? audioFile.read(into: buffer)
return buffer
}
What value do I put for frameCapacity in the AVAudioPCMBuffer initializer?
The documentation says frameCapacity should be "The capacity of the buffer in PCM sample frames." What does that mean? Is this a static value or do you grab it from the audio file?
frameCapacityis the maximum number of frames thatAVAudioPCMBuffercan hold. You don't have to use all of the frames. Consumers ofAVAudioPCMBuffers should only ever consultframeLengthframes, andframeLength <= frameCapacity. A capacity different from the length can be useful if you're processing audio in chunks ofNframes, and for whatever reason you get a short read:But if you're only ever going to store
audioFrameCountframes (the length of your file?) in the buffer, then setframeCapacityto that: