I am currently using a CADisplayLink to grab the current pixel buffer from an AVPlayerItemVideoOutput. The display link is updating like so;
func displayLinkUpdated(link: CADisplayLink) {
let time = output.itemTime(forHostTime: CACurrentMediaTime())
guard output.hasNewPixelBuffer(forItemTime: time), let buffer = output.copyPixelBuffer(forItemTime: time, itemTimeForDisplay: nil) else { return }
...
}
I would like to use the same output to also grab a pixel buffer exactly 1 second ahead. Since my video is 30fps, I thought I could use CMTimeMakeWithSeconds to do the following;
func displayLinkUpdated(link: CADisplayLink) {
let time = output.itemTime(forHostTime: CACurrentMediaTime())
let timeB = CMTimeAdd(CMTimeMakeWithSeconds(CMTimeGetSeconds(time) + 1, preferredTimescale: time.timescale), time)
guard output.hasNewPixelBuffer(forItemTime: time), let buffer = output.copyPixelBuffer(forItemTime: time, itemTimeForDisplay: nil) else { return }
guard let bufferB = output.copyPixelBuffer(forItemTime: timeB, itemTimeForDisplay: nil) else { return }
...
}
When I attempt to do this, and then perform my ... post-processing, the playback I experience is incredibly choppy. If I set let timeB = time just for the sake of testing, my post-processing and playback is back to normal and smooth.
Is there any explanation for why this approach would not work, or be more efficient