AVPlayerLoop not seamlessly looping - Swift 4

3.8k Views Asked by At

My problem is: I am trying to do seamless looping (I intend to make my AVPlayer or AVPlayerQueue, loop without any delay between playbacks). So for example, if I do a video and go to the playback, it should be looping endlessly without any blips in between or delays in the looping.

I have written the code below (its straight from example code too):

    var playerQQ: AVQueuePlayer!
    var playerLayur: AVPlayerLayer!
    var playerEyetem: AVPlayerItem!
    var playerLooper: AVPlayerLooper!

    func playRecordedVideo(videoURL: URL) {

        playerQQ = AVQueuePlayer()
        playerLayur = AVPlayerLayer(player: playerQQ)
        playerLayur.frame = (camBaseLayer?.bounds)! 
       camBaseLayer?.layer.insertSublayer(playerLayur, above: previewLayer) 

       playerEyetem = AVPlayerItem(url: videoURL)
        playerLooper = AVPlayerLooper(player: playerQQ, templateItem: playerEyetem)
        playerQQ.play()

    }

The code above does not loop seamlessly; it has blips in-between the end of the current Player and the next one. I have tried a lot to find the problem and searched it online and have not found a solution. Also, I've been trying NSNotifications and other methods including setting Player.seek(to: zero) when the player finishes playback. But nothing has worked at all.

Any help would be appreciated :)

4

There are 4 best solutions below

0
NoHalfBits On BEST ANSWER

One thing to to keep in mind when looping assets is that audio and video tracks can have different offsets and different durations, resulting in 'blips' when looping. Such small differences are quite common in recorded assets.

Iterating over the tracks and printing the time ranges can help to detect such situations: for track in asset.tracks { print( track.mediaType); CMTimeRangeShow( track.timeRange); }

To trim audio and video tracks to equal start times and equal durations, get the common time range of the tracks, and then insert this time range from the original asset into a new AVMutableComposition. Normally, you also want to preserve properties like the orientation of the video track:

let asset: AVAsset = (your asset initialization here)

let videoTrack: AVAssetTrack = asset.tracks(withMediaType: .video).first!
let audioTrack: AVAssetTrack = asset.tracks(withMediaType: .audio).first!

// calculate common time range of audio and video track
let timeRange: CMTimeRange = CMTimeRangeGetIntersection( (videoTrack.timeRange), (audioTrack.timeRange))

let composition: AVMutableComposition = AVMutableComposition()

try composition.insertTimeRange(timeRange, of: asset, at: kCMTimeZero)

// preserve orientation
composition.tracks(withMediaType: .video).first!.preferredTransform = videoTrack.preferredTransform

Since AVMutableComposition is a subclass of AVAsset, it can be used for AVPlayerLooper-based looping playback, or exporting with AVAssetExportSession.

I've put a more complete trimming implementation on github: https://github.com/fluthaus/NHBAVAssetTrimming. It's more robust, handles multiple tracks, preserves more properties and can either be easily integrated in projects or be build as a standalone macOS command line movie trimming utility.

2
John Lanzivision On

If playing at the end try

NotificationCenter.default.addObserver(self,
                                       selector: #selector(playerItemDidReachEnd(notification:)),
                                       name: Notification.Name.AVPlayerItemDidPlayToEndTime,
                                       object: avPlayer?.currentItem)

 @objc func playerItemDidReachEnd(notification: Notification) {
        if let playerItem: AVPlayerItem = notification.object as? AVPlayerItem {
            playerItem.seek(to: kCMTimeZero, completionHandler: nil)
        }
    }

If not, I would suggest managing your own with a dTime (fire an NSTimer 1/30 seconds or something) and set to play with something like this

    player.seekToTime(seekTimeInProgress, toleranceBefore: kCMTimeZero,
            toleranceAfter: kCMTimeZero, completionHandler: ...

the kCMTimeZero are extremely important, or the time won't be exact. And finally, I've found there is a load time when restarting vids, depending on the iOS phone type and the length of the video and how many your playing, so if you're still getting that lag after you eliminate the timing issues you may be forced to account in your UX.

0
AudioBubble On

The answer from @NoHalfBits works great, but also I found another solution. I basically got the intersection time range of the video and sound mediaTypes from the playerItem's asset. After that I added the intersectionTimeRange as the timeRange in the parameter when I call:

playerLooper = AVPlayerLooper(playerQueue:_, playerItem:_, timeRange: intersectionTimeRange)

This will work! To get the timeRanges of each set up a for loop for the playerItem's asset.

1
Dragon.Yao On

It looks like .mp4 files problem, convert .mp4 file to .mov file. The AVPlayer or AVQueuePlayer with the .mp4 file all work fine. here is my code:

NotificationCenter.default.addObserver(forName: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: nil, queue: nil) { [weak self] (noty) in
   self?.player?.seek(to: CMTime.zero)
   self?.player?.play() }

or

let asset = AVAsset(url: URL(fileURLWithPath: movPath))
let playerItem = AVPlayerItem(asset: asset)
let player = AVQueuePlayer(playerItem: playerItem)
playerLooper = AVPlayerLooper(player: player, templateItem: playerItem)
plyerLayer.frame = CGRect(x: 0, y: 88, width: kWidth, height: kWidth * 0.75)
plyerLayer.videoGravity = .resizeAspectFill
plyerLayer.player = player
view.layer.addSublayer(plyerLayer)
player.play()