Performance for retrieving PHAssets with moments

1.5k Views Asked by At

I am currently building a gallery of user photos into an app. So far I simply listed all the user's photos in a UICollectionView. Now I would like to add moment clusters as sections, similar to the iOS Photos app.

What I am doing (a bit simplified):

let momentClusters = PHCollectionList.fetchMomentLists(with: .momentListCluster, options: options)
momentClusters.enumerateObjects { (momentCluster, _, _) in
    let moments = PHAssetCollection.fetchMoments(inMomentList: momentCluster, options: nil)
    var assetFetchResults: [PHFetchResult<PHAsset>] = []
    moments.enumerateObjects { (moment, _, _) in
        let fetchResult = PHAsset.fetchAssets(in: moment, options: options)
        assetFetchResults.append(fetchResult)
    }
    //save assetFetchResults somewhere and use it in UICollectionView methods
}

Turns out this is A LOT more time-intensive than what I did before - up to a minute compared to about 2 seconds on my iPhone X (with a gallery of about 15k pictures). Obviously, this is unacceptable.

Why is the performance of fetching moments so bad, and how can I improve it? Am I using the API wrong?

I tried loading assets on-demand, but it's very difficult, since I then have to work with estimated item counts per moment, and reload sections while the user is scrolling - I couldn't get this to work in a way that is satisfactory (smooth scrolling, no noticable reload).

Any help? How is this API supposed to work? Am I using it wrong.

Update / Part solution

So after playing around, it turns out that the following was a big part of the problem: I was fetching assets using options with a sort descriptor:

let options = PHFetchOptions()
options.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: true)]
options.predicate = NSPredicate(format: "mediaType = %d", PHAssetMediaType.image.rawValue)
let assets = PHAsset.fetchAssets(in: moment, options: options)

It seems sorting doesn't allow PhotoKit to make use of indices or caches it has internally. Removing the sortDescriptors speeds up the fetch significantly. It's still slower than before and any further tips are appreciated, but this makes loading times way more bearable.

Note that, without the sort descriptor, assets will be returned oldest ones first, but this can be easily fixed manually by retrieving assets in reversed order in cellForItemAt: (so the cell at 0,0 will get the last asset of the first moment)

1

There are 1 best solutions below

4
On

Disclaimer: Performance-related answers are necessarily speculative...

You've described two extremes:

  • Prefetch all assets for all moments before displaying the collection
  • Fetch all assets lazily, use estimated counts and reload

But there are in-between options. For example, you can fetch only the moments at first, then let fetching assets per moment be driven by the collection view. But instead of waiting until moments are visible before fetching their contents, use the UICollectionViewDataSourcePrefetching protocol to decide what to start fetching before those items are due to be visible on screen.