In my app I have different list view that contains some thumbnails. Today I'm starting the refactoring and I want to implement the LRU Caching. I'm following the Android guide lines, but I'm wondering if is better to initialize only one LRU Cache for entire app, or is better initialize LRU cache for each list view. I'm afraid of outOfMemory. Thus I have the following questions that I can't answer with myself: - one LRU cache initialized with singleton pattern is a good idea ? - if the memory is low, does lead to outOfMemory situation the following initializion of the LRU Cache ?
@Override
protected void onCreate(Bundle savedInstanceState) {
...
// Get max available VM memory, exceeding this amount will throw an
// OutOfMemory exception. Stored in kilobytes as LruCache takes an
// int in its constructor.
final int maxMemory = (int) (Runtime.getRuntime().maxMemory() / 1024);
// Use 1/8th of the available memory for this memory cache.
final int cacheSize = maxMemory / 8;
mMemoryCache = new LruCache<String, Bitmap>(cacheSize) {
@Override
protected int sizeOf(String key, Bitmap bitmap) {
// The cache size will be measured in kilobytes rather than
// number of items.
return bitmap.getByteCount() / 1024;
}
};
...
}
if memory is low, the LRU cache is automatically released? I'm wondering if the app will have problem to release memory when I use LRU Cache (app crash because out of memory ? )
Only one LRU Cache for entire app , can it be a problem ?
- More than one LRU Cache for entire app , can they be a problem ?
I had a lot of problems with this, so I can give you some helpful answers here.
The LRU Cache will not automatically release memory. You will need to evict the entries programmatically.
The
LruCacheclass is a generic class with a type for the key and a type for the value. I would say you want to have one LRU Cache for each object type that you are caching. You are doing what I did: cachingBitmaps and keying withStrings.Just a side note: Be careful with using
bitmap.getByteCount(). The JavaDocs say this:I was using an LRU Cache for
Bitmaps. I thought it would be sufficient to overrideApplication.onTrimMemory()and clear the cache whenever this method was called.Also, I did the same thing you did and set a cache size based on some percentage of heap memory available to the app.
But here's what would happen: When my app was in a low memory state and my app would try to download an image with insufficient memory, the GC would run but the
BitmapFactorywould still be unable to allocate memory for theBitmap. The logs were showing that theonTrimMemory()method was being called asynchronously, sometimes almost a full second after theOutOfMemoryErrorwas thrown!HEY GOOGLE: IF THE SYSTEM CAN'T TELL ME I'M LOW ON MEMORY UNTIL AFTER
OutOfMemoryErrorIS THROWN, HOW IN THE HELL AM I SUPPOSED TO MANAGE MY MEMORY?Insanity. Sheer, utter insanity.
Here's what I ended up doing: I would catch
OutOfMemoryErrorin a try block, clear the cache there then retry the image request to the server. The exact thing they tell you not to do. But it ended up fixing my problem. The app is a lot more stable now.So after you implement your LRU Cache, make sure you stress test your app; try to get it into a low memory situation and see how it behaves. For me it worked best when using an emulator. The emulator would have a small 96M heap limit, but if it had a high screen resolution, the image resources would scale to be pretty big, which made pushing the memory to the max fairly easy.
If you are displaying thumbnails, but you are getting images from a server and they might be larger than your
ImageView, make sure you read this article: Loading Large Bitmaps Efficiently | Android Developers to learn how to download bitmaps of an appropriate size without wasting memory.You might also experiment with just letting the system do the caching and set up an
HttpResponseCache.Whatever you end up doing, make sure to stress your app and see how it behaves when there's not much heap left.
And be prepared to deal with a little frustration.