My intention is to analyze the bytecode of a Java program and collect the data about the data structures that have been used. That data includes the initial capacity and how the particular data structure has grown throughout the runtime(growing rate, according to the growing policy of the Java data structures).
Can I assume something like capacity is proportional to the memory taken by that particular data structure instance?
For example;
I used RamUsageEstimator from com.carrotsearch.sizeof.RamUsageEstimator to get the memory size taken by the particular data structure instance.
List<Integer> intList = new ArrayList<Integer>(4);
for(int i = 0 ; i < 5 ; i++){
intList.add(i);
System.out.println("Size(byte) -> " + RamUsageEstimator.sizeOf(intList));
}
I ran this code with an ArrayList of an initial size of 4. I added 5 elements to the list using a loop. According to the growing policy of the Java ArrayList, it should grow itself by 50%, which means after the 4rth element when I enter the 5th element new size would be 6. I got the following results, Bytes -> 72, 88, 104, 120, 144.
We can clearly see here that between the first 4 elements, the byte gap is 16, and eventually, at the 5th element, it has become 24. So, it clearly shows the growth and the rate right?
Is it possible to achieve my task this way?
Any answer would be great! Thank you!
When you create a heap dump from a running JVM that have eaten up nearly all of the heap available to it, and use a tool like Eclipse MAT to sum up the size of all data structures, the result is usually 10% up to 250% larger than the size of the heap itself … the magical RAM increase! Or that's because Java knows how to use a single byte of the RAM multiple times …
The reason is much more profane: the same (child) data structure is referenced by multiple parent data structures, and that is not always visible without digging deep. A good sample is the old
java.util.Dateclass …Tools like the already mentioned "RamUsageEstimator" try to make good guesses about the size, they are not named "Estimator" out of purpose.
For your goal, you should compare the size of your data structures having hundreds or thousands of initial elements with their size after doubling and tripling the number of elements, to get an idea of the behaviour.
Doing this with one, two, three elements is fruitless at best, misleading at worst.