In a JavaScript engine like V8, does an object consume more memory than a function that returns it?

100 Views Asked by At

Let's say I have a non-trivial static data structure:

[
  { id: 1, title: 'Long title 1...' },
  { id: 2, title: 'Long title 2...' },
  /* 998 more... */
]

Considering my app will infrequently need to map/filter this (and I assume keep the module in memory), would it consume less memory to (A) export it as a const:

export const rows = [ ... ];

Or (B), export a function that returns the literal:

export function getRows() {
  return [ ... ];
}

Or (C), export a function that JSON.parses a string literal?

export function getRows() {
  return JSON.parse("[ ... ]");
}

To be clear the concern is its contribution to RAM consumption of a browser tab for the length of the page lifetime. A transient spike in RAM usage during infrequent operations is less of a concern.

I know in (B) and (C), each invocation may incur some object construction time, but that's not as concerning as taking up RAM. I know (C) defers the parsing time of the literal, but it's not clear if the parsed representation consumes more RAM than its JSON string literal.

1

There are 1 best solutions below

2
Steve Clay On

Short answer may be “yes”. From the testing I performed in this repo, the "winner" was (C) with the least long-term memory consumption, with (B) consuming twice the memory, and (A) 4x the memory.

Surely this will depend on the size and shape of the data. In this case:

Case Module export Size on disk (b) Bytes added to "heapUsed" after 30s
C Function using JSON.parse() 1,400,667 1,934,336
B Function returning literal 1,379,845 4,422,656
A Const 1,379,828 9,018,368

Each module was tested separately, first capturing the output of memoryUsage(), then loading the data module dynamically, accessing the data, then waiting 10s and 30s after loading and re-measuring memory usage then (not allowing the imported resource to fall out of scope). The results from 3 runs can be seen in this spreadsheet (the values are deltas in kB from the baseline call to memoryUsage()).

What I suspect is going on is that a single large JSON string in source code can be stored in contiguous RAM whereas the other two representations require a lot more individual objects and expensive "edges" for lack of a better term. And it's pretty surprising that the constructed variable takes up twice the memory as the compiled representation of a function returning the object as a literal!

But maybe it shouldn’t be surprising: I assume the AST of a literal can’t be GC-ed regardless of if it’s a const or function so while the module is in scope you’re always paying that memory cost on top of any variable storage. All things equal, try to store as little on the heap for as short as possible.