How to read big files with FileReader.readAsArrayBuffer()

1.1k Views Asked by At

If I have a big file (e.g. hundreds of MB) does FileReader.readAsArrayBuffer() read the entire file data into an ArrayBuffer?

According to MDN, it does.

I have a big .zip file with multiple GB. I'm concerned with taking up the entire RAM when working e.g. on a mobile device. Is there an alternative approach where a file handle is returned and a portion of the file is read as needed?

1

There are 1 best solutions below

0
Kaiido On BEST ANSWER

You can use Blob.stream() for this:

const bigBlob = new Blob(["somedata".repeat(10e5)]);
console.log("Will read a %sbytes Blob", bigBlob.size)
const reader = bigBlob.stream().getReader();
let chunks = 0;
reader.read().then(function processChunk({ done, value }) {
  if (done) {
    console.log("Stream complete. Read in %s chunks", chunks);
    return;
  }
  // do whatever with the chunk
  // 'value' is an ArrayBuffer here
  chunks++;
  return reader.read().then(processChunk);
}).catch(console.error);