Confusion about sending a 100GB file from front-end to back-end

505 Views Asked by At

Background

We know that web browsers have memory limitations per tab, which can be raised to 4GB for example.

Assume there is a web app. This web app cannot load files larger than around 4GB in its memory. If there is a 100GB file, it cannot be loaded by the web app onto the memory.

Approach

I'm going to send a 100GB file from front-end to the back-end by this approach. Basically, the user is prompted to select the 100GB file from her/his computer:

<form class="upload">
   <input type="file" name="uploadFile" accept=".stl" required />
   <br/><br/>
   <input type="submit" />
</form>

The file is sent from front-end to back-end by a POST request using fetch API:

const uploadForm = document.querySelector('.upload')

uploadForm.addEventListener('submit', function(e) {
   e.preventDefault()
   let file = e.target.uploadFile.files[0]
   let formData = new FormData()
   formData.append('file', file)
   fetch('http://localhost:3000/upload_files', {
      method: 'POST',
      body: formData
   })
   .then(resp => resp.json())
   .then(data => {
      if (data.errors) {
         alert(data.errors)
      }
      else {
         console.log(data)
      }
   })
})

Question

I didn't test the above approach with the actual 100GB file. But before testing, I wonder if I'm missing something with the above setup? Is something wrong with it? Is there a standard approach to so such things? Is there a better approach?

How can I avoid loading the 100GB onto the JS memory? How can I send it to the back-end without going beyond the browser memory limitations?

0

There are 0 best solutions below