I have this (simplified) code to upload a file as chunk :
let reader = new FileReader();
let blob = file.slice(0, STEP);
reader.readAsDataURL(blob);
reader.onload = (e) =>
{
let d =
{
container: container,
blob: BlobName,
file: reader.result,
id: id
};
$.ajax({
url: uploadPath,
type: "POST",
data: d,
timeout: 30000
}).done(function(r)
{
if (r.success == "yes")
{
Loaded += e.loaded;
if(Loaded < total)
{
blob = file.slice(Loaded, Loaded + STEP); // getting next chunk
reader.readAsDataURL(blob); // trigger onload for next chunk
}
else
{
// File is completely uploaded
}
}
else
{
if (tries++ > 3)
{
// error management here
}
else
{
// try again
reader.readAsDataURL(blob); // trigger again onload
}
}
}).fail(function (jqXHR, textStatus, errorThrown)
{
if (tries++ > 3)
{
// error management here
}
else
{
// try again
reader.readAsDataURL(blob); // trigger again onload
}
}
}
This code worked like a charm, even for large files (43 GB).
Today, we had to upload a large file (20 GB) and we got a SBOX_FATAL_MEMORY_EXCEEDED
on Chrome (88)
After lots of tests and monitoring, we noticed a HUGE memory usage growing up in Chrome when using this upload.
Other tests were made and we noticed the same behavior on Edge and Firefox (upload could be done on FF, but still used GBs of RAM)
What can I do to fix this terrible memory management?
question from:
https://stackoverflow.com/questions/66065824/sbox-fatal-memory-exceeded-when-uploading-a-file-in-chunks 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…