Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
177 views
in Technique[技术] by (71.8m points)

javascript - SBOX_FATAL_MEMORY_EXCEEDED when uploading a file in chunks

I have this (simplified) code to upload a file as chunk :

let reader = new FileReader();
let blob = file.slice(0, STEP);
reader.readAsDataURL(blob);
reader.onload = (e) =>
{
    let d =
    {
        container: container,
        blob: BlobName,
        file: reader.result,
        id: id
    };

    $.ajax({
            url: uploadPath,
            type: "POST",
            data: d,
            timeout: 30000
    }).done(function(r)
    {
        if (r.success == "yes")
        {
            Loaded += e.loaded;
            if(Loaded < total)
            {
                blob = file.slice(Loaded, Loaded + STEP);   // getting next chunk
                reader.readAsDataURL(blob);  // trigger onload for next chunk
            }
            else
            {
                // File is completely uploaded
            }
        }
        else
        {
            if (tries++ > 3)
            {
                // error management here
            }
            else
            {
                // try again
                reader.readAsDataURL(blob); // trigger again onload
            }
        }
    }).fail(function (jqXHR, textStatus, errorThrown)
    {
        if (tries++ > 3)
        {
            // error management here
        }
        else
        {
            // try again
            reader.readAsDataURL(blob); // trigger again onload
        }
    }
}

This code worked like a charm, even for large files (43 GB).

Today, we had to upload a large file (20 GB) and we got a SBOX_FATAL_MEMORY_EXCEEDED on Chrome (88)

After lots of tests and monitoring, we noticed a HUGE memory usage growing up in Chrome when using this upload.

Other tests were made and we noticed the same behavior on Edge and Firefox (upload could be done on FF, but still used GBs of RAM)

What can I do to fix this terrible memory management?

question from:https://stackoverflow.com/questions/66065824/sbox-fatal-memory-exceeded-when-uploading-a-file-in-chunks

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

It seems like the recursives triggers of the event prevent the chunks to be GCed immediately

The references of the chunks can be set to null to make them eligible to GC :

Before each readAsDataURL(), add this :

reader.result = null; // the result itself
d.file = null; // the chunk in the current object sent to the server
reader.readAsDataURL(blob);

This now works properly with a correct management of memory, which remains stable during the upload


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...