'JSON.stringify throws RangeError: Invalid string length for huge objects

As the title implies I'm trying to stringify huge JavaScript Object with JSON.stringify in my Node.js app. The objects are - again - huge (tens of mega bytes), they don't contain any functions. I need to write the serialized objects to a file. What I'm getting now is this:

RangeError: Invalid string length
  at Object.stringify (native)
  at stringifyResult (/my/file.js:123:45) -> line where I use JSON.stringify

Any idea how to solve that issue?



Solution 1:[1]

I too have seen this unhelpful/misleading nodejs error message, so I booked an issue over at nodejs github

RangeError: Invalid string length --- it should be saying Out Of Memory

Solution 2:[2]

As mentioned by @sandeepanu, there's a great little solution by @madhunimmo for if you're trying to stringify a huge array. Just stringify one element at a time:

let out = "[" + yourArray.map(el => JSON.stringify(el)).join(",") + "]";

If you're trying to stringify an object with a very large number of keys/properties, then you could just use Object.entries() on it first to turn it into an array of key/value pairs first:

let out = "[" + Object.entries(yourObject).map(el => JSON.stringify(el)).join(",") + "]";

If that still doesn't work, then you'll probably want to use a streaming approach, although you could slice your array into portions and store as multiple jsonl (one object per line) files:

// untested code
let numFiles = 4;
for(let i  = 0; i < numFiles; i++) {
  let out = arr.slice((i/numFiles)*arr.length, ((i+1)/numFiles)*arr.length).map(el => JSON.stringify(el)).join(",");
  // add your code to store/save `out` here
}

One streaming approach (new, and currently only supported in Chrome, but will likely come to other browsers, and even Deno and Node.js in some form or another) is to use the File System Access API. The code would look something like this:

// untested code
const dirHandle = await window.showDirectoryPicker();
const fileHandle = await dirHandle.getFileHandle('yourData.jsonl', { create: true });
const writable = await fileHandle.createWritable();
for(let el of yourArray) {
  await writable.write(JSON.stringify(el)+"\n");
}
await writable.close();

Solution 3:[3]

I find JSONStream to be a reliable alternative to the native JSON.stringify that works well with large objects. For example:

var fileSystem = require( "fs" );
var JSONStream = require( "JSONStream" );
var records = [
    { id: 1, name: "Terminator" },
    { id: 2, name: "Predator" },
    { id: 3, name: "True Lies" },
    { id: 4, name: "Running Man" },
    { id: 5, name: "Twins" }
    // .... hundreds of thousands of records ....
];

var transformStream = JSONStream.stringify();
var outputStream = fileSystem.createWriteStream( __dirname + "/data.json" );
transformStream.pipe( outputStream );    
records.forEach( transformStream.write );
transformStream.end();

outputStream.on(
    "finish",
    function handleFinish() {
        console.log("Done");
    }
);

Took the sample code from here.

Solution 4:[4]

Here's a simple helper file that can do the job:

const fs = require('fs');
const json = require('big-json');
 
// pojo will be sent out in JSON chunks written to the specified file name in the root 
function makeFile(filename, pojo){

    const stringifyStream = json.createStringifyStream({
        body: pojo
    });

    stringifyStream.on('data', function(strChunk) {
        fs.appendFile(filename, strChunk, function (err) {
            if (err) throw err;
        })
    });

}

module.exports = {
    makeFile
}

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Scott Stensland
Solution 2 joe
Solution 3 Usman Khawaja
Solution 4 BuffaloDev