Cannot create a string longer than 0x1fffffe8 characters in JSON.parse?

0x1fffffe8 is exactly 512MB.

The many commenters are correct: you are bumping up against a system limit. I agree with @Pointy that it is mostly likely a Node string length limit. fs-extra has nothing to do with the limit

In any case, you’re going to have to process that JSON in chunks. Below are different ways to do this.

A: Use a SAX-style JSON parser

You have many parser options. To get you started, here are a couple I found on NPM:

  • BFJ has a walk function that does SAX-style parsing. BFJ is archived, but still has millions of weekly downloads.

  • stream-json

B: Implement a Node Streams pipeline

Almost certainly your massive JSON data is a array at the root level. This approach uses a parser that can asynchronously process each element in that array individually, or in batches, whichever makes sense. It is based on the very powerful and flexible Node Streams API.

ℹ️ If your data isn’t an JSON array, but a stream of concatenated JSON objects, then it probably conforms to the JSON Streaming protocol. See option D below.

  • JSONStream lets you filter by path or pattern in its streaming parse. It is archived, but still has millions of weekly downloads

  • BFJ – in addition to supporting SAX-style walk function mentioned above, it does selective object level streaming:

    match returns a readable, object-mode stream and asynchronously parses individual matching items from an input JSON stream.

  • stream-json has a Pick pipeline operator that can pick desired items out of a stream, ignoring the rest. Many other options.

  • jsonparse

C: Manual chunking

🚩 This will likely be the most efficient if your data supports it.

This option is like B, except instead of employing a streaming parser, you do the chunking yourself. This is easy to do if the elements of the JSON data array are very regular, e.g. each element occupies exactly N lines. You can easily extract them without parsing.

For example, if your data looked like this:

{
  data: [
    { name: ...,
      address: ... },
    { name: ...,
      address: ... },
    { name: ...,
      address: ... },
    { name: ...,
      address: ... }
  ]
}

Your process would be something like this:

  1. Use a buffered reader to read the file.
    (DO NOT synchronously read it all into memory)
  2. Discard the first two lines
  3. Read the file in chunks, two lines at a time
  4. If a chunk starts with {, remove any trailing
    comma and parse each individual
    {name:..., address:...} record.
  5. If it doesn’t, you have reached the end of the
    array. Discard the rest of the file or hand it
    off to some other process if you expect
    some other data there.

The details will depend on your data.

D: Use a JSON Streaming protocol parser

The JSON Streaming protocol is a stream of multiple JSON objects concatenated in a stream. If that’s what you have, you should use a parser that supports this protocol.

Leave a Comment