Best way to parse a large JSON file?

Find out why the . goes before the /

Moderator: Paul Tuersley

Post Reply
lorewap3
Posts: 1
Joined: October 23rd, 2019, 8:48 am

October 23rd, 2019, 9:08 am

Greetings all,

I'm trying to import a fairly large JSON file to use for the creation of (mostly transform property) keyframes. The JSON file is rather simple, it's laid out like:
{
'player': 'xxx',
'video': 'xxx',
'tracking':
[
{ 'x': 99,
'y':99,
'opacity': 100,
'time': 1.532,
...
},
...
]
}

Where almost all of the data is housed in the tracking array. It contains an array containing every frame of the position of an object. I read this file in as a string, then use JSON.parse (https://github.com/douglascrockford/JSON-js) to parse it into an object.

The issue is the majority of these are going to be around 15000 frames long. The file comes out to ~ 3.3mb. It takes an average of 5-10 minutes to read in all the data, during which AE is chugging and showing the 'Not responding' as its processing the file. I know 15k is alot, but my computer is fairly fast and a simple json file with 15k array objects should still be doable.

There has to be a better way to do this. I snagged the latest copy of json2.js to see if it had been better optimized but it's still slow. I'm the one creating the json file, so I could break it up into smaller chunks, but I'd like to try improving this process first. At minimum it would be nice to do this parsing asynchronously somehow so at least it doesn't lock up the application completely while it's doing it.

I'm open to any suggestions! Thanks guys!

Will
Post Reply