I need to edit procedurally multiple JSON files
Hey everyone,
I have a bunch of JSON files, and they all look something like this:[
{
"elem1" : -1,
"elem2" : 16,
},
{
" elem1 " : -1,
" elem2 " : 16,
}, ...... and so on....
]
I need to read those in Max, but when imported in [dict] I get the error "cannot read dictionary: 0"
This is because I don't have any container for the array. Something like this would make the job:{ "foo":[
"elem1" : -1,
"elem2" : 16,
}, ...... and so on....
]}
But the problem is that I have thousands of files and editing them manually is of course out of discussion.
I managed successfully to load the JSONs with this js solution:
https://cycling74.com/forums/sharing-is-fun-example-write-and-read-json-in-javascript
But due to some memory leaks, reading one file after the other gets progressively slow and memory-consuming (I guess for this reason https://sites.google.com/a/leapformax.com/tcpigoleap/home/what-was-the-memory-issue ).
Do you know how can I procedurally edit multiple JSON files to format them correctly for [dict]?
Can you suggest any other method to read JSON files or to find a way around this?
(BTW, if the issue is the memory leak due to the Max symbol hash table, even with a correct JSON format, the problem still exists....)
Thanks!
Is it just 2 lines or so you need to add to each file? I would simply read in the files to a JS File object and use regexp to modify the appropriate lines and the File object to write them back out. Should not bloat the symbol table to do this.
In the past I've done something similar that involved opening an in-file and an out-file, copying lines from in to out and adding lines where appropriate. If you get stuck post what you've got and I may be able to offer suggestions.
Thanks Rob for the quick reply!
I'm completely new to JavaScript, so a hint is more than welcome!
This is what I'm trying right now and (of course!) it doesn't work:
function container(src, dst)
{
var s = new File(src);
var i,a,c;
if (s.isopen) {
var d = new File(dst);
d.writestring("{");
d.writestring("foo" :)
c = s.eof;
for (i=0;i<c;) {
a = s.readbytes(32);
if (a.length) {
d.writebytes(a);
i += a.length;
} else {
break;
}
}
d.writestring("}")
s.close();
d.close();
post("done");
}
The message to get the symbol table size in max window is ";max size" not ";max button". You really need to create ten thousands of symbols before you start noticing a slowdown. You can try to do your symbol stuff inside an object to minimize the number of symbols passed around like mentioned. In C externals you can put strings into a dict and pass it to another external without creating an entry in the hash table. You have ";max relaunchmax $1" (where $1 is the patcherfile to open after relaunch) to programmatically restart with a fresh symbol table.
Max is not good at parsing text. If you take a look at style remover, I used [filein] and [fileout] to be able to work with numbers instead. But maybe with js you can find the best solution for your situation.
Hey 11olsen, thanks for your reply!
My symbol table contains about 300000 entries, so it could be the reason of the slowdown.
A C external can be a solution, but i'd prefer to try something else before going to visual studio :)
I'll investigate [filein] and [fileout] to see if they can help to solve the issue.
Thanks!
I almost managed to make the js code work:
function container(src, dst)
{
var s = new File(src);
var i,a,c;
if (s.isopen) {
var d = new File(dst, "write", "JSON");
d.writestring("{");
d.writestring("foo :")
c = s.eof;
for (i=0;i<c;) {
a = s.readbytes(32);
if (a.length) {
d.writebytes(a);
i += a.length;
} else {
break;
}
}
d.writestring("}")
s.close();
d.close();
post("done");
}
}
Now the problem is how to insert the word "foo" between quotes in the JSON file.
EDIT: Ok found, like this:
d.writestring(' "foo" :');
this is looking good to me, and sounds like you sorted it out, but if not bundle up the source, a patch, and a test file and I can take a look.
Thanks Rob for your support!
In the end, also this method gets slower and slower when it comes to using the modified JSON files imported in [dict]...
So I'm trying a different approach:
I convert the original JSON files to csv (while parsing only the stuff I need) with an Ubuntu distro for Windows and the utility "jq". With a little script, I run for each JSON file: jq -r '.[] | [.src_x, .src_y, .dx, .dy] | @csv' /mnt/c/path/file.json > /mnt/c/path/file.csv
This seems to work fine and faster. I'm trying now to run this command through the [shell] object and the windows Linux subsystem, so to process each JSON file and have the result immediately into Max.
I know this stuff doesn't fit anymore the topics of this part of the forum, but it's just to keep you updated
EDIT: even better with tab separated values:jq -r '.[] | [.src_x, .src_y, .dx, .dy] | @tsv' /mnt/c/path/file.json > /mnt/c/path/file.txt