[sharing is fun] example: write and read JSON in javascript

Peter Nyboer's icon

I searched around on the forum for this stuff, and managed to gather the necessary stuff for a complete example. Thanks to Adam Murray for the pointer to the main ingredient!
The attached zip file has the js and maxpat that demonstrate how to populate an object in javascript with some stuff (I have it stored in colls, initially) then turn that into a JSON file, then read that JSON file to repopulate your object in the js at a later date.
I'm curious if there's a way to break out the actual parsing functions into a separate file that I could include somehow. I would like this parser to be available to several scripts, but I'd rather not cut and paste it into every one!

2297.JSONreadwrite.zip
zip
Luke Hall's icon

Check out the jsextensions file which can hold functions you can call from any other javascript you want.

Peter Nyboer's icon

Oh, hang, that's too easy! Thanks for the tip.
Peter

oli larkin's icon

Json is great. I have done a similar thing here https://cycling74.com/forums/sharing-max-web-control-v0-1

I hope max6's js engine gets updated. Lots of interesting stuff going on with js at the moment (e.g node.js) and it would be cool to be able to #include files using require()

www.olilarkin.co.uk

orange_glass's icon

Thanks for posting this, it's really helped me out.

Luke Hall's icon

You can use eval() to include other javascript files if you want. Although this comes with problem of making sure the file you're including contains no errors because eval() will load it in as-is. Here's an example of some code I load in to jsextensions that will make this easier, go take a look.

orange_glass's icon

One thing I've noticed with the
var fout = new File(path,"write","TEXT");

line is if my file was once larger than the JSON output is currently, the old extra contents are still at the bottom of the file.

Is there anyway to zero out the file beforehand?

Emmanuel Jourdan's icon

I think you can set the end of file:

var fout = new File(path, "write", "TEXT");
fout.eof = 0;
// ... write your stuff

Peter Nyboer's icon

Glad you caught that. I think it'sfout.eof = 0

Emmanuel Jourdan's icon

Of course Peter. Thanks for mentioning it, I fixed my previous post.

Peter Nyboer's icon

And thanks for posting it. I was actually in the midst of suffering the problem when I happened to check back to this post, and there was the solution! I suppose that is a disadvantage of posting a zip file, as that bug is embedded in eternity up there. Well, for the sake of the archive, here's a fixed version

jic's icon

Thank you for writing this! It has been immensely helpful in a project I've been working on for a while now.

This morning I was testing saving out very large JS objects (Around 10000 key/value pairs in about 10 levels of depth) and it is truncating the JSON file pretty severely (at around 2000 entries).

I've started an attempt at debugging this but was hoping someone could offer some insight. Thanks.

kavinithiy's icon

Am new to json, I need to create json file for following case.

In html once am getting input data and need to store tat in json file.

Am not that much good in java script.

Can some please guide me how to do.

Thanks in Advance!

Floating Point's icon

After wasting a whole evening trying to coerce [text] and [jit.text] to save a large jsonified text (they have 1069 char and 2048 char limits respectively), your json_readwrite code worked beautifully -- thanks!

Jay Walker's icon

Thank you Peter, thanks to you I can now write to a json file. I'm using this to store device parameter settings per clip with a json file. I would love to know, if anyone has any examples, of a way to remove or reorder json arrays. I know there's write and read, but is there a pop or delete or .remove example? So far I'm trying to simply replace with "" but I don't think that's the best solution.

orange_glass's icon

not sure what anything() is doing here, presumably mutating the contents of UI. Can we see what the contents of UI is before being written out to file?

Jay Walker's icon

Thank you again Orange_Glass. I deleted my posts to avoid confusion. I'm finding that the extra characters at the end of the JSON file are caused by trying to replace values from an object. If the new values to be replaced are a lower character count than the current object's values, the file will still end where it did originally and not shrink to the correct character count, leaving extra useless characters. For example:

For example, replacing {"0":{"1":[[60,64,0,127,0,127,0,64]]}} with {"0":{"1":[[60,64]]}} leaves me a fckd up json file like: {"0":{"1":[[60,64]]}} ,0,127,0,127,0,64]]}}

If anyone knows how to first delete contents of an object and allow the file to be correctly formatted with new values, please let me know! Maybe there's a solution with getting the eof calculated and trimming the excess? I'm just not finding very much value from using JSON if I can't push new information to existing objects! Thanks!

orange_glass's icon

I think this is a file writing issue not JSON. Reading back my message from 2011 it looks like the exact same issue. It's that long ago I can't even remember what I was trying to do let alone the solution. However the next couple of replies suggest a solution...

.eof = 0

Try that on the file object before writing you contents. Failing that I can think of 2 possible solutions I would explore next. 1 programmatically erase the file before hand, looking at google there are shell externals out there or the java object can apparently do this. 2 zero out any contents before hand. Using the file object https://docs.cycling74.com/max7/vignettes/jsfileobject you could readstring the contents of old file, create a var of equal character length of spaces, write it, then write your JSON, ensuring your filepos is at the start of the file for both writes, hacky I know.

Jay Walker's icon

Hey Orange_Glass, thank you so much, you led me on the right path and I think I have a solution. Shell externals made me think of trying out Node, the fs module has tons of useful tricks around files.

For anyone interested, this is how I push and update objects to JSON with Node, which is much more along the right way to do things! The node script is used only to write to the JSON file, it doesn't even read it. In the plain JS file I read and append to the JSON object, which goes into the inlet of the Node script:

//------- nodeScript.js -------- //
const maxApi = require('max-api');
const fs = require('fs'),
path = require('path'),
fileFile = ('writeToJson.json'),
filePath = path.join(__dirname, fileFile);

var newObj = new Object;
var styleValue = 5;

// FUNCTION TO WRITE TO FILE
// two top level key-value pairs are inside the JSON file named "style" and "PARAMS"
maxApi.addHandler('appendParamData', (inletObject) => {
var paramObject = JSON.parse(inletObject); // parse inletObject
newObj.style = styleValue; // create a test object
newObj.PARAMS = paramObject; //append inletObject
// write JSON string to file
fs.writeFile("writeToJson.json", JSON.stringify(newObj), function(err) {
if (err) throw err;
});
});
//------- End Of File -------- //

This is the JS that reads the JSON file and appends dynamic data as needed:

//------- deviceParameters.js -------- //
// (some code ommitted)
// High Level Commands
function writeParams() {
getParamValues2();
writeParamValues();
}

// Get Ableton Parameter Values and create a formatted object
function getParamValues2() {

var paramChunkArray = new Array();
var chosenParam = 0;
var paramValue = null;
numOfParams = device.getcount('parameters');

// create param values object, ex: length = 65
for (chosenParam = 0; chosenParam < numOfParams; chosenParam++) {
paramPath = new LiveAPI('live_set tracks ' + trackNumber + ' devices ' + selectedDevice + ' parameters ' + chosenParam);
paramValue = paramPath.get('value');
paramObject[chosenParam] = paramValue;
}
log("-deviceFunctions: getParamValues2 - paramObject:", paramObject.length, typeof(paramObject), paramObject);

// split param object into chunks of 8, ignoring first number
// note: this will create write function [[ [1],[2],[3] ]]
arrayOfParamValues = (chunk(paramObject, 8));

// create arrays for each group of 8
for (var i = 0; i < arrayOfParamValues.length; i++) {
paramObject[i] = '[' + arrayOfParamValues[i] + ']';
log('paramObject', i, paramObject[i]);
// In this code, JSON.parse makes them arrays and not strings.
paramChunkArray.push(JSON.parse(paramObject[i]));
}
arrayOfParamValues = paramChunkArray;
}

// Chunks function to split an array into segments (to swap portions of an object if needed)
function chunk(arr, len) {
var chunks = [],
i = 1,
n = arr.length;
while (i < n) {
chunks.push(arr.slice(i, i += len));
}
log('chunks');
// log('chunks',chunks,'\n', /*chunks[1], */ typeof(chunks));
return chunks;
}

// Send object through outlet for Node to receive it
function writeParamValues() {
read();
anything(trackNumber, currentClip, arrayOfParamValues);
var jase = JSON.stringify(UI);
outlet(0, jase);
}

// Reads the JSON. ===>MAYBE THE MOST IMPORTANT FUNCTION!!<===
// The last two lines will target a specific key value pair to continually append new data. Check it out suckas.
function read() {
memstr = "";
data = "";
maxchars = 8000;
path = p;
var f = new File(path, "read");
f.open();
if (f.isopen) {
while (f.position < f.eof) {
memstr += f.readstring(maxchars);
}
f.close();
} else {
post("Error\n");
}
// hell yeah
var UIObject = JSON.parse(memstr);
UI = UIObject["PARAMS"];
}

// Appends new data to UI object
function anything()
{ var a = arrayfromargs(arguments);
var id = a[0];
var property = a[1];
var data = a[2];
if (UI == null){UI = new Object();}
if (UI[id] == null) {
UI[id] = new Object();
}
UI[id][property] = data;
}
//------- End Of File #ballin-------- //

orange_glass's icon

Great to hear it's working!

Matteo Marson's icon

Hey everyone,

Thanks for sharing!
I'm having an issue with JSON_readwrite.js; I need to read multiple JSON files and parse some elements to fill a jit.matrix. The process starts fast and smooth, but after a while, it gets painfully slow. I noticed that the memory usage keeps increasing. I also tried recompiling the js file before each reading, but it doesn't fix the issue.
I'm an almost complete noob at js, but is there a way to free the memory after each file reading?

Might this be the reason why?


Thanks!