save multiple colls and other data to single file
I have 4 colls within my patch (of indeterminate length), as well as a few additional data elements, that I would like to save to a single file on disk. I would then want to be able to load that file back into my patch to restore the data to those same colls and to the various other data elements.
Suggestions for best way to do this?
very good question. if those colls are of similar structure, you can solve that by using the same coll for all of them, and eventually use fixed offsets for the different things (e.g. the second instance of the coll starts its data at index #1001)
for multiple very different things i have no idea. it seems most straight forward to write individual files, but inside the same subfolder.
You could use file compression if it is for archiving only.
If you want to be able to read all in a single coll,
then rather dump all colls into master one or a text file.
it all depends on index type you use.
sort them and insert separation index while dumping.
I do this a ton and use dicts as an intermediary container. I'll set up individual dicts pull_from_coll to each dict, then [dict.pack coll1: coll2: coll3: coll4:] and then write that dict to disk. To unpack I do the reverse and dict.unpack them all and push_to_coll to put them back where they belong.
[dict] all the way! As Rodrigo said, pull_from_coll and push_to_coll make it easy to store/restore colls into dicts.
Here is an example with two colls and some extra data:
Thanks to you all for your responses! The dict intermediary container seems like a great way to go, will give that a try - and many thanks TFL for your example patch as well.