memory not released when objects are deleted?
after trying out creating a lot of standart objects and deleting them afterwards
while having a look at the memory that is used by max(with activity monitor, xcode instruments)
I’m a bit confused how the memory management works, especially how it is released in max.
If I create a new patch and add 200 metro objects to it, the physical memory used by max of course rises.
But if I delete all of the objects the memory usage doesn’t drop.
If create more objects after that the memory usage still rises.
Can someone explain what’s going on?
And is there a way to check if own build externals work with correct memory management?
There’s a lot of black magic in any memory allocation, particularly so in Max.
First of all, how are you allocating memory? If you’re using the getbytes() calls, it maintains its own private memory pool (in blocks allocated using malloc() during the low-priority thread). The block size was 32kB on Max 4 and earlier (actually, four bytes less than); I think they’re now at least twice the size but I’d need to check. Anyway, when are these blocks released? Good question. In particular, if 200 metros were allocated in one of those blocks *plus* something else that hasn’t called freebytes(), the entire block is still in use even after deallocating the 200 metros. So the block obviously can’t be free()’d.
Since you say you are creating Max objects, they will almost certainly be using getbytes(). You can check your own malloc()/free() calls with XCode Instruements & friends, but with getbytes() you’re pretty much on your own. My only advice is to follow a well-defined allocation/deallocation strategy and stick to it. You have to watch out because the Max API has numerous functions that allocate memory as a side-effect, and I am not aware of any kind of consistent naming convention that helps you recognize the guys. If in doubt, check the documentation.
Another thing to mention is that Max uses a memory pool to improve fast memory location. If you allocate and deallocate a large space in memory using the sysmem_newptr/sysmem_freeptr you should see it in Activity Monitor. For instance, writing "tapin~ 60000" increase the memory by ~20MB (yeah I’m using Max 6 ;-), removing the object free the memory as expected.
so is only the memory of large objects released when they are deleted?
Meanwhile I had a look at some more of the standard objects.
"filtergraph" or for example "function" release used memory.
But most of the objects I tried didn’t. This includes own build externals which consist
of only the bare necessities so they can be created in max. If lucky, sometimes memory of own build ext. is partly released. (I can only achieve this if I create at least +50 objects an then delete all of them at once)
Is there a way to release the memory for any object after it is deleted, to get back all the memory that was allocated before?
And another thing: Am I right that it is not possible to use Garbage Collection when coding in Objective-C building max externals?
I set a user defined build flag in Xcode "GCC_ENABLE_OBJC_GC required" what didn’t work.
I think Max console said something like build module not found and my objects were then not usable.
Objective-C uses it’s own memory management system, using a set of ‘memory pools’ . Simple types allocate fine, and scope out cleanly (are freed once out of scope), but any complex object or array would need to be allocated something like this *( tim p,or anybody, please correct me)
/* alloc obj, send instantiation msg */
NSMutableArray *myArray = [[NSMutableArray alloc] initWithCapacity:10];
/* return, marked for freeing after use */
return [myArray autorelease];
or some such. autorelease is std. msg to free after scope the object returned to is freed… there are other possibilities. programmer’s responsibility to manage it .
you can also use old school c malloc() / free() calls for the ‘base’ c types.
oh, and Max’s ever growing memory pool has been an issue before: the demo i saw at a max night school: roll up a bunch of new unique symbols for max to refer to with a js / java script (labeled s/r pairs, f’instance). look at the memory use grow! now delete the scripted patch… shouldn’t all those "referred to" labels go away from memory? evidently not… now this was more than a major release ago… do we still have the same issue?
minds stuck at work tweeking other’s bad sql code want to know! :-)
AFAIK there is no way to remove symbols from the symbol table once they were added. This has to do with the design of the system itself, as you have no chance to track whether a specific symbol is still being used by a particular object or not…
re: osx autoref count : :-) ah, the eternal whirlpool of progress…er… or was that a spout… whatever, same as it ever was: different. ;-)
" if it is a small memory allocation, we use our internal pool, and if it is a large allocation, it uses the standard OS memory pool" – Joshua Kit Clayton
Ok and I did some more testing. I created so many metros until my free memory was smaller than 10mb. Then I deleted the metros and started again to see if the memory used before was lost
or if it could be used again. …after creating a lot of new metros again it became obvious that the memory from the max memory pool gets released if it is needed for new objects. You can see how the memory is released in the activity monitor. So the memory isn’t lost. My experience was that max is still stable after that point.
Could someone else have a try creating objects till the memory is near it’s limit, delete the objects or close the patch but not max and open a complex patch to see if it works properly?
Testing: [mbp, dual core 2.7i, 4gb]
I did have a try creating 1000 numbers(the integer box), deleting them, creating them again
until my count said that in sum 10000 were created without having a crash.
Also 200 bpatchers with embedded patches that contain about 20 objects of small size.
Interesting about the bpatcher testing is that even with that amount of bpatchers max editing and using the objects was flawless.
400 bpatchers of the same type still worked. But as always if a certain number of objects are present in a patch, max starts to get slower even if they only use a small amount of memory/cpu.
My conclusion about this is that maybe there’s no need to worry about the max memory management but more about the one which has to be done by myself.