Two problems: getting a subset of a string and can't close panels window
Hi,
looking for showing the playhead position in minutes and seconds outside Live (Android tablet + TouchOSC) I found the Time Display 1.0 object that is perfect... almost. The format it shows is: hhh:mm:ss:msmsms and I would like to strip away the hours - first three characters and the ":" , and the milliseconds, last three and the [:] before, so I opened the device in Max editor and in the end I managed to open the "soubroutine" called "Time" in which I tried to add a regexp filtering like this: "regexp .*[:](.*) with no luck: any help?
In the Time patcher I added an udpsend to broadcast the time through OSC. The sending works and I can get the time into my TouchOSC layout but I'd like to strip away the outer characters.
I would also like not to send OSC data continuosly but only when the last message is different then the previous one.
Another thing is that when I try to open the "Time" patcher window with "Original" nothing happens and I have to use the "New window" instead. Now I have a few small windows floating around and I don't know how to turn them off.
I watched many videos and read many papers and downloaded a lot of examples or snippets but I couldn't find a solution that works.
Any good link about M4L programming and tutorials?
Thanks.
fabio
upload the patch, otherwise one can't understand what that original
or new floating windows are...
it should be possibe to output time without hours and ms in first place,
without a need for regexp.
than add zl change to send only when seconds progress.
------
otherwise try this

regexp outputs only numeric items,
we route items 2 & 3 from the HH:MM:SS.MS list
to sprintf to form MM:SS messages.
If that timer patch is programmed correctly, there should be no need for zl.change,
but I would not expect it to be the case.
P.S. if I remember correctly , Live is not capable to report real time
correctly, if there are any tempo changes involved.
Thank you for taking the time to reply to my post and try to help me.
Before I go on to other things, let me share the patch so you see what is like.
Now let me ask a little question on a thing that's bugging me a little, please.
Yesterday, every time I wanted to edit the popup window I had to choose "new window" because "original" was not available or it didn't work. Now, when Max is opened I'm left with all the popups, and I can't see how I can get rid of them: any help on this too, please?

@source audio
About the way to get only h:mm:ss I tried your suggestion but it didn't work.
After a few trial and error, and a lot of frustration with Max documentation - both the help and the reference show quite a few things but explain very little and there is almost no CLEAR examples of how to setup a function to work, I managed to make it work almost perfectly. Here is how I did it, taking into account that I needed to send all the values as a single string, in order for a label into a TouchOSC to show the time correctly.
Here is the working snippet (with some "probes" to see what's gettin along the signal path):

And here's the TouchOSC lay out (sorry for the quality of the picture...) output:

Now I would like to format minutes and seconds so that they always show two digit values: where should I look?
And what if I would like to filter the millisecond to show only values at half a second (.0 or .5)?
Thank you very much.
fabio
1- you did not use sprintf as on the screenshot.
2- you stated you only want mm:ss ?
3 - you messed with the patcher and added it again to parent patch,
and have now duplicates of it.
Best would be to trash that device and start from scratch.
simpler way to get real time is get_current_song_time
or plugsync~
and translate ticks to HH:MM:SS
then regexp and sprintf
or use ms and do the math using plain max objects.
Did I understand that correctly - you wrote
you want ms to dislay as .0 or .5
that is what it would look like:
02:33.0
02:33.5
02:34.0
02:34.5
etc
you don't need to display anything in Live, only send time to OSC.
Can be all done with only few objects.
1- you did not use sprintf as on the screenshot.
I gave your hint another look and of course it works.
What was putting me off was that you wrote " we route items 2 & 3 ..." so I thought that the box where I saw "$2 $3" had to be a "route" instead of a standard message, even if I couldn't see the "route" word in your screen. But I thought that that didn't mean to be sintatically 100% respondent to the real objects. In fact, in the "zl change" box I can't see the dot between the two words but I learned that it is needed in the real one.
2- you stated you only want mm:ss ?
Yes, but along the way I thought that it would be better to show a single hour field too, just in case I would have to work with videos. I modified your patch consequently.

3 - you messed with the patcher and added it again to parent patch,
and have now duplicates of it.
Here you are right again. In my frustration of not being able to open the popup window in edit mode, I tried things without knowing what I was doing... Now, how do I delete that tab in the parent window?
Did I understand that correctly - you wrote
you want ms to dislay as .0 or .5
that is what it would look like:
02:33.0
02:33.5
02:34.0
02:34.5
etc
Yes, you got it right. It's a minor thing but I think that on the tablet layout it would feel better seeing that the time changes more frequently than once a second but not as fast as once every tenth of a second. Think of it as a sort of a "blinking led" telling you that te transport in running.
To do it easily I would need a sort of a "round for defect" function where I can define 0.5 as the weight of the rounding process...
you don't need to display anything in Live, only send time to OSC.
Can be all done with only few objects.
...
here one example using plugsync~
Thank you very much, that's much simpler. I made a little adaptation to show the single hour digit and gere it is:
Thank you again and have a nice day.
fabio
glad it worked for you, and that you got rid of that complicated device.
just a little hint :
sprintf needs %02d etc only to add leading zeros,
for single digit %d or %i is ok.
and
if you leave space between HH: and MM
you get quotas at output

sprintf symout %d:%02d:%02d.%d
would produce

if you prefer to have separation arround :
then comment would be better as it ignores quotas

touchOSC also ignores them, at least the old version
Thank you very much again for your explanation. That is gold for my attitude: "while you are at something, try to learn something general and not just how to solve your little need of the moment".
Going back to the OSC sending to TouchOSC (Mk1), to do it in Live I'm using the Live connection kit Max device "OSC Send" or the "Live Grabber sender", but with both I can't map directly to them the tracks ARM buttons. Searching around I found an M4L device called "Arm 8 tracks" that let me map the track ARM buttons to its buttons, which are mappable to the OSC senders.
Now that I learned how to send to OSC directly from within a patch, I'm trying to mod the Arm 8 tracks putting directly into it's patch the sending functions via OSC. Here it is, with my modding in blu background.
I have some questions, though.
The string I need to send should have this format: OSC address | arm status (0/1)
In my mod it should be in a form like this: /ViviARM 1 (when armed) and this: /ViviARM 0 (when unarmed)
For now I've "hard coded" them in the patcher but If it's not too time consuming to learn, I'd like to be able to get the address from a global list stored somewhere, or better a table or a db where each record have at least 3 fields: track number, track name, OSC address.
In that way I should be able to get the track number and/or the OSC address knowing its track name (or even better, selecting it from a list), with some sort of a look up. In that way I would be sure to use the same track info across all the patches.
Now the questions:
1
Where do you think I can I get the track number in that patch? And how?
As you can see, on the first track I "probed" some outlets to see what and when I am receiving and I see that in the leftmost "probe", the last piece of info it is the track number minus 1 (the list starts from 0).
Can you think of a way to get the track name?
2
Do you think it is better to use a single udpsend box as I did ( in fact, I used two of them) instead of having one for each button?
3
Performancewise it seems to me that clicking on one of the buttons of the patcher, both in Max editor and in Live object, I get a faster feedback on TouchOSC than when I click directly on the arm button of a track: in the code, do you see any room for improvement on this?
4
Do you have an Idea on how to build that "global db" of tracks info and values?
Thank you.
fabio
I realized that I would need to be able to set an offset in seconds to compensate the gap between the projet start time (1.1.1) and the song start time (usually 12 seconds from 1.1.1.).
We need to subtract the offset time before the number gets unpacked and, if I understood correctly the top part of your patch, we have 640 tick per second : am I right ?
I thought that I should at first convert the offset seconds into ticks (offset seconds x 640) and then subtract the resulting ticks from the number before the translation into four pieces, so I thought to use an expression box like this: "expr $f1 - 640 * Sf2" where $f1 is the time in ticks and Sf2 is the offset time, in seconds.
Trouble is that, as you can see in the patch, if I use the $f2 operator the box doesn't evaluate the expression but if I put the seconds number directly into the expr box like this: "expr Sf1 - 640 * 12", it gets evaluated! I would like to understand why it works like that because I will need to "present" the offset field into Live so I need the $f2 version to works.
A useful thing would be to be able to programmatically get the time value of a locator named "Start", as I always have one at the exact beginning of the song. In that way I could use the Start time to set the offset device even if I move the locator around.
Thank you.
PS.
I'm really puzzled about the behaviour of the expr box... I'm curious to see what I'm doing wrong...
I don't use Live, find it a crap and have no interest at all in it.
Only reason I ever come in touch with it is to help
some students and customers.
For that reason I am wrong person to ask about armed tracks and the rest of it.
But once asked, I would say try to use only 1 Live device,
like that time display device and also use it to control arm state of that 8 tracks.
Make your life easier and place that 8 tracks to track slots 1 - 8
so you don't need to detect anything but talk plain text to tracks and touchOSC.
I have this idea that live set is static set for stage use or ?
issuing track names and get/set their armed state should be easy thing to do in a single device
in case you insist on using names and placing tracks in any order,
but then have fun with it....
but you need a good strategy - is TouchOSC in control of armed states ?
is song auto-arming the tracks in first place ?
single udpsend is a better choice, one can also control order and flow, speed etc of sent messages.I see you used 192.168.0.255 - trying to broadcast ?sorry I got this with IP wrong, a mistake - was a part of some other patch
instead of 192.168.0.99 ?
sometimes, broadcasting has speed issues
------
To the offset :
plugsync~ outputs raw ticks in relation 1. = 480 ticks or 1 quarter note.
I don't know what that 640 you mention represents.
To be able to programm offset, you will need time signature
translated to ticks & multiplied by number of bars to offset.
than insert (- that much) in ticks flow.
Live has it with bars : beats : units and not real time, so I guess your offset is in bars.
plugsync has time signature outlet as list, unpack and
you multiply number of beats with beat (expr 1. / $f1 * 4)
let's say TS is 7/8, 1/ 8 = 0.5 raw ticks (1. / 8 * 4)
7 * 0.5 = 3.5 = bar length in raw ticks.
if you want to offset 6 bars then multiply 3.5 * 6 = 21.
insert - 21. in rawticks output before * 480.
All clear ?
---------
But if you want real time offset, like 12 seconds,
then you need to translate 12000 ms to ticks
in tempo xy
let's take tempo 110.
60000 / 110 = 545.4545454545 = 1 quarter note length in ms
now 12000 / 545.4545454545 = 20 which is your offset in Live's
representation of raw ticks.
here a little presentation of both offset types

P.S. I forgot to answer about "Start" locator.
It is possible to extract it's position, which is expressed in elapsed quarter notes.
in bar 6/8 and locator placed at bar 11, it will mean 3 * 10 = 30 quarter notes
It makes me wonder why you use realtime, and not bars and beats, which seem to be the only Live native time measuring units ...
here is your "Start" locator detector
Hi, ant thank you again for the time you are spending in helping me to solve my little problems and, above all, to learn new things.
but you need a good strategy - is TouchOSC in control of armed states ?
is song auto-arming the tracks in first place ?
No, there is no auto-arming. I'm just building a remote control for the most common needs while tracking.
Yes, in my TouchOSC layout I can control the ARM states sending a midi cc message to Live where in a Clyphx User settings file I put the scripts and they are triggered by the midi cc message coming in.
Some scripts are as simple as "track/Mute" or "track/ARM" but you can get a very good control of the most part of Live objects. Unfortunately Live "per se" doesn't give a lot of feedback directly and that is where OSC is unvaluable. Without that I couldn't get the state af the Mute or Arm buttons so I had to always check the PC window to be sure things were as expected and so I had to use two buttons for each function: "Mute on" and "Mute off", "Arm on" and "Arm off". But that was far from ideal so, thanks to OSC and to Max that let us use it within Live.
About track names vs track numbers
I'm relying heavily on track names because Clyphx let you use them as target of a few commands without being tied to the track position and without having to remember the position number.
But as my tracks are always in the same positions inside the patchers I can easily use the numbers.
I must search how are numbered the return tracks though...
About the offset
............
if you want to offset 6 bars then multiply 3.5 * 6 = 21.
insert - 21. in rawticks output before * 480. All clear ?
Well, not really but I was considering that I could go with the offset indication made of beats or bars instead of realtime because my "Start" locator sits always on a line between bars or beats so I could use those values to define it. In any case, thanks to your new Time-OSC with automatic offset calculation I won't need to manually set a value for that.
So let me thank you very much again for that little gem. I will keep it like it is.
It makes me wonder why you use realtime, and not bars and beats, which seem to be the only Live native time measuring units ...
I need to show the playhead/cursor position in realtime because who sings, usually could have put on the lyrics some comments and some marks with the running time indication because they were listening with a player that gave only runtime values.
So, in these last few days, mainly for your help and your work I made giant steps on the path of building my remote control. Should you ever come to northern Italy send me a message and I'll be glad to take you to some nice place (my wife is from Bellagio... The real one, not the fake Vegas one...).
There is still one thing for what I couldn'f find a solutionand not even any info:
- getting the state of the global recording or of the recording button
Now when I hit the "rec" button on my layout I have to check the tiny button on the PC window to be sure that recording in ON.
After hours of serching I'm still left with empty hands...
Thank you.
fabio
For the feedback of muting and arming through OSC, I ended up modding the "Arm 8 tracks" patch.
Here is the one for the muting of the first 16 tracks:
And this is for arming 8 tracks:
For now I kept them as they were, with the interactive buttons, even if I'm not using them. I will think if it's better to remove everything but what is needed to send the status via OSC.
Maybe I will make that version too.
Now I need to understand how to apply the same functions to the return tracks...
Thx.
fabio
There is still one thing for what I couldn'f find a solutionand not even any info:
- getting the state of the global recording or of the recording button
After some further research I managed to get the status of the global recording. In fact it's as easy as reading a property of the live_set class.
Thank you again.
fabio
that looks more complicated than I thought.
That arm and mute devices you posted have no idea what track names are.
track has slot number which is human visible by looking at the track activator number.
Track id has nothing to do with that.
If you want to report arm state of the track named Guitar, you need it's id
which will remain same if you move the track to different slot.
So what is your link between track name and it's slot number and report of the arm or mute state ?
I don't see it.
In order to do something like that, one has to build a list of track names and their id's
and to monitor them for any changes, and keep their states linked to what ?
what I am trying to say, either one builds remote control which is completely flexible,
which means it reflects complete Live set state,
or one uses fixed setup with set tracks, devices etc and programs static efficient control.
I am not sure to understand exactly what your concept is.
If you send arm and mute state of first 16 tracks to
touchosc, that will allways be related to track numbers, not names.
Here one option would be to check states of 16 tracks every time
that Clyphx executes any script.
like this
and here linking track names & ids
-----
you can use return_tracks instead of tracks to get their ids, states etc.
one builds remote control which is completely flexible,
That is the primary goal but to do that one must have a good knowledge of how to do it.
A few days ago I knew almost nothing about how to do it and it's only thanks to your patient help that I'm starting to understand something.
If you want to report arm state of the track named Guitar, you need it's id
which will remain same if you move the track to different slot.
Time ago I decided in favour of a static template with the same tracks in the same positions to get a link between names and numbers. But sometimes I found it useful to be free to add or remove a track - even only temporarily, and that would break the link between name and number from that track on to the right of the set.
Now, as you say that the ID stays the same even if you move the track and I don't know nothing about how ID assignement work, let me ask you a couple of questions, please.
Should I add a new audio track in position 5 and rename it to "Guitar", it will be given a certain ID:
1 - If I delete the track, save and close Live, reopen it and create the same track in the same position, will it get the same ID as before?
2 - And what if I create it in a different position but with the same name?
The answers to these two questions are key for the route I will decide to follow for the further development of my remote so forgive me if for now I send my reply even if it doesn't cover the rest of your post.
Thank you.
fabio
PS. Just this:
you can use return_tracks instead of tracks to get their ids, states etc.
OK: inside the children "return_tracks", are they enumerated with numbers or letters?
Thank you.
every track & return_track gets new id when created, current highest id +1
all track types share same id number list , also the master track.
if you had set with ONLY 5 tracks, delete 3 of them and add 3 again no matter
what name they get, you will have new increased ids for that new tracks.
instead of (theoretical) 1 2 3 4 5 you now have 1 2 6 7 8
Like in poker.
ids of return tracks are numeric, same as normal tracks,
even that they show alphabet chars A B C etc, and can't get reordered,
but if you insert new return track by right click on a return track,
it's letter will get "stolen" from new return track and all next return tracks
including "victim" will get shifted in alphabet...
-----
if you start default set with 4 tracks and 2 returns.
you have ids 2 3 4 5 assigned to tracks, 6 7 to return tracks 8 for master
at least that is what Live 11 does on my test system.
Now it comes even worse -
if you add a track it will get id 9, delete it and create another with any name - id 10,
then add a return track it will become id 11.
The worst thing is - there is no way to reassign ids.
well ... there is, but it involves uncompressing the als file, editing it (it is xml format)
save it and add .als extension.
Live will open it even that it is not recompressed.
If you create and delete a track repeatedly, the highest id number will continue to increase.
After saving the Set it remains so as long as Set is open.
On next open of the Set current highest existing id is used to create new tracks.
Sounds complicated ?
here an exampe :
you start default set with that 1 midi 2 midi 3 audio 4 audio A Reverb B Delay Master
I don't need any midi - remove them
ids 2 & 3 dissapeared forever !
also take that rev and delay out.
now we are left with 2 audio tracks and their ids are 3 & 4.
Add audio track , delete it, add again, now 3rd audio track has id ???
you would say 5, but NO it is 10 because
initial set had 7 ids, 4 tracks 2 returns and master, starting with id 2.
on first track addition it got id 9 - we deleted it , added another track it got id 10.
Can you follow that ?
that id is linked to that track as long as it exists in that Set,
no matter if you rename or move it to different track slot.
But ... if one saved Set when it had only 2 audio tracks with ids 3 & 4,
close the set and reopen it, new 3rd track would get id
one number higher than master track assigned.
----------
Save Set without reopening it does not reset current highest assigned id.
That has to do with undo functions .
-------
This are decisions made by ableton from the very beginning and I guess will not change ever,
no matter how inconvenient they are, same as unability to report real time
no matter what time or tempo changes in song are contained etc.
I had to laugh seeing post few days ago from someone from Ableton developer team asking
"how can we make Live better ?"
There is looong list of requests from users since many years, dealing with basic
functions, which are a must in any serious DAW, which remain unanswered...
........................
If you don't want to play this poker game with track slots, names ids etc,
your best bet is to create static sets and program
custom control for them.
If you allwas have tracks named like guitar, or piano and so on
keep tham allways in same track slot, and take care to add them in proper order,
because that matters when tracks get ids assigned.
But wait --- Live is NOT capable of creating empty set without any tracks,
so that one can start in proper order !?!?!?
And they ask how to make Live better ?
--------
Enough jokes, what would you do ?
To remain flexible, also in terms of display on touchOSC screen,
you need to scan and report any track and their state changes, at least.
And that has to be based on ids.
So how do you report arm state of added track named Kalimba
at some point ?
Does touchOSC have defined number of comments, or buttons
which get renamed when track in track slot number changes ?
And if track Guitar which was in slot 5 is armed, and you move it to track slot 6,
you have to report both slots to touchOSC, or even ALL slots.
Will add a lot of OSC messages.
P.S. here is a simple device that let's you follow track names and ids
when they get created, deleted or reseated into diffeent track slot.
select track name in the menu to check it's id.
you can drop it into any track, midi, audio or master
In case you want to make that flexible control,
you will need some parts of it ...
Thank you again.
I'm having a look at your TR-ID-SHOW device to understand how thing goes and if you don't mind I'll check with you step by step.
1 - First of all let's clear (for me) that when we deal with tracks (children of live_set) we are dealing with a list.
The LOM shows this as the way to get to a given track in the list:
Canonical path: live_set tracks N
2 - Although I couldn't find an explanation of what that N stands for, looking at the code I dealt with, I assumed that the items in the list are enumerated by the number of their position in the set, from left to right, minus 1 (0 = based counting).
And understanding that, instantly made it clear for me why you are stressing so much the point of setting the tracks in fix positions:
being track numbers linked directly to track position, if you change the position, the link with a hard coded number will break.
Is all that correct, so far?
3 - On the other hand, in the code, to refer to a given track you have to give the ID that Live assigned to the track when this was created. ID numbers (or addresses) are just coding handles and unfortunately their values are not predictable as they change when you delete and recreate a track, even in the same position, so there is not an usable link between track number (position-1) and track ID. But to get a track ID you have to know the track number...
4 - So, lets enter track names.
I think that the first thing that everybody does when building a Live set is to assign meaninful names to tracks. And maybe, between different sets the most of the tracks have the same names - at least for me is like that: drums, guitar, voice, etc. That said, I think that the easiest way for a user to select a track, wheter to perform actions on that or to get feedback from it, it is to use its name. And that becomes a must when having to do it away from the PC and the mouse,. Luckily Live makes the name of a track, as it is shown on its header, available to the programming environment as a property of the track. Unfortunately, to get the name of a track you need its ID and to get its ID you need its number: not ideal for flexibility...
In the end, I think that the only way to be free from the fixed positioning of the tracks is to build a table with at least three columns: track number; track ID and track name; and then create a sort of an Excel vlookup, maybe also making it a reusable patch, with name as input and ID and number as two different outputs.
I think this is not too far away from what you have done in your patch - building linked names and IDs lists into the umenus, but with my near null knowledge I can't see how to create a table and even less how to lookup into it with the name, to get back the ID and/or the number. I don't even know if this is possible...
Hope all this makes sense.
Thank you.
f.
Thanks to your work, maybe we don't need no table and no looking up...
After I managed to understand a little better how it works, I imported your code into my "Arm 8 tracks" to make some testing and I noticed that when you select an item in the list of a umenu, the left output sends out the position of the item in the list (zero based) and that, for the way the list is built, it is directly the track number we need to send to the trigger of our actions. So we can send the track number obtained to a [path live_set_tracks $1] message to get the ID of the track for the rest of the code, to apply actions or to get properties values.
I also saw thet you can use the umenu as a toggle but after a little test I reverted to the standard behaviour, for now.
Another useful thing I saw is that after copying the umenu box several times, each copy works independently from the others.
For now I haven't managed to make the umenu load and present the preset items. The umenus show the first item even if they act on the previously selected track.
I didn't aligned the wires for now because in the mess I think that they are more easily followed like that.
Thank you.
f.
I am out for most of the day, so just a short reply,
building that table with track names, positions and ids is no problem a all,
but it is more of an information to look at then really a need.
Only problem I see is that issuing names for ALL
tracks is not an easy thing, at least not the way I understand it,
because one would need to place a device in each track to auto detect
if it was renamed etc.
Otherwise one needs to rescan a central device from time to time to keep up to date.
you ask live.object or observer to report track by it's number
which includes id in first place, which is then used to get name
and all other informations like arm, mute etc states.
A list with 16 ids can be used to scan 16 track for arm state,
like in previous example I posted, but that has to be executed either
by specific interval using metro, or by reading input of control from
touchosc, when one expects that something changes in the set.
Forgot to answer that your explanations of tracks, ids etc are all correct, I am sure you are making it through the LOM jungle, in first place by creating own devices and test patches.
Another thought - if you have a table which keeps current track names and their ids you can issue track id by calling it from touchosc, for example
/guitar/arm 1
that string split into
guitar - selects it from table by name to get it‘s id
then pack id with arm 1 message to arm it
That could avoid a need for that extra midi to clyphx or whatever it is named scripter
I am sure you are making it through the LOM jungle
I know who to thank for that...
So, thank you again for your patience, your attitude and your time.
Even if you'll be notified that I replied to this tread, please ignore for as long as you'll be confortable to reply in the next days or whenever.
No hurry.
Enjoy your time.
I should do the same today as this evening should rain, hopefully.
Northern Italy have never seen such a hot and dry period for a couple of centuries.
Have a nice day.
fabio.
Hi, during the week end, to check what I learned after almost a week of your patient teaching, I did my homework...
And after some frustrations and some eureka moments and with the help of some snippets of yours, I almost completed the goals I set to myself for now, to build a remote control with TouchOSC on an Android tablet with OSC messages as the medium to get the best feedback possible from Live.
As I got a LOT of help, the least I can do is to spend some time to explain in detail the path I followed because I think that it could be useful for other beginners of Max programming, like I am, and hopefully it could also help to save some of the experts time that could point to this explanations when some basic questions should arise for the hundredth time in the forum.
Here's the list of all the tasks I set out to work on:
- getting and setting the tracks states - for now just MUTE and ARM, regardless of names and/or position in the set: ;
- getting and setting the song states: playing, stopped, recording;
- getting and setting the loop state: active, inactive;
- getting and setting the time value (mm:ss) of loop cue points: loop start, loop end, loop length;
- getting the time value (mm:ss) of the general cue points: playback start, last stop, cursor position;
- doing all that with flexibility and modularity in mind.
I created two audio devices: one for the track states (let's call it TS), and the other for the time values and the song states (let's call this TV).
TS (Track States).
I copied here your code to get the lists of tracks and return tracks, and I created an abstraction module to be used for each track: 20 tracks, 3 return tracks and the master.
Each patcher has four inlets:
- toggle arm state (0,1) - Here I connected a clickable and mappable ARM button (live.text)
- track number (position minus 1) (not the ID)
This is taken from a umenu populated by your code (one for each track). Linked to the umenu there is a numerical box that I use to set a default value, to not end up having to choose manually the different tracks names each time you load the device into a set. The default numbers goes from 0 to 19 for the tracks and from 0 to 2 for the return tracks.
- external trigger - bangs when I need to read the properties and there have been no changes,
- toggle mute state (0,1) - Here I connected a clickable and mappable MUTE button (live.text)
And six outlets:
- arm state (0,1) - This is linked the inlet of the ARM button, to change its aspect
- arm state OSC address (track name + ARM + 0 or 1). The string is built automatically from the track name with spaces substituted by underscores as OSC addresses don't like spaces.
- track position as shown by Live (1 based counting)
- track name as shown by Live - Here I had a hard time to figure out how to replace the spaces with the underscores. I tried with regexp and it worked, but only if a space existed, otherwise it screwed up the thing. Unfortunately, Max has a funny way to deal with strings with empty spaces so I couldn't find a direct way.
Live.observer outputs [Guitar] for the name Guitar but ["\"Piano lows|"" ] for the name Piano lows.
So I ended up having to do some string parsing, mesuring and converting to get the plain name of a track.
- mute state (0,1) - This is linked the inlet of the MUTE button, to change its aspect
- mute state OSC address ( track name + ARM + 0 or 1 as before)
Inside the module I use the live.observers to get the values I need, and live.object to set them.
In this way I managed to keep the patcher sparse and readable even if I put there 24 modules, each with its two buttons, a umenu, a number and a trigger with three outlets that I've set up with message ordering in mind to be sure to read the values in the right order.
For each group of eight tracks I put there a "trigger distributor" with eight outles to help to keep clean the board.
TV (Time values and general states (play, stop, rec, mute)
Here I created an abstraction (but didn,t need...) with many outlets:
- recording state (0,1)
- current time
- Offset before song start - In all my sets there is a Start locator that marks the actual beginning of the song and it is usually 12 to 20 secs away from 1.1.1 so, if you have marked the paper of the lyrics of a song with some "time" reference here and there, while listening to it with a player, with the offset you can make the time always show 00;00 where you have placed the Start bookmark.
- loop start - with the new auto comping and with loop recording it becomes fundamental to get the loop bar the right size, to move it in the right place and to know where it starts and where it ends.
- loop length
- loop end - Calculated: loop_start + loop_length]
- loop state (0,1)
- playing state
Here I copied directly the content of a patcher of yours... Thank you.
Everything seems to work nicely but I still have to learn about inizialisation techniques because some time I don't get everything set up when I load a patcher into Live.
Here it is. This is the so called TS:
And here's the TV:
In the end, the only restriction left with this approach is that when you add or rename a track you have to stick to a "custom thesaurus" of pre-thought names otherwise you'll have to change the OSC addresses (messages) in your TouchOSC layout to avoid that the feedback would break.
But the freedom to think of my tracks directly with their names, and to move them around, can make life a little easier. To me, at least.
At your ease, please check if I have done something wrong or if it possible to do it easily.
And when you'll have some time, please tell me the important points about patcher initialisation.
Thank you.
fabio.
PS.
In the code inside the abstraction I made for the TS, I see something strange happening: after selecting a different track in the umenu, the ARM branch sends the OSC address with the name of the new selected track, while the MUTE branch sends the message with the name of the track shown by umenu before the change.
Hovering with the mouse over the "prepend" box that goes to the outlet 2, I could see the same new name on both the wire going to the inlet and the one coming from the outlet, but over the "prepend" box that goes to the outlet 6 I could see the new name before the box and the old one after the box.
But both the OSC messages are fired.
Hi Fabio,
great that you maske such good progress.
I will try to understand your 2 devices as soon as I get time.
Having a short look at the Track Stastes device, I have a feeling that some
things run unefficient, at least from Max point of view.
In addition, all rules we use to make flow in Max efficient change somewhat
when it runs in Live, because many objects behave bit different when run
as Live devices , less efficient.
Also you can't store and seal a device with any leftovers,
like populated track names from any set, and then reinitialise the whole thing on each start ....
then using umenu for each track, but select track by number box, which even has not been limited to track numbers ...
you should remove unused objects, for example this

all it needs is this

seems not to be a big deal, but if you overpopulate functions like that 100 times in each device,
then it counts, which would explain you problem with sending OSC strings
of both previous and new selected track.
Asking live objects and observers many times to check sometimes is unefficient.
If you have track number and id defined in one track slot,
then you don't ask for it's id again, it is allready known.
-----
Another thing is storage of the track menus - track selections in your device.
Do you want store selected tracks as part of live set ?
How would that work in terms of timing,
you first need to populate all menus, then recall selections...
and so on ...
Will be back with better overview when I get time.
P.S.
using that complicated strlen strstr etc causes
OSC message problems.
If you want to replace spaces with underline
then use
regexp " " @substitute _

if you want to limit number of chars in OSC message,
then strcut with fixed length can be used between regexp and sprintf
you should insert print to check output of objects you are not sure about
and remove print once all runs as expected....
return tracks and master have no arm state
master has no mute and name Master seems unchangeable ?
mute 1 mutes the track, should button state and color not match Track Activator State ?
This device serves only as bridge between current states, and OSC report back.
ARM and MUTE states should not be stored or automated here !
Best would be to use NON live gui elements if any are needed at all !?!?
and have all automation, parameter mode mappings etc disabled.
It is job of a track to store or automate it's state in the set.
More things I need to know about Live ?
If you want to replace spaces with underline
then use
regexp " " @substitute _
Yeah, that is the first thing I tried but I found out that it doesn't work with strings that doesn't have a space in them.

The characters limiting it's needed for the akward way live.observer deals with strings with and without spaces otherwise a simple "minus 1" object would have been enough for every body, to take into the account the fact that using sprintf with three operand, when the string has a less number of words it gets an underscore attached at the end so the -1 to remove it. But it was not enough... I wasted the most part of an hour to look for and to try different things...
you should insert print to check output of objects you are not sure about
and remove print once all runs as expected....
When I'm programming I usually attach messages and numbers at every stage of the data path to keep in check if things goes as I expect. Half of the time, they didn't... I find it easier this way than having to look at the console to see the values. (the regexp example shows clearly what I mean).
But, a part a bit of frustration here and there, has been fun - and a bit rewording, that I've been able to make progresses and managed to make things work as I wanted, even if I know that things are far from perfect.
Thank you.
f.
return tracks and master have no arm state
master has no mute and name Master seems unchangeable ?
Yes, for now I created dedicated abstractions for them (not needed at all for the master) just changing the children in the path live_set box. It was already late and by that time I was happy enough seeing things work.
mute 1 mutes the track, should button state and color not match Track Activator State ?
On my side, the "Click" button reflects the state of the activator (the square with the position number) whenever it changes, regardless of what action changed it: click on the activator, click on the Click button, midi message from outside Live.
If it doesn't, try to hit the little cyan button and then try again. This is part of my not fully understanding of initialisation and data flow.
For instance, to check things while writing this reply I opened the test live set I closed last night (working) and the "Initial value" of the number boxes I used as a sort of a default preset were not loaded and all the boxes showed zero and consequently all the umeus showed the first item. I had to remove the device from Live and dragg it in again to see the correct default population happen. But...
clicking on the first button caused the same reaction to the 9th and the 17th buttons even if their umenu was showing different track names.
The relation I see among them is that I coded the 1 to 8 eight first and then I copied the block of code and changed the Initial value of the number boxes to load different tracks at loading time. So each of the three buttons id the first of ist block of 8.
Anyway, at this point I clicked on my cyan button and everything falled into its place, working again as last night. That's the next thing I must tackle...
And you see? The UI as it is now let me see instantly what was going (or not going) on.
This device serves only as bridge between current states, and OSC report back.
ARM and MUTE states should not be stored or automated here !
My development of this tool started from a similar device I found in the repository that was simple enough to give me the confidence that maybe I could understand Max. For now I left the UI with umenus and live.text buttons to easily check what's happening. As soon as I'm sure that everything works always as expected I will keep only the OSC messaging and stripp away all the rest.
Best would be to use NON live gui elements if any are needed at all !?!?
Are these all the objects that begins woth "live."?
and have all automation, parameter mode mappings etc disabled.
Not sure I understand that: are they settings you find in the patcher inspector?
Thanks again a thousand times.
f.
connect unmatched regexp outlet for that.
use right version if you want _ between track name and parameter.

-----
That track report device should NOT store anything.
Track states are stored in track itself,
if you blindly insert buttons and other elements which are set to be stored, automated etc per default,
you are storing and recalling their states in live set.
Why should one do that ?
Then - storing umenus ???
Live has to scan the set, to get names of all tracks and populate umenus with them,
and at same time recall what was stored in each menu ?
Do you even know what gets stored in that case ?
Menu item number selected or item symbol - track name ?
No wonder you have problems with it.
Once you decide to use fixed names, you are forced to user them in Live and touchosc.
It makes no sense to send states of track named whatever, if touchosc
does not have anything that receives messages bound to that name.
You missed the point in my example - to link track names with ids
in order to scan ALL ids every time a report to touchosc is needed.
And that is whenever automation control arrives which changes any of the tracks.
And that using a single live.object or observer.
There is nothing that needs to be visible or manually controled.
here is example of what I suggested.
it builds track id - names coll and updates it whenever tracks move, get deleted or added.
When you click on "dump" , all tracks report their arm and mute states ,
and after that return tracks report mute states.
No matter how many tracks.
Idea is that you add a trigger to fire the dump,
whenever control messages to arm or mute any track get received from scripter.
OSC message gets formed using Track Name with spaces repalaced with underline
ansd underline gets added before ARM or MUTE
example: Hot Guitar
/Hot_Guitar_ARM 1
Then - storing umenus ???
Live has to scan the set, to get names of all tracks and populate umenus with them,
and at same time recall what was stored in each menu ?
After your last slap on the wrist I saw that in fact the umenues etc. were necessary only if I would need to keep the Ui inferface but as I don't, I stripped down everything, ending up with an abstracted module, a costant number and a trigger for each track, wiping out the rest. (I haven't figure it out how triggering works, exactly, yet. And to add to the confusion (for me) I suspect that there are differences when running in the editor and when running in Live. Monitoring the OSC messages with Protokol I can see that multtiple copies for each track state are sent when loading or when dumping. Sometimes only the bang to the number was enough to see things work - and OSC messages were sent just once, and sometimes nothing worked without banging directly the abstraction too.

There is nothing that needs to be visible or manually controled.
The only thing left in presentation is a dump button...
I opened your last patcher and yeah, that it is how one should do it, of course. But I can see that it is out of my knowledge so I could never end up with something like that.
Idea is that you add a trigger to fire the dump, whenever control messages to arm or mute any track get received from scripter.
Not sure how I can do that. Being all boolean values maybe somewhere ther is a single binary number combining them all... It would be easy to monitor a single change. Or maybe there is already some function that can do that.
Until now I monitored things track by track by I don't think you have in mind something like that.
Thank you again.
f.
Looking around I found a couple of snippets that seem to work. Probably they are not complete and are a bit messy but making an abstraction with them and putting it into your Track-States patcher it works and in Protocol I can see the dumping every time an arm or a mute button changes, even on the returns.
Here it is:
The only drawback for me is that troubleshooting any problem with missing OSC feedback, it will be a bit more complicated then before because, instead of seeing only the last few messages, after each clearing I will always have to look for the messages I'm working on in a screen full of all the messages and with my eyesight the task will be a little harder. Let's see how it goes...
What would it be the best approach to map the dump button to a button in TouchOSC, to be able to manually sync the tablet and Live?
It can transmit MIDI and OSC.
Today I tried to do that with two mappable objects: a live.button and a live.text, used as a button, sending a MIDI cc on ch 16 ad 106 (not mapped elsewere) but I've got a strange behaviour. The TOSC button stayed lit up and the live. buttons too on the presentation screen of the Max devices: MIDI looping? If yes, why? Otherwise, what else? I doubled checked and there is no OSC feedback active on to those live. buttons
and I was not using a toggle behaviour.
Thank you.
f.
I thought that you send arm and other messages from touchosc, as midi.
Only needed would be to monitor that midi messages and fire dump button.
I would add short delay like 100 - 200 ms to avoid retriggering,
in case you send messages too fast from iPad.
Adding manual sync button is a simple thing.
Use momentary button with osc string /sync set to send only on press.
in track device add udpreceive with proper port and listen to message
/sync 1

Do not use any live buttons , what for ?
You get only into trouble using them...
-------------
The device you posted is missing that abstraction.

Unless you really need to use abstractions, rather use patchers.
You can easily encapsulate multiple objects into subpatcher by selecting them
and Edit -> Encapsulate.
Abstractions make sense only if one use many instances of them, and
one knows how to deal with them.
-----------
There is no troubleshooting needed, it either works or not.
But I would verify on your side that in Set with many Tracks all states get sent
and received properly.
Dumping all messages is very fast.
In case some get lost, one could slow down output from colls.

Not sure what's going on here...
I added the udpreceive snippet as you indicated and I added a button on the out of del to visually see the banging, and I see it.
The program doesn't stop at 1 nor at 3 nor at 2 and I'm gettin a different dump result (wrong) than when I click on the local dump button (correct) or on my added butto (correct).
Local dump obtained by clicking on the dump button or on my added button:

Dump obtained through the added udpreceive chain:

I thought it was a simple "banging adding" but it's not...
I feel as I'm in a maze... You think you're on the way out, but it was another blinded alley...
Thank you.
f.
I have no idea why should that dump message get repeated again and again.
As first, disconnect bang from dump message and capture
OSC input.
Maybe you have some osc feedback or whatever ...
Screenshot shows thar dump gets retriggered very fast and so, can't past few tracks report.
I have also no idea what your setup is, not network, no idea about that cliphx scripts
and also any other thing that might cause that behaviour.
And honestly, I have no time to troubleshoot that.
You could insert onebang 1
and check if osc sync message gets repeated.
