searching for a device displaying parameter value (& name)

jo_leborgne's icon

Hi there,

I'm searching for a m4l device that pop up a window displaying parameter's value (& name), each time I turn a knob,
just for a few second and then auto-close

ex : " Filter Frequency 2.46 kHz "

I'm able to make it appear in the status bar with a python script, but that's too small, I want a bigger window , so when I'm on my controller I can do fine parameter adjusting

does it exist? or someone can help me to program it?

jo_leborgne's icon

I've make a try with the grab_midi function (and copy&paste from other's patches)

show midi parameter.amxd
application/octet-stream 74.77 KB


it looks like what I need, but there's some problems :
I don't need forwarding buttons, only encoders and it doesn't work for all of them (the ones that are connect_to in the script, for example to volumes)
so I think the best would be to send the messages (strings) from the python script to the max device,
somebody knows how to do that?

Patrick_K's icon

display_paramter_name_value_demo_0.1.amxd
amxd 1.66 MB

Here is an M4L device I put together last night in Max 8.1.9.

This is in response to both this thread and the thread on Ableton's forum.

This is very rough and will need some commenting and polishing, but I want to make sure it's meeting the essential requirements and (with a revolting 128 concurrent live.observers) that it doesn't cause any performance drags on your system.

EDIT: I updated the attachment to include caveats re: current limitations:

- limited to devices on same track as .amxd
- limited to the parameters of the currently selected device
- popup window blocks mouse-based adjustment of device parameters, so this is a control-surface-only demo

jo_leborgne's icon

show moving parameter.amxd
application/octet-stream 45.96 KB

Patrick_K's icon

is it possible for you to post the remote script you're using with SHOW MOVING PARAMETER so others can see how it works?

I haven't dealt with remote scripts in well over a decade, I don't recall if it's possible to make one that's generalized for most any controller, but seeing how the script is forwarding values to this .amxd would be helpful for anyone that comes across this thread in the future.

jo_leborgne's icon

it's a remote script I've made for a BCR2000, that handle volume, sends and devices,
and there is already in it a call (on start and on every changing of devices) that list a lot of devices, so I've added the detection of the max device in this call :

def set_instrument(self,t,i):
    chd=[]
    trg=[]
    o=[]
    a=[]
    f=[]
    e=[]
    x=[]
    l=[]
    
    def get_dev_list(devs_list):
        r=[]
        for dev in devs_list:
            r.append(dev)
            if dev.can_have_chains:
                for c in dev.chains:
                    r.extend(get_dev_list(c.devices))
        return r
 
    if t in self.song().tracks:
        for d in get_dev_list(t.devices):
            b=parameter_banks(d)
            if 'show'in d.name:
                for y in d.parameters:
                    if 'dial'in y.name:
                        self.show_parameter=y
                        

self.show_parameter at the end is the dial of the max device
(it's maybe here that I can find another parameter that receive text)

and then, as there were already a listener on encoders, I've added the sending trick to the max's dial:

@subject_slot_group('value')
def on_econtrols_value_changed(self,*a,**k):
    for x in a:
        if x in self.econtrols:
            o=x._orig_buttons[a[2]][a[1]]if isinstance(x,ButtonMatrixElement)else x
            if o.mapped_parameter():
                p=o.mapped_parameter()
                #self.show_message('            '+str(p.name)+'   '+str(p)+'   ')
                if p and p.is_enabled and self.show_parameter:
                    self._tasks.add(Task.run(lambda:self.show_parameter_value(p)))
 
def show_parameter_value(self,p):
    self.show_parameter.value=((p.value-(p.min if not p.is_quantized and p.min<0 else 0))/float(p.max-p.min))*127

just setting the self.show_parameter.value is enough to change the dial value

and there was also those 2 lines in some other place in the script :
(to set the listener on the encoders)

self.econtrols=[self._vol_enc,self._dev_enc,self._sse_enc,self._svo_enc,self._sld_par_enc]

self.on_econtrols_value_changed.replace_subjects(self.econtrols)

Patrick_K's icon

Thanks for adding this. Clearly, you know your way around scripts and the more expansive non-public API. It's frustrating that the more limited public API exposed through Max seems to be making this so complicated... the public API seems to prioritize a per-device approach to accessing these parameters that is very much in-line with how Ableton Push works, but not yet a great fit for those that need to reach across all devices + mixer on all tracks at once to get names/parameters.

I hope someone out there knows how to capture the parameter name & value info that you're trying to forward to a live.dial within Max.

A very crude suggestion in the meantime: if you already have a script that shows exactly the info you want on-screen, just not large enough, have you considered using a third-party magnification-window app to make the info you're already showing large-enough?

If you don't intend to distribute this solution publicly, why not just magnify the portion of the screen that's too small so you can put this need to bed and get back to making music?

It seems there are a few different solutions out there for both Windows and OS X.

jo_leborgne's icon

thanks for the suggestion, I prefer that the floating window pop up only when I turn a encoder

in fact there is only one unofficial API (used with the push, the LOM), it's just that I've learned, with python, how to make lists of devices, parameters and then how to navigate through them. it's like recreate an interface of live, a custom one, on your control surface.

Patrick_K's icon

I hear you.

Re: the API, it is my understanding that:

1. The public/supported API is all that can be addressed via the live.path, live.object, live.observer, live.remote~, and (for JavaScript) LiveAPI objects in Max. The LOM Document contains a complete list all Live API properties and functions addressable via Max for Live. Anything not in the LOM document is not addressable via Max for Live.

2. The public/supported API is only a subset of the full (but not officially supported) Ableton Live API, addressable via Python. Ableton Live is one application with just one API, but only a portion of that API is officially supported and addressable directly-addressable via Max for Live.

Is my understanding correct?

jo_leborgne's icon

I'm not sure to understand, but I think everything is in the LOM, I use only that for my python script

Patrick_K's icon

Interesting.

There are things in the unofficial programmatically-generated API documentation (as seen on various 3rd party sites) that are not in the LOM documentation, such as the function "get_all_scales_ordered", but the biggest difference I've seen thus far is what your script seems to highlight: listening to changes in all devices across the entire Live Set seems fairly clean/straightforward in Python, whereas doing the same in Max would require procedurally-spinning up complex and potentially massive networks of live.observer modules per-device. It seems the performance hit from having this many live.observer modules might be a non-issue, but implementing a means to do this with potentially thousands of little graphical representations of live.object on-screen is just a mess... regardless of whether they are procedurally-generated and procedurally-removed or not.

Anyway, your solution of using Python to manage the device listening and just send the current modulation(s) to a live.dial in Max is nice and clean, provided someone has means to deliver a Python-derived parameter name and value into Max.

jo_leborgne's icon

I suppose that functions as "get_all_scales_ordered" are just new, and not updated in the LOM documentation, idk.

for the python script I 've put the listeners on the encoder, not on all the parameters. I suppose too it's more effective as there is less encoders (32) than parameters that I can handle with. (there are also listeners on the parameters connected to the encoders, but these listeners moves to other parameters when changing track/device selection)

The script is listing all the devices just to detect and list the ones that I need to handle . There is a listener only on the API's devices list for the selected track, which is triggered only on importing/deleting a device, to then reset/recreate this custom devices list (but no listener on those devices)

The way of listening encoders list seems clean maybe because @subject_slot_group is a tool from the framework, given by Ableton. In python you can program everything you use several times as sub-script (tools) to be more effective in your scripting way of programing, idk if it's possible in Max.


tyler mazaika's icon

"whereas doing the same in Max would require procedurally-spinning up complex and potentially massive networks of live.observer modules per-device. It seems the performance hit from having this many live.observer modules might be a non-issue, but implementing a means to do this with potentially thousands of little graphical representations of live.object on-screen is just a mess... regardless of whether they are procedurally-generated and procedurally-removed or not."

FWIW this would actually be doable using JS for the observer lifetime management and jsui to render control values to the screen (it would be a catastrophe in Max code...). The thing you'd want is to just have _one_ rendering function call which takes x,y origin coordinates for where in the jsui canvas to draw the controller value / name etc. I did something similar for a 'mod matrix' (like the one in Live's Wavetable), and it stores a single image of the last state of all the controls on redraw so that when other control(s) value changes it redraws fairly efficiently --- even when there were 1000+ squares rendered.