Using control_surface grab_control is not putting MIDI out where I expect it (Push2)
Hi folks, I’m attempting to override a subset of Push2s default controls from within a Max for Live Device. With minimal success. The documentation states for ‘grab_control’ that “This releases all standard functionality of the control, so that it can be used exclusively via Max for Live.” (https://docs.cycling74.com/max8/vignettes/live_object_model?q=LOM)
But when I grab_control anything, such as Push’s control named Global_Mute_Button, I don’t get any control from within my Max for Live Device. Instead the corresponding MIDI CC/Note message goes into the sequencer inputs (e.g. the “Ableton Push 2 (Live Port)” input, MIDI CC 60 in the case of the Global_Mute_Button). This hardly seems “used exclusively via Max for Live” and is not helpful for an Audio-type devices (which can’t get that MIDI info).
When I enable “grab_midi” on Push2, I get MIDI outputs direct from the [live.object] in my device. Similar output from something in my device is more-or-less what I am expecting is possible when I use grab_control. (The grab_midi output itself is problematic for me because it doesn’t disable to default functionality of the Global_Mute_Button. And if I grab_control the Global_Mute_Button, then grab_midi stops reporting the MIDI changes corresponding to that control).
Am I misunderstanding how this is supposed to be able to work?
Thanks!
-Tyler
Just some additional info:
I also tried setting up a ‘MaxForLive’ control surface and did a register_midi_control and grab_control etc., but never achieved useful output.
In Live’s preferences > Link/MIDI, I have both Push2 and ‘MaxforLive’ control surfaces using “Ableton Push 2 (Live Port)” for their Inputs / Outputs.
I know it’s possible with some external max objects to get MIDI directly from certain ports into my M4L device, but that’s not ideal as it wouldn’t avoid the potential of other MIDI tracks getting undesirable MIDI CC/Note message inputs if they are receiving MIDI input from “Ableton Push 2 (Live Port)” or “All Ins”.
Attached are sample project (and corresponding amxd build) that reproduce the problem for me. Per Facebook user group it _should_ work the way I expect, I'm probably just setting it up incorrectly.
When I drag this device into a track, the 'Mute' 'Solo' and 'Stop Clip' buttons are grabbed, but when I press their buttons I don't get any output within my device (which I'm expecting to see printed).
Thank you, I appreciate it! So to summarize:
1. grab_control to takeover the control you want
2. get_control to get the id of the same control
3. Use that id to assign a [live.observer] to listen to this control (on the property 'value')
(In truth I had also tried testing with an observer, but hadn't set the property correctly…).
Perhaps I'd update the documentation for 'grab_control' to read:
grab_control
Parameter: control
Take ownership of the control. This releases all standard functionality of the control, so that it can be used exclusively via Max for Live.
The controls output can be obtained by observing its value property.
@SKEWBORG:
Thanks for this file, works great.
One question though, how would i address the grabbed Element from within live?
Let´s say i want to light up the grabbed mute button, where would i send that command to?
Thanks!
Send to the control_surface live.object:
call send_midi [ 176 | 144 ] [ control/pitch number ] [ value ]
For the Mute button is CC 60, so "call send_midi 176 60 127" will turn it on to bright red (I think), with different values changing the color.
Here's a rather sloppy diagram of the assignment mappings I pieced together a while back. Green is for MIDI notes, and blue is for CCs. If anything here is wrong you can reverse-engineer it yourself by getting the output from the control_surface while grab_midi is enabled.

Hi Tyler, thanks for your effort but this doesn´t make sense for me We can access all control elementsthrough the api and do it with the script Skewborg posted, it would work two ways without having to go into usermode where you don´t have a display and so on.
The only problem here is that i have a hard time to find out how to acces an elemnt from live, like lighing up a button, based on his code.
The MIDI CC / note addresses from the diagram above are the same whether you are in User Mode or not, and you can change button colors with the same messages. AFAIK there is no API-based way to set colors.
i think there is, here´s a fully working example, it checks for all available controls and makes them accessible from live to push and vice versa, without having to go into usermode.
The reason why i am asking for skewborgs´s help is that it is much more lightweight than what i am posting here.
can´t credit anyone because i don´t remember where i got it from, have fun.
Thanks Tyler for all your useful information. I am able to repeat what you have suggested to get the note buttons pressed on the Push2 to appear in Max for Live and able to set the colours by sending it a midi message. The problem I am having, only the "loud" notes played on the Push2 find their way through to Max for Live. Notes that usually go through fine when Live has control are totally ignored when putting Max in control only about a velocity of 50 on upwards gets any attention. It's like the default sensitivity controls have been ignored. Wouldn't have thought I need to override the sensitivity using the sysex commands to the Push2. Anyone else have the same problem or have found workarounds? My rough notes in the screen grab below.

I have done some more testing and have narrowed down the problem:
The translation from finger velocity (physical interaction between the human finger and the Push2) to the velocity that appears in Max for Live, is dependant on two things:
1. The Push2 "Pad sensitivity", "pad gain" and "pad dynamics", ok so this is the obvious one and makes sense.
AND
2. Whether the Push2 midi messages are sent to:
2.1 Live then to Max (the default, i.e. receive via the Max midiin object)
OR
2.2 Directly to Max (the "call grab_control Button_Matrix" and "call get_control Button_Matrix" messages to live.object and setting up a live.observer to fetch what row,col and velocity was pressed).
Funky chicken hey? So the only work around I have is to do a calibration stage, twiddle the "Pad sensitivity", "pad gain" and "pad dynamics" Push2 knobs until a reasonable smooth response is obtained and the low end is responsive enough, then multiply that velocity by a further gain so the full range can be obtained.
So why on earth have the Ableton people done this? Tut tut tut!
Anyone found a better solution?