about the magic "MIDI learn" toggle in Live and m4l UI objects….
so there is, indeed, in Live, this magical "midi learn" toggle that lets you assign any midi you want (or nearly) to a clicked Live UI butotn toggle dial or whatever. You cna, this way, assign in one click, midi control to your m4l UI objects inside your m4l patches.
I was wondering if, inside this time Max and without Live, tehre was any way at all to reproduce this, ie having a m4l api object which you feed a toggle and maybe a parameter file load/save, and when toggle activates it, any m4l ui object inside Max would be ready to be MIDI learnt. It would sound logical and easy to do (for m4l developpers), but maybe i am wrong. I don’t know m4l that well, hence maybe it’s one of those basic things i’ve missed… anyway, is there something like this ?
What is your question exactly? It’s easy tp create something that learns from incoming midi input.
errr, yes it’s pretty easy, but in Live, it’s more than easy, it’s "one click". Or did i miss a "one click" feature built-in in Max ?
Because surely you can do [ctlin 12 1] but if you want to use another midi device, mapped in a different fashion, or if you want to remap things, you’ll have to rebuild your object. Which can eventually be a problem if you have many of those controllers, hence my query if that live’s feature is reproductible inside Max. I mean for m4l ui objects, as it works for any m4l ui object when you’re inside Live and using maxforlive.
So ? Nothign, really ? :/
(last time i do this here, sorry.)
It’s probably going to involve something like pattrhub/pattrstorage/pattrmarker. The device thing is a bit of a pain, since the channel numbers may change if your MIDI config changes.
There’s not (that I’m aware of) a drop-in solution for this yet. You’d need to be able to separate the MIDI mapping from other pattrstorage systems, and you’d need to be able to monitor for changes in value, to know when to do the mapping. (Pattr is probably a candidate for this)
It’s a pain to do, in short. Not undoable, but a pain.
awww. Okay then, thanks, but since tehre was a built-in way to do it in Max for Live, i would believe it could be doable in Max as well :/ i was mislead. Thanks anyway…
I created a MIDI/OSC learn system as part of a larger project a while back, and as Peter suggests, It IS a pain to do. If I get some time in the near future I will try and revise and improve that system, which is dependent on an sqlite db to write the controller/mapping data to, Unfortunately there are a few aspects that are a bit limited and even kludgey. In any case maybe you should check this http://cycling74.com/toolbox/midi-learn/ Haven’t fully tested it but it seems to work ok in limited trials….
@vichug: the way it works in Live is that it’s already using Ableton’s API by that point.
What you’re looking for in a solution is something like this:
1. You don’t have to add anything that’s not already provided by pattr to your interface objects.
2. Ideally, it should be a one object-ish solution; the more steps you add, the more difficult it becomes to manage and use.
If I were to speculate as to a way to do it, I would imagine a poly~ patch containing pattr objects that are bound to the pattr objects in pattrstorage’s client list. (so each voice is bound to one value) The poly~ patch would have as many voices as controls. The output of the pattr in the poly~ voice would bang a thispoly~, which would then report the number via a send to the main patch. (so you know which control to assign things to…)
Outside the poly~ patch, you’d have your midiparse object to give you controller numbers which then you can route to the appropriate destinations. When learn mode is on, it stores the last received changed object number with the last received controller number into a coll/lookup table. (smells like a pack situation to me)
One catch seems to be that your MIDI values will not be in the same range as your destinations. If you’re using the live. controls, you can / 127. and send that through a "raw float $1" message box.
Outside the poly~ patch, you would probably need to store a coll/dict/etc. that would record which voice has which value. I’d rather store a file with "nameOfControl midiValues" than just some indexed value for the first field, since that could change if you add pattr controls.
Anyways, my thoughts on how you might approach it. Implementations are always a bit trickier than the planning, but it seems feasible.