can Max control Live without a M4L device inserted on a live track?

    Jul 24 2011 | 1:45 am
    Hi all,
    My main focus at the moment is to control LIVE with custom made interfaces (using TouchOSC). I have a number of controllers coming in (some OSC, some midi): I want to route and process all the incoming control signals, and use the M4L api to control LIVE
    But.. I can't seem to get to work unless the patch is actually a device in Live. It may be obvious (I'm a newbie) that Live can only be controlled through a M4L device - but it seems weird to have to make a track in live for remote controlling the entire application.
    Is there a way of using M4L API to control LIVE without having a device on a track?
    Thank you!! Bas

    • Jul 24 2011 | 5:36 am
      I think you have to put a device on a track. Maybe use the master track?
    • Jul 26 2011 | 3:11 am
      thanks benj3737 - not a bad suggestion. I've done some more digging around, pretty sure now that MAX cannot control LIVE unless through a device on a track.
    • Aug 04 2011 | 2:09 pm
      Ok, found the final proof: "live.object only functions inside of Max for Live devices." I wish that had been a note on the help pages, like on the LiveAPI or the Live Object Model!
    • Aug 04 2011 | 8:43 pm
      You can control Live without MAX4LIVE using max msp with MIDI. You receive the osc information and change to MIDI messages.
    • Aug 05 2011 | 3:49 am
      You can also use Max to control Live using the _Framework objects and the Python backend....see ST8's stuff @ ....essentially this is all that m4l is, a specialized MIDI remote script written in Python.
    • Aug 09 2011 | 1:57 pm
      Thanks all. So what do you think would be the most efficient way for some of the following from my iPad using touchOsc?
      ***UNDO function 1 iPad - osc connection - max for live - live.object Or: 2. iPad sending midi notes as if a mackie hui - midi connection - ableton controlled through midi/mackie emulation
      ***Changing the pitch of an audio clip 1 iPad - osc connection - max for live - live.object or: 2. In live, midi map pitch to a midi cc, and send that midi cc from the IPad over a midi connection
      My criteria are: 1. Low overhead (I use this in live performances so can't afford any glitches) 2. latency / responsiveness 3. Ease of design and maintenance
      Regarding the last point, doing everything through api calls from live will be the easiest way to keep my patches clean (as opposed to adding virtual midi connections to provide. Macie hui emulations) - but I'm starting to wonder if it actually introduces overhead, and if I'm better off using remote control scripts and controller emulations where I can.
      Thank you for sharing your insights!
    • Aug 14 2011 | 2:17 pm
      Please don't respond here - I'll start a new thread with the same questions.. this is actually going off-topic!