can't open Audio in midi device in M4L?
Hello,
I looked into this issue and didn't find a lot of answers that helped.
I'm an electronic music major. This semester in my electronics ensemble I'm trying to build a device that converts incoming audio data to midi, simply enough I want to build a device that when I blow into a microphone, creates midi triggers exponentially, sort of like unlocking the built in arpeggiator in ableton and using miliseconds. The issue I'm having currently is that when I open the edit window for a midi device and use an EZADC~ and a plugin~ device the device wont stay on.
I'm not sure why this is happening. Is it because I don't currently have an active MaxMSP license?
I'm not new to MaxMSP by a long shot, I've been using this program for years, however I am brand spanking new to Max4Live, is this a common issue?
thanks,
PS any pointers to building a patch like this would be helpful as well.
the demo mode does only affect saving files including clipboard actions, but what do you expect [ezadc~] to do in a plug-in/m4l?
Routing Audio into a MIDI device would need using the live.routing object.
Check the help file.
First of all, [dac~] [ezdac~] [adc~] [ezadc~] do absolutely nothing inside M4L devices. Instead you need to use [plugin~] [plugout~] objects (or their mc version.)
When inside a M4L device, Max uses Live's audio and MIDI engine exclusively, so you need to route your signal through Live. In case of output, Live automatically recognizes if an audio effect has a MIDI out, and it displays that port as if it's a virtual MIDI port - so you can just select it in the mixer's MIDI From menu. In case of input routing, your device would need to choose where to take the input from, and [live.routing] and/or Live API's DeviceIO class lets you do this.
Unfortunately, help file on [live.routing] is a bit lacking IMO. Ableton's "Building Max Devices" Series has an example patcher on routing, and it explains better - even though it doesn't use [live.routing] object and instead it just uses [live.object], they are functionally the same. You can get the examples from Live's Available Packs menu.
According to the tutorial, different types of devices have different routing capabilities.
All devices can route audio I/O, but you need to have [plugin~] and [plugout~] with channel arguments present (like [plugin~ 1 2] [plugin~ 3 4]) and Live needs to recognize that your device is capable of audio routing, so you need to save the patcher after placing them.
But in case of MIDI, different types of devices have different limitations - since M4L devices can only handle one input and one output MIDI ports - which means M4L Instruments cannot have a separate MIDI Input since it already has default MIDI input, and MIDI Effects cannot have routing at all.
So you would either need to
A. Create an audio fx device, get incoming audio normally with track input and use DeviceIO routing to output MIDI. (either by just using [midiout] and selecting it in the mixer, or by using M4L API and choosing where to route the MIDI to programmatically)
B. Create a MIDI fx device, get audio signal via DeviceIO routing, and output MIDI messages normally - But in this case, you won't be able to record MIDI data that came out of the device, at least not on the same track.
(Personally, I also have resorted to using Node for Max to work around the limitations of Live's MIDI engine. You can do literally whatever you want with Node for Max even inside an M4L device, as long as it's possible to do in Node.js)