Controlling an RME FireFace 800
Sorry that this question is very interface specific:
Does anyone know how to write a Max external that would control the internal routing of the FireFace? I know that the HDSP object written by Jhno does something very similar, but unfortunately it has not been updated to work in OS X or with the newer RME gear.
Being able to control the routing of the Fireface via MaxMSP would provide a near-zero latency controllable routing solution that would be amazing to use in live performance.
If anybody has suggestions, please help. I'm pretty desperate at this point. The TotalMix software that RME includes for this purpose is pretty useless for what I'm trying to do.
Thanks
>Sorry that this question is very interface specific:
>
>Does anyone know how to write a Max external that would control the
>internal routing of the FireFace? I know that the HDSP object
>written by Jhno does something very similar, but unfortunately it
>has not been updated to work in OS X or with the newer RME gear.
>
>Being able to control the routing of the Fireface via MaxMSP would
>provide a near-zero latency controllable routing solution that would
>be amazing to use in live performance.
>
>If anybody has suggestions, please help. I'm pretty desperate at
>this point. The TotalMix software that RME includes for this purpose
>is pretty useless for what I'm trying to do.
hi
never tried it, but with my FFace 400 came a "midi remote" document -
seems that's what you (and me as well!!) need:
Notes on driver version 2.58 (Firmware 1.46 / 2.63 / 2.47)
The TotalMix included in driver 2.58 now allows for a control of all
faders of all three rows vias simple Control Change commands.
The format for the Control Change commands is:
Bx yy zz
x = MIDI channel
yy = control number
zz = value
The first row in TotalMix is adressed by MIDI channels 0 up to 3, the
middle row by channels 4 up to 7 and the bottom row by channels 8 up
to 11.
16 Controller numbers are used: 102 up to 117 (= hex 66 bis 75).
kasper
--
Kasper T. Toeplitz
noise, composition, bass, computer
http://www.sleazeArt.com
> If anybody has suggestions, please help. I'm pretty desperate at this point. The TotalMix software that RME includes for this purpose is pretty useless for what I'm trying to do.
So what are you trying to do?
Is it not possible to do using the midi control features for the FF?
I have a FF440 and I've recently been automating some things in TotalMix thru midi from Max. That worked out fine.
I'd be interested to hear what you are trying to do exactly.
regards,
kjg
Consult the pages 78-80 of the English RME manual.
TotalMix is controllable via the Mackie protocol. And that can definitely be done via Max because it exists software for that: LC Xmu from John Pitcairn (who is also present in this forum): http://www.opuslocus.com/lcxmu/
So far I know most of LC Xmu is done in Max, maybe there are some JavaScripts and custom objects. LC Xmu emulates a Logic/Mackie Control. I use it since several years.
I think diving into such stuff is not a trivial task and might easily exceed the effort and time you are willing to spend. The RME manual tells about simple MIDI control. But it is really not much: The monitor, dim and talkback buttons, phones buttons and the 8 preset buttons.
I am pretty sure that one can operate the faders too, someone told me that a while ago. I will search for information because I am also interested in controlling this interface. I would need the faders and the mute buttons, but in all three lines. If that is not easily possible (what I expect) I will use another option, described below:
If you want full control but not to develop your own Mackie protocol implementation, you can always use LC Xmu as a middleware and feed it with standard controllers from Max.
Hope that helps.
Please let me know if you find something useful out.
>
> I think diving into such stuff is not a trivial task and might easily exceed the effort and time you are willing to spend. The RME manual tells about simple MIDI control. But it is really not much: The monitor, dim and talkback buttons, phones buttons and the 8 preset buttons.
>
> I am pretty sure that one can operate the faders too, someone told me that a while ago. I will search for information because I am also interested in controlling this interface.
the FF400 uses continuous controller 102 to 117 to control the faders, where
channel 0-3 controls row 1
channel 4-7 controls row 2
channel 8-11 controls row 3
i'm not sure about the mutes, but you could stores the fader value, send the fader value zero to "mute" and then restore the fader value to "unmute".
regards,
kjg
Thanks for the support.
Yes, TotalMix can be controlled via Max; no problems there. And yes it's easy to control output levels with midi cc's.
I've read the manual, talked to RME, and I understand the functionality of TotalMix. I've been able to use the Mackie HUI protocol to control everything as RME intended.
The problem is this: I'm trying to use my FireFace 800 as an effect router for live performance--sort of like a blendable patch bay. To do this, I need to be able to control the levels of multiple inputs and outputs SIMULTANEOUSLY. Unfortunately the levels can only be controlled for one submix at a time. The submix channel can be switched with the HUI protocol, but it makes crossfading effects nearly impossible for me.
Basically TotalMix was intended to be function like a mixer, where multiple submixes would rarely need to be controlled at once.
The FireFace 800 has 28 HW inputs, 28 software outs, and 28 HW outputs. This gives a total of 1568 possible routing options. Optimally I'd have a MIDI controller value for each route (there are plenty, 16 midi channels, 128 controller numbers = equals 2048 possible CC's)
Hope that clears up some things. Thanks again.
Thank you kjg, I found the description in the new RME manual right now ;-) It is the same for the Fireface 400 and 800 and you are right of course, the mute buttons are not necessary.
Quote: Peter Ostry wrote on Wed, 30 April 2008 18:02
----------------------------------------------------
> Thank you kjg, I found the description in the new RME manual right now ;-) It is the same for the Fireface 400 and 800 and you are right of course, the mute buttons are not necessary.
>
----------------------------------------------------
You are welcome.
Please keep us posted on your findings while trying to gain control over complex routings. I wonder how many CC messages a FF can take per sec before choking...
I've used this cc thing but not for anything complex or with a high data rate.
Speaking of crossfading.. Although technically not a perfect crossfade, I think that something like this might be smooth enough for your purposes?
Right now it is not equal power.. it has a big hole in the middle, but I'm sure that could be fixed using or example a coll lookup for the fader values.
It's fading between two sets of 4 channels and when I route all these channels to my headphones (beyerdynamic dt880) there is no zipper noise or similar problems to be heard, no matter how fast I "crossfade".
Good luck!
Quote: devkerr wrote on Wed, 30 April 2008 17:51
----------------------------------------------------
> The problem is this: I'm trying to use my FireFace 800 as an effect router for live performance--sort of like a blendable patch bay.
----------------------------------------------------
Exactly what I want to do either.
----------------------------------------------------
>To do this, I need to be able to control the levels of multiple inputs and outputs SIMULTANEOUSLY.
----------------------------------------------------
This itself is no problem. I just tried it and can move any fader I want.
----------------------------------------------------
> Unfortunately the levels can only be controlled for one submix at a time.
----------------------------------------------------
Yes, this is a serious limitation. We have 8 presets, therefore we can directly control the feed for 8 different submixes. Still not enough for a live setup.
The question is, how many different feeds do you really need? If you have let's say 5 stereo effects boxes which you use in different ways, they can all stay in the same input channels, their audible appearence and level controlled by Max.
If you have more than 5 stereo sources, you can involve the presets, each preset set to a different subgroup. That gives you 8 times as much accessible input channels.
I do not know your setup, but I have an ADA8000 attached, so I get 28 channels at once. Multiplied by the 8 presets I get 28 x 8 different routings. That is already more than I can handle while I play.
However, I do not see the big benefit of many TotalMix subgroups. A live setup doesn't usually have 28 outputs unless you either go through a mixer afterwards or you do a very special thing on stage.
Complicated routings might mean audio routing too. If you have MSP either, you can think about grabbing some input channels and route them to other outputs which are actually inputs for hardware boxes. And there is Soundflower, which provides up to 18 additional audio channels that can be used for routing.
I think it is more a matter of organization, depending on the particular setup.
---
My current conclusion is, to use one of the following methods:
a) Study your setup again and think of a way to organize your inputs, outputs and presets to match your needs. Yes, I want also to do anything, but theoretical possibilities do not count - only the physical existence of hardware and your personal requirements.
b) Use LC Xmu, feed it with standard controller messages and get the full benefit of the Mackie protocol. This is probably the way I am going because then I don't have to deal with TotalMix directly but can easily address any fuction it provides via the Mackie protocol.
c) Pipe all audio into Max/MSP, make the audio routing here and use TotalMix just for what it is built for: as a mixer for an audio interface with more or less fixed routings. This is a good, but perhaps the most expensive and programming-intensive method.
Btw, I wrote "you" but it applies to me either. I haven't chosen my method yet.
Quote: kjg wrote on Wed, 30 April 2008 18:42
----------------------------------------------------
> Please keep us posted on your findings while trying to gain control over complex routings. I wonder how many CC messages a FF can take per sec before choking...
----------------------------------------------------
I don't expect a problem herewith. It swallows several fader movements at once, additional to "full display" output to a hard- or software display which I do not need in this case. Firewire can obviously handle a lot.
----------------------------------------------------
> I've used this cc thing but not for anything complex or with a high data rate.
> > Speaking of crossfading.. Although technically not a perfect crossfade, I think that something like this might be smooth enough for your purposes?
----------------------------------------------------
Maybe yes. I haven't tried it yet. But my crossfades are rather between instruments and effects, not across output channels. I will do it by fading the inputs. And that needs to be configurable in Max, probably with a couple of fixed curves and times, depending on the type of sound.
----------------------------------------------------
> It's fading between two sets of 4 channels and when I route all these channels to my headphones (beyerdynamic dt880) there is no zipper noise or similar problems to be heard, no matter how fast I "crossfade".
----------------------------------------------------
Good to know.
> Good luck!
Thank you!
Quote: Peter Ostry wrote on Wed, 30 April 2008 19:59
----------------------------------------------------
> Maybe yes. I haven't tried it yet. But my crossfades are rather between instruments and effects, not across output channels. I will do it by fading the inputs.
if it works on output channels i'm sure it will work on input channels too. I'm guessing they use the same algoritm for all faders...
Just go build that patch. I'm sure you succeed creating a workable solution if you set realistic goals.
regards,
kjg
Quote: kjg wrote on Wed, 30 April 2008 20:08
----------------------------------------------------
> if it works on output channels i'm sure it will work on input channels too. I'm guessing they use the same algoritm for all faders...
I waited for that after I re-read my text. Good catch :-)
I meant a different thing but well, it doesn't matter.
> Just go build that patch. I'm sure you succeed creating a workable solution if you set realistic goals.
Yess Sir!
I'll try and report back.
Quote: kjg wrote on Wed, 30 April 2008 10:42
----------------------------------------------------
> It's fading between two sets of 4 channels and when I route all these channels to my headphones (beyerdynamic dt880) there is no zipper noise or similar problems to be heard, no matter how fast I "crossfade".
>
> Good luck!
----------------------------------------------------
Thanks kjg. I also saw similar results--totalmix is great at fading for a given submix (no artifacts, zippering, fader lag).
Quote: Peter Ostry wrote on Wed, 30 April 2008 11:39
----------------------------------------------------
>
> However, I do not see the big benefit of many TotalMix subgroups. A live setup doesn't usually have 28 outputs unless you either go through a mixer afterwards or you do a very special thing on stage.
>
----------------------------------------------------
Peter, I don't totally understand. If you are hooking up say 4 stereo effect boxes that you wish to be able to configure in any way (change order, parallel, series, etc.) and you want to be able to mix them in and out then you NEED to be able to simultaneously control their outputs in different submixes. So for 4 effect boxes you need at minimum 5 submixes (4 for effects, 1 for output).
What you mean when you say: "We have 8 presets, therefore we can directly control the feed for 8 different submixes." How would the presets help?
Quote: Peter Ostry wrote on Wed, 30 April 2008 11:39
----------------------------------------------------
>
> a) Study your setup again and think of a way to organize your inputs, outputs and presets to match your needs. Yes, I want also to do anything, but theoretical possibilities do not count - only the physical existence of hardware and your personal requirements.
>
> b) Use LC Xmu, feed it with standard controller messages and get the full benefit of the Mackie protocol. This is probably the way I am going because then I don't have to deal with TotalMix directly but can easily address any fuction it provides via the Mackie protocol.
>
> c) Pipe all audio into Max/MSP, make the audio routing here and use TotalMix just for what it is built for: as a mixer for an audio interface with more or less fixed routings. This is a good, but perhaps the most expensive and programming-intensive method.
>
>
> Btw, I wrote "you" but it applies to me either. I haven't chosen my method yet.
>
>
----------------------------------------------------
I'll check out LC Xmu, but I fear it will still have trouble with the submix stuff. Piping audio into MaxMSP is pretty undesirable, just from a latency/CPU usage point of view. Part of the reason I got the FireFace was because of it's near-zero latency routing abilities.
Thanks again everyone.
Perhaps I'll have the time to talk to RME about giving us enough information to write an external that would totally control (without submix limitations) the FireFace 400/800. I feel like having that object would help anyone who plays live with an RME interface. I've heard RME may update TotalMix in the near future... maybe it'll all be solved then.
Quote: Peter Ostry wrote on Wed, 30 April 2008 20:13
----------------------------------------------------
> Quote: kjg wrote on Wed, 30 April 2008 20:08
> ----------------------------------------------------
> > if it works on output channels i'm sure it will work on input channels too. I'm guessing they use the same algoritm for all faders...
>
> I waited for that after I re-read my text. Good catch :-)
> I meant a different thing but well, it doesn't matter.
I figured you probably meant something else... What exactly are you trying to do then?
Quote: devkerr wrote on Wed, 30 April 2008 20:31
----------------------------------------------------
> Quote: Peter Ostry wrote on Wed, 30 April 2008 11:39
> ----------------------------------------------------
> > However, I do not see the big benefit of many TotalMix subgroups. A live setup doesn't usually have 28 outputs unless you either go through a mixer afterwards or you do a very special thing on stage.
> ----------------------------------------------------
>
> Peter, I don't totally understand. If you are hooking up say 4 stereo effect boxes that you wish to be able to configure in any way (change order, parallel, series, etc.) and you want to be able to mix them in and out then you NEED to be able to simultaneously control their outputs in different submixes. So for 4 effect boxes you need at minimum 5 submixes (4 for effects, 1 for output).
----------------------------------------------------
Brainstorming again:
Unless you don't play really weird stuff, you have most likely a traditional effects chain. Not regarding the effects themselves but their order: Tone shaping effects first, then modulation effects, and room effects at the end. Means, that a reverb output will usually not go into a distortion box. This is not so much a matter of taste, it has technical reasons.
If you see your different effects chains as parallel vertical rows, you can cable them (through line mixers, patch panel, what ever) in a logical order. Then a distorted tone has just to descide wether it goes to chorus A or chorus B or avoid both and go to the echo. That makes routing in TotalMix easier, because you have a vertical flow. The outputs go to the inputs of possible following devices, from there they enter the input stage again at another point and the game happens a second or a third time. And you can control what goes where by fading the input and output channels.
Second, the subgroups in TotalMix are always functional. The activated channel pair in the lower row is just a helper for us. Max doesn't need this help if you've already set up the basic routing in several subgroups. If you are in subgroup #1, Max can still move an output fader of subgroup #5. We just think that we have to select another subgroup but that is not true. Max doesn't click onto a channel pair but it can still move the faders and change levels.
The above is a bit of a theory. I did not fully check that out because the problem was new to me until you brought it up today. But I see what happens when I control TotalMix from Max and it looks as if it could change all levels without selecting a subgroup.
----------------------------------------------------
> What you mean when you say: "We have 8 presets, therefore we can directly control the feed for 8 different submixes." How would the presets help?
----------------------------------------------------
Make a simple test: Select subgroup #1, save as preset 1. Select subgroup #2, save as preset 2. Now recall the presets and you will see that the subgroup selection got stored. You can save preset 1-8 with selected subgroups 1-8 for example.
Because of this, Max can switch the presets and activate 8 different subgroups to work in. Together with a clever routing that should be enough. And if this is not enough for a whole concert, there are still the "Save as" and "Open" functions in the menu to save and load another set of presets. When you find the time to load another set during a gig, the possibilities are almost unlimited.
----------------------------------------------------
> I'll check out LC Xmu, but I fear it will still have trouble with the submix stuff. Piping audio into MaxMSP is pretty undesirable, just from a latency/CPU usage point of view. Part of the reason I got the FireFace was because of it's near-zero latency routing abilities.
----------------------------------------------------
Is the latency really that bad? Even if you use Soundflower to stay in the digital domain? You could mix and switch the Soundflower channels as you like and send just the result to the Fireface outputs.
At least I hope that it works like this, because my other project depends on that and I have not tested the latency with Soundflower and MSP yet. Was probably a failure ...
----------------------------------------------------
> Perhaps I'll have the time to talk to RME about giving us enough information to write an external that would totally control (without submix limitations) the FireFace 400/800. I feel like having that object would help anyone who plays live with an RME interface. I've heard RME may update TotalMix in the near future... maybe it'll all be solved then.
----------------------------------------------------
That would be great but we have it when we see it ;-)
I found your posting over there in the forum as I looked for this MIDI control thing today. RME didn't inform you with much enthusiasm. They seem rather to stay on the "use the Mackie protocol" road. And they are basically right because they have an existing controller implementation.
I know very few people and companies who understand the importance of this things for todays live performers. A part of these artists is a completely different breed than the classical Rock or Jazz band that the companies knew for many years. And most of these players have technical problems. Ironically, with all the technology around us.
Quote: kjg wrote on Wed, 30 April 2008 20:36
----------------------------------------------------
> I figured you probably meant something else... What exactly are you trying to do then?
----------------------------------------------------
Man, you have difficult questions. I am a youngster, how should I know what I want to do?
But I try to explain:
I work on a good routing and controlling concept for two projects, both use a DAW as "musical host". Both projects are for live performances. One should bring the best sound I can get from real instruments and a lot of effects (Ac. guitar, MIDI guitar and bass, Wind Controller). This is my own and a new project and I plan to use everything available - the original sound of the guitars, virtual instruments, soft- and hardware effects.
The other project is more universal and deals with virtual instruments, few outboard gear and mainly with Wind Controllers as MIDI source. Target is to make a system that isn't too difficult to configure but very flexible and controllable by foot controllers, keyswitches, via the DAW, everything that can deliver MIDI. I am working on that for quite a while now, together with a Wind Controller player. We got stuck because we used exclusively Logic and reached the ceiling. After looking at other systems I am finally here in Max home. Happy and far too late.
The routing I have in mind involves switching different sound and effects sources and has to blend them in a dynamical manner. In addition, the effects and the routing react to the players performance and the control capabilities of his instrument. An example: It can happen that a certain breath strength and additional mouth movements on a Wind Controller "tremoloes" into another channel. And that with layered sounds. That's why I cannot simply tell Max to blend one channel with another one because a tiny controller action of the artist can change the channels. I have to do all separately, strictly follwing the MIDI input and - in a later development stage - based on information from the audio material itself. Then the more traditional fading and panning will come back.
I hope that was an explanation. Please excuse my long posts. I am Austrian and my language is German. I need many lines to say what you could say with a couple of words.
>Not sure what would be the most difficult with a tight deadline:
>to become proficient in the Environment, or to learn Max.
Heh. Peter is already quite proficient in the Logic environment. Unfortunately, since Apple's takeover, the Logic environment is now so full of unfixed bugs and things needing workarounds that it is proving unreliable for complex mission-critical stuff. I think that's why Peter is here. To some degree, it's why I am here too.
If RME would be willing to release some info (under NDA) about controlling the Fireface hardware mixers globally via native C/C++, I'd be willing to take a crack at writing a Max external to talk to the unit, and I have a FF800 I can test on. I'm not so inclined to chase them for the info, but I do have a prior relationship with Matthias Carstens there, so if you're contacting them, maybe that would help.
Quote: jeanfrancois.charles wrote on Thu, 01 May 2008 02:29
----------------------------------------------------
> > ... Wind Controller player. We got stuck because we used exclusively Logic and
> > reached the ceiling.
>
> Your requests must be pretty high. Michael Brecker had a crazy
> live-electronics set-up to play with his EWI, and that was a all-Logic
> system.
Breckers requirements were different and the setup apart from the built-in looper in no way crazy but very individual. His patches are basically identical parallel MIDI processing lines that got configured for particular performances, plus a routing matrix. The looper is another story.
> The ceiling with Logic seems pretty high for MIDI processing.
And the number of bugs and the necessity for workarounds is also pretty high. Furthermore, if you try to route audio in Logic by MIDI, you run from one shortcoming into the other. The development of this very good part of Logic was simply stopped at some point.
> Agreed, that [Brecker's environment] was done by a real specialist of the Logic Environment.
It was done by his keyboarder. At this time, not many people had the idea to replace a complex hardware setup by software. Well, some had the idea, but they didn't do it.
> ... It surely depends on the
> precise things you want to do.
Exactly. Here is the short version of the list:
1. Acceptance of any controller input, mapable to at least 64 preconfigured virtual instruments including keyswitches, keysplits, transposition, adjustable curves for continuous controllers and dynamics, message conversion. Input distribution based on MIDI port and channel.
2. At least 4 "routes" of active instruments per preset that can be mixed, muted, crossfaded, sent to effect paths (soft- and hardware) on demand, controlled by the artist. Global parameters like transposition can be changed on the fly, the same applies to instrument specific parameters of the virtual instruments themselves and to plugins in their individual channelstrips.
3. MIDI control from controllers to the software and from the software to external units. This includes external multieffects and amp switching and output to custom MIDI aware devices.
4. At least 64 presets (better 100+), that store all settings for the minimum of 4 parallel "routes", where each route can consist of several solo or layered sounds. Preset switching has to occur within few milliseconds. Configurable crossfades between presets, especially for spill-over of delay and reverb. A kind of song list where each song combines several presets would be fine.
6. Ability to process audio and mix audio input with virtual instruments. Audio routing within the DAW and via external effects loops.
7. Big and clear onscreen feedback for the artist.
That's it, basically. And as you see, the requirements are quite different from Brecker's.
I got more than 50% of the above running in Logic. But then the last update broke an essential function, I had time to rethink the system and came to the conclusion, that with all the upcoming problems in the growing patch I would not be able to fulfil more than 70% of the requirements. I started to look for other ways and finally, as John correctly assumes in his post, I am here.
I use Max/MSP only for several weeks now but it is enough to tell me that the project can be successful with this software. I am not sure yet what I will do in Logic, what in Max/MSP or probably in the mixer of the RME interface which could be a requirement for a setup.
Quote: johnpitcairn wrote on Thu, 01 May 2008 06:23
----------------------------------------------------
> If RME would be willing to release some info (under NDA) about controlling the Fireface hardware mixers globally via native C/C++, I'd be willing to take a crack at writing a Max external to talk to the unit, and I have a FF800 I can test on. I'm not so inclined to chase them for the info, but I do have a prior relationship with Matthias Carstens there, so if you're contacting them, maybe that would help.
----------------------------------------------------
Great offer, John. I will to talk to them, we speak at least the same language and I am not as far away as you are. Maybe that helps. I'll let you know what they say.
Quote: Peter Ostry wrote on Thu, 01 May 2008 08:28
----------------------------------------------------
> Quote: johnpitcairn wrote on Thu, 01 May 2008 06:23
> ----------------------------------------------------
> > If RME would be willing to release some info (under NDA) about controlling the Fireface hardware mixers globally via native C/C++, I'd be willing to take a crack at writing a Max external to talk to the unit, and I have a FF800 I can test on. I'm not so inclined to chase them for the info, but I do have a prior relationship with Matthias Carstens there, so if you're contacting them, maybe that would help.
> ----------------------------------------------------
>
> Great offer, John. I will to talk to them, we speak at least the same language and I am not as far away as you are. Maybe that helps. I'll let you know what they say.
>
>
>
----------------------------------------------------
Great to hear the enthusiasm. John and Peter, let me know what I can do to help. Thank you!
-Devin
... my reason for wanting such an external is that I'd like to improve the support for the Mackie Control, so the controller and FF800 function MUCH more like a hardware mixer, with familiar pre/post send busses, fader/mute groups and other stuff you can't currently do at all well with the RME MCU support.