RNBO Unity Audio Plugin

Alex Van Gils's icon

Hello everyone, this post is to announce an experimental integration that we are excited to share — an adapter for Unity’s Native Audio Plugin. Using RNBO’s C++ source export, this repository will help you build a plugin which provides an API that facilitates integration of your RNBO device with the Unity game engine.

As described, this is experimental software. Please use at your own risk and know that anything in this repository could change in the future. However, we very much want to hear your thoughts and see what you make.

If you find problems or have feature requests, please track them using the GitHub repository’s issues page.
https://github.com/Cycling74/rnbo.unity.audioplugin

Have fun!

Jan Klug's icon

Experimental.. risk.. fun.. what a nice combination! :D
Thank you so much for this additional extension or extension of the addition; have been checking the forum once a week in the home that this announcement would pop up (as I hadn‘t yet managed to combine RNBO, Unity and Android)..
Will dive in right away!

Jan Klug's icon

Hey Alex, I'm building RNBO-generated audio plugins into a multiplayer theatre-VR-installation-performance hybrid, and it's working great so far using Unity and Quest 2.
Thanks for this nice starting point!

I do have some questions though that don't fall into the category problem or feature request. Do you prefer that I post them here, or also via GitHub issues?

Alex Van Gils's icon

Hi Jan, it’s so cool that you are using this for your project — it sounds awesome.

Please feel free to share your thoughts here, and if it makes sense to move some of them over to the repo as an issue, we can do that later!

Jan Klug's icon

Yes, since the announcement of RNBO I had been waiting to be able use it in our Unity VR project.. Good timing!
It's an audiovisual experience for (for now) 4 persons in a shared physical / virtual space, where they can engage and interact sonically. And play with a room-sized sequencer...

I'm mainly using about 6 RNBO-synths per person, which I control individually via OSC. As all players are networked and hear/see the others, the result is that on each Oculus 2 there are at least 24 synths playing simultaneously, on spatialized objects. And that works fine!

Right now they play continuous sounds, but I'd like to make them a spatially distributed synthesizer, so would like to send midi notes to them from Max, Ableton or both - or use the spatial sequencer for that.

One question that I'm having is this:
The setup I use now is a hacky modification of the CUSTOM_FILTER example code, via local (and spatialized) Audio listeners. But I'm unsure of how to include the Synth Helper as in this example... Doesn't that reside in the audio Mixer? How does that combine with the spatialized setup?

Alex Van Gils's icon

Hi Jan, thank you for this! If it becomes reasonable to do so, it would be super cool if you shared your project.

I understand your question. Basically, if you are making a custom filter, like the CUSTOM_FILTER example, you don't need to use the Helper in order to get access to those methods like SendMIDINoteOn() -- you can get the methods from the Handle directly. The Helper is really there to help you access an instance of your plugin which has been loaded onto the audio mixer.

Here's an example where we send a MIDI CC message to our custom filter called "synth." You can use the same principle to send MIDI notes with SendMIDINoteOn(), etc, as in the MIDI example.

Certainly let me know if I can help further or if this doesn't answer your question.

using UnityEngine;

[RequireComponent(typeof(AudioSource))]
public class OrbPlayer : MonoBehaviour
{
    TestOrbsHandle synth;

    public OrbPlayer() : base() {}
    
    void Start()
    {
        synth = new TestOrbsHandle();
        
        // MIDI channel 1, controller number 1, value 110
        synth.SendMIDICC(1, 1, 110);
    }

    void Update()
    {
        synth.Update();

    }

    void OnAudioFilterRead(float[] data, int channels) 
    {
        if (synth != null)
        {
            synth.Process(data, channels);
        }    
    }
}
Jan Klug's icon

Ah thanks, that de-confuses me in that matter, and it works!

Another confusion I have is about the updating of Parameters.
Does that have to happen in each Update() cycle for *all* the parameters, like

void Update()
{
synth.SetParamValue(param_ratio, ratio);
// and then quite some more like this
synth.Update();
}

- even when the parameter doesn't change in-between?

Alex Van Gils's icon

Hi Jan,

Thank you for the question -- You don't need to update the parameter value every frame, when Update() is called. You can call SetParamValue() outside of the Update() method.

For example,

using UnityEngine;

[RequireComponent(typeof(AudioSource))]
public class DrumFilter : MonoBehaviour
{
    
    RNBODrumHandle drums;

    readonly System.Int32 numerator = (int)RNBODrumHandle.GetParamIndexById("numerator");
    readonly System.Int32 denominator = (int)RNBODrumHandle.GetParamIndexById("denominator");

    public DrumFilter() : base() {}
    
    void Start()
    {
        drums = new RNBODrumHandle();
    }

    void Update()
    {
        drums.Update();

        if (Input.GetKeyDown(KeyCode.T))
        {
            ChangeSubdivision(Random.Range(1f, 4f), Random.Range(1f, 8f));
        }
    }

    void OnAudioFilterRead(float[] data, int channels) 
    {
        if (drums != null)
        {
            drums.Process(data, channels);
        }    
    }

    void ChangeSubdivision(float numeratorValue, float denominatorValue)
    {
        drums.SetParamValue(numerator, numeratorValue);
        drums.SetParamValue(denominator, denominatorValue);
    }
}
stefano's icon

hey all! anybody succeeded to make this work with unity 2022? i think the .bundle gets imported incorrectly, the plug-in is not seen from the mixer as effect.
All working fine in unity 2021.
cheers
Stefano
(edit: solved, see below)

Alex Van Gils's icon

Hi Stefano, thanks for reaching out... would you mind sharing your experience over on the GitHub repo's Issues page?

I would expect the plugin to work with Unity 2022 versions. If you could share your exact Unity version, and maybe a little more about how you are importing the built package and what you see on the mixer.

Thank you!

stefano's icon

Hi Alex, thanks for answering, it seems i cried wolf too early: I made a couple of new test projects and all went well there. It seems i had something wrong with the first project i experimented with. The only oddity i see is a field in the inspector of the imported .bundle (see image below), but the plugin actually works fine, sorry for the false emergency.

Alex Van Gils's icon

Thank you for following up, Stefano! I've tracked this for some sleuthing here: https://github.com/Cycling74/rnbo.unity.audioplugin/issues/29

Tonzari's icon

Hello! I'm here just to say that I'm thoroughly enjoying learning RNBO (and Max) for use in Unity!

Ran a few "hello world" tests just to make sure I understood the process and hope to build out some fun prototypes in the coming days.

Is this forum the best place to discuss, suggest, ask general topics? (Any obvious bugs/issues I can take to github, but curious if you have a preferred channel for general convo).

Thanks for creating the RNBO -> Unity Repo and docs!

Alex Van Gils's icon

Hi Tonzari! Glad that you are having fun checking this out, and certainly, we can chat about the topic here. Fire away : )

ncdlmn's icon

Hi @Alex. This is great and I've been having a lot of fun experimenting with this implementations. Thank you very much for the thorough documentation!
I want to be able to use the spatialization option that unity provides with my plugin. Seems like Jan figured out a way to do it. Is there a more direct way to implement the spatialzation? Maybe a way to route the signal around? I'm thinking something along these lines, where you add an empty file to the audio source and then multipy and adc~ input by the end of the signal in rnbo (see here https://github.com/LibPdIntegration/LibPdIntegration)
I haven't tried this yet, but I wanted to ask first and see if this id something that was already adressed. Thanks again!

Alex Van Gils's icon

Hey @NCDLMN, I'm glad you've been having fun! Certainly feel free to share anything you've been hacking on, it would be cool to see/hear what folks have been doing.

Probably the easiest way to take advantage of Unity's spatialization features is to make a "Custom Filter," as in this example: https://github.com/Cycling74/rnbo.unity.audioplugin/blob/main/docs/CUSTOM_FILTER.md

What's sort of cool about this is that because this requires an Audio Source, you can naturally take advantage of the 3D spatialization algorithms that Unity provides for an Audio Source. And if you change which spatializer you are using (in Unity's settings), that change should apply to this Audio Source as well.

Does this point you in the right direction?

A

Alex Van Gils's icon

If you'd like to see this implemented, the "humming orb" from our demo project ( https://cycling74-assets.nyc3.digitaloceanspaces.com/rnbo/unity/example-projects/RNBODrumkit.zip ) uses this strategy.

A

ncdlmn's icon

Hi Alex, thanks for your reply.
Yes the CUSTOM_FILTER example works great. Just a couple of questions. I have a plugin that requires a buffer (a granular synth) and I'm able to make it work as in the BUFFERS.md example. But I cannot use both the buffer script and the filter script in the same object (I dont get any sound out.) Probably has to do with the instace number? But becasue the filter does not generate the GUI for the plugin, I cannot access it. Does that make sense? The overall question would be, is there a way for the CUSTOM_FILTER example to generate the plugin GUI automatically? Or would I have to program each of the parameters again, as in the TRANSPOR_TEMPO.md example?

Alex Van Gils's icon

Hi NCDLMN,

Thank you for the question -- so, I think I have a sense of what you are trying to do, but I think there might be some slight confusion. If you make a custom filter from your RNBO device, you don't need an Instance Index -- that Instance Index is what we use to distinguish which instance of a RNBO device, loaded on an Audio Mixer, a given component should manipulate.

If you make a filter, as we do with the CUSTOM_FILTER example, we don't load that RNBO device onto a track on the mixer. Instead, the Audio Source associated with your filter has an Output field in the inspector, where you can select your desired destination for this audio.

So you don't need to generate the GUI with the param sliders, instance index, etc. This is all for the benefit of the Audio Mixer. You can just create your filter and set the Output of the Audio Source. If you want to add buffers to your custom filter, you can do so just like the BUFFERS example, and when you create your

[SerializeField] AudioClip Buffer;

you should see that field in the inspector of your filter component.

Does this all make some sense? If not, please feel free to share your script, either here or in an email to support@cycling74.com, and I'll be happy to try to help.

Yours,

A

ncdlmn's icon

Hi Alex, thanks for your response. I was able to add the buffer to the the filter script and now everything is working as I expected.
The main reason why I wanted to acess the parameters VST is because I wanted modify them in real time and have some GUI feedback. I ended up acessing the parameters and creating the sliders as in the Transport Tempo example. Not really a hard task, but I was wondering if there was an easier way to be able to acess the parameters through the script. Anyways...
Exporting max/rnbo patches and use them in Unity bring incredible opportunities and the real-live applications are endless. I could totally see myself using this workflow more often in my practice.
I will get back to you via email in case I have further questions.
Best,
N

Alex Van Gils's icon

Sounds good, NCDLMN, and you are very welcome! I'm glad this integration has been useful for you, and we'd love to see+hear what you are doing with it.