SwiftRNBO



SwiftRNBO is a translation layer between a Swift-based Apple platform project and C++ code that was exported using Cycling '74 RNBO software. It requires no knowledge of C++ to use and allows any Apple platform developer familiar with the Cycling '74 Max programming environment to build sophisticated and efficient audio applications with the ease of use of Swift and SwiftUI.
Cool project! Must be! Can't test it out myself as I don't have RNBO yet, but I faced similar task with Gen~ code export, so perhaps it is a major advancement in this field. Maybe, I can even create apps for Apple Watch now?
Thank you very much!
You actually can just do everything in Gen and put it inside an rnbo~ subpatch. Should work without need to save the rnbo~ subpatch.
Objective-C++ Wrapper: Alex Nadzharov
does this work for mobile iOS apps?
Totally. Current example is multiplatform: iOS, macOS and tvOS work out of the box. But currently only SwiftUI Example is present.
Did anybody try this? how can you upload the patch to an iPhone? I don’t have RBNO now but this could be a really good reason to jump in.
It is not exactly an "upload", but more of a way to use RNBO in an iOS project. So you still need to create UI and app logic in Swift. You actually create an application for iOS in Xcode an use RNBO export as an embedded Audio Unit. Therefore, you need to know iOS development to do that.
Fantastic. Hopefully I get some time in the holidays to dive in!
Thank you for this amazing work! I've managed to get an iOS app working that produces audio output, but I can't get microphone input into the signal chain. I've been looking at RNBOAudioEngine.swift in the example project and uncommented this line in initInput()...
let format = input.inputFormat(forBus: 0)
...which causes the app to prompt me to allow microphone input when I run it on my phone, yet still no audio from the microphone passes into the signal chain.
Any suggestions for what to modify here?
Hello!
This is actually one of the enhancements we are planning to implement:
https://github.com/ceammc/SwiftRNBO/issues/19
If someone manages to do that, please make a pull request linked to this issue.
In the meantime I will try to solve it myself. Should not be that hard.
Thanks! I'll keep my eye out for movement on this. Would be really great to have this functionality.
BRATSCHEMEISTER,
Done! Check out the latest release (v0.4.3).
The problem was that right after connecting the device's input node, we were connecting an audio player node, that was replacing the device's input node. Now they can play together thanks to a couple of AVAudioMixerNodes. The mic is muted by default to avoid feedback.
Sweet!!! Great catch, Eldar! Thanks for looking into this! :D
Hey Eldar - so I'm getting a kernel error about a sample rate mismatch...
Thread 18: "required condition is false: IsFormatSampleRateAndChannelCountValid(format)")
...when I switch format to match the microphone by uncommenting these first two lines and commenting out the third:
let input = engine.inputNode
let format = input.inputFormat(forBus: 0)
//let format = avAudioUnit!.inputFormat(forBus: 0)
I'm curious - what steps did you take to get the microphone to work? Am I going about this the wrong way?
BRATSCHEMEISTER,
If you perform the changes that you mentioned, then the check on the line 25
if input.outputFormat(forBus: 0).sampleRate == format.sampleRate {
is always true, regardless of whether or not they actually match. That's why when they don't, you get the error. So my advice is to just leave them as they are. If you need a super quick fix without changing a lot, like I did in v0.4.3, you can just comment this line:
engine.connect(playerNode, to: avAudioUnit!, format: audioFile?.processingFormat)
This will leave the mic connected. Just make sure not to start the player or the app will freeze.
If the problem persists, I would kindly ask you to create an issue on github, so we could address it more conveniently.
Yeah, I'm a bit lost, since even if I run the project as is without making any changes, I get this error: "Could not connect input node: sample rate mismatch"
I'll follow up over on Github. Thanks for your patience.
Ah, I got it. I think you meant to say comment out this line:
//engine.connect(playerNode, to: avAudioUnit!, format: audioUnitFormat)
No, actually you don't need to change anything in the last version - I was speaking of the previous one. I assumed you already had a project with it, so by commenting this line you would prevent the connection of the audio player, which was overriding the connection of the input node.
If you are using v0.4.3, the message "Could not connect input node: sample rate mismatch" means what it says!) They are most likely different! Make sure to have both your microphone and output device at the same sample rate. I can't figure out yet how to force them to switch to the same sample rate when they don't match, that's probably a good enhancement that could be done. Also try it on macOS or in a simulator: I'm not sure how to change sample rate on iOS.
I ran into the sample mismatch error when running in the simulator on my computer. At 48k, I'd get the error, at 44.1k, I wouldn't. Grabbing the hardware format from the microphone solves the issue so that it runs in the simulator regardless of what my computer's sampling rate is at:
let hardwareInputFormat = engine.inputNode.inputFormat(forBus: 0)
//let audioUnitFormat = avAudioUnit!.inputFormat(forBus: 0)
engine.connect(engine.inputNode, to: avAudioUnit!, format: hardwareInputFormat)
engine.connect(avAudioUnit!, to: engine.mainMixerNode, format: hardwareInputFormat)
let outputFormat = engine.outputNode.inputFormat(forBus: 0)
engine.connect(engine.mainMixerNode, to: engine.outputNode, format: outputFormat)
In any case, I'm a n00b with Swift and Xcode and fumbling my way through it, so your project has been TREMENDOUSLY helpful. Thank you for putting up with my riffraff!
Glad to hear that it works! Feel free to ask/suggest anything on GitHub, it will be very appreciated.
Hi, I have a question. I already have a project with js export, so of course I use HTML, CSS, JS and the JSON file. I only use parameters. However, I'm trying to build the same for iPhone and iPad. I found out this forum and the github which I already download, but I don't know swift language.
Can someone explain a bit more detailed about how to create the UI and how does swift works?
Thanks
Carmen Fregoso,
I recommend you watch these wonderful YouTube tutorial series:
Hacking with Swift by Paul Hudson
SwiftUI Bootcamp by Swiftful thinking
Tutorials by Sean Allen
There is more, but these are more than enough, unless you are looking for something really specific.
Also asking questions to ChatGPT really helps sometimes. Just check everything it says.)
There are two parts to this: learning Swift, the programming language itself, and SwiftUI — a relatively easy to use framework for UI creation that is made for Swift. Most of the authors have tutorials on both and sometimes they even combine them into one course.
Hi,
Thanks to your rnbo-swift layout I can play some sounds sending midi note on and off to my patch export. Unfortunately I cannot change parameters . Is someone facing the same issue ?
Thank you for your work.
Hi, Alexis!
Did you manage to run the example project? There are sliders that should control parameters. If they are working fine, we must look into your specific case.
Best regards,
Eldar
Hi,
thank you so much for answering me so quickly. Thanks to advice in can now changer parameters but only using the setParameterValueNormalized() function. I still not understand what is going wrong when I am using setParameterValue(). If I find my error, I will let you know.
Hi,
I tried to run the example app on an iPhone, and got the sample rate mismatch message.
Tried the code from se.thorn above, and the error disappeared. But the app doesn't quite work, there is some sound, but it feels rather strange. I currently have two things I want to do: 1) remove the microphone input (hope that is easy), and control the volume of the output. All help/suggestions on how to do this is appreciated - and also info on how to do the sample rate thing correctly - unless the code above is what should be used (from googling it seems the iPhone needs 48000 sample rate).
Edit: I changed 44100 to 48000 everywhere in the code, and the milliseconds to 0.02083333333. No immediate crashes, so this is maybe all that is needed?
Year
2023
Location
Moscow, Russia
Author