Integrating RNBO into iOS / Objective-C based project.

amonk's icon

Hey all,
I'm looking to integrate RNBO into an existing iOS application built in Objective C in Xcode. I am planning to make an Objective C wrapper around the C++ code. I'm following the instructions for importing RNBO in Xcode :

Include the header files located at rnbo/RNBO.h and rnbo/common/*.h

Add rnbo_source.cpp and rnbo/RNBO.cpp to your compiled sources.

However, when I then try to build the project it appears to be missing files from the '/src' folder with the error : 'src/RNBO_DynamicSymbolRegistry.h' file not found

Importing the 'src' folder creates (many) additional errors... which following leads me down a rabbit hole that I don't know quite how to navigate.

Has anyone else been encountered this or figured out how to integrate it? Thanks!

amonk's icon

I was able to get it compile. I simply needed to add entire RNBO folder (recursively) to the Header Search Paths in each of my project's targets.

George Khut's icon

So does this mean we can build RNBO patches in max (e.g. synth/sample-player patches) in Max, and have them run on iOS devices? Ive worked with developers previously, doing this with PureData (libPD), and I'm curious if I can now write my patches in RNBO instead?

Jean-Francois Charles's icon

George, it's not plug-and-play for iOS as it is for Raspberry Pi: for iOS, you can export audio processing code from RNBO, then you have to integrate it in Xcode and develop a graphical user interface, etc. If you have no experience developing with Xcode, that might be a steep learning curve.

George Khut's icon

Hello Jean-Francois
Thanks for your reply. Previously I have released an iOS app on the app store, with asistance from excellent developers who able to build an iOS app with my PD patch inside it (libPD).

I do appreciate it is certainly not a plug-and-play process, what with Test Flight and the various quirks of the App store submission and publication process as well.

The PD patch that ran on the iOS app - did all the data mapping work, plus interactive sound (sample playback, simple synths).

I used PD to perform heart rate data statistics (measure min-max, set baseline etc.), and then to map this statistical data to control audio biofeedback (sounds that indicate changes in heart rate), PLUS OpenSoundControl messages back out to the iOS app that generate the visuals (e.g map average heart rate to diamter and colour of shape).

So… I'm curious if RNBO is best used mostly for audio applications (all opperations in RNBO are performed at audio rate??).

I'm guessing if all the statistics, mappings and interpolations (spring, ease etc.) from HR data in my app that are used to modulate the sounds and visuals - are being done at audio-rate (instead of say - at a given frame rate) - that may be quite a heavy CPU load for the iOS device.

My PD patch - which I'm sure could be improved) was pretty heavy on the CPU anyway, as you can see in the screencast below - combined with the interactive visuals - the app barely scraped through in terms of graphics frame rate.

I'm now in the process of rebuilding the app to work on iOS 16 - so I'm trying to figure out if I could switch from doing my portion of the 'development' in PD or Max/RNBO/gen~

Eldar Sadykov's icon

Hello everybody! Proud to present: https://github.com/ceammc/SwiftRNBO
Objective-C++ Wrapper: Alex Nadzharov
Maybe OP could adapt this for Objective-C instead of Swift.
If so, please contact us to include it.

Ruslan's icon

Cool Eldar! It's what I'm looking for. The only thing I need to realize is how to buy RNBO from Russia.

Eldar Sadykov's icon

Ruslan, we are happy to help! Please stay in touch with me if you have any questions: info@eldarsadykov.com