real time notation
is there such a tool that creates a notation out of audio that is produced in max ? Meaning if i play and tweak in patch that makes a real time notation ?
yes, that can be programmed in max. ;)
Curious :P
If you mean musical notation, as in notes on a staff, then look at the Bach package (package manager).
Bach would perhaps be the second part. According to the question he wants audio-to-MIDI at first and nobody wants to start a 132-pages thread ...
However, it would be interesting what "notation out of audio" means. I guess notation for pink noise is not in question, so what kind of audio is that and how is it generated?
Maybe not 132 pages :)
just a simple thought experiment
What is the minimum number of different parameters that one needs to extract from the audio?
What will it take to extract them?
Can they be converted to midi, or Bach?
//
2 parameters (possible 3rd)
Pitch detectoin, duration detection, and (dynamics)
1. any number of pitch detectors 2. an event gate and counter designed to open and close when pitch or attack changes and spits out milliseconds. or samples. 3. a way to send a single different bang at different amplitude levels each time the gate opens
MMMax!
edit - audio "produced in Max" will be much easier to manage (and notate) then audio that is performed on an analog instrument and "captured in real time".
(1 - finished, 2 - not difficult in max, difficult to capture, 3 - same as 2)
one can sureley line out what parts would be reqired in theory, transient detection, pitch tracking, makeing a up a rule how to combine those, then you could next try to do it on different frequency bands and so on, but there are good reasons why melodyne does it completely different (adn still can not capture anything), because with 3 different instruments in a highly dynamic piece you´d have no chance with average~ and fiddle~, as they will only produce data salad.
in 2023 i would probably start to do that by splitting the imaginary instrument tracks in the piece using a better of these KI based unmixing online services. in step 2 you could analyse those tracks now to check for their maximums and minimums in regards of level and frequency to adjust how its history shall be tracked later. and only in step 3 one can now think about finding frequencies, attacks, and lengths of the music events, which is not difficult at all.
inserting rest symbols and finding the key and scale system for notation should be the easiest part then.
but you have to start with page 132 here, or you will never reach page 1.
Depends on whether it's going to be an experiment or a traditional musical application. If the latter, every little thing can be a big thing. Think of transients, pitch deviation after transients, dynamic sample start and length, degree of precision, speed, pitch range and bend correction etc. There are few people on earth who actually mastered audio-to MIDI for real time in an universal usable way. We'll see what AI has to offer in this niche segment.
However, we can't do anything with the question of the thread as long as we don't know details about sources and expectations. Audio-to-MIDI in general is too little information and Roman gave instantly the right and only answer: it can be done :-)
Well , let say i have many instruments in patch kind of musique concrete, doing things in max, so real time performance, and I would need some sort of a program that would help me notate what is played in max, doesn't need to be really detailed, but just some sort of a score that I could export?
so Max - > Score - > live ensemble performance (the score I'd get out).
Something in this direction.
All the samples I'm using are live recorded instruments one by one and now I'd like to make different types of mosaic and see what interesting happen out of this, therefore I'd need 'something' that would help me notate what is played in max in real time when I'm tweaking the knobs and parameters.
Hoep thats clear enough? haha
I'm more or less newbie in max, doing for past 2,5 years and I never did something in this direction.
Studied compositions and more coming from a musical gound rather than programming. But I had max classes there aswell. also I'm really interested in live audio process type of patches. So if some of you know a book , tutorial or patch reference where I could get an idea, that would be awesome.
I know ezdac and simple things to create but nothing really interesting comes out, or maybe it does sometimes, but not enough to create a composition for certain instrument and electronics.
Thank you a lot for your answers. Always nice to see how supportive people get here.
Best
Hello, I did in the past a series of work called "Adaptive studies" in which I developed a real time transcription system structured as follows:
a pitch detector (from audio to midi), trying different objects (retune~, fiddle~ sigmund~, in the end I opted for sigmund), an attack detection and duration calculation system, a storage system based on the bach library ( starting from bach.transcribe), which allows the storage in symbolic form (score fragments) of successive snapshots of what an instrument is playing. The system was designed to work on a single instrument, but I also created a version with two instruments (separate microphones and separate transcribers). Obviously the system is not free from errors (especially pitch detectors in live situations are not always reliable) but for my purpose the error was tolerable.
Here is a paper that describes the system (I apologize but it is in Italian)
https://www.riccardodapelo.com/uploads/9/4/8/3/94832014/verso_unopera_adattiva.pdf
@Elvis Homam - I'll post some examples soon
for now your can:
Download Bach and doodle
Check out some pitch detecters and doodle
we can work out how to convert things from audio to notation
later
wil~
Hi Elvis.
Here is short example of audio pitch detection to notation in Bach. (Very basic, needs improvement - next step, add durations + continuous notation)