Please Help WIth Drum Sequencer
Hey guys! So I'm working on a drum sequencer for a project, but I'm running into some problems with it that I'd love some insight on.
First off let me describe what I'm hoping to do. I'm hoping to build a step sequencer that allows for drum sequence programming for up to 8 drum sounds and up to 8 patterns up to 4 bars that can be switched between in real time. Also, I'm hoping to be able to edit the sequences on a 256th note resolution. In the end, this is going to be edited on a hardware controller that contains a row of 16 buttons, so you'd select the drum sound and pattern you want to edit, the resolution you want to edit them at, and the position in the overall pattern that's being displayed on the buttons. I also want to be able to edit the velocity of each note individually with all of this.
Now let me get into some of the problems I'm having programming this so far. For one, I'm having loads of trouble storing with the amount of data that this seems to use, as a single sound of a single pattern at a 256th note resolution will have 1024 notes. SO there's a lot of data to store and recall, and I'm having a lot of trouble with this. I've been working primarily with the preset and matrix.cntrl objects to do this, but I'm running into loads of errors. Also, working with different resolutions, it seems the stored data for different resolutions needs to be able to interact with each other. For example, a note on the first beat of one resolution should also be on the first beat of every other resolution.
So, with all of this, if anybody could give me some general pointers with how to go about doing this, it would be GREATLY appreciated.
Thanks,
Charlie
hi,
yeah, matrixctl gets really weird at any size over 16*16, really. Once you hit that first hurdle you should generally stop and re-assess the project.
You could consider dumping the data straight into coll, which generally won't mind having a thousand of anything in there. And it'll store a list in each slot, so that could be a list for all 8 drum sounds.
Or you could use peek~ and poke~, which will store an absurdly high number of values.
Have you done any sort of mockup work with this project? Because the more I think about it, the more absurd 256th note programming sounds. Like, if I *ever* do anything at that rate, I certainly am not step programming each individual note, but rather drawing out rolls.
Thanks for the help! I'm having a lot more success with coll. It seems as though the hardest part of this project is going to be the data management of the entire thing. And yeah, I understand editing 256th notes seems a bit ridiculous, and there's a good chance I'll scrap it and stick with 128th or maybe even 64th notes, but I'm hoping it'll add a really precise level of control. I'm hoping to use this to create some really complex rhythmic patterns, and I figured that editing on a 256th note resolution could help with that.
I think a great ui has to be a balance between form and function, and it has to be easy to navigate.
IDK, I think micro-edits are great, but doing it this way is akin to menu-diving on a dx7; you might be able to build something complex, but it won't be fun, and precise control seizes to be feasible.
I go on about this, because it's a spectrum I spend a lot of time in these days, building sequencers, doing specification for them...
The method I'm attempting right now is building blocks of sequences, and then having a sort of meta-sequencer sequence those.
This allows for a relatively high amount of complexity, while keeping the interface somewhat doable.
I wonder if it would be possible in Max to build something like the piano roll editor of Live (or other DAWs) where you can switch between different note resolutions and levels of time control by zooming in and out.
just a poly-live.step, really. That'd be really neat, I think.
So the sequencer is closed to finished to now, thanks to the coll object, which really helped me organize all of my data. I'm running into a bit of a problem with one aspect though that someone may be able to shed some light on. I'm trying to add the ability to sync the sequencer with Ableton (I'm running this in Max for Live) and I was accessing the current song time in the Live API and processing that information to do so, but it isn't running very stable, as it tends to cause triggering at bad times, as if it isn't updating fast enough. So I'm wondering what the best way to sync a beat in Max for Live to Ableton's clock.
yep. The "transport" object does this.
The object can sync with Ableton?
the whole runtime is sycned with live, if you want to call it like that.
In M4L you could also use [plugsync~]. It outputs the transport state of Live at intervals of 64 samples (about 1.5ms @44.1kHz samplerate). In particular, on the 7th outlet it delivers the current beat as float number, just like current_song_time from the API, but with much better time accuracy.
Great, that's exactly what I was looking for!
transport has a 480ppq resolution, btw. That's 1/1920th of a bar ;)
IDK, I've never found the need to go any higher. YMMV.