help making customizable UI inside locked patch

    May 07 2009 | 8:46 pm
    I have been testing an idea which will hopefully lead to a patch that will let the user drag the UI elements around, changing their position and also how they interconnect with the other objects in terms of signal chain.
    For example, a collection of synth modules which can be re-arranged in the signal chain simply by dragging them into a new visual order, inside a locked patch/standalone application.
    I've not yet tried to get the msp connections to automatically connect/disconnect depending on position, but it looks like it might be possible using thispatcher.
    I have the dragging working for normal UI stuff, but thispatcher does not seem to communicate with bpatcher in the same way. The whole idea would be very complicated without using bpatchers for each draggable module, as all of the positions of each element in relation to the others would get messed up.
    Any ideas?
    Heres the patch I made to test it with:
    [the patch contained in the bpatcher should be imbedded in the test patch, its an mlrV-inspired gain+vu I made to test the dragging]

    • May 07 2009 | 9:12 pm
      try the "sendbox" message to [thispatcher] like this... I haven't used "sendbox" much, in fact I just learned about it myself very recently.
      also, [key] and [keyup] don't take arguments...
    • May 07 2009 | 9:30 pm
      Ahh, I didn't know about sendbox, very new to all the possibilities of thispatcher.
      That works well, thanks very much!
      Do you know it it's possible to have msp connections visible in presentation mode?
    • May 07 2009 | 10:06 pm
      I don't think this is possible, however if you put your connections in a [bpatcher] which isn't in presentation mode you can obviously still see them. The link below is to my UI clickable [matrix~]-esque patchbay which might give you some ideas.
      Note: the first inlet is a "ghost" inlet so will always show a max connection even if an MSP object is linked to the first inlet.
    • May 07 2009 | 11:47 pm
      I think if I want the connections visible, I'll just make the presentation of the patch in patch mode, with unwanted objects hidden.
      Here's my first attempt at getting two objects to connect only when the bpatcher is in a certain area of the screen. I tried just using the mouseover function of lcd and tab to trigger the connect script, but they don't respond when dragging another object over them.
      Hopefully someone has tried this before and knows a better way of doing it?
      I had a look at your patchbay, and I don't really understand how it works, I've only been using max for a couple months, so i don't understand the sprintf objects and what theyr doing.
      Anyway, heres the connect attempt:
    • May 08 2009 | 12:16 am
      The [sprintf] object is just used to format messages to [thispatcher]. I thought you might find it useful if you wanted to go down the modular synth type route of connecting your subpatches, but if you want the connections onscreen to the actual inlets and outlets you will probably want to go with making the connections direct to the objects. I've included the helpfile for the lh.patchbay example anyway just so you can see how it works in a [bpatcher] but it might not be the right tool for you.
      If you want to report whether your mouse is over a [bpatcher] or not you could give it a scripting name in the object inspector then use [hover] to trigger connections. A scripting name would also allow you to use the "connect" and "disconnect" messages to [thispatcher] to make and break connections (see the helpfile). It all depends on how you invision this project working. If you could give more detail on how you'd like to move and connect your subpatches it might help. I'm sorry if I'm confusing you more than anything else but I do think your idea is really interesting. Let me know how it progresses, I'll help if I can.
    • May 08 2009 | 1:38 am
      Here is another way that could conceivably work. I haven't dealt with making connections but it shouldn't be too difficult from here on in. In any case, what is happening is that after you move a bpatcher its new location is stored in the coll and then from there one could detect which patcher is the right-most, top-most etc...
      Hope this helps.
      Open the "bpMoveTest" file!!
      sorry for the messy patching... it is probably also overly complicated right now.
      The js code isn't mine. I got it from someone on the forum some time ago and can't remember from whom...
    • May 08 2009 | 1:42 am
      I didn't know [hover]. Thanks!!
    • May 08 2009 | 1:50 am
      I like the way it moves relative to where you click the mouse, I was thinking of piecing together a javascript to do something similar. Here's a simple method I've just pieced together that doesn't rely on [ubutton] but also always moves the object's top left corner to the mouse position. lh
    • May 08 2009 | 8:40 am
      Thanks for the replies! I guess what I have in mind is something similar to reactable, where each bpatcher would represent some thing like an oscillator/filter/adsr module, but be made so that (unlike my impression of reactable) it's easier to create useful "musical" synthesis, if that makes sense.
      The whole thing would be much easier, albeit less interesting if I just used static modules and patched them together using matrixcontrol, but I really like the idea of an interactive interface.
      If the idea starts to take shape, I also think it would be great for anyone else to be able to build their own modules, and then load them into the application with the same functionality as the default ones. It could be used for exploring synth techniques, processing live audio,sound design, or could be played via midi like any other synth.
      And eventually, when I get around to learning some of the possibilities with jitter, see if I can incorporate some visual interaction with the objects. I'm thinking it won't be possible to have jitter and bpatchers in the same window, but one idea would be to have a symbolic representation of each module and how they react when connected to eachother projected onto a screen so that there is some visual interest for anyone watching. I am going to try and get a friend who is studying art at university involved for input on the visual stuff, could even be used to process live video, each audio module corresponding to a jitter process.
      I think that once the basis of the interconnectivity between the modules works in terms of msp signals, it would be really interesting as an open project, to have lots of people contributing features.
      IT could end up being an incredibly complex project, but it should all be possible, and if it works out could be a really powerfull tool, there's only the discussion of whether or not the synthesis will sound good enough, but I personally think thats down to the programming and choice of objects rather than maxmsp's limits.
      I really like the non-ubutton method, dragging with the top left corner makes it look better than how I had it.
      Like i said, I'm still pretty new to this program, so it might take a while before the whole thing takes shape, but I'm happy to keep anyone posted who is interested. There will more than likely be some more posts on here when I run into new problems!
    • May 08 2009 | 9:45 am
      here's a version which keeps the pointer position when you click and drag it. took a bit of fiddling with the ubutton. If it's a pwindow only, you could make do without the ubutton, but I wanted to make something that could be used for any object.
      I encapsulated the process part but for some reason it behaved strangely then (yes I did hit the loadbang again). not sure what's up, but could be fixed, then usable as a tidy subpatch with any object you want. Would only want one master mousestate though.
      I really like the Reactable idea, you could definitely have some fun messing around with the objects for this kind of "spatial-interaction" synthesis. Plus you could take it into three dimensions, especially with the [cosm] objects just released---they would manage the sonic space/ambisonics plus collision/distance detection. That would be *amazing* if done well!!
    • May 08 2009 | 10:23 am
      Those cosm externals look like they could open up some incredible 3D sequencing options, giving each object a path to follow that would bring them into contact with eachother over time. I wonder if it would be possible to model magnetic attraction/repulsion. It could be used to do all kinds of things, influencing a new direction after a collision.
      .....I need to start exploring jitter.........