sig~. 'Order Forcing' how does this work?
Hovering over the left input of the sig~ object gives the hint "(signal) Order forcing". I couldn't find mention of this in the documentation. Can someone explain what this means, and how it works?
It's something that was used in the old days to perform some DSP optimization, still works, but is better to forget about So no need to worry about that, sig~ convert a float into a signal, that's it
The reason I ask is that i'm looking for a way to force the order of 'bang' generation when parallel signal cords are terminated with edge~ objects. Do you kow if sig~ can help with this? (or if there's another way).
bangbang is probably the object that you're looking for. or trigger.
They're no use here i'm afraid.
See the right-hand part of the patch. This is the kind of situation i'm talking about. I want a robust way of guaranteeing a that the edge~ objects generate bangs in a certain order (changing the edge objects' horizontal screen order restarting DSP changes the bang order, but i'm looking for a less hacky way of organising this).
What exactly are you trying to do? The example patch isn't very helpful as you would just use one [edge~] and then use a [trigger] or [bangbang] object after it to determine the order of printing as Emmanuel mentioned. If you could elaborate on the what you are trying to achieve and why you might get more help.
lh
Hi thereishopeforus.
I'm not interested in help for a specific patch. Rather, I'd like to understand a general principle.
I'd like to know how to reliably ensure a particular order of edge object bangs when those objects share an MSP object as their common ancestor. Failing this, I'd settle for confirmation that what i'm talking about isn't possible.
Thanks.
Ah in that case I can't help. Perhaps having a look at the SDK will shed some light on this. I assume that with signal connections you are not meant to think that everything happens simultaneously (but obviously this is impossible) so the order is undefined. In pratical terms like your [edge~] example it makes absolutely no difference as the "bang" happens at the control rate and this is where the [trigger] object and others come in. Maybe someone else can chime in here.
lh
as already stated by the other guys, as I would always recommend to stay away from using object positioning to determine the firing order, and would therefore use trigger for the left side of your patch (event domain), consequently I would also use trigger to determine the order of the edge outputs.
I understand you don't want help for a specific patch, but obviously there is no order when signals are involved, and for events I would settle on always using trigger. Is there a specific reason why you wouldn't want to do that ?
"but obviously there is no order when signals are involved, and for events I would settle on always using trigger. Is there a specific reason why you wouldn't want to do that ?"
Hi monohusche. For events i always use trigger to guarantee correct event order in my projects. The left-hand side of the diagram i linked to is an exception because it was made specifically to demonstrate how object position effects bang order in both domains.
Event order in general isn't a problem. It's the order in which parallel signal-to-event chains are resolved that I'm looking for more information about; a very specific situation.
yeah understood.
I am just stuck on seeing an example where the order of signal processing (not event firing) would matter to you (e.g. give different results based on the respective order).
Or to put it differently: all you care about is in which order your parallel signal-to-event chains fire, and that can be determined with trigger.
maybe you have an example where it makes a difference.
"maybe you have an example where it makes a difference."
I don't have a patch to share that's clear enough to demonstrate it, but here's an example.
You're working on an app like mlr ;) All timing related stuff is ultimately driven by one sync object.
Every 16th, a pulse is sent to a certain part of the patch, so that it can handle things that should take place every 16th. This pulse is translated to a bang via an edge object.
In another part of the patch a pulse fires only on every bar, again this pulse is translated to a bang via an edge object.
The bar pulse and the 16th pulse are both ultimately derived from the sync object.
At the start of a new bar, the bar pulse and the 16th pulse will fire 'at the same time', but you need to be sure that the bar actions will happen before the 16th actions.
Of course you could redesign the patch so that it does this all differently. But I hope that that's a concrete enough example of the kind of situation in which it might be desirable to determine signal-to-bang order of chains that share the same MSP object as an origin.
I would centralise this whole function into a subpatch (called Quantiser) which also does the signal-to-event translation (using trigger) resulting in using remote sends (send 16_bang, send 1_bang).
I believe that this would be the cleanest way from a modularization perspective. Maybe you want to change the behaviour later to have the 16th event fire BEFORE the bar event, then there is one central place (your quantiser) where to change that.
The main reason for this design would be that obviously, the pulse is only needed as the source for an event rather than as a signal, so why not expose it only as an event which is being correctly scheduled in relation to other events.
I guess, that's the whole point. If you want determinism in terms of order, use events and trigger. If you want continous changes, use signals. I am not sure whether order in the context of signals makes sense.
this document should be interesting: https://cycling74.com/story/2005/5/2/133649/9742
I guess, the difference between audio and event domain is that each event is discrete which means there is clear determinism in how to travel the patcher network (depth first).
with audio, there is no such thing unless one wants to call a vector an event
You can prevent double triggers with a simple delay object. If the delay object is set to 0 it will only allow one bang per scheduler tick... (If scheduler in audio interrupt is on, it will be maximum one bang per signal vector...)
I can't imagine you need more than that...
Stefan
Mono's advice sounds solid, but I may have an example where it makes a difference and centralizing the transition from sample to control rate to a single root still leaves order ambiguities. It involves seq~ and I made a separate post about it: