Doppler effect, some leads needed

Julien Bayle's icon

Hi there,

I'd need some leads to work on my doppler effect in myUniverse (= 3D Space with a cam and moving objects emitting sound)

My cam is the place where ears are.
I still don' t know if I'll use a more than one microphone approach. One microphone = 1 dimension panning, enough actually.

My objects emit sounds.

0/ does the doppler effect should be apply on my master output ?
I mean, the cam is moving near from 2 other objects moving.
Which frequency would be shifted ?
Answer is: all relative distance change involves a shift.
What would be the approach ?

1/ what would be required to calculate my doppler shift ?
I guess I'll have to make some continuous change in the frequency of my sources.
Is there a global formula that could be applied directly inside my objects to modify the nominal frequency ?
Is this the approach to follow ?

ANY ideas/leads would be appreciate

best regard,
julien

Floating Point's icon

my approach would be to calculate the distance from your mic(s) to each object at every update (presumably every audio vector) and then delay the audio emitted from each object by the time equivalent to that distance (ie delay time = distance/(speed of sound). Each sound source would have it's own delay
use tapin/out to vary the delay-- changes in delay will cause a perceived pitch shift corresponding to the doppler effect
so you don't vary the pitch of the sound sources directly, but only via the changing delay

Julien Bayle's icon

Hi terry and thanks a lot for your answer.
It is totally clear.
I'd call this implementation : the natural Doppler effect :-)

I'll use super collider as the sound generator, using synth and voices handling inside of it .
I'll add the delay for each sources in sc

Indeed, it is totally true that changing the pitch itself wouldn't be ok.

I'll make a post about the mic(s) implementation for spatialization purpose.
Still try to figure out the concept.

thanks a lot again.

Peter McCulloch's icon

Since you're synthesizing, you can probably just change the pitch of the synthesized sound via line~. The doppler patch in the examples is a good starting point. (IIRC there's maybe even a synth version in there)

Since you're already doing the distance calc in your java code, it's really just a matter of interpolating that over time into the appropriate units. You could optionally do it with delay, but that's probably more expensive, since you can't downsample the interpolation if need be, whereas for the synth you could do that if you were using line~ to change the frequency.

Julien Bayle's icon

hello Peter,
I got it.

These are 2 options.

About the "it's really just a matter of interpolating that over time into the appropriate units", I guess I missed something.

Currently, I'm firing an event as soon as the position is changing.
I don't know the frequency of updates. It is driven by the frequency of position informations popped out from jit.gl.camera.

By interpolating, you mean to use line~ I guess.
Indeed, from discrete values, line~ can fire continuous sample accurate continuous value.
Is this what you mean?

3993.positionUpdate.PNG
PNG
Peter McCulloch's icon

Hi Julien, I forgot that your sound engine is in SuperCollider, so you'd just do the interpolation there, so line~ wouldn't really be needed in this case. For the interpolation time on that, you probably could just set this to some small value (< 20 ms?) and I imagine it'd be fine. Alternatively, you could derive this from the framerate, but I suspect that level of precision will be overkill.

Also, the volume is inversely proportional to the distance^2, so it's handy to have that value floating around...

Max Patch
Copy patch and select New From Clipboard in Max.

As far as the tuning goes, there's this from the doppler example patch:

Julien Bayle's icon

Ok Peter I got your point.
I guess there is something I missed in your patch, btw.
The pitch ratio would change only when there is a distance variation (= when a movement occurs)

About the distance, I would use that power 2
But indeed, all is virtual.. I mean, I have also some doubts about units in the virtual world.
I can make very little objects and put the max speed at a very low value OR make huge objects etc...
I'll post a metaphysical post/question about that in few minutes :p

About all what I have to do in my sound sources (basically, voices of synths in Super Collider), I'll put a little module in each synth.
This little module will be responsible for:
- sound attenuation
- sound spatialization
- doppler stuff (delay? or direct pitch alteration as you mentioned)
- sound modifications in case where I want to create some specific atmospheric fx (only some filters or I don't know)

The (now famous) objects know distance to cam, angles to mic, so they can fire & tweak SC synths in realtime
(OMG ... poor cpu)

Does this schematic make sense ?
It is only a schematic to sketch the global chain at the end of my sound sources (=synths in SC)
I already put it there: https://cycling74.com/forums/strategies-of-spatialization-for-moving-objects-ears about spatialization (for which I still have to figure out mic & calculations I'll have to do)

3994.sketchingSpatializationSTuff.PNG
PNG
Peter McCulloch's icon

Yeah, I think that's pretty much it.

I use spatialization rather than volume to handle density in my pieces, and it works well. I use a onepole~ filter to rolloff the highs as things become more distant.

Julien Bayle's icon

=)

you mean you don't use the 1/r^2 volume decrease ?

Peter McCulloch's icon

Should have said "spatialization rather than just volume"... I have amp, filtercutoff, and dry level stored into a lookup table so I just treat distance as some 0-1 where 0 is "IN YO FACE!" and 1 is infinity.

You can also do a global reverb for each output (separate from dry), so really distant sounds can only be in reverb, close sounds can be mostly dry, etc.

These threads have been kind of funny for me (the whole universe thing) as I've been re-reading some Douglas Adams recently...

Julien Bayle's icon

I got that.
about spatialization, are you calculating stuff with angles in 3D ? by projecting all in the cam plane?
I'm interested.

(and I could really be interested by Douglas Adams too!)

Peter McCulloch's icon

I've always just done it in 2d, either cartesian or polar, mostly out of laziness... The only thing that the angles probably matter for is the spatialization of the position, since the other parameters are usually functions of distance. In 3d you could use it azimuth to do some high-frequency tailoring via high-shelf, but that is more subtle and something that might be worth doing if you find you have extra CPU hanging around. Mostly, I just worry about distance, since that's the thing that seems to have the biggest impact on everything.

I use the angles to adjust the dry signal that goes to the various speakers and in combination with the distance to determine where the reverb goes, so that really far off sounds on the left-front are only in the left-front reverb whereas sounds that are really close are in all of the reverbs.

As an alternative approach, you could also treat your speakers as virtual microphones (the old room-within-a-room reverb trick) that are projected from your camera at fixed distances. (imagine that your camera is at the center of a square and that the virtual microphones are at the corners) There's a lot of nice things that you get, but it comes with a higher computational price, since you're now dealing with four distance calculations.

Julien Bayle's icon

Peter,
in my case, I'll probably add more atmospheric effect a bit later.
Indeed, I like the basic idea of using azimut to control the amount of sound sent to that or this speaker.

I made a very light schematic.

About the distance attenuation.
I can distinguish the 2 sources (1) & (2) by distance
(1) will be heard a bit less loud than (2)

About spatialization/volume distribution to my speakers
If I take only the basic first angle to the source, I cannot distinguish (1) & (2)
And in fact, (1) should be heard a bit louder on the FRONT RIGHT than on FRONT LEFT & REAR RIGHT, compared to (2)

I guess I'm missing something (a value to measure for sure)

3995.spatialization.PNG
PNG