A core calculating, a lot of objects (=abstractions) living… strategies ?
I have a JAVA core
This one instantiate pre-made abstractions and store references and all infos from them at init time (basically when I’m creating or removing or modifying objects, and when it has to store/load presets)
The bunch of objects almost lives itself and is able to trigger their own events like sound triggering etc.
I have a question.
All those objects have parameters
JAVA core, in order NOT to be the bottleneck wouldn’t be the things calculating, for instance, distances between the camera & all my ob3D objects as soon as a movement is occuring.
Because I need that each object is able to know the distance between the camera and itself in order to be autonomous to trigger an event (here a music notes), how would you do that thing ??
1/ distributed calculation
Of course, the basic idea would be to put a distance calculator in each object.
2/ centralized calculation + maxi propagation
The second idea, is obviously to trigger the calculation inside the JAVA Core one time (but VERY often, as soon as a movement occurs), and to propagate ALL distances at time calculation TO EACH object.
Indeed, in the first case, no need to propagate, the JAVA Core only needs to know about initial position (knowing some objects are moving, but the new positions is only important for sounds calculations)
What would you advice to me, you gurus ??
How are you doing distance? I’m assuming Euclidean and in 3D? The reason I ask is that that’s potentially a lot of sqrts. If you’re using some threshold to trigger it, I’d do your threshold in terms of distance^2 rather than distance = sqrt. e.g.
boolean isBelow = (distanceSquared) < (x-a)*(x-a) + (y-b)*(y-b) + (z-c)*(z-c); // 3 multiplies and 3 adds
boolean isBelow = distance < Math.sqrt((x-a)*(x-a) + (y-b)*(y-b) + (z-c)*(z-c)); // ack.
Can you describe a little bit more what your system looks like? Is the threshold for triggering events always the same, or is it different for different objects?
My suspicion is that it’s going to be faster to just go ahead and do the calculation on the objects in your system provided that there’s not an insane (100,000…) number of them, as the branching instructions (testing if in range) could end up being just as slow (since the distanceSquared is only three multiplies and adds). If you did it with centralized calculation you could use one of the linear algebra libraries out there. They’re pretty damn fast anymore.
If you end up populating a bigger universe, maybe you have some function that takes into account the maximum velocity of your camera and gathers all the possible in-range objects for the next n frames, though this gets nastier when camera and objects are moving.
Have you done some profiling? In either of your cases, there’s going to be function calls between the core and the object, and that’s probably where you could have a bottleneck, but I’m also a bit rusty as to what optimizations the compiler can make. How is it running now?
thanks for your answers & questions too :)
About the distances calculation, I am not using root square but indeed distance^2
I didn’t optimize that part enough, I used Taylor Series with bitwise operators in Digital Collision iOS App (from OpenFrameworks which is C++) and it work VERY fast & is totally ok.
In the case of myUniverse, I’m afraid to be not enough precise for big numbers & distance.
So yes, distance^2 is the way to go I totally agree… and what you are saying reassures me :)
To describe a bit more.
Each type of objects are doing almost the same job.
The interface with the global system is the same (send/receive using the simplest possible messaging system, with broadcast busses etc)
Some objects are moving themselves, which means their particular distances have to be calculated every time, even if the cam doesn’t move.
Some objects are not moving anymore, which means I can fire the calculation for them only when the cam moves.
The number of objects wouldn’t be insane!
If I consider my sounds emitting objects + my sounds modifiers & my visuals modifiers objects, I would have … VERY maximum 200. no more
I don’t know because I would program/compose inside that universe and I don’t know til where I’ll go
But saying… if I’m insane, it would be 300 maximum.
The trick about the maximum speed is a bit "dangerous" I guess. Especially about side-effects.
Actually, I didn’t profile yet.
I don’t have enough objects and I’m currently designing a GUI to create/move/modify my objects in that universe
I’ll have quite soon a way to place all objects in a huge map and I could make some tests.
Another important point.
Some objects are circular, some other more like very long lines.
I’m approximating those being a bit cubic, or Parallelepiped by sphere. Indeed, if the ratio the smallest dimension / the greates is til 1/3, I’m okay with a sphere.
So.. it means I have 2 cases, directly defined by my objects.
Each object is, by design, in one of both cases.
So in one case the distance is Euclidean, in the other one, I have to calculate segment to point distance.
This can be done using these kind of optimizations I guess http://www.softsurfer.com/Archive/algorithm_0102/ , plus the fact I’ll avoid sqrt & every trigo stuff (using lookup table or whatever)
I’ll go for my Java Core calculation and propagating the results to the objects.
- a cam movement triggers a calculation+propagation to ALL objects
- an object own movement triggers a calculation only for him (cheap optimization but)
what do you think about this ?
In case of hardcore behaviors, I could also go for a grid optimization.
Each object belongs to a grid.
I know the sound emitting range of each object.
I’m triggering the distance calculation ONLY for the cases adjacents to the cam (or one case border outside etc)
The problem is about the dimensions of objects which should be the same but by cheating a bit, it could prune a bit the huge calculation tree.
As you wrote, "how is it running ?" is the main question.
Optimizing only for the pleasure of optimizing won’t be made here, at least for that project :)
I’d definitely recommend unit testing. Figure out your worst case scenario (everything moving) and see how it does with 200 objects. Build a test unit that’s just moving things around; if you can handle your worst case, you’re in business. You also don’t have to update these objects 500 times a second, just enough to keep up with the framerate… If you’re at 30 fps, that’s ~3 ms of time resolution which is very reasonable sound wise.
I’d say just solve it in the most readable, logical fashion, then profile and see where you are. Also, I don’t know if you’ve checked out the Java 3d libraries, but someone may have come up with a smart implementation of a lot of these things already, and that could save a bunch of time.
If anything, I expect it’s going to be the signal processing that could get intensive without a good muting scheme, especially if you have doppler shifts. (though this isn’t a problem if you’re synthesizing the sound…)
About the update.
Actually, I trigger distance calculations like that:
- cam moves, calculation triggered for ALL objects and results propagated to all objects from the JAVA Core
- if an object is moving, calculation is triggered for it only and provoked by it to the JAVA Core, that one sending back the result
Indeed, I can "limit" the time resolution inside the JAVA (using a timer thread I guess) but, as you wrote, I would go there in case of problem.
The first tests will be quite important to see where I am.
About library JAVA 3D, I didn’t really check that (yet)
The light stuff I made seems solid because it is quite light.
About the global storage, I didn’t explain what I did, finally.
I’m using an HashMap to store the higher class called myObject.
That one contains the MaxBox reference (to abstraction) and around 20 variables (= properties in my case)
Maybe I should more go into creating/declaring attributes to the MaxBox objects themselves.. but indeed, my stuff works very fine right now, I mean, especially about the readability, and logic.
Then the only part I didn’t touch yet, the sound.
Intuitively, using an external powerful machine like Super Collider seems to be the safest stuff.
No need to create thread, protected thread in Max6, everything will come and be protected by design, I mean, by separating the binaries :p
About the muting scheme, indeed, that one will come directly from 2 things: distance between cam & objects AND range.
The main tip is: if I’m outside the limit of that object’s range, i don’t hear it.
There will be an envelope to be make non linear range but the only constraint will be, the further point of the range has a value of volume equaling ZERO.
Doppler shifts will HAVE to occur.
AFAIK, they would come naturally if I’m using delays correlated with distance between cam/objects AND relative speed.
I didn’t study that part, and yes, this will only be processed on synthesized sounds.
Do you have lead about doppler shifts that could follow in those cases ?
To be honest, i am not very much into max-java, so this might be a stupid idea. But couldn’t you put all of the object positions into a matrix and then calculate the distances with the gen object? Resulting in a direct volume (0. – 1.). Does seem very fast to me.
And for big objects you could use some kind of offset in a 5th plane.. (distance – radius of object = new distance)
Redistributing of the values could happen via a forward object, using the cellcoords as indexes.
----------begin_max5_patcher---------- 1581.3oc4atsahiCFG+ZFo4cvJRyM6vTEGmCvp8hYeC16qppLIFp6DbhRLsz NZe2WeHARRIPfFSc0pRIEe7y+7e+4C396u9kINKx1RJc.+I3VvjI+VDxDUXx PlTGvDm03swo3RUBcXjmyV7nyzp33jsbU3k4ETFeInjvR.hTPh4eiB91xcoL aCOkv4ujSzUniC3t53Vlw3L7ZULN+cAEmtKazDUfhR7G9gNMyPI8UUFfd23V GNayZJSTMJa0qNzbLO9AJa08EBqRW6dybE4B34GJe.QyUeRD1diRTVZaVUXP Un+6W+h7o3wz2MwdjxugxIEGgPSq98rIUv4QJ3.HELP9PyqiAJzECp3r0qIL 9aIkVNA19xq.rPcIxDHgVxwrXBHaYkZCvy.wBxTf2U.oTFINaCSUJnyCgHz3 gPE4B7UJMOE.8m0O.cuX.tLMSTT8qmDwi4RA0BLa0YKpPt6BMufHFmywbZFq Q6DgTREjp85V8V6p4cwQjFdgyT7z8sJwbbgnYHFScOggWjppI2CSYOKkxddm hx9ZkjOzPT12Jn7ZRYIdE4.SzP33zTfq5Guo.EkGuIY7fi2jLpwAPWO0Cn9w LSLECOa0pTR+TfJcAd2AZsvYNms9.5GIaIypEFuC4wk2hkSdtLub0F5Il97b 6+ggime+JRUsHC87nnv9E.dFYMFqw7B5VvOwI3bN.1OuDI9dchuLxgFcxAQU icfmZUGiO5DR3hLgq0f9A1E4fGNhNXpvDRMVLDZt0vZ147ftimxAEYCyd0mn hCV.VbBA0kRwnQe3mFhH+q+fOoeqUDF3mR3.TpKjmw7bY.GWyf5U66dR10pv D6FTSqZbILIZJ4IRQoXQfMq7IN377FgOoYljT9wLUYENceXTlNL28gUPdhVW DAUgJ6Aqqj8srYg5M.5qZRgtpG9AMaZht6UwYo5J4Vguy4Sa7FrSJSyh+EIo k83jkSXTVyE91N9DxR7lT988zO0NAKwwj9y9gkESbVUPSxXRCocdkgWWkh9Y 8Z6BZ0lTIggyOT1EaWkuobAtPx6J2Nd6hkmkk1Nt88yhgPXFUHuIbp1j8b2W tz0pCdocko8r8PYbQVZZ6RSG0SGJpDgVHl7LMg+fp3ZgUQFn40cHN6wUBcEo j2IPNdUY2fnbcCyQreb4em3zrEt+XnZoia5.ocDG0QRamIeuUvGYKBmvkQ+a U6X9N5eF8d8g.0Gxxbjx8q2MAcbgz6T6S16FtdfrQH5e.1Ok8XCU3kAU3ofZ f9H9zqRRutR6BpkOSeE7hovpqYwp9Le7QVJV2ZHrBmaFrVsv0+uR0YFRrV4X 0dw5SjXw7thWlhrQlYJKsdE4o1pPPn8A1XRZpoXZ3kwT2goVgQ5SZats5D3k WMEXQlz4JTuClf.Kkq9lhplZk.9yza4MRuMQ6Cq+UtgP5bS5T0yctstMfTBa kXuhlApQFcven0tDfeXHdFZRQZ0ndaTjpNuW4MC3exJ0qsxbKuJvPqDP6Zct F0Hn8IZoL.zTyWYFlp0sZ2.HW6inhJF71l9vPluIcchBzGKUzPXlaOLqwglJ ujN8c3eJiPlfdnYY1lh3ZEV89a.crnDRImx1cXx2tiQcS3CzjjNGaq9DNKkG YZxwZOSGCae9PscUqDZUFu7rPOCi2tHu7LGGlwenl4GsrY1mXiWdrjCz3g1G 4CGrw69I13Oj.6C13Gtmx41G3Grjet04kO5bLcKC6ngZ6g1msOX4N5Srsae9 HGrKR6aIMAmiZe7bxzXc8IjbBSTNwubeLN9gJiCbmN95rMlWZnb7uZrOen70 XdwfOuKIj+w9WpP+M6u6lAG8w7OeBKiVR.90WuJg8.1+0DO9WOzw+RVUwuy+ BpoSW6sdVwytizpo5aFg42dHVeayrsOjCL15.iqNXG8fMLzfrqfqtcEMD6Bd 0MK3fLKjc1MBu9xqNaGnOC6CnibP8jQW+NR2gXWdslG9ZXWcNNfiXWnqqcMD y5ziGWSSxynLdkCdjq56l.FLS+Hb2mZTUFviGbH.9rZIvHnz1m6KeupYI+f4 ZGvAqf8ttJkwx2cWBqPKzKnoTQ9IChXz3n5OlcUspmN2abkI089h28th+l6I tpTEu8ef8.oJC -----------end_max5_patcher-----------