How do _you use accelerometer data?
Jan 6, 2007 at 12:29pm
How do _you use accelerometer data?
Thinking aloud ….
I’m wondering if any one who is already using accelerometers in their
I’ve started thinking about this more as I’ve just bought a Wii
One of the main ways that I use controller data is to control the
My suspicion is that accelerometer output is by definition unusable
So far 2 ways of interpreting the data have occurred to me:
1. Linear(ish) – apply the accel data directly to a parameter;
One interesting visual analogue of this is that effect where you
2. Thresholds – more rapid movements fire off different events (delay
One way I’d thought of using this is to direct the threshold triggers
Anyway, that’s as far as I’ve gotten with this. What thoughts/
Jan 6, 2007 at 2:12pm
About a year ago, I had to make a decision as to what motion-tracking system I wanted to invest in. I was very interseted in inertia sensors, but lacked the funds to buy any, so I got into video-tracking instead, which has led to some interesting work. Since a friend of mine bought a Wii remote for his family for Christmas, I went and dug up the articles I had collected about measuring inertia and how to use the measurements. I haven’t tried it yet, but here are a few notes.
First off, here is a great article:
Secondly, inertia can be measured to update the actual position in space of the object. The basic precept is that any 3D object has 6 variables defining its position in space. These are the x, y and z coordinates and the rotation of the object itself in any of those three directions (pitch, yaw and roll). Added to this is acceleration in any of these directions. In reality, it is the acceleration which we are interested in, which draws curves of position in space and time and can serve to convert the motion of the object being measured into data which is relevant for your purpose.
In the case of music, it is simply a matter of mapping available data to desired data. A short list of possible musical data would be along the lines of:
These four categories are basically what are reflected in the structure of the MIDI protocol.
Although the list is endless, an example of what one could with measurements of acceleration is to trigger a note whenever acceleration in the x axis is detected, set the pitch of the note with position in the y axis, VARY the pitch with variations in the position in the y axis, control the volume with variations in the z axis and, just to be tricky, use the rotation in the x axis to modulate some aspect of the timbre.
Using video-based motion tracking, I do the same, and have a matrix set up to easily choose what motions to map to what aspect of the program. I hope that when I find the time to experiment with accelerometers, I will find latency and precision superior to that which I can attain with video.
Hope this helps.
Jan 6, 2007 at 2:59pm
David Stevens skrev:
Jan 6, 2007 at 3:11pm
Jan 6, 2007 at 3:41pm
Something like that is what I was trying out yesterday – the bit that
On 6 Jan 2007, at 14:59, Andreas Wetterberg wrote:
> Well, I would reckon that if you include TIME as a factor in these
Jan 6, 2007 at 3:57pm
thanks for an interesting response – lots of food for thought there.
I’ve been working with various sensor set ups for a while, and I
Some of the attractive things about the Wii remote are that there are
On 6 Jan 2007, at 14:12, Dayton wrote:
Jan 6, 2007 at 7:55pm
Quote: david stevens wrote on Sat, 06 January 2007 08:57
Light is definitely the most touchy aspect in video-based tracking. Since I work with dancers and most of our performances are concieved for theaters, we have the opportunity to specify minutely how we want the light to be, but it requires alot of work. Shadows are a big problem.
I just did some research into the Wii remote, and I am VERY pleased that I did. I had no idea how cheap it was. It is based on the ADXL330 chip, which has relatively high accuracy for the phenomenally low price. I don’t know how fast the poll-rate is for the a/d chip, but it must at least be higher than the frame rate of the games it is used with. At a very minimum I would guess that this could be 40ms, although any worthwhile chip could give you 20ms or less. (This translates into latency, although in a somewhat unpredictable manner. Still; loads better than video.)
It is still not clear to me what you get as an output to the computer, but it seems likely that you not only get the x, y and z axis (by the way, the x and y axis are measured with 3x more resolution than the z axis) but also computes the pitch, yaw and roll before sending the information. If this is true, then it is very useful.
In any case, a periodic recalibration of the x, y and z translations (zeroing the positions) would allow the performer to move about naturally and still have full control over the sound (and/or video) based on his or her personal coordinate-system while the rotations, being in relation to gravitational pull, are automatically recalibrated by our inner-ear’s ability to balance ourselves. The absolute position can be calculated in two ways: having a marked position which the performer KNOWS where he is and triggers the appropriate recalibration, and at the same time the cumulative position information can be saved in order to compare.
I must buy one of these things; I’m burning to try it out. A year ago, I outlined my ideas to my father, who was an aerospace engineer for McDonnel-Douglas from the 60′s until a couple of years ago, and he explained to me the basic precepts of tracking used in aerospace aplications, both near the earth and outside of the earth’s gravitation. The calibration is an important aspect in this type of work, and he described it as “a series of approximations”.
Controlling music with such information is pretty straightforward, but leaves alot of questions to be answered personally. What really seems interesting is the applications to the use of OpenGL. Where’s my wallet…
Jan 6, 2007 at 11:50pm
Jan 10, 2007 at 11:11pm
Jan 11, 2007 at 9:18am
Jan 11, 2007 at 11:28am
Alright. I let it get the better of me and went out and bought one of these things.
Here are some of my own observations after the first few of hours and a fair amount of research:
The Wiimote is very sensitive, displaying the sort of problems which commonly arise in working with hyper-instruments: they can seem nearly as complicated and sensitive as conventional instruments, so that a true mastery of the device would entail training similar to that necessary for mastery of a violin or other instrument. The main difference is that a programmer can simplify things, emulating a more perfect control of the device.
The most effective measurements possible are (using OpenGl terminology) rotations in the z-axis and x-axis. Acceleration in any direction can be measured, but using the typical formulae for computing position, in order to extract useful numbers from this data proves to be extremely inaccurate due to latency-jitter and drift.
Measuring Rotation in the y-axis is not possible without the sensor-bar, which would make the Wiimote only useful in a fixed-coordinate system with a line-of-sight connection to the bar, and (except when using self-constructed led-arrays) only within 1 to 5 meters of the bar.
From the examples provided with GlovePIE, it seems that the most effective preparation of the data is to use it as impulses in desired directions, and the amount of impulse (degree of rotation or amount of acceleration) can map to the speed of movement in that direction. Using [slide] and [accum] would be a good way to accomplish this. Zero-position recalibration in periods of non-activity might be important as well.
For those who might need it, here is the information which GlovePIE provides about movement:
Possible Wiimote measurements without the sensor-bar using GlovePie.
The rotations are distinctly different than in OpenGL. In OpenGL, an axis works like a roasting-spit, so that rotation in the x-axis means that the top of the object tilts forward and back (like a man bowing) I don’t know why it appears differently in GlovePIE. Here is a bit of the information provided with GlovePIE, annotated when necessary:
• Pitch (rotation in the z-axis, corresponding to rotation in the x-axis in OpenGL) -90(pointing at floor) 0(parallel to floor) +90(pointing at ceiling )
The sensor bar is just a bunch of Infra Red lights which are always on. You can make your own fake sensor bar with candles, Christmas tree lights, or Infra-Red remote controls with a button held down. Or you can order a wireless sensor bar off the internet, or you can build your own.
You can read the position of the infra-red dots that the Wiimote can see with:
You can tell whether an infra-red dot can be seen with Wiimote.dot1vis to Wiimote.dot4vis
You can tell the size of a dot (between 0 and 15) with Wiimote.dot1size to Wiimote.dot4size
Have fun; I am.
You must be logged in to reply to this topic.