how to motion track dancers

Oct 1, 2007 at 3:15am

how to motion track dancers

Hi everyone,

I’m very new to Jitter, and I want to track the motion of 3 dancers and bring in that data to jitter so that I can analayze relationships. This is an architecture thesis, and i’m looking at the architecture of performance as an interactive engagement. Basically, I’m “remixing” Edgard Varese’s Poeme Electronique for the philips Pavilion (in the brussels world fair of 1958) with modern technologies.

I have looked through the the message boards and have tried to google for what equipment I need, but i’m not sure what systems work best for this. Does anyone have any recommendations as to a camera, or can I use a series of colored markers, and then simply track them in jitter? Is optiTrack the way to go or is that overkill?

I’m at a loss, so if anyone could point me in the right direction, that would be great.

thanks in advance,

Doron

#33908
Oct 1, 2007 at 1:04pm

have a look at the cv.jit objects.

http://www.iamas.ac.jp/~jovan02/cv/

they are built on the openCV technology.
Very usefull for tracking, shape recognition and more.
it will take some time learning how to use them if you are new to jitter but they are really worth the effort

good luck

#113605
Oct 1, 2007 at 3:54pm

Thanks Karrrlo

i came across that last night after posting. Between that, and Eric Singer’s Cyclops, I think i have a notion of how to progress….now it’s just a matter of learning jitter/, cyclops, connecting the hardware….oh and constructing a thesis out of all this :)

are there any tutorials you know of that use the cv object?

I just came across eyesweb, and I’ll give that a look as well….

#113606
Oct 5, 2007 at 7:22am

sorry for late answering , a place you may want to have a look at is the Musical Gesture Project at the university of Oslo, norway.

http://www.hf.uio.no/imv/forskning/forskningsprosjekter/musicalgestures/

it might be good place to look at for your thesis research, get inspiration . there are some patches, applications to download for motion analysis >software>Musical Gesture Toolbox.

good luck

Quote: doronserban@gmail.com wrote on Mon, 01 October 2007 17:54
—————————————————-
> Thanks Karrrlo
>
> i came across that last night after posting. Between that, and Eric Singer’s Cyclops, I think i have a notion of how to progress….now it’s just a matter of learning jitter/, cyclops, connecting the hardware….oh and constructing a thesis out of all this :)
>
> are there any tutorials you know of that use the cv object?
>
> I just came across eyesweb, and I’ll give that a look as well….
—————————————————-

#113607
Oct 5, 2007 at 7:50am

thanks karl, this looks sweet

On 10/5/07, karl-otto von oertzen wrote:
>
>
> sorry for late answering , a place you may want to have a look at is the
> Musical Gesture Project at the university of Oslo, norway.
>
> http://www.hf.uio.no/imv/forskning/forskningsprosjekter/musicalgestures/
>
> it might be good place to look at for your thesis research, get
> inspiration . there are some patches, applications to download for motion
> analysis >software>Musical Gesture Toolbox.
>
> good luck
>
> Quote: doronserban@gmail.com wrote on Mon, 01 October 2007 17:54
> —————————————————-
> > Thanks Karrrlo
> >
> > i came across that last night after posting. Between that, and Eric
> Singer’s Cyclops, I think i have a notion of how to progress….now it’s
> just a matter of learning jitter/, cyclops, connecting the hardware….oh
> and constructing a thesis out of all this :)
> >
> > are there any tutorials you know of that use the cv object?
> >
> > I just came across eyesweb, and I’ll give that a look as well….
> —————————————————-
>
>
> –
> karrrlo
>
>

#113608
Oct 5, 2007 at 8:26pm

you are welcome,
it is a well documented project/research , i hope they will pursue it .
best

Quote: yair r. wrote on Fri, 05 October 2007 09:50
—————————————————-
> thanks karl, this looks sweet
>

> > http://www.hf.uio.no/imv/forskning/forskningsprosjekter/musicalgestures/

#113609
Oct 5, 2007 at 8:35pm

these things look fantastic. I am looking at what kind of cameras to use for this project. If I’m tracking 3 dancers in the xyz system, it seems for these systems, it’s ideal to use two cameras: one pointing down (to get xy) and a forward facing camera to record the xz plane. Systems that rely on sensors seem to be prohibitively expensive.

thanks for the great advice again!

#113610
Oct 6, 2007 at 6:04am

it’s being integrated into the jamoma project as we speak

check it out: http://www.jamoma.org

/*j

>
>>> http://www.hf.uio.no/imv/forskning/forskningsprosjekter/
>>> musicalgestures/

#113611

You must be logged in to reply to this topic.