Forums > Jitter

how to motion track dancers

October 1, 2007 | 3:15 am

Hi everyone,

I’m very new to Jitter, and I want to track the motion of 3 dancers and bring in that data to jitter so that I can analayze relationships. This is an architecture thesis, and i’m looking at the architecture of performance as an interactive engagement. Basically, I’m "remixing" Edgard Varese’s Poeme Electronique for the philips Pavilion (in the brussels world fair of 1958) with modern technologies.

I have looked through the the message boards and have tried to google for what equipment I need, but i’m not sure what systems work best for this. Does anyone have any recommendations as to a camera, or can I use a series of colored markers, and then simply track them in jitter? Is optiTrack the way to go or is that overkill?

I’m at a loss, so if anyone could point me in the right direction, that would be great.

thanks in advance,

Doron


October 1, 2007 | 1:04 pm

have a look at the cv.jit objects.

http://www.iamas.ac.jp/~jovan02/cv/

they are built on the openCV technology.
Very usefull for tracking, shape recognition and more.
it will take some time learning how to use them if you are new to jitter but they are really worth the effort

good luck


October 1, 2007 | 3:54 pm

Thanks Karrrlo

i came across that last night after posting. Between that, and Eric Singer’s Cyclops, I think i have a notion of how to progress….now it’s just a matter of learning jitter/, cyclops, connecting the hardware….oh and constructing a thesis out of all this :)

are there any tutorials you know of that use the cv object?

I just came across eyesweb, and I’ll give that a look as well….


October 5, 2007 | 7:22 am

sorry for late answering , a place you may want to have a look at is the Musical Gesture Project at the university of Oslo, norway.

http://www.hf.uio.no/imv/forskning/forskningsprosjekter/musicalgestures/

it might be good place to look at for your thesis research, get inspiration . there are some patches, applications to download for motion analysis >software>Musical Gesture Toolbox.

good luck

Quote: doronserban@gmail.com wrote on Mon, 01 October 2007 17:54
—————————————————-
> Thanks Karrrlo
>
> i came across that last night after posting. Between that, and Eric Singer’s Cyclops, I think i have a notion of how to progress….now it’s just a matter of learning jitter/, cyclops, connecting the hardware….oh and constructing a thesis out of all this :)
>
> are there any tutorials you know of that use the cv object?
>
> I just came across eyesweb, and I’ll give that a look as well….
—————————————————-


October 5, 2007 | 7:50 am

thanks karl, this looks sweet

On 10/5/07, karl-otto von oertzen wrote:
>
>
> sorry for late answering , a place you may want to have a look at is the
> Musical Gesture Project at the university of Oslo, norway.
>
> http://www.hf.uio.no/imv/forskning/forskningsprosjekter/musicalgestures/
>
> it might be good place to look at for your thesis research, get
> inspiration . there are some patches, applications to download for motion
> analysis >software>Musical Gesture Toolbox.
>
> good luck
>
> Quote: doronserban@gmail.com wrote on Mon, 01 October 2007 17:54
> —————————————————-
> > Thanks Karrrlo
> >
> > i came across that last night after posting. Between that, and Eric
> Singer’s Cyclops, I think i have a notion of how to progress….now it’s
> just a matter of learning jitter/, cyclops, connecting the hardware….oh
> and constructing a thesis out of all this :)
> >
> > are there any tutorials you know of that use the cv object?
> >
> > I just came across eyesweb, and I’ll give that a look as well….
> —————————————————-
>
>
> –
> karrrlo
>
>


October 5, 2007 | 8:26 pm

you are welcome,
it is a well documented project/research , i hope they will pursue it .
best

Quote: yair r. wrote on Fri, 05 October 2007 09:50
—————————————————-
> thanks karl, this looks sweet
>

> > http://www.hf.uio.no/imv/forskning/forskningsprosjekter/musicalgestures/


October 5, 2007 | 8:35 pm

these things look fantastic. I am looking at what kind of cameras to use for this project. If I’m tracking 3 dancers in the xyz system, it seems for these systems, it’s ideal to use two cameras: one pointing down (to get xy) and a forward facing camera to record the xz plane. Systems that rely on sensors seem to be prohibitively expensive.

thanks for the great advice again!


October 6, 2007 | 6:04 am

it’s being integrated into the jamoma project as we speak

check it out: http://www.jamoma.org

/*j

>
>>> http://www.hf.uio.no/imv/forskning/forskningsprosjekter/
>>> musicalgestures/


Viewing 8 posts - 1 through 8 (of 8 total)