Forums > Jitter

motion tracking down a narrow hallway

March 24, 2008 | 5:53 pm

Greetings all,

I have a unique challange that has been keeping me up at night
with no solid solution as of yet.
As there is usually a myriad of ways to achieve the same goal.
I’m looking for some advice on tracking motion down a 16 meter
long hallway.
The idea is to generate color "blobs" with motion trails on a low resolution LED wall with a literal resolution of 160×10.

The "blobs" will need to follow multiple objects(IE: if two people
are walking towards each other down the hallway each has a unique color "blob")
Since the hallway is narrow and the celling is low I believe we will need multiple sources for range data to track the motion down the hall "accurately".
The main question I have is what would be the best method for the tracking? IR, ultrasound or multiple cameras?

So far the best idea that we have come up with is to use multiple
ultrasound sensors pinged sequentially to pool
height data for input into jitter matrices.
Comparing frames of data to then slide objects based on movement in the sonar array.

We were looking at using 16 of the max sonars (1 per meter)…
basicx.com/Products/sonar/sonar.htm
but I’m afraid that the area they look at may be too narrow.
so that if someone stops between sensors the range data will return to default.

If anyone has recommendations I would love to hear them.
Thanks in advance.
-Jason



MIB
March 24, 2008 | 5:56 pm

do you know about the cv.jit externals??? this might be a good place to start.



ico
March 24, 2008 | 6:38 pm

Why not simply using a camera (IR if required to use at low light),
subdivide the viewport into smaller chunks to reflect corridor segments, and
track activity per chunk by subtracting frames and/or do some averaging
should the noisiness of the video feed prove to be a problem?

Best wishes,

Ivica Ico Bukvic, D.M.A.
Composition, Music Technology, CCTAD, CHCI, CS and Art (by courtesy)
Director, DISIS Interactive Sound & Intermedia Studio
Virginia Tech
Dept. of Music – 0240
Blacksburg, VA 24061
(540) 231-6139
(540) 231-5034 (fax)
ico@vt.edu

http://www.music.vt.edu/faculty/bukvic/

> —–Original Message—–
> From: jitter-bounces@cycling74.com [mailto:jitter-bounces@cycling74.com]
> On Behalf Of jason
> Sent: Monday, March 24, 2008 1:53 PM
> Subject: [jitter] motion tracking down a narrow hallway
>
>
> Greetings all,
>
> I have a unique challange that has been keeping me up at night
> with no solid solution as of yet.
> As there is usually a myriad of ways to achieve the same goal.
> I’m looking for some advice on tracking motion down a 16 meter
> long hallway.
> The idea is to generate color "blobs" with motion trails on a low
> resolution LED wall with a literal resolution of 160×10.
>
> The "blobs" will need to follow multiple objects(IE: if two people
> are walking towards each other down the hallway each has a unique color
> "blob")
> Since the hallway is narrow and the celling is low I believe we will need
> multiple sources for range data to track the motion down the hall
> "accurately".
> The main question I have is what would be the best method for the
> tracking? IR, ultrasound or multiple cameras?
>
> So far the best idea that we have come up with is to use multiple
> ultrasound sensors pinged sequentially to pool
> height data for input into jitter matrices.
> Comparing frames of data to then slide objects based on movement in the
> sonar array.
>
> We were looking at using 16 of the max sonars (1 per meter)…
> basicx.com/Products/sonar/sonar.htm
> but I’m afraid that the area they look at may be too narrow.
> so that if someone stops between sensors the range data will return to
> default.
>
> If anyone has recommendations I would love to hear them.
> Thanks in advance.
> -Jason
>
>


March 24, 2008 | 10:14 pm

Thats an interesting idea however im not so sure that I would be able to position the camera high enough in a way that would be able to track 2 objects at once and be able to position the LED object at the same depth down the hall as the object being tracked.
In the method you described how would you deal with two people passing in the hallway?If one person was obscured by another body?

-jason


March 24, 2008 | 10:38 pm

if the floor is free i would tile pressure sensors under the carpet.

http://www.greyworld.org/#the_layer_/v1

On Tue, Mar 25, 2008 at 12:14 AM, jason wrote:

>
> Thats an interesting idea however im not so sure that I would be able to
> position the camera high enough in a way that would be able to track 2
> objects at once and be able to position the LED object at the same depth
> down the hall as the object being tracked.
> In the method you described how would you deal with two people passing in
> the hallway?If one person was obscured by another body?
>
> -jason
>
>



ico
March 24, 2008 | 10:45 pm

> Thats an interesting idea however im not so sure that I would be able to
> position the camera high enough in a way that would be able to track 2
> objects at once and be able to position the LED object at the same depth
> down the hall as the object being tracked.
> In the method you described how would you deal with two people passing in
> the hallway?If one person was obscured by another body?

I haven’t been keeping up with latest emails so I can only presume this is
in reply to my email. If so, this could be a combination of things, e.g.
partitioning the screen further in combination with cv blob tracking someone
mentioned here already. That still would not solve partially occluded
passers-by, especially if your corridor is not very tall.

Hope this helps!

Best wishes,

Ivica Ico Bukvic, D.M.A.
Composition, Music Technology, CCTAD, CHCI, CS and Art (by courtesy)
Director, DISIS Interactive Sound & Intermedia Studio
Virginia Tech
Dept. of Music – 0240
Blacksburg, VA 24061
(540) 231-6139
(540) 231-5034 (fax)
ico@vt.edu

http://www.music.vt.edu/faculty/bukvic/


March 25, 2008 | 1:50 am

Put the cameras on the ceiling looking straight down at the floor to
give you an overhead view, then no-one is going to obscure anyone
else, unless they are levitating. Use more than one camera if you
have to. Wide angle lens might help. Use the srcrect attribute of
the grab object to ignore the walls and only track the floor.

On Mar 24, 2008, at 6:14 PM, jason wrote:

>
> Thats an interesting idea however im not so sure that I would be
> able to position the camera high enough in a way that would be able
> to track 2 objects at once and be able to position the LED object
> at the same depth down the hall as the object being tracked.
> In the method you described how would you deal with two people
> passing in the hallway?If one person was obscured by another body?
>
> -jason
>


March 25, 2008 | 7:53 pm

we used an array of motion sensors taken from outdoor motion sensing lights from walmort for a similar hallway idea… this did however require some custom hardware to take the output from the hacked motion sensors..


March 26, 2008 | 12:44 am

I’ve been entertaining this thought of using multiple cameras
putting them together with "jit.glue" and using "cv.jit.blobs.sort"
then tracking colors to generate particle effects emitter coordanance /waves etc….
However Im still working on how to keep the sorted colors during collision events like when people pass in the hall and 2 blobs then become the same color.
This will become a problem if I use wide angle lenses an the perspective is anything less than straight over head.
Is there anyway to avoid this?
Also is there a way to limit the number of colors that are assigned
like a maximum of 4.

On the flip side a friend of mine is willing to program a micro controller for the max sonars.I’m now leaning towards the cam
option but am still not convinced that this is the right tool for the job.


March 26, 2008 | 6:09 pm

What do you plan on doing with the data… I always create systems that acquire more data than needed… mappings are so important.


March 26, 2008 | 6:14 pm

if you go with cameras and with cv.jit.blobs.sort, then Jamie
Jewett’s previous post on this topic might be some help. You
wouldn’t have to color track from the help patch (I’m assuming that’s
why you’re color tracking) – just get the x,y values.
http://www.cycling74.com/forums/index.php?
t=msg&goto=118099&rid=0&S=59ebff90c7334d210b3fffba33cdbc1d&srch=cv.jit.b
lobs.sort#msg_118099

On Mar 25, 2008, at 8:44 PM, jason wrote:

>
> I’ve been entertaining this thought of using multiple cameras
> putting them together with "jit.glue" and using "cv.jit.blobs.sort"
> then tracking colors to generate particle effects emitter
> coordanance /waves etc….
> However Im still working on how to keep the sorted colors during
> collision events like when people pass in the hall and 2 blobs then
> become the same color.
> This will become a problem if I use wide angle lenses an the
> perspective is anything less than straight over head.
> Is there anyway to avoid this?
> Also is there a way to limit the number of colors that are assigned
> like a maximum of 4.
>
>
> On the flip side a friend of mine is willing to program a micro
> controller for the max sonars.I’m now leaning towards the cam
> option but am still not convinced that this is the right tool for
> the job.


Viewing 11 posts - 1 through 11 (of 11 total)