motion tracking down a narrow hallway

Mar 24, 2008 at 5:53pm

motion tracking down a narrow hallway

Greetings all,

I have a unique challange that has been keeping me up at night
with no solid solution as of yet.
As there is usually a myriad of ways to achieve the same goal.
I’m looking for some advice on tracking motion down a 16 meter
long hallway.
The idea is to generate color “blobs” with motion trails on a low resolution LED wall with a literal resolution of 160×10.

The “blobs” will need to follow multiple objects(IE: if two people
are walking towards each other down the hallway each has a unique color “blob”)
Since the hallway is narrow and the celling is low I believe we will need multiple sources for range data to track the motion down the hall “accurately”.
The main question I have is what would be the best method for the tracking? IR, ultrasound or multiple cameras?

So far the best idea that we have come up with is to use multiple
ultrasound sensors pinged sequentially to pool
height data for input into jitter matrices.
Comparing frames of data to then slide objects based on movement in the sonar array.

We were looking at using 16 of the max sonars (1 per meter)…
basicx.com/Products/sonar/sonar.htm
but I’m afraid that the area they look at may be too narrow.
so that if someone stops between sensors the range data will return to default.

If anyone has recommendations I would love to hear them.
Thanks in advance.
-Jason

#36488
Mar 24, 2008 at 5:56pm

do you know about the cv.jit externals??? this might be a good place to start.

#125288
Mar 24, 2008 at 6:38pm

Why not simply using a camera (IR if required to use at low light),
subdivide the viewport into smaller chunks to reflect corridor segments, and
track activity per chunk by subtracting frames and/or do some averaging
should the noisiness of the video feed prove to be a problem?

Best wishes,

Ivica Ico Bukvic, D.M.A.
Composition, Music Technology, CCTAD, CHCI, CS and Art (by courtesy)
Director, DISIS Interactive Sound & Intermedia Studio
Virginia Tech
Dept. of Music – 0240
Blacksburg, VA 24061
(540) 231-6139
(540) 231-5034 (fax)
ico@vt.edu

http://www.music.vt.edu/faculty/bukvic/

> —–Original Message—–
> From: jitter-bounces@cycling74.com [mailto:jitter-bounces@cycling74.com]
> On Behalf Of jason
> Sent: Monday, March 24, 2008 1:53 PM
> Subject: [jitter] motion tracking down a narrow hallway
>
>
> Greetings all,
>
> I have a unique challange that has been keeping me up at night
> with no solid solution as of yet.
> As there is usually a myriad of ways to achieve the same goal.
> I’m looking for some advice on tracking motion down a 16 meter
> long hallway.
> The idea is to generate color “blobs” with motion trails on a low
> resolution LED wall with a literal resolution of 160×10.
>
> The “blobs” will need to follow multiple objects(IE: if two people
> are walking towards each other down the hallway each has a unique color
> “blob”)
> Since the hallway is narrow and the celling is low I believe we will need
> multiple sources for range data to track the motion down the hall
> “accurately”.
> The main question I have is what would be the best method for the
> tracking? IR, ultrasound or multiple cameras?
>
> So far the best idea that we have come up with is to use multiple
> ultrasound sensors pinged sequentially to pool
> height data for input into jitter matrices.
> Comparing frames of data to then slide objects based on movement in the
> sonar array.
>
> We were looking at using 16 of the max sonars (1 per meter)…
> basicx.com/Products/sonar/sonar.htm
> but I’m afraid that the area they look at may be too narrow.
> so that if someone stops between sensors the range data will return to
> default.
>
> If anyone has recommendations I would love to hear them.
> Thanks in advance.
> -Jason
>
>

#125289
Mar 24, 2008 at 10:14pm

Thats an interesting idea however im not so sure that I would be able to position the camera high enough in a way that would be able to track 2 objects at once and be able to position the LED object at the same depth down the hall as the object being tracked.
In the method you described how would you deal with two people passing in the hallway?If one person was obscured by another body?

-jason

#125290
Mar 24, 2008 at 10:38pm

if the floor is free i would tile pressure sensors under the carpet.

http://www.greyworld.org/#the_layer_/v1

On Tue, Mar 25, 2008 at 12:14 AM, jason wrote:

>
> Thats an interesting idea however im not so sure that I would be able to
> position the camera high enough in a way that would be able to track 2
> objects at once and be able to position the LED object at the same depth
> down the hall as the object being tracked.
> In the method you described how would you deal with two people passing in
> the hallway?If one person was obscured by another body?
>
> -jason
>
>

#125291
Mar 24, 2008 at 10:45pm

> Thats an interesting idea however im not so sure that I would be able to
> position the camera high enough in a way that would be able to track 2
> objects at once and be able to position the LED object at the same depth
> down the hall as the object being tracked.
> In the method you described how would you deal with two people passing in
> the hallway?If one person was obscured by another body?

I haven’t been keeping up with latest emails so I can only presume this is
in reply to my email. If so, this could be a combination of things, e.g.
partitioning the screen further in combination with cv blob tracking someone
mentioned here already. That still would not solve partially occluded
passers-by, especially if your corridor is not very tall.

Hope this helps!

Best wishes,

Ivica Ico Bukvic, D.M.A.
Composition, Music Technology, CCTAD, CHCI, CS and Art (by courtesy)
Director, DISIS Interactive Sound & Intermedia Studio
Virginia Tech
Dept. of Music – 0240
Blacksburg, VA 24061
(540) 231-6139
(540) 231-5034 (fax)
ico@vt.edu

http://www.music.vt.edu/faculty/bukvic/

#125292
Mar 25, 2008 at 1:50am

Put the cameras on the ceiling looking straight down at the floor to
give you an overhead view, then no-one is going to obscure anyone
else, unless they are levitating. Use more than one camera if you
have to. Wide angle lens might help. Use the srcrect attribute of
the grab object to ignore the walls and only track the floor.

On Mar 24, 2008, at 6:14 PM, jason wrote:

>
> Thats an interesting idea however im not so sure that I would be
> able to position the camera high enough in a way that would be able
> to track 2 objects at once and be able to position the LED object
> at the same depth down the hall as the object being tracked.
> In the method you described how would you deal with two people
> passing in the hallway?If one person was obscured by another body?
>
> -jason
>

#125293
Mar 25, 2008 at 7:53pm

we used an array of motion sensors taken from outdoor motion sensing lights from walmort for a similar hallway idea… this did however require some custom hardware to take the output from the hacked motion sensors..

#125294
Mar 26, 2008 at 12:44am

I’ve been entertaining this thought of using multiple cameras
putting them together with “jit.glue” and using “cv.jit.blobs.sort”
then tracking colors to generate particle effects emitter coordanance /waves etc….
However Im still working on how to keep the sorted colors during collision events like when people pass in the hall and 2 blobs then become the same color.
This will become a problem if I use wide angle lenses an the perspective is anything less than straight over head.
Is there anyway to avoid this?
Also is there a way to limit the number of colors that are assigned
like a maximum of 4.

On the flip side a friend of mine is willing to program a micro controller for the max sonars.I’m now leaning towards the cam
option but am still not convinced that this is the right tool for the job.

#125295
Mar 26, 2008 at 6:09pm

What do you plan on doing with the data… I always create systems that acquire more data than needed… mappings are so important.

#125296
Mar 26, 2008 at 6:14pm

if you go with cameras and with cv.jit.blobs.sort, then Jamie
Jewett’s previous post on this topic might be some help. You
wouldn’t have to color track from the help patch (I’m assuming that’s
why you’re color tracking) – just get the x,y values.
http://www.cycling74.com/forums/index.php?
t=msg&goto=118099&rid=0&S=59ebff90c7334d210b3fffba33cdbc1d&srch=cv.jit.b
lobs.sort#msg_118099

On Mar 25, 2008, at 8:44 PM, jason wrote:

>
> I’ve been entertaining this thought of using multiple cameras
> putting them together with “jit.glue” and using “cv.jit.blobs.sort”
> then tracking colors to generate particle effects emitter
> coordanance /waves etc….
> However Im still working on how to keep the sorted colors during
> collision events like when people pass in the hall and 2 blobs then
> become the same color.
> This will become a problem if I use wide angle lenses an the
> perspective is anything less than straight over head.
> Is there anyway to avoid this?
> Also is there a way to limit the number of colors that are assigned
> like a maximum of 4.
>
>
> On the flip side a friend of mine is willing to program a micro
> controller for the max sonars.I’m now leaning towards the cam
> option but am still not convinced that this is the right tool for
> the job.

#125297

You must be logged in to reply to this topic.