eyes reconition

Matth's icon
Olivier Pasquet's icon

Have a look at the Machine Perception Toolbox that does an eye
blinking detection:

I started an external called jit.op.sarko but I did never have the
time to do finish it yet. GRRrrr

best,

O./////

Dieter_Laser's icon

maybe you could use an IR (infrared) array and track the reflecting light from the eyes with a Wii-controller

maybe...

yair reshef's icon

"cvEyeTracker is eyetracking software aimed at doing simple dark-pupil eye
tracking from a firewire camera that monitors the eye. The software also
supports input from a second camera to provide a frame of reference of where
the eye is looking."

they modified a unibrain firewire camera with sensor separated from the cam
body which unibrain later commercialised.

yair reshef
On Thu, Mar 6, 2008 at 9:47 PM, Dieter_Laser wrote:

>
> maybe you could use an IR (infrared) array and track the reflecting light
> from the eyes with a Wii-controller
>
> maybe...
>

Jean-Marc Pelletier's icon

We have been doing some work here (IAMAS) with blink detection and I've devised a number of patches for head tracking, eye detection and blink detection that work fairly well using only standard cv.jit objects. I've included one of the better patches below. It works well with dark eyes, but I think blue-eyed people might be its weakness. Unfortunately, being in Japan, I don't have access to a whole lot of blue eyes for testing!

cv.jit.faces is for face detection and ideally once a face is detected, it is better to use another method for tracking. Here, using the bounding box returned by cv.jit.faces, I make a rough guess as to the eyes' positions. I then initialize cv.jit.track to features in that area and then track the face that way. The eyes are a feature-rich region of the face and are easy to track solidly.

There is another big advantage to identifying features: the eye region is surrounded by very uniform regions: the forehead, the cheeks and the nose ridge. This means that if I have a list of features roughly centered around where the eyes should be, calculating the orientation of this distribution will give me the orientation of the head.

From this information, I can make a much better guess as to the eyes' positions.

I further refine this guess by using the new external cv.jit.shift to "hone in" on the darkest area in a submatrix that contains an eye. In most cases, this will accurately find the pupil.

In test conditions, we've had eye and blink detection ratios of close to 100%. However, when actual interactive pieces were deployed, performance dropped to around 80%. Odd eyeglasses and haircuts accounted for a small part of the problem. However, the major issue was that extra processing running on the same computer to play video and sound slowed down the frame rate to the point where it was too slow to capture rapid blinks.

Here is the patch, it's rather big and not all that well documented but it should work. (Built-in iSights on Mac laptops work especially well in good lighting.)

Warning: there's an important loadbang that needs to fire, so if you just c&p it probably won't work. Save and re-open.

Jean-Marc

Max Patch
Copy patch and select New From Clipboard in Max.


cap10subtext's icon

JM: this patch is great, but I find it jumps to my eyebrows instead of focusing on the pupil after about 7 seconds or so. Refreshing seems to fix it but maybe there's a more elegant solution? Thanks for sharing.

Jean-Marc Pelletier's icon

Quote: cap10subtext wrote on Mon, 10 March 2008 11:39
----------------------------------------------------
> JM: this patch is great, but I find it jumps to my eyebrows instead of focusing on the pupil after about 7 seconds or so. Refreshing seems to fix it but maybe there's a more elegant solution? Thanks for sharing.
----------------------------------------------------

Play with the threshold value, put it as low as possible. When I last tried it the default value (35, I think) was a bit too high. This is likely what causes it to latch on eyebrows instead.

Also, it helps to refresh periodically. I didn't include this function in this patch because the timing is likely to be application-dependent. You might want to reset every five or ten seconds, or after a blink, or when the person is still...

Jean-Marc

flim's icon

hello!

sorry to dig this up..!

I was just wondering if it would be a big deal in modification if I were to detect 2 further pair of eyes :) having 2 people in the picture..

would it suffice to merely double everything from [p find_face] on - respectively adapting [value]'s "addressees" and bringing them together in [p draw_line]..?

even a "yes" or a "no" at this point would be of great help before I start messing around for nothing (my speciality)..

cheers!

-jonas

Max Patch
Copy patch and select New From Clipboard in Max.

STPHNMNSLW's icon

that patch is so cool!! jean paul you are such a cool guy!...you might be interested in this: http://text20.net/node/14

lewis lepton's icon

this is a great patch, although...

i have to take my glasses off to get it to work, hahaha. i have the worst eye-sight also so i cant actually see what is happening without them. but still cool none-the-less.

lewis edwards
------
smokingbunny.co.uk

pid's icon

lewis, you must have missed the "@spectacles $1" attribute.... must be your eyesight...

flim's icon

sorry for asking before having made any practical attempts, but I'm slightly dizzy from stripping down this patch, after quite some hours.. I'll continue tomorrow, with or without your help :)

until then, if someone can support me in the following assumption, I'd be quite happy to know if I'm on the right track:

in order to actually visualize the detection of 2 pair of eyes, I feel to have found the key in [p find_eyes] (tucked in [p find face]

am I possibly right..? slicing what [p find_eyes] gives me shouldn't be the problem. I'll just have to get it to extract "eye data" equal to the number of faces detected.

increasing cv.jit.track's npoints ([p find_eyes] upstream object) hasn't helped much either, so it seems I have to concentrate on this 1 subpatcher alone.

I'll let you know if I've succeeded. tomorrow. :) until then

all the best

-jonas

flim's icon

done :) for anybody who's interested in detecting as many eyes as possible (whereas I haven't actually messed around with its possible limits, this modification only goes for 2 pairs of eyes), here you go

thanks again, jean-marc, for this incredible starting point

Max Patch
Copy patch and select New From Clipboard in Max.

flim's icon

sorry, slight revision in [p draw_line] (the clear messages quite dispensable..):

Max Patch
Copy patch and select New From Clipboard in Max.

akee-rf's icon

Did someone arrive to get the blinking information?
i tried with jit.3m but ...

i really looking forward for blinking detection, if someone got something to share!!

madjax's icon

Its been awhile, so maybe this is due to a max version change, or a problem with an international keyboard or something, but when I pasted JMP's code into max, I had to find and replace all the "–" (Ascii extended code 150dec, not [minus]) with "-" minus, and all the "#1", "#2", etc. with "$1", "$2".

After fixing these obvious glitches, I was still having trouble making it work, so I started reverse engineering it and have discovered these fixes...so far:

The sub-patch "two_eyes" would not pass signal through the jit.submatrix object until I added another inlet and fed it the matrix from jit.rgb2luma object. I also had to add the embedded message "@dim 50 33" to the jit.submatrix objects to get them to be the right size.

The sub-patch "draw_pupil" needed some editing to the message box feeding the jit.lcd object. change to, "frgb 0 255 0 255, framerect $1 $2 $3 $4"

The sub-patch "draw_line" : There is something wrong with the final message feeding the jit.lcd object. The message has a packed list with 6 ints being fed to it, but it is only sending "frgb 255 0 0" I haven't figured out what those 6 values are for. panitarc? ("paintarc $1 $2 $3 $4 $5 $6" superimposes a half circle roughly placed over the pupil)

And finally the jit.pwindows at the very bottom of the patch, which I believe should be a isolation of detected pupil portion of the video stream, won't display anything unless you feed it a matrix from the somewhere (probably from the left outlets of top two pwindows. You also have to set a dimensions of the sub matrix objects. I can't tell if these should be the same as the dimensions in the "two_eyes" sub patch or whether these should be dynamically updated from the "eyedim" value object.

AND NOW SOME QUESTIONS:

Has anybody used this for gaze tracking? Do you have a calibration patch with final screen coordinates that you would like to share?

Is this even worth it, or should I use external eye tracking software and bring the data into max?

Eye-Tracker.maxpat
Max Patch
madjax's icon

I found a complete version with everything working on a different thread. I am posting it here too, in case anybody finds this thread and wants it.

Eyetracker2.maxpat
Max Patch
whorl's icon

Awesome patches, but Gaze detection is another matter. You need external hardware if you are interested in that.

The Pupil and The EyeWriter projects are both excellent and reasonably affordable open-source options:

The Pupil allows you to track gaze on a head-cam, essentially letting you move around while it tracks your gaze, while the EyeWriter requires you to maintain a relatively fixed head position.

jaclynr's icon

Hi,

Does anyone know how to detect eyes without detecting the face first? I am new to Max, but am creating an art piece that needs to know if someones eyes are looking through two eye-sized holes.

Thanks a lot!

Laura Bloom's icon

I am also trying to detect eyes and write algorithms off of their movements. Is there a function that helps to detect eyes from a live stream? I am also doing this for an art project.

Ian Bloomfield's icon

whenever I try to run this patch - it closes with a bug?

Am I doing something wrong?

Om  Shidhaye's icon

i too am facing the same issue as Ian bloomfield