anyone used MindSet yet??
Just came across the MindSet from NeuroSky (www.neurosky.com) and then found a max external for it here: http://github.com/qdot/np_mindset/
Just wondering if anyone has used MindSet and would be willing to share some insights. how reproducible is the data? or is it just a bunch of noise? is it worth the 200$?
Sounds like interesting controller if it works.
Hi, there's been a bit of discussion about this. I built an external on the promise that NeuroSky was going to send me a MindSet then they decided not to certify in Canada. :P So I haven't even been able to test mine. Then I found the one on Github but haven't tried that either. Not to sound bitter but I've been dying to get my hands on one.
Anyways here's the post.
I don't know if the Mindset is really going to be all that
great of a controller. Basically it reads your brain waves
and then triggers events when they change. It does not read your
thoughts. Have you ever tried changing your brain waves?
You have to sit and concentrate. Not very responsive.
go beyond thought :)
"Basically it reads your brain waves
and then triggers events when they change. It does not read your
thoughts. Have you ever tried changing your brain waves?
You have to sit and concentrate. Not very responsive."
Dude, just hook up a change and control a bang with my mind, wtf? This $%@# is next level man, to boldly go where no man has gone before! When i have cash i will get one. :)
It would also be nice if it worked in both ways, :) this would make it a real HID :)
This one looks more robust, especially with the facial gesture recognition. Also it seems like its brainwave elements are more detailed, but maybe that's just how the device looks :)
No mention of controlling via Max, but you can utilize its built-in commands that send keystrokes, to some effect.
Why they don't easily (without using the API and programming your own commands) let you access the brainwave data streams, I don't know... that's what we really need to get nuts over this stuff...!
I love the idea of various brain frequency levels controlling audiovisual effects, and seeing what happens when you (say) watch a scary bit in a movie, or someone pinches you, or you think of a happy place... or best of all, just focus on what's coming out (the audio and/or visuals) and see what kinds of crazy feedback you can get...
stack overflow in your brain.
Ah, poking around some more it looks like the Emotiv data could be accessed using [hi] or [udpreceive]:
Awesome---especially as it might allow one to buy only the headset, not the more pricey SDK kits. Though I imagine they are very powerful, we can roll our own analysis/visualizations in a patch instead... :)
Nice... can you imagine all the possibilities with this thing. I wonder what the response will be when dreaming or doing shrooms compared to normal readings.
So yeah, i would love to get into this and experiment.
FRid
Wow that Emotiv looks pretty tempting. But after
viewing these videos, it looks like it is not very
responsive.
By the time you do a filter sweep with this thing,
the crowd will have left already.
Now did i see some funny gestures there? :) and Dave doesn't really seem into it at first. but i must say i actually hadn't expected this amount of control.
"By the time you do a filter sweep with this thing,
the crowd will have left already."
:), yeah especially when doing the sweep-move.
Yes, it makes for a very theatrical performance.
Sorry been away.
Some very interesting stuff. Just imagine having 5 or 6 of them planted on audience members... ever so much fun could be had. Wish I could get my hands on one of them to play with. But it's just a tad expensive for just giggles :)
Been looking at the Emotiv a lot, and I think it's got some good potential. You establish a "neutral" level as a baseline, then you "train" various thoughts to become triggers. So you can think of something very hard, it learns that signal signature, then if you get close to it again, trigger. Seems to work pretty well, considering that the idea sounded totally impossible based on the brain's complexity...
The videos with people using their hands are misleading, you don't need to do that... it's just so you feel more like a Jedi. And no, these aren't the droids you're looking for.
Aside from the thinking elements, the facial recognition is apparently right on, so at the very least you could control a ton of things with winking, frowning, smirking, etc... also great for the audience :)
"The videos with people using their hands are misleading, you don't need to do that..."
Just imagine Dave standing there, crossing his arms, doing nothing and this guy telling you; "Look at Dave controlling the stones on the screen with his mind' That isn't really good advertising probably. My guess is that thats the reason for the funky moves, to have some visual reference other than the screen.
When i get one (in a long time probably, no cash) i'll go do the moves anyway. Just to improve my dancing-skills.
Frid
Woow, just a thought but what if you hooked it up to a dog or something? Have the dog trigger midi-notes in Max :) or do your own Pavlov-experimenting, hook up the dog and when it thinks a happy thought it will get fed. I wonder how long it'll take for the dog to find out. Don't worry, i do not own a dog :) i own a plant.
Hey gang --
I've been collaborating with Dave Sulzer (neurophysiologist reseacher/colleague here at Columbia, but better known to the music world as "Dave Soldier") for a couple of years on EEG-based music generation. Some of our work can be seen in the links here:
We've been using the infusionsystems/i-cubex sensors, but we've had a few problems with it. It's really expensive (especially for what you get), and the bluetooth nonsense is just plain annoying. I just purchased a Neurosky Mindset system a few weeks ago, but haven't had time to delve into it. I was also interested in the Emotiv system, but at the time I had a budget for ordering things their website said "We are currently sold out of this product. We are taking orders for shipping in May". And in mid-May they weren't shipping yet...
A few observations: the signal coming in (at least from the i-cubex, and according to Dave probably from about all of these devices) is quite noisy, but Noise Can Be Fun. Don't expect that you will get really localized brainwave info, it's just not gonna happen. However, you *can* work with what you get, just realize that you're going to see an overview of total brain activity unless you can spend $50k (or more) on some serious medical hardware.
And even with the serious medical hardware, if you imagine you can "think" a particular thought and then have it somehow happen, you're _way_ beyond the current state-of-the-art. Well, maybe not _way_ beyond, but such a system -- especially if it is robust and consistent -- could probably get you some papers published in those prestigious journals necessary to gain you tenure at a US research university.
The biggest problems we've had in performance (except for the annoying bluetooth implementation) result from electrical shorts between the sensors due to sweating. About half-way through most of our performances the audience starts to laugh because one (or more) of the performers suddenly appear brain-dead. Time to wipe the brow!
Also, it turns out that moving hands and limbs and other body-parts actually *does* effect the EEG signals. Your brain forms the action that triggers the movements. Placing sensors on optimally-located parts of the skull can help isolate some of this activity. Some of Dave's research involves the evoked differences between doing an action and simply thinking hard about doing it. Fun stuff!
I'm looking forward to trying out the Neurosky hardware. Their support people have been really good so far. I'd be interested in hearing what people discover about the Emotiv hardware if anyone can actually get some. I'm also not sure max/msp support exists for it at present, but I could be wrong about that.