Thanks for your help. We've received your bug report.

Crossword of Sound – Crossole

Sang Won Lee
Apr. 30
Atlanta, GA

Crossole is a meta-instrument that allows you to control music in high level. The word “Crossole” is a portmanteau of “crossword” and “so-lee(소리)” which, in Korean, means sound. Literally, crossole is a crossword of sound. Chord progression of music is visually presented as a set of virtual blocks that will eventually start to resemble a crossword puzzle. With the aid of the Kinect sensing technology, you can either build music in high level by gestures of building a score(blocks) as well as you can play note by note by stepping into the low level(grid).

In the way that a crossole player build musical structure using a set of blocks and play music by moving a cursor within the blocks, you have to think about the musical structure prior to notes, melody or sounds. Variation in progression of music can be easily made by selecting a different route. Also, crossole let you record your route. Once you finish the recording, you can let crossole play music by traversing the recorded route in the background. In the meantime, the player can map his hand gestures to other interesting sound.


Sang Won Lee
Avinash Sastry
Sertan Şentürk
Anosh Daruwalla

Feel free to contact us to play crossole or share the codes.

How was MAX used?

Max receive OSC message from the c++ module(OpenFramework + Kinect + OpenNI + OSCeleton). And then based on received OSC message, MAX generates a sequence of MIDI notes of broken chord in specified tempo. Key point is that disassociating tempo and its sequence. It means you can change the a set of MIDI notes that will be played regardless of what is being played. MAX patch will either apply new sequence at the moment it received message or switch to the sequence from the point when next note will be played based on message type. Additionally, Max also helps to generate random MIDI notes with random interval. MIDI notes are sent to Ableton. The Max Patch is mostly done by Anosh Daruwalla.

Crossword of Sound – Crossole

Mar 17, 2013 at 6:19am

Amazing project :) Well done! I love the variety of sounds that you are able to create with the different components of the installation.

I was wondering though, how did you manage to get the swiping along the axis to work? I can see that when you swipe your hand up and down or left to right you are able to chose different parameters. Also, how did you create the graphics for the grid display?

Thnaks in advance :)

Great stuff!


Mar 18, 2013 at 7:51am

Thanks for generous comments. For a short answer, all gestures and graphics are implemented using Microsoft Kinect and OpenFrameworks. Swiping gestures is just based on the hand position cross the certain threshold into four directions. For a long answer plus motivations behind the project, there’s a paper about this project you can read.

Mar 18, 2013 at 9:20am

Thank you for your reply. And for the information. I will look at the paper on the project.
Just out of interest, what operating system did you use to build, code and run Crossole on?

Thanks again.

Mar 18, 2013 at 11:27am

It was built and performed with 10.6.x OSX and used many api for kinect osc generation.

Mar 19, 2013 at 3:52am

Thank you for the information :). Crossole is a really fantastic project.

I am creating a project which also uses Kinect however i’m using Java and Processing instead of Openframeworks as i am not very confident with C++.
Also, is OSCeleton a tracking programme?

Thanks and sorry for all the questions.

Mar 19, 2013 at 10:28am

Thanks!! I think Java/Processing will do without problem. You can switch later if you feel the program is slow. Yes, I used OSCeleton. If you have iOS device, you can download an app called “echobo” and see the block interface used in crossole implemented on a mobile phone for a completely different purpose.

Good luck on your project and let me know if you post a video. :)

Mar 19, 2013 at 1:51pm

Great :) thanks for all the help!

With regards to the swipe gestures, would you mind sharing the code? No problems if you can’t :)

Thans again.

Mar 19, 2013 at 5:30pm

not a problem. It will be extremely hard to read though. The code is at I would start from GestureDetect() in Visual.cpp

Mar 20, 2013 at 4:42am

Thanks for that :) You are right though, it is hard to read!
One last question (I promise). How did you get the Kinect to understand where you are standing. I see on your video that if you stand in different parts of the stage, the screen displays different visuals. I have been looking for a way to do this for months but I can’t figure anything out.

Thanks again for your help :)

Mar 20, 2013 at 11:08am

Well that’s easy. I wonder if this is exactly what you need though. It’s whether z coordinate(depth) of torso is within threshold or not. :)
If you approach to camera the visual will change to grid sequencer view.

Mar 20, 2013 at 3:00pm

How do you assign a z coordinate? I thought the kinect only could track X and Y.

Mar 20, 2013 at 3:06pm

Kinect does measure depth (which is its core difference with other cameras.) OSCeleton will send you OSC message with xyz coordinate of your joints. See—message-with-the-coordinates-of-each-skeleton-joint

Mar 25, 2013 at 3:08am

Thank you very much for all your help :)


You must be logged in to reply to this topic.