Thanks for your help. We've received your bug report.

Crossword of Sound – Crossole

Sang Won Lee
Apr. 30
Atlanta, GA

Crossole is a meta-instrument that allows you to control music in high level. The word “Crossole” is a portmanteau of “crossword” and “so-lee(소리)” which, in Korean, means sound. Literally, crossole is a crossword of sound. Chord progression of music is visually presented as a set of virtual blocks that will eventually start to resemble a crossword puzzle. With the aid of the Kinect sensing technology, you can either build music in high level by gestures of building a score(blocks) as well as you can play note by note by stepping into the low level(grid).

In the way that a crossole player build musical structure using a set of blocks and play music by moving a cursor within the blocks, you have to think about the musical structure prior to notes, melody or sounds. Variation in progression of music can be easily made by selecting a different route. Also, crossole let you record your route. Once you finish the recording, you can let crossole play music by traversing the recorded route in the background. In the meantime, the player can map his hand gestures to other interesting sound.

GTCMT http://gtcmt.gatech.edu

Sang Won Lee http://www.sangwonlee.com
Avinash Sastry http://avinashsastry.com
Sertan Şentürk http://sertansenturk.com/
Anosh Daruwalla http://gtcmt.gatech.edu/?p=4751

Feel free to contact us to play crossole or share the codes.

How was MAX used?

Max receive OSC message from the c++ module(OpenFramework + Kinect + OpenNI + OSCeleton). And then based on received OSC message, MAX generates a sequence of MIDI notes of broken chord in specified tempo. Key point is that disassociating tempo and its sequence. It means you can change the a set of MIDI notes that will be played regardless of what is being played. MAX patch will either apply new sequence at the moment it received message or switch to the sequence from the point when next note will be played based on message type. Additionally, Max also helps to generate random MIDI notes with random interval. MIDI notes are sent to Ableton. The Max Patch is mostly done by Anosh Daruwalla.

Crossword of Sound – Crossole

Mar 17, 2013 at 6:19am

Amazing project :) Well done! I love the variety of sounds that you are able to create with the different components of the installation.

I was wondering though, how did you manage to get the swiping along the axis to work? I can see that when you swipe your hand up and down or left to right you are able to chose different parameters. Also, how did you create the graphics for the grid display?

Thnaks in advance :)

Great stuff!

Marco

#262870
Mar 18, 2013 at 7:51am

Thanks for generous comments. For a short answer, all gestures and graphics are implemented using Microsoft Kinect and OpenFrameworks. Swiping gestures is just based on the hand position cross the certain threshold into four directions. For a long answer plus motivations behind the project, there’s a paper about this project you can read. http://scholar.google.com/scholar?hl=en&q=crossole+&btnG=&as_sdt=1%2C23&as_sdtp=

#262871
Mar 18, 2013 at 9:20am

Thank you for your reply. And for the information. I will look at the paper on the project.
Just out of interest, what operating system did you use to build, code and run Crossole on?

Thanks again.

#262872
Mar 18, 2013 at 11:27am

It was built and performed with 10.6.x OSX and used many api for kinect osc generation.

#262873
Mar 19, 2013 at 3:52am

Thank you for the information :). Crossole is a really fantastic project.

I am creating a project which also uses Kinect however i’m using Java and Processing instead of Openframeworks as i am not very confident with C++.
Also, is OSCeleton a tracking programme?

Thanks and sorry for all the questions.

#262874
Mar 19, 2013 at 10:28am

Thanks!! I think Java/Processing will do without problem. You can switch later if you feel the program is slow. Yes, I used OSCeleton. If you have iOS device, you can download an app called “echobo” and see the block interface used in crossole implemented on a mobile phone for a completely different purpose.

Good luck on your project and let me know if you post a video. :)

#262875
Mar 19, 2013 at 1:51pm

Great :) thanks for all the help!

With regards to the swipe gestures, would you mind sharing the code? No problems if you can’t :)

Thans again.

#262876
Mar 19, 2013 at 5:30pm

not a problem. It will be extremely hard to read though. The code is at https://github.com/panavrin/crossole. I would start from GestureDetect() in Visual.cpp

#262877
Mar 20, 2013 at 4:42am

Thanks for that :) You are right though, it is hard to read!
One last question (I promise). How did you get the Kinect to understand where you are standing. I see on your video that if you stand in different parts of the stage, the screen displays different visuals. I have been looking for a way to do this for months but I can’t figure anything out.

Thanks again for your help :)

#262878
Mar 20, 2013 at 11:08am

Well that’s easy. I wonder if this is exactly what you need though. It’s whether z coordinate(depth) of torso is within threshold or not. :)
If you approach to camera the visual will change to grid sequencer view.

#262879
Mar 20, 2013 at 3:00pm

How do you assign a z coordinate? I thought the kinect only could track X and Y.

#262880
Mar 20, 2013 at 3:06pm

Kinect does measure depth (which is its core difference with other cameras.) OSCeleton will send you OSC message with xyz coordinate of your joints. See https://github.com/Sensebloom/OSCeleton#joint-message—message-with-the-coordinates-of-each-skeleton-joint

#262881
Mar 25, 2013 at 3:08am

Thank you very much for all your help :)

#262882

You must be logged in to reply to this topic.