Content You Need: ml.star


    Today, we released Benjamin D. Smith’s exciting new Machine Learning package for Max, ml.star. Machine Learning is a hot topic these days, but it can be tough to figure out where to get started if you aren’t a researcher in the field. For the beginners among us, there are some good example patchers to be found in the package along with lots of documentation. I asked Ben to describe the package in his own words and let all of us know why we should be using it.
    Here’s what Ben had to say:
    For me, machine learning is all about exploration and discovering new possibilities in my music and art. Machine learning encourages flexible, non-linear approaches that can be surprising and go to unexpected, but connected, places. Solving musical problems with fixed, logical patches doesn’t always allow us to go beyond, to find new things, new sounds, new ways of looking at our work. Again, machine learning is great at enabling emergent behaviors and connections, and finding spaces that we can’t anticipate when we first start in on a new patch or work.
    This package came about through my personal obsession with human and computer co-performance. The notion that both are agents that work together to enact greater musical collaborations is endlessly fascinating to me. But I came up against challenge after challenge as my patches could only do as much as I intentionally coded in, they couldn’t grow, they couldn’t handle truly unexpected input (or they crashed), and they couldn’t learn how I played and find new things to create. I quickly came to realize the next step in this direction had to be with smarter algorithms, i.e. machine learning models.
    If you’ve never used machine learning, or worked with a neural net of your own, why would you want to?
    I think the true power of ML for Max users is twofold:
    • It can tell us useful things about data streams–such as MIDI data from a keyboard or controller, or audio features from an acoustic musician
    • It can create natural, but really complex mappings allowing us to easily control large patches, synths, and generators without requiring extreme expertise and cognitive load.
    Machine learning models are primarily designed to classify things. In images, they can identify if an animal is visible and what kind of animal it is. In music, we can identify chords, or keys, or harmonic movement, or melodic or timbrel gestures! And when it “hears” something it doesn’t recognize it can tell us how related it is to what it already knows (if I play a cluster on my keyboard it will tell me how close it is to all the chords I’ve already told it about). This information can then be plugged in to drive all your favorite processes and patches! In this simple example we could map the “C-major-ness” of the cluster to the tint of a video, and the “Eb-minor-ness” to a posterization effect. Now, playing clusters or improvising on the keyboard manipulates the imagery in a repeatable, but complex fashion. Of course, chords could be replaced with physical gestures from a hand held device or audio input from a microphone, and we only have to patch a different data source into the ML objects.
    And what if we don’t know the chords or gestures that will be used in a performance? No problem, the ML objects can identify them on the fly (hence, “unsupervised”) and create mappings as it goes. This is really exciting for human-computer music making, where the machine can be finding gestures (i.e. patterns in our data) and pointing them out to us. And we don’t have to decide what gestures we’re going to use and code them in manually! We just play. The dream of having a patch that listens to everything you play, learns all of your music, and grows with you is within reach! Find ml.star in the Max 7 Package Manager (File > Show Package Manager) and start exploring.

    • Sep 27 2017 | 7:07 pm
      Hi A new door opens to us ... Thank you for that. This is going to be incredible. For music and for creating new picture
    • Sep 28 2017 | 9:20 am
      Really impressive. Thank you very much.
      The only issue I observed is immediate crash when I'm trying to open ml.spatial example. I've copied a OSX crash report to attached text file.
    • Sep 28 2017 | 8:34 pm
      Thanks for the crash report, yaniki–I'll dig into it!
    • Sep 29 2017 | 10:45 am
      I'm glad, I can help a bit. Btw. all other patches included in the package are rock-stable.
    • Sep 29 2017 | 4:21 pm
      Crashed ml.spatial everytime i try to open the example file.
    • Oct 03 2017 | 3:38 pm
      The ml.spatial help file crash appears to be a font issue. If someone who had the crash could take this file and test it (confirm a crash or not) I'd greatly appreciate it!
    • Oct 03 2017 | 8:14 pm
      It still crashes.
      cheers
    • Oct 03 2017 | 10:25 pm
      No crash, and the original patch ok too OSX 10.11.6, Max 7.3.4 64 bit MacBookPro11,1 nice work, thanks ! zz
    • Oct 04 2017 | 6:13 am
      Forgotten... (It crashes with) OS X 10.12.6 Max 7.3.4
    • Oct 04 2017 | 5:01 pm
      It seems everyone has the 'ML Fun' patches working, so at least the library is good to go! This appears to be both a file and system specific issue, as the ml.spatial object is working in the other patches fine. I'll try tearing the help file apart and rebuilding from scratch.
    • Oct 06 2017 | 12:20 pm
      Dear Ben
      Sorry for be annoying, but the patch you attached is triggering on my computer the same behaviour (immediate crash) as the original one.
      Here is " the latest crash report":
    • Oct 08 2017 | 4:24 pm
      @Yaniki: thanks for sticking with this, here's a completely rebuilt patch. It really seems to be some font/text glitch, so hopefully this fresh build should go for you. Let me know!
    • Oct 09 2017 | 10:45 am
      Dear Ben
      Thanks for your positive attitude and fast reaction.
      Unfortunately this new patch is still crashing (report attached). I have no time now, to dig into the problem more intensively but I'll be trying to do some tests on different computers during upcoming week.
    • Oct 09 2017 | 3:38 pm
      Hi Ben here, no crash. I have seen in the tab "one hot", the attrui is not updated at load (0.1) like the 2nd decay argument in the box (1.). hope this helps
    • Oct 09 2017 | 10:51 pm
      @Ben
      I know, it's a strange idea, but can you post a screenshot of the problematic patch? On my computer, when I'm creating an instance of the [ml.spatial] nothing happens (I mean: no crash), so this is probably something related to the patch itself - maybe it is possible to investigate the problem adding elements one after another, treating the screenshot as a "score" ;-)