Deadline EXTENDED to July 9 -- Musical Metacreation Workshop (MUME2013)

    Jun 29 2013 | 8:51 am
    Call for Participation -- please distribute widely
    ============================= Musical Metacreation 2013   DEADLINE EXTENSION Submissions Now Due July 9 =============================
    ((( MUME 2013 ))) 2nd International Workshop on Musical Metacreation
    Held at the Ninth Annual AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment (AIIDE'13) Northeastern University, Boston, Massachusetts, USA October 14-15, 2013
    News: New Deadline for Paper and Demo Submissions: *** July 9, 2013 ***
    New Info for Interested Industry Presenters:
    We are delighted to announce the 2nd International Workshop on Musical Metacreation (MUME2013) to be held October 14 and 15, 2013, in conjunction with the Ninth Annual AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment (AIIDE'13). MUME2013 builds on the enthusiastic response and participation we received for the inaugural workshop in 2012, which received 31 submissions, 17 of which were accepted (a 55% acceptance rate). This year the workshop has expanded to 2 days.
    Thanks to continued progress in artistic and scientific research, a new possibility has emerged in our musical relationship with technology: Generative Music or Musical Metacreation, the design and use of computer music systems which are "creative on their own". Metacreation involves using tools and techniques from artificial intelligence, artificial life, and machine learning, themselves often inspired by cognitive and life sciences. Musical Metacreation suggests exciting new opportunities to enter creative music making: discovery and exploration of novel musical styles and content, collaboration between human performers and creative software "partners", and design of systems in gaming and entertainment that dynamically generate or modify music.
    MUME brings together artists, practitioners and researchers interested in developing systems that autonomously (or interactively) recognize, learn, represent, compose, complete, accompany, or interpret music. As such, we welcome contributions to the theory or practice of generative music systems and their applications in new media, digital art, and entertainment at large. Join us at MUME2013 and take part in this exciting, growing community!
    Topics ======
    We encourage paper and demo submissions on topics including the following:     * Novel representations of musical information     * Systems for autonomous or interactive music composition     * Systems for automatic generation of expressive musical interpretation     * Systems for learning or modelling music style and structure     * Systems for intelligently remixing or recombining musical material     * Advances or applications of AI, machine learning, and statistical techniques for musical purposes     * Advances or applications of evolutionary computing or agent and multiagent-based systems for musical purposes     * Computational models of human musical creativity     * Techniques and systems for supporting human musical creativity     * Online musical systems (i.e. systems with a real-time element)     * Adaptive and generative music in video games     * Methodologies for, and studies reporting on, evaluation of musical metacreations     * Emerging musical styles and approaches to music production and performance involving the use of AI systems     * Applications of musical metacreation for digital entertainment: sound design, soundtracks, interactive art, etc.
    Format and Submissions ======================
    The workshop will be a two day event including:     * Presentations of FULL TECHNICAL PAPERS (8 pages maximum)     * Presentations of POSITION PAPERS and TECHNICAL IN-PROGRESS WORK (5 pages maximum)     * Presentations of DEMONSTRATIONS (3 pages maximum)     * One or more PANEL SESSIONS (potential topics include international and networked collaborations, evaluation methodologies, generative music in art vs. games) * Presentations by INDUSTRY PARTNERS
    Workshop papers will be published in a Technical Report by AAAI Press and will be archived in the AAAI digital library. Submissions should be made in AAAI, 2-column format; see instructions here:
    We also invite companies involved in Musical Metacreation and its application to present their work and challenges to the MUME community. Each industrial partner selected will be given a timeslot to present/demo during the workshop. Interested industry representatives, for more info see:
    For complete details on attendance, submissions and formatting, please visit the workshop website: *** ***
    Important Dates ===============
    Submission deadline: July 9, 2013 Notification date: August 6, 2013 Accepted author CRC due to AAAI Press: August 14, 2013 Workshop date: October 14-15, 2013
    Workshop Organizers ===================
    Dr. Philippe Pasquier (Workshop Chair) School of Interactive Arts and Technology (SIAT) Simon Fraser University, Vancouver, Canada
    Dr. Arne Eigenfeldt School for the Contemporary Arts Simon Fraser University, Vancouver, Canada
    Dr. Oliver Bown Design Lab, Faculty of Architecture, Design and Planning The University of Sydney, Australia
    Graeme McCaig (Administration & Publicity Assistant) School of Interactive Arts and Technology (SIAT) Simon Fraser University, Vancouver, Canada
    Program Committee =================
    Gérard Assayag - IRCAM-France Al Biles - Rochester Institute of Technology - USA Tim Blackwell - Department of Computing, Goldsmiths College, University of London - UK Alan Blackwell - Cambridge University - UK Oliver Bown - The University of Sydney - Australia Andrew Brown - Queensland Conservatorium, Griffith University - Australia Jamie Bullock - Integra Lab, Birmingham Conservatoire - UK Karen Collins - University of Waterloo - Canada Nick Collins - University of Sussex - UK Darrell Conklin - University of the Basque Country - Spain Arne Eigenfeldt - Simon Fraser University - Canada Jason Freeman - Georgia Institute of Technology - USA Guy Garnett - University of Illinois - USA Toby Gifford - Griffith University - Australia Luke Harrald - Elder Conservatorium of Music, The University of Adelaide - Australia Bill Hsu - Department of Computer Science, San Francisco State University - USA Robert Keller - Harvey Mudd College - USA Nyssim Lefford - Audio Technology, Luleå University of Technology - Sweden George Lewis - Department of Music, Columbia University - USA Aengus Martin - Faculty of Engineering, The University of Sydney - Australia James Maxwell - Simon Fraser University - Canada Graeme McCaig - School of Interactive Arts and Technology, Simon Fraser University - Canada Jon McCormack - Centre for Electronic Media Art, Monash University - Australia James McDermott - Complex and Adaptive Systems Laboratory, University College Dublin - Ireland Alex McLean - ICSRiM - University of Leeds - UK Kia Ng - ICSRiM - University of Leeds - UK Philippe Pasquier - School of Interactive Arts and Technology, Simon Fraser University - Canada Marcus Pearce - Queen Mary, University of London - UK Robert Rowe - New York University - USA Benjamin Smith - Case Western Reserve University - USA Richard Stevens - Leeds Metropolitan University - UK Michael Sweet - Berklee College of Music - USA Peter Todd - Indiana University - USA Dan Ventura - Brigham Young University - USA Ivan Zavada - Conservatorium of Music, The University of Sydney - Australia