Scientific organizers: Christina Anagnostopoulou, Elaine Chew, Elizabeth Margulis, Anja Volk

Scientific Advisory Committee: Emilios Cambouropoulos, Ching-Hua Chuan

Description and aims of the workshop: The aim of this workshop was to bring together experts on music similarity from Computer Science and the Musical Sciences in order to discuss overarching and cross-disciplinary strategies on the theoretical and computational modeling of music similarity. The dramatic increase in the digitization of music calls for the development of computational methods in Music Information Research, where similarity poses serious challenges because of its context dependence. For music scientists who study similarity in listening, composition/improvisation, and analysis of scores/performances, the complexity of the musical material and lack of established formal models for the music domain pose serious challenges that cannot be met by one discipline alone, hence our multidisciplinary approach. The following paragraphs summarize participants’ responses to the post-workshop feedback form.

Expected tangible outcome: (1) A goal of the workshop was to develop a roadmap document on music similarity research, providing an overview of achievements, current challenges, future short and long term goals on modeling music similarity. (2) Networking at the workshop has led to numerous expressions of intent to submit future joint grant and workshop proposals, such as EU COST Action and Horizon 2020 framework ICT 16 Big Data proposals. (3) A special issue of the Journal of New Music Research will be dedicated to outcomes of the discussions at the Lorentz workshop; numerous attendees have formed partnerships to jointly submit proposals to co-author articles in the special issue.

Developments that constitute (even beginning) of scientific breakthrough: Participants appreciated learning about the state-of-the-art on music similarity from multiple unexpected perspectives, and mention breakthroughs, or progress towards discoveries in the following areas: [1] Definition of similarity: the concept of similarity as a category rather than a measure; [2] Scope of similarity: recognition that similarity extends beyond music features to include affect, physical motion, and other multimodal elements. [3] Evaluation of similarity: the current paradigm of testing algorithms/systems against ground-truth was increasingly challenged, especially given the importance of user context. Finally, one participant wrote, “We need a flagship program The Human Ear Project (cf. The Human Brain Project).”

Notable “Aha” moments: Learning about others’ work in a wide range of disciplines produced a host of “aha” moments, including [1] the value of alternate views: (a) realizing that categorical definition of similarity is necessary and complementary to distance-based definitions; (b) similarity and contrast are essentially related; (c) that acoustic and symbolic approaches are complementary and necessary; (d) the concept of priming might offer a way to explain ‘context’; [2] the extent of academic exclusivity: (a) information processing models of similarity dominate research in the area; (b) the lack of meaningful evaluation of similarity assessment; (c) discussions of embodiment and performance analysis was highly interesting; and, [3] the inter-connectedness of the disciplines: (a) similarity extends to timbre, motion, affect, and other modes; (b) that computer science and psychology methods of assessing quality of similarity measures are closely related; (c) that researchers in other fields also think about music similarity and in a variety of different ways.

Experience of format of workshop: The workshop had a limited number of plenary sessions, each with two speakers, interspersed with as many group brainstorming activities based around participant-chosen topics.  Most participants found the format to be excellent and enjoyed the balance of lectures vs. working groups, as well as the blend of experienced and starting researchers.  While many applauded the inclusive nature of the “big tent” approach, and appreciated the high-level and abstract nature of many of the discussions, some yearned for clearer goals and greater depth (perhaps over fewer topics), and for concrete results like a catalog of definitions, methods, and evaluations.

Other comments, suggestions and/or criticism: Many participants applauded the organization both by the scientific team as well as the Lorentz Center.  All found the center to be a most enjoyable venue, and were extremely impressed with the running of the Center and its openness to humanistic research in tandem with its focus on scientific research.  Participants appreciated very much the conceptualization of the workshop—the mixture of people with technical and musicological/psychological backgrounds—and found the networking aspect of the workshop especially worthwhile.  Several wrote that it should be repeated again in the future whenever possible.  The only very minor quibbles were that signage to the Lorentz Center could be improved, and there was not sufficient time to enjoy Leiden.  One participant wrote, tongue in cheek, that the number of coffee machines could be halved to increase the amount of informal contacts.  On a more serious note, the workshop highlighted the challenges of “interdisciplinary research, and the importance of such workshops so that we can all be in agreement (or at least be aware of non-agreement) regarding the science of the [music similarity] field.”