MusGO: A Community-Driven Framework For Assessing Openness in Music-Generative AI
Citation: Roser Batlle-Roca, Laura Ibáñez-Martínez, Xavier Serra, Emilia Gómez, Martín Rocamora MusGO: A Community-Driven Framework For Assessing Openness in Music-Generative AI.
Internet Archive Scholar (search for fulltext): MusGO: A Community-Driven Framework For Assessing Openness in Music-Generative AI
Wikidata (metadata): Q135644933
Download: https://www.arxiv.org/abs/2507.03599
Tagged:
Summary
Adapts an evidence-based openness audit from LLMs to music-generative AI and, via a survey of 110 MIR community members, distills a domain-specific framework—MusGO—with 13 categories: 8 essential (graded closed / partial / open; e.g., source code, training data, model weights, code docs, training procedure, evaluation procedure, research paper, licensing) and 5 desirable (binary; model card, datasheet, package, user-oriented application, supplementary material page). The paper publishes a public leaderboard and open repository so assessments (and their evidence) can be scrutinized and updated. Applying MusGO to 16 music models shows training procedure is often the most open, training data the most closed (only one model reaches “fully open” there), and that openness in code ↔ weights ↔ documentation ↔ licensing tends to co-occur, while datasheets are rare. A weighting scheme (doubling the three most-valued essentials) orders models but is expressly not a single-score definition.
Theoretical and Practical Relevance
Provides a sector-specific, evidence-verifiable yardstick for “open model” claims in music—useful for audits, funding, and platform policy—and a community workflow (leaderboard + repo) that can catch open-washing and track improvements. By treating training data disclosure pragmatically (counting detailed source disclosure as “fully open” when raw sharing is unlawful) and adding evaluation procedure and user-facing artifacts, MusGO shows how openness can be operationalized under IP constraints while still enabling scrutiny and reproducibility—complementing general schemes (e.g., MOF/OSAID) with music-specific criteria.