Kategorien
Dezember 2024 M D M D F S S 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 Meta
Archiv der Kategorie: MIR Research
Bachelor Thesis on the Multimodal AMUSE Extension
Clara Pingel’s bachelor thesis, titled „Erweiterung von AMUSE zur Verarbeitung mehrerer Modalitäten“ (Extension of AMUSE to Process Multiple Modalities, PDF in German) focuses on the further development of the AMUSE framework. The goal of the thesis was to extend the … Weiterlesen
Veröffentlicht unter Theses
Schreib einen Kommentar
Paper describing Artificial Audio Multitracks (AAM) dataset published in EURASIP Journal on Audio, Speech, and Music Processing
F. Ostermann, I. Vatolkin, and M. Ebeling: AAM: a Dataset of Artificial Audio Multitracks for Diverse Music Information Retrieval Tasks. EURASIP Journal on Audio, Speech, and Music Processing, 13, 2023. Zenodo link: https://doi.org/10.5281/zenodo.5794629 Abstract: We present a new dataset of … Weiterlesen
Veröffentlicht unter MIR Research, Publications
Schreib einen Kommentar
Two papers accepted for EvoMUSART
(1) I. Vatolkin, M. Gotham, N. Nápoles López, and F. Ostermann: Musical Genre Recognition based on Deep Descriptors of Harmony, Instrumentation, and Segments. Accepted for Proceedings of the 12th International Conference on Artificial Intelligence in Music, Sound, Art and Design … Weiterlesen
Veröffentlicht unter MIR Research, Publications
Schreib einen Kommentar
Paper on multi-modal music classification accepted for ISMIR
I. Vatolkin and C. McKay: Stability of Symbolic Feature Group Importance in the Context of Multi-Modal Music Classification. accepted for Proceedings of the 23rd International Society for Music Information Retrieval Conference (ISMIR) Abstract: Multi-modal music classification creates supervised models trained … Weiterlesen
Veröffentlicht unter MIR Research, Publications
Schreib einen Kommentar
Paper on multi-modal music classification using six modalities published in TISMIR
I. Vatolkin and C. McKay: Multi-Objective Investigation of Six Feature Source Types for Multi-Modal Music Classification. Transactions of the International Society for Music Information Retrieval, 5(1), pp.1–19, 2022. Abstract: Every type of musical data (audio, symbolic, lyrics, etc.) has its … Weiterlesen
Veröffentlicht unter MIR Research, Publications
Schreib einen Kommentar
Paper on EAR Drummer accepted for TISMIR special collection on AI and Musical Creativity
F. Ostermann, I. Vatolkin, and G. Rudolph: Evaluating Creativity in Automatic Reactive Accompaniment of Jazz Improvisation Abstract: Music generating computer programs can support jazz musicians and students during performance and practice, for instance by providing accompaniment for solo improvisation. However, … Weiterlesen
Veröffentlicht unter Publications
Schreib einen Kommentar
Paper on multi-modal music classification accepted for Entropy
The following paper was accepted for Entropy: B. Wilkes, I. Vatolkin, and H. Müller: Statistical and Visual Analysis of Audio, Text, and Image Features for Multi-Modal Music Genre Recognition
Veröffentlicht unter Publications
Schreib einen Kommentar
Job offer for a student assistant
For assistance during software project „Music Informatics“, a position as a student assistant (8 hours per week) is offered at the Chair of Algorithm Engineering, Department of Computer Science, TU Dortmund. Please see the full description in German.
Veröffentlicht unter AMUSE & MIR Software, General, Teaching Activities
Schreib einen Kommentar
AMUSE paper accepted for SIGIR
The following paper was accepted for SIGIR conference: I. Vatolkin, P. Ginsel, and G. Rudolph: Advancements in the Music Information Retrieval Framework AMUSE over the Last Decade Before the presentation at SIGIR, we will update the user manual (the current … Weiterlesen
Veröffentlicht unter AMUSE & MIR Software, Publications
Schreib einen Kommentar
AMUSE Repository Moved
The repository of Advanced MUSic Explorer has moved to: https://github.com/AdvancedMUSicExplorer/AMUSE
Veröffentlicht unter AMUSE & MIR Software, MIR Research
Schreib einen Kommentar