Music Classification
Automatic classification of musical properties

A piece of music can be described not only by its title, the artist or the album. It also has individual characteristics such as tempo, instrumentation, genre, or even the mood it creates. IDMT Music Classification extracts these properties automatically from the music recording, offering versatile possibilities of using the data thereby gained.
Software solution for improved music search and recommendation
Based on automatic music annotation Fraunhofer IDMT has developed a software solution that allows the search for similar musical pieces. SoundsLike not only makes it easier to find certain music tracks in a database, specific search queries can also be implemented on the basis of individual criteria or automatically create music recommendations based on individual user preferences such as the current mood.
Metadata Categories
- Genre (Classical, Country, Electronica, Jazz, Latin, Pop, Rap, Rock, Soul, Speech)
- Emotion (Anxious, Depressed, Exuberant, Content)
- Valence (High, Low)
- Arousal (High, Low)
- Perceived Tempo (Fast, Mid-Fast, Mid, Mid-Slow, Slow)
- Beats per Minute (e.g., 120 bpm)
- Key (C to H)
- Texture (Hard, Soft)
- Instrumental Density (Full, Sparse)
- Distortion (Clean, Gained, Overdrive, XXL)
- Dynamic (Continuous, Changing)
- Percussive (Nonpercussive, Percussive)
- Synthetic (Acoustic, Electroacoustic, Synthetic)