Show HN: I built an AI music tagger for DJs, fusing metadata, audio DSP, and ML

Built this for myself. I DJ, throw parties in Portland, and my digital library is 10k tracks and unmanageable. Tagging everything was a nonstarter.

Project started as a Python script that fed track metadata into ChatGPT. Worked surprisingly well for something so basic. Friends kept asking me to run their libraries through it, and eventually I built a real app around it.

The core insight: no single signal is enough. Metadata lookups miss obscure tracks. Audio analysis alone lacks context. LLMs alone hallucinate. So the app combines them:

* Metadata cross-referenced against a 200M record Discogs-based DB for known releases * A Mac companion app for managing library / tagging / syncing to Rekordbox * Python DSP that analyzes audio directly (BPM, harmonic content, energy) * A trained ML model for obscure tracks the main databases don't cover * Multiple AI models handling different classification dimensions * Four tag dimensions: genre (50+ sub-genres), region, era, vibe * Syncs to Rekordbox's My Tag system so you can filter on CDJs mid-set

Genre, region, and era are solid. Vibe is still a work in progress. Classifying how a track functions in a set (peak time vs after hours vs hypnotic) is fundamentally different from classifying what it is. Decent but not where I want it yet. No audio ever leaves the machine, only metadata. Credit-based pricing, no subscriptions, priced quite reasonably i think.

Would love feedback / questions!


Comments URL: https://news.ycombinator.com/item?id=47780619

Points: 1

# Comments: 0

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top