João Amaro Lagedo
software engineer
EN PT

Ficino

macOS menu-bar app that comments on your Apple Music tracks.

A macOS menu-bar app that delivers AI-generated commentary on Apple Music tracks as they play, running entirely on-device via Apple Intelligence. Named after Marsilio Ficino, the Renaissance philosopher who held that music bridges the physical and the divine.

What it does

When a track changes, Ficino catches com.apple.Music.playerInfo, queries MusicKit and Genius in parallel for genres, editorial notes, and sample data, assembles a structured prompt, and sends it to Apple's on-device 3B FoundationModels session. A floating panel slides in from the top-right with album art and a short paragraph about the song — credit, context, the story behind the take. Zero network calls at runtime; nothing leaves the machine.

Stack

  • App: Swift 6, SwiftUI, FoundationModels, MusicKit, custom NSPanel, zero external dependencies.
  • ML workspace: Python with uv, an eval pipeline that mirrors the Swift PromptBuilder exactly, Anthropic's Batch API for synthetic training data, LoRA fine-tuning via Apple's adapter toolkit.

What is interesting

The on-device 3B model is small enough to hallucinate. Out of the box it prepended "Sure! Here is…" to half its replies, invented vocalists, and mistook album names for tracks. A LoRA adapter trained on 3,000 synthetic examples — distilled from real MusicKit + Genius metadata, generated by Claude Haiku — eliminated every failure pattern across an 81-track evaluation set. Zero preambles, zero hallucinations, zero misattributions. Training cost: roughly two hours on one H100, under ten dollars.

The pipeline is the interesting object. Swift's prompt builder and Python's eval loop are kept in lockstep by a small CLI binary (FMPromptRunner) that both sides invoke — the model behaviour the user sees in production is the same model behaviour the eval scores.

Source on github.com/jlagedo/Ficino.

View source →