Fase 1: Populate BPM in sample_metadata.db (283/511 samples from filenames) Fase 2: DB-aware sample selection (_pick_best_db) with BPM±5 and key matching Fase 3: Auto-warp samples to project tempo via warp_clip_to_bpm Fase 4: Connect pattern_library engines (BassPatterns, ChordProgressions, MelodyGenerator) Fase 5: Harmonic coherence — detect key from drumloop and transpose MIDI Fase 6: SentimientoLatino2025 + reggaeton3 integrated — 616 samples, 19 clean categories New files: - engines/bpm_key_parser.py — robust BPM+key parser for filenames - engines/populate_bpm_from_filenames.py — DB population script - engines/recategorize_samples.py — category normalization (19 categories) Modified: - score_renderer.py — DB selection, auto-warp, engine patterns, key detection, 18 categories - ai_loop.py — SYSTEM_PROMPT with full category list
2.0 KiB
2.0 KiB
CLAUDE.md - AbletonMCP_AI v3.2
Canonical project context for AI agents. Read this BEFORE doing any work.
CRITICAL RULES
- NEVER touch
libreria/orlibrerias/- User's sample library. - NEVER delete project files - Overwrite only.
- NEVER create debug .md files in project root - All in
AbletonMCP_AI/docs/. - ALWAYS compile after changes:
python -m py_compile "<file_path>" - ALWAYS restart Ableton after changes to
__init__.py. - STRICT SESSION VIEW ONLY - Arrangement View is discarded for production.
Architecture
AbletonMCP_AI/
├── __init__.py # Remote Script (All-in-one API)
├── docs/ # Sprints & SYSTEM_SCORE_RENDER.md
└── mcp_server/
├── server.py # MCP Server (130+ tools)
├── score_engine.py # [NEW] Pure Python song data model
├── score_renderer.py # [NEW] Session View renderer
├── ai_loop.py # [NEW] Autonomous production loop
└── scores/ # [NEW] JSON song storage
Primary Workflow (Score → Render)
The preferred way to produce music is the Compose-then-Render pipeline:
- Compose: Use
compose_from_templateor incrementalnew_score+compose_*tools. - Review: Use
get_scoreto see the JSON structure. - Save: Use
save_scoreto persist the canzone inmcp_server/scores/. - Render: Use
render_scoreto inject the JSON into Ableton's Session View. - Batch: Use
render_all_scoresto produce multiple songs at once.
How It Works
- Ableton starts TCP server (9877).
- MCP tools build a
SongScoreobject in memory. - Renderer translates JSON sections to Scenes and definitions to Clip Slots.
- Patterns (Dembow, Bass, etc.) are resolved server-side into MIDI notes.
Workflow
- Kimi codes fast, implements features.
- Qwen verifies, compiles, debugs, creates next sprint.
- Refer to
docs/SYSTEM_SCORE_RENDER.mdfor full technical details.