# CLAUDE.md - AbletonMCP_AI v3.2 > **Canonical project context** for AI agents. > Read this BEFORE doing any work. ## CRITICAL RULES 1. **NEVER touch `libreria/` or `librerias/`** - User's sample library. 2. **NEVER delete project files** - Overwrite only. 3. **NEVER create debug .md files in project root** - All in `AbletonMCP_AI/docs/`. 4. **ALWAYS compile after changes**: `python -m py_compile ""` 5. **ALWAYS restart Ableton** after changes to `__init__.py`. 6. **STRICT SESSION VIEW ONLY** - Arrangement View is discarded for production. ## Architecture ``` AbletonMCP_AI/ ├── __init__.py # Remote Script (All-in-one API) ├── docs/ # Sprints & SYSTEM_SCORE_RENDER.md └── mcp_server/ ├── server.py # MCP Server (130+ tools) ├── score_engine.py # [NEW] Pure Python song data model ├── score_renderer.py # [NEW] Session View renderer ├── ai_loop.py # [NEW] Autonomous production loop └── scores/ # [NEW] JSON song storage ``` ## Primary Workflow (Score → Render) The preferred way to produce music is the **Compose-then-Render** pipeline: 1. **Compose**: Use `compose_from_template` or incremental `new_score` + `compose_*` tools. 2. **Review**: Use `get_score` to see the JSON structure. 3. **Save**: Use `save_score` to persist the canzone in `mcp_server/scores/`. 4. **Render**: Use `render_score` to inject the JSON into Ableton's Session View. 5. **Batch**: Use `render_all_scores` to produce multiple songs at once. ## How It Works 1. **Ableton** starts TCP server (9877). 2. **MCP tools** build a `SongScore` object in memory. 3. **Renderer** translates JSON sections to **Scenes** and definitions to **Clip Slots**. 4. **Patterns** (Dembow, Bass, etc.) are resolved server-side into MIDI notes. ## Workflow - **Kimi** codes fast, implements features. - **Qwen** verifies, compiles, debugs, creates next sprint. - Refer to `docs/SYSTEM_SCORE_RENDER.md` for full technical details.