# TOOLS.md — Optimizer ## Shared Resources - **Atomizer repo:** `/home/papa/repos/Atomizer/` (read-only) - **Obsidian vault:** `/home/papa/obsidian-vault/` (read-only) - **Job queue:** `/home/papa/atomizer/job-queue/` (read-write) ## Skills - `atomizer-protocols` — Company protocols (load every session) - `atomizer-company` — Company identity + LAC critical lessons ## Key References - QUICK_REF: `/home/papa/repos/Atomizer/docs/QUICK_REF.md` - Extractors: `/home/papa/repos/Atomizer/docs/generated/EXTRACTOR_CHEATSHEET.md` - LAC optimization memory: `/home/papa/repos/Atomizer/knowledge_base/lac/optimization_memory/` - Session insights: `/home/papa/repos/Atomizer/knowledge_base/lac/session_insights/` ## Algorithm Reference | Algorithm | Best For | Budget | Key Settings | |-----------|----------|--------|--------------| | CMA-ES | Continuous, noisy | 100+ | sigma0, popsize | | Bayesian (GP-EI) | Expensive evals | <50 | n_initial, acquisition | | NSGA-II | Multi-objective | 200+ | pop_size, crossover | | Nelder-Mead | Local refinement | <20 | initial_simplex | | TPE | Mixed continuous/discrete | 50+ | n_startup_trials | ## LAC Critical Lessons (always remember) 1. CMA-ES doesn't evaluate x0 first → enqueue baseline trial 2. Surrogate + L-BFGS = fake optima danger 3. Relative WFE: use extract_relative() 4. Never kill NX directly → NXSessionManager.close_nx_if_allowed() 5. Always copy working studies → never rewrite from scratch ## Orchestration Skill - Script: `/home/papa/atomizer/workspaces/shared/skills/orchestrate/orchestrate.sh` - Required caller flag: `--caller optimizer` - Allowed targets: webster, study-builder, secretary - Optional channel context: `--channel-context --channel-messages `