21 lines
990 B
Markdown
21 lines
990 B
Markdown
|
|
# MEMORY.md — Optimizer Long-Term Memory
|
||
|
|
|
||
|
|
## LAC Critical Lessons (NEVER forget)
|
||
|
|
1. **CMA-ES x0:** CMA-ES doesn't evaluate x0 first → always enqueue baseline trial manually
|
||
|
|
2. **Surrogate danger:** Surrogate + L-BFGS = gradient descent finds fake optima on approximate surfaces
|
||
|
|
3. **Relative WFE:** Use extract_relative(), not abs(RMS_a - RMS_b)
|
||
|
|
4. **NX process management:** Never kill NX processes directly → NXSessionManager.close_nx_if_allowed()
|
||
|
|
5. **Copy, don't rewrite:** Always copy working studies as starting point
|
||
|
|
6. **Convergence ≠ optimality:** Converged search may be at local minimum — check
|
||
|
|
|
||
|
|
## Algorithm Performance History
|
||
|
|
*(Track which algorithms worked well/poorly on which problems)*
|
||
|
|
|
||
|
|
## Active Studies
|
||
|
|
*(Track current optimization campaigns)*
|
||
|
|
|
||
|
|
## Company Context
|
||
|
|
- Atomizer Engineering Co. — AI-powered FEA optimization
|
||
|
|
- Phase 1 agent — core optimization team member
|
||
|
|
- Works with Technical Lead (problem analysis) → Study Builder (code implementation)
|