feat: Add M1 mirror Zernike optimization with correct RMS calculation
Major improvements to telescope mirror optimization workflow: Assembly FEM Workflow (solve_simulation.py): - Fixed multi-part assembly FEM update sequence - Use ImportFromFile() for reliable expression updates - Add DuplicateNodesCheckBuilder with MergeOccurrenceNodes=True - Switch to Foreground solve mode for multi-subcase solutions - Add detailed logging and diagnostics for node merge operations Zernike RMS Calculation: - CRITICAL FIX: Use correct surface-based RMS formula - Global RMS = sqrt(mean(W^2)) from actual WFE values - Filtered RMS = sqrt(mean(W_residual^2)) after removing low-order fit - This matches zernike_Post_Script_NX.py (optical standard) - Previous WRONG formula was: sqrt(sum(coeffs^2)) - Add compute_rms_filter_j1to3() for optician workload metric Subcase Mapping: - Fix subcase mapping to match NX model: - Subcase 1 = 90 deg (polishing orientation) - Subcase 2 = 20 deg (reference) - Subcase 3 = 40 deg - Subcase 4 = 60 deg New Study: M1 Mirror Zernike Optimization - Full optimization config with 11 design variables - 3 objectives: rel_filtered_rms_40_vs_20, rel_filtered_rms_60_vs_20, mfg_90_optician_workload - Neural surrogate support for accelerated optimization Documentation: - Update ZERNIKE_INTEGRATION.md with correct RMS formula - Update ASSEMBLY_FEM_WORKFLOW.md with expression import and node merge details - Add reference scripts from original zernike_Post_Script_NX.py 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
188
docs/06_PROTOCOLS_DETAILED/ASSEMBLY_FEM_WORKFLOW.md
Normal file
188
docs/06_PROTOCOLS_DETAILED/ASSEMBLY_FEM_WORKFLOW.md
Normal file
@@ -0,0 +1,188 @@
|
||||
# Assembly FEM Optimization Workflow
|
||||
|
||||
This document describes the multi-part assembly FEM workflow used when optimizing complex assemblies with `.afm` (Assembly FEM) files.
|
||||
|
||||
## Overview
|
||||
|
||||
Assembly FEMs have a more complex dependency chain than single-part simulations:
|
||||
|
||||
```
|
||||
.prt (geometry) → _fem1.fem (component mesh) → .afm (assembly mesh) → .sim (solution)
|
||||
```
|
||||
|
||||
Each level must be updated in sequence when design parameters change.
|
||||
|
||||
## When This Workflow Applies
|
||||
|
||||
This workflow is automatically triggered when:
|
||||
- The working directory contains `.afm` files
|
||||
- Multiple `.fem` files exist (component meshes)
|
||||
- Multiple `.prt` files exist (component geometry)
|
||||
|
||||
Examples:
|
||||
- M1 Mirror assembly (M1_Blank + M1_Vertical_Support_Skeleton)
|
||||
- Multi-component mechanical assemblies
|
||||
- Any NX assembly where components have separate FEM files
|
||||
|
||||
## The 4-Step Workflow
|
||||
|
||||
### Step 1: Update Expressions in Geometry Part (.prt)
|
||||
|
||||
```
|
||||
Open M1_Blank.prt
|
||||
├── Find and update design expressions
|
||||
│ ├── whiffle_min = 42.5
|
||||
│ ├── whiffle_outer_to_vertical = 75.0
|
||||
│ └── inner_circular_rib_dia = 550.0
|
||||
├── Rebuild geometry (DoUpdate)
|
||||
└── Save part
|
||||
```
|
||||
|
||||
The `.prt` file contains the parametric CAD model with expressions that drive dimensions. These expressions are updated with new design parameter values, then the geometry is rebuilt.
|
||||
|
||||
### Step 2: Update Component FEM Files (.fem)
|
||||
|
||||
```
|
||||
For each component FEM:
|
||||
├── Open M1_Blank_fem1.fem
|
||||
│ ├── UpdateFemodel() - regenerates mesh from updated geometry
|
||||
│ └── Save FEM
|
||||
├── Open M1_Vertical_Support_Skeleton_fem1.fem
|
||||
│ ├── UpdateFemodel()
|
||||
│ └── Save FEM
|
||||
└── ... (repeat for all component FEMs)
|
||||
```
|
||||
|
||||
Each component FEM is linked to its source geometry. `UpdateFemodel()` regenerates the mesh based on the updated geometry.
|
||||
|
||||
### Step 3: Update Assembly FEM (.afm)
|
||||
|
||||
```
|
||||
Open ASSY_M1_assyfem1.afm
|
||||
├── UpdateFemodel() - updates assembly mesh
|
||||
├── Merge coincident nodes (at component interfaces)
|
||||
├── Resolve labeling conflicts (duplicate node/element IDs)
|
||||
└── Save AFM
|
||||
```
|
||||
|
||||
The assembly FEM combines component meshes. This step:
|
||||
- Reconnects meshes at shared interfaces
|
||||
- Resolves numbering conflicts between component meshes
|
||||
- Ensures mesh continuity for accurate analysis
|
||||
|
||||
### Step 4: Solve Simulation (.sim)
|
||||
|
||||
```
|
||||
Open ASSY_M1_assyfem1_sim1.sim
|
||||
├── Execute solve
|
||||
│ ├── Foreground mode for all solutions
|
||||
│ └── or Background mode for specific solution
|
||||
└── Save simulation
|
||||
```
|
||||
|
||||
The simulation file references the assembly FEM and contains solution setup (loads, constraints, subcases).
|
||||
|
||||
## File Dependencies
|
||||
|
||||
```
|
||||
M1 Mirror Example:
|
||||
|
||||
M1_Blank.prt ─────────────────────> M1_Blank_fem1.fem ─────────┐
|
||||
│ │ │
|
||||
│ (expressions) │ (component mesh) │
|
||||
↓ ↓ │
|
||||
M1_Vertical_Support_Skeleton.prt ──> M1_..._Skeleton_fem1.fem ─┤
|
||||
│
|
||||
↓
|
||||
ASSY_M1_assyfem1.afm ──> ASSY_M1_assyfem1_sim1.sim
|
||||
(assembly mesh) (solution)
|
||||
```
|
||||
|
||||
## API Functions Used
|
||||
|
||||
| Step | NX API Call | Purpose |
|
||||
|------|-------------|---------|
|
||||
| 1 | `OpenBase()` | Open .prt file |
|
||||
| 1 | `ImportFromFile()` | Import expressions from .exp file (preferred) |
|
||||
| 1 | `DoUpdate()` | Rebuild geometry |
|
||||
| 2-3 | `UpdateFemodel()` | Regenerate mesh from geometry |
|
||||
| 3 | `DuplicateNodesCheckBuilder` | Merge coincident nodes at interfaces |
|
||||
| 3 | `MergeOccurrenceNodes = True` | Critical: enables cross-component merge |
|
||||
| 4 | `SolveAllSolutions()` | Execute FEA (Foreground mode recommended)
|
||||
|
||||
### Expression Update Method
|
||||
|
||||
The recommended approach uses expression file import:
|
||||
|
||||
```python
|
||||
# Write expressions to .exp file
|
||||
with open(exp_path, 'w') as f:
|
||||
for name, value in expressions.items():
|
||||
unit = get_unit_for_expression(name)
|
||||
f.write(f"[{unit}]{name}={value}\n")
|
||||
|
||||
# Import into part
|
||||
modified, errors = workPart.Expressions.ImportFromFile(
|
||||
exp_path,
|
||||
NXOpen.ExpressionCollection.ImportMode.Replace
|
||||
)
|
||||
```
|
||||
|
||||
This is more reliable than `EditExpressionWithUnits()` for batch updates.
|
||||
|
||||
## Error Handling
|
||||
|
||||
Common issues and solutions:
|
||||
|
||||
### "Update undo happened"
|
||||
- Geometry update failed due to constraint violations
|
||||
- Check expression values are within valid ranges
|
||||
- May need to adjust parameter bounds
|
||||
|
||||
### "This operation can only be done on the work part"
|
||||
- Work part not properly set before operation
|
||||
- Use `SetWork()` to make target part the work part
|
||||
|
||||
### Node merge warnings
|
||||
- Manual intervention may be needed for complex interfaces
|
||||
- Check mesh connectivity in NX after solve
|
||||
|
||||
### "Billion nm" RMS values
|
||||
- Indicates node merging failed - coincident nodes not properly merged
|
||||
- Check `MergeOccurrenceNodes = True` is set
|
||||
- Verify tolerance (0.01 mm recommended)
|
||||
- Run node merge after every FEM update, not just once
|
||||
|
||||
## Configuration
|
||||
|
||||
The workflow auto-detects assembly FEMs, but you can configure behavior:
|
||||
|
||||
```json
|
||||
{
|
||||
"nx_settings": {
|
||||
"expression_part": "M1_Blank", // Override auto-detection
|
||||
"component_fems": [ // Explicit list of FEMs to update
|
||||
"M1_Blank_fem1.fem",
|
||||
"M1_Vertical_Support_Skeleton_fem1.fem"
|
||||
],
|
||||
"afm_file": "ASSY_M1_assyfem1.afm"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Implementation Reference
|
||||
|
||||
See `optimization_engine/solve_simulation.py` for the full implementation:
|
||||
|
||||
- `detect_assembly_fem()` - Detects if assembly workflow needed
|
||||
- `update_expressions_in_part()` - Step 1 implementation
|
||||
- `update_fem_part()` - Step 2 implementation
|
||||
- `update_assembly_fem()` - Step 3 implementation
|
||||
- `solve_simulation_file()` - Step 4 implementation
|
||||
|
||||
## Tips
|
||||
|
||||
1. **Start with baseline solve**: Before optimization, manually verify the full workflow completes in NX
|
||||
2. **Check mesh quality**: Poor mesh quality after updates can cause solve failures
|
||||
3. **Monitor memory**: Assembly FEMs with many components use significant memory
|
||||
4. **Use Foreground mode**: For multi-subcase solutions, Foreground mode ensures all subcases complete
|
||||
313
docs/ZERNIKE_INTEGRATION.md
Normal file
313
docs/ZERNIKE_INTEGRATION.md
Normal file
@@ -0,0 +1,313 @@
|
||||
# Zernike Wavefront Analysis Integration
|
||||
|
||||
This document describes how to use Atomizer's Zernike analysis capabilities for telescope mirror optimization.
|
||||
|
||||
## Overview
|
||||
|
||||
Atomizer includes a full Zernike polynomial decomposition system for analyzing wavefront errors (WFE) in telescope mirror FEA simulations. The system:
|
||||
|
||||
- Extracts nodal displacements from NX Nastran OP2 files
|
||||
- Fits Zernike polynomials using Noll indexing (optical standard)
|
||||
- Computes RMS metrics (global and filtered)
|
||||
- Analyzes individual aberrations (astigmatism, coma, trefoil, etc.)
|
||||
- Supports multi-subcase analysis (different gravity orientations)
|
||||
|
||||
## Quick Start
|
||||
|
||||
### Simple Extraction
|
||||
|
||||
```python
|
||||
from optimization_engine.extractors import extract_zernike_from_op2
|
||||
|
||||
# Extract Zernike metrics for a single subcase
|
||||
result = extract_zernike_from_op2(
|
||||
op2_file="model-solution_1.op2",
|
||||
subcase="20" # 20 degree elevation
|
||||
)
|
||||
|
||||
print(f"Global RMS: {result['global_rms_nm']:.2f} nm")
|
||||
print(f"Filtered RMS: {result['filtered_rms_nm']:.2f} nm")
|
||||
print(f"Astigmatism: {result['astigmatism_rms_nm']:.2f} nm")
|
||||
```
|
||||
|
||||
### In Optimization Objective
|
||||
|
||||
```python
|
||||
from optimization_engine.extractors.zernike_helpers import create_zernike_objective
|
||||
|
||||
# Create objective function
|
||||
zernike_obj = create_zernike_objective(
|
||||
op2_finder=lambda: sim_dir / "model-solution_1.op2",
|
||||
subcase="20",
|
||||
metric="filtered_rms_nm"
|
||||
)
|
||||
|
||||
# Use in Optuna trial
|
||||
def objective(trial):
|
||||
# ... suggest parameters ...
|
||||
# ... run simulation ...
|
||||
|
||||
rms = zernike_obj()
|
||||
return rms
|
||||
```
|
||||
|
||||
## RMS Calculation Method
|
||||
|
||||
**IMPORTANT**: Atomizer uses the correct surface-based RMS calculation matching optical standards:
|
||||
|
||||
```python
|
||||
# Global RMS = sqrt(mean(W^2)) - RMS of actual WFE surface values
|
||||
global_rms = sqrt(mean(W_nm ** 2))
|
||||
|
||||
# Filtered RMS = sqrt(mean(W_residual^2))
|
||||
# where W_residual = W_nm - Z[:, :4] @ coeffs[:4] (low-order fit subtracted)
|
||||
filtered_rms = sqrt(mean(W_residual ** 2))
|
||||
```
|
||||
|
||||
This is **different** from summing Zernike coefficients! The RMS is computed from the actual WFE surface values, not from `sqrt(sum(coeffs^2))`.
|
||||
|
||||
## Available Metrics
|
||||
|
||||
### RMS Metrics
|
||||
| Metric | Description |
|
||||
|--------|-------------|
|
||||
| `global_rms_nm` | RMS of entire WFE surface: `sqrt(mean(W^2))` |
|
||||
| `filtered_rms_nm` | RMS after removing modes 1-4 (piston, tip, tilt, defocus) |
|
||||
| `rms_filter_j1to3_nm` | RMS after removing only modes 1-3 (keeps defocus) - "optician workload" |
|
||||
|
||||
### Aberration Magnitudes
|
||||
| Metric | Zernike Modes | Description |
|
||||
|--------|--------------|-------------|
|
||||
| `defocus_nm` | J4 | Focus error |
|
||||
| `astigmatism_rms_nm` | J5 + J6 | Combined astigmatism |
|
||||
| `coma_rms_nm` | J7 + J8 | Combined coma |
|
||||
| `trefoil_rms_nm` | J9 + J10 | Combined trefoil |
|
||||
| `spherical_nm` | J11 | Primary spherical |
|
||||
|
||||
## Multi-Subcase Analysis
|
||||
|
||||
For telescope mirrors, gravity orientation affects surface shape. Standard subcases:
|
||||
|
||||
| Subcase | Description |
|
||||
|---------|-------------|
|
||||
| 20 | Low elevation (operational) |
|
||||
| 40 | Mid-low elevation |
|
||||
| 60 | Mid-high elevation |
|
||||
| 90 | Horizontal (polishing orientation) |
|
||||
|
||||
### Extract All Subcases
|
||||
|
||||
```python
|
||||
from optimization_engine.extractors import ZernikeExtractor
|
||||
|
||||
extractor = ZernikeExtractor("model.op2")
|
||||
results = extractor.extract_all_subcases(reference_subcase="20")
|
||||
|
||||
for label, metrics in results.items():
|
||||
print(f"Subcase {label}: {metrics['filtered_rms_nm']:.1f} nm")
|
||||
```
|
||||
|
||||
### Relative Analysis
|
||||
|
||||
Compare deformation between orientations:
|
||||
|
||||
```python
|
||||
from optimization_engine.extractors.zernike_helpers import create_relative_zernike_objective
|
||||
|
||||
# Minimize deformation at 20 deg relative to polishing position (90 deg)
|
||||
relative_obj = create_relative_zernike_objective(
|
||||
op2_finder=lambda: sim_dir / "model.op2",
|
||||
target_subcase="20",
|
||||
reference_subcase="90"
|
||||
)
|
||||
|
||||
relative_rms = relative_obj()
|
||||
```
|
||||
|
||||
## Optimization Configuration
|
||||
|
||||
### Example: Single Objective (Filtered RMS)
|
||||
|
||||
```json
|
||||
{
|
||||
"objectives": [
|
||||
{
|
||||
"name": "filtered_rms",
|
||||
"direction": "minimize",
|
||||
"extractor": "zernike",
|
||||
"extractor_config": {
|
||||
"subcase": "20",
|
||||
"metric": "filtered_rms_nm"
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
### Example: Multi-Objective (RMS + Mass)
|
||||
|
||||
```json
|
||||
{
|
||||
"objectives": [
|
||||
{
|
||||
"name": "filtered_rms_20deg",
|
||||
"direction": "minimize",
|
||||
"extractor": "zernike",
|
||||
"extractor_config": {
|
||||
"subcase": "20",
|
||||
"metric": "filtered_rms_nm"
|
||||
}
|
||||
},
|
||||
{
|
||||
"name": "mass",
|
||||
"direction": "minimize",
|
||||
"extractor": "mass_from_expression"
|
||||
}
|
||||
],
|
||||
"optimization_settings": {
|
||||
"sampler": "NSGA-II",
|
||||
"protocol": 11
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Example: Constrained (Stress + Aberration Limits)
|
||||
|
||||
```json
|
||||
{
|
||||
"constraints": [
|
||||
{
|
||||
"name": "astigmatism_limit",
|
||||
"type": "upper_bound",
|
||||
"threshold": 50.0,
|
||||
"extractor": "zernike",
|
||||
"extractor_config": {
|
||||
"subcase": "90",
|
||||
"metric": "astigmatism_rms_nm"
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
## Advanced: ZernikeObjectiveBuilder
|
||||
|
||||
For complex multi-subcase objectives:
|
||||
|
||||
```python
|
||||
from optimization_engine.extractors.zernike_helpers import ZernikeObjectiveBuilder
|
||||
|
||||
builder = ZernikeObjectiveBuilder(
|
||||
op2_finder=lambda: sim_dir / "model.op2"
|
||||
)
|
||||
|
||||
# Weight operational positions more heavily
|
||||
builder.add_subcase_objective("20", "filtered_rms_nm", weight=1.0)
|
||||
builder.add_subcase_objective("40", "filtered_rms_nm", weight=0.5)
|
||||
builder.add_subcase_objective("60", "filtered_rms_nm", weight=0.5)
|
||||
|
||||
# Create combined objective (weighted sum)
|
||||
objective = builder.build_weighted_sum()
|
||||
|
||||
# Or: worst-case across subcases
|
||||
worst_case_obj = builder.build_max()
|
||||
```
|
||||
|
||||
## Zernike Settings
|
||||
|
||||
### Configuration Options
|
||||
|
||||
| Setting | Default | Description |
|
||||
|---------|---------|-------------|
|
||||
| `n_modes` | 50 | Number of Zernike modes to fit |
|
||||
| `filter_orders` | 4 | Low-order modes to filter (1-4 = piston through defocus) |
|
||||
| `displacement_unit` | "mm" | Unit of displacement in OP2 ("mm", "m", "um", "nm") |
|
||||
|
||||
### Unit Conversions
|
||||
|
||||
Wavefront error (WFE) is computed as:
|
||||
|
||||
```
|
||||
WFE_nm = 2 * displacement * unit_conversion
|
||||
```
|
||||
|
||||
Where `unit_conversion` converts to nanometers:
|
||||
- mm: 1e6
|
||||
- m: 1e9
|
||||
- um: 1e3
|
||||
|
||||
The factor of 2 accounts for the optical convention (surface error doubles as wavefront error for reflection).
|
||||
|
||||
## NX Nastran Setup
|
||||
|
||||
### Required Subcases
|
||||
|
||||
Your NX Nastran model should have subcases for each gravity orientation:
|
||||
|
||||
```
|
||||
SUBCASE 20
|
||||
SUBTITLE=20 deg elevation
|
||||
LOAD = ...
|
||||
|
||||
SUBCASE 40
|
||||
SUBTITLE=40 deg elevation
|
||||
LOAD = ...
|
||||
```
|
||||
|
||||
The extractor identifies subcases by:
|
||||
1. Numeric value in SUBTITLE (preferred)
|
||||
2. SUBCASE ID number
|
||||
|
||||
### Output Requests
|
||||
|
||||
Ensure displacement output is requested:
|
||||
|
||||
```
|
||||
SET 999 = ALL
|
||||
DISPLACEMENT(SORT1,REAL) = 999
|
||||
```
|
||||
|
||||
## Migration from Legacy Scripts
|
||||
|
||||
If you were using `zernike_Post_Script_NX.py`:
|
||||
|
||||
| Old Approach | Atomizer Equivalent |
|
||||
|--------------|---------------------|
|
||||
| Manual OP2 parsing | `ZernikeExtractor` |
|
||||
| `compute_zernike_coeffs_chunked()` | `compute_zernike_coefficients()` |
|
||||
| `write_exp_file()` | Configure as objective/constraint |
|
||||
| HTML reports | Dashboard visualization (TBD) |
|
||||
| RMS log CSV | Optuna database + export |
|
||||
|
||||
### Key Differences
|
||||
|
||||
1. **Integration**: Zernike is now an extractor like displacement/stress
|
||||
2. **Optimization**: Direct use as objectives/constraints in Optuna
|
||||
3. **Multi-objective**: Native NSGA-II support for RMS + mass Pareto optimization
|
||||
4. **Neural Acceleration**: Can train surrogate on Zernike metrics (Protocol 12)
|
||||
|
||||
## Example Study Structure
|
||||
|
||||
```
|
||||
studies/
|
||||
mirror_optimization/
|
||||
1_setup/
|
||||
optimization_config.json
|
||||
model/
|
||||
ASSY_M1.prt
|
||||
ASSY_M1_assyfem1.afm
|
||||
ASSY_M1_assyfem1_sim1.sim
|
||||
2_results/
|
||||
study.db
|
||||
zernike_analysis/
|
||||
trial_001_zernike.json
|
||||
trial_002_zernike.json
|
||||
...
|
||||
run_optimization.py
|
||||
```
|
||||
|
||||
## See Also
|
||||
|
||||
- [examples/optimization_config_zernike_mirror.json](../examples/optimization_config_zernike_mirror.json) - Full example configuration
|
||||
- [optimization_engine/extractors/extract_zernike.py](../optimization_engine/extractors/extract_zernike.py) - Core implementation
|
||||
- [optimization_engine/extractors/zernike_helpers.py](../optimization_engine/extractors/zernike_helpers.py) - Helper functions
|
||||
332
examples/Zernike_old_reference/nx_post_each_iter.py
Normal file
332
examples/Zernike_old_reference/nx_post_each_iter.py
Normal file
@@ -0,0 +1,332 @@
|
||||
# nx_post_each_iter.py
|
||||
import os, subprocess
|
||||
import NXOpen
|
||||
from datetime import datetime
|
||||
import csv, re
|
||||
|
||||
# --- SETTINGS ---
|
||||
TEST_ENV_PY = r"C:\Users\antoi\anaconda3\envs\test_env\python.exe"
|
||||
SCRIPT_NAME = "zernike_Post_Script_NX.py" # your script in the .sim folder
|
||||
OP2_NAME = "assy_m1_assyfem1_sim1-solution_1.op2"
|
||||
EXP_NAME = "Iteration_results_expression.exp"
|
||||
TIMEOUT = None # e.g., 900 for 15 min
|
||||
# Option A: set via env NX_GEOM_PART_NAME, else hardcode your CAD part name here.
|
||||
GEOM_PART_NAME = os.environ.get("NX_GEOM_PART_NAME", "ASSY_M1_assyfem1")
|
||||
|
||||
# ---------------
|
||||
|
||||
def import_iteration_results_exp(exp_path: str, lw) -> bool:
|
||||
"""Import EXP into current Work part (Replace) and update."""
|
||||
theSession = NXOpen.Session.GetSession()
|
||||
workPart = theSession.Parts.BaseWork
|
||||
|
||||
if not os.path.isfile(exp_path):
|
||||
lw.WriteLine(f"[EXP][ERROR] File not found: {exp_path}")
|
||||
return False
|
||||
|
||||
mark_import = theSession.SetUndoMark(NXOpen.Session.MarkVisibility.Visible, "Import Expressions")
|
||||
try:
|
||||
modified, err_msgs = workPart.Expressions.ImportFromFile(
|
||||
exp_path, NXOpen.ExpressionCollection.ImportMode.Replace
|
||||
)
|
||||
# surface any parsing messages
|
||||
try:
|
||||
if err_msgs:
|
||||
for m in err_msgs:
|
||||
lw.WriteLine(f"[EXP][WARN] {m}")
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
mark_update = theSession.SetUndoMark(NXOpen.Session.MarkVisibility.Invisible, "NX update")
|
||||
nErrs = theSession.UpdateManager.DoUpdate(mark_update)
|
||||
theSession.DeleteUndoMark(mark_update, "NX update")
|
||||
|
||||
theSession.SetUndoMarkName(mark_import, "Expressions")
|
||||
theSession.DeleteUndoMark(mark_import, None)
|
||||
|
||||
lw.WriteLine(f"[EXP] Imported OK (modified={modified}, nErrs={nErrs})")
|
||||
return True
|
||||
except Exception as ex:
|
||||
lw.WriteLine(f"[EXP][FATAL] {ex}")
|
||||
try:
|
||||
theSession.DeleteUndoMark(mark_import, None)
|
||||
except Exception:
|
||||
pass
|
||||
return False
|
||||
|
||||
def export_all_named_expressions_to_exp(workPart, out_path, lw):
|
||||
"""
|
||||
Export expressions to an .exp file using the 3-arg signature:
|
||||
ExportToFile(<mode>, <filepath>, <sort_type>)
|
||||
Works across NX versions where enums live under either:
|
||||
NXOpen.ExpressionCollection.ExportMode / SortType
|
||||
or
|
||||
NXOpen.ExpressionCollectionExportMode / ExpressionCollectionSortType
|
||||
"""
|
||||
try:
|
||||
if not out_path.lower().endswith(".exp"):
|
||||
out_path += ".exp"
|
||||
|
||||
mode_cls = getattr(NXOpen.ExpressionCollection, "ExportMode",
|
||||
getattr(NXOpen, "ExpressionCollectionExportMode", None))
|
||||
sort_cls = getattr(NXOpen.ExpressionCollection, "SortType",
|
||||
getattr(NXOpen, "ExpressionCollectionSortType", None))
|
||||
if mode_cls is None or sort_cls is None:
|
||||
raise RuntimeError("Unsupported NX/Open version: ExportMode/SortType enums not found")
|
||||
|
||||
workPart.Expressions.ExportToFile(mode_cls.WorkPart, out_path, sort_cls.AlphaNum)
|
||||
lw.WriteLine(f"[EXP-EXPORT] Wrote: {out_path}")
|
||||
return True
|
||||
except Exception as ex:
|
||||
lw.WriteLine(f"[EXP-EXPORT][ERROR] {ex}")
|
||||
return False
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
def parse_exp_file_to_dict(exp_path):
|
||||
"""
|
||||
Parse NX .exp lines like:
|
||||
// comments
|
||||
[MilliMeter]SomeName=0.001234
|
||||
SomeOther=42
|
||||
into { 'SomeName': numeric_or_str, ... }.
|
||||
"""
|
||||
out = {}
|
||||
num = re.compile(r'^[+-]?\d+(?:\.\d+)?(?:[eE][+-]?\d+)?$')
|
||||
|
||||
with open(exp_path, 'r', encoding='utf-8', errors='ignore') as f:
|
||||
for line in f:
|
||||
s = line.strip()
|
||||
if not s or s.startswith('//'):
|
||||
continue
|
||||
if '=' not in s:
|
||||
continue
|
||||
left, right = s.split('=', 1)
|
||||
# strip optional [Unit] prefixes on left side
|
||||
left = re.sub(r'\[[^\]]*\]\s*', '', left).strip()
|
||||
key = left
|
||||
val = right.strip()
|
||||
|
||||
# try numeric
|
||||
v = val
|
||||
if num.match(val):
|
||||
try:
|
||||
v = float(val)
|
||||
except Exception:
|
||||
pass
|
||||
out[key] = v
|
||||
return out
|
||||
|
||||
def append_named_exprs_row(results_dir, run_id, run_dt, expr_dict, lw, source, part_name):
|
||||
"""
|
||||
Appends one row to Results/NX_named_expressions_log.csv
|
||||
Columns auto-extend for new expression names.
|
||||
Adds metadata: RunID, RunDateTimeLocal, Source ('SIM'|'PART'), PartName.
|
||||
"""
|
||||
log_csv = os.path.join(results_dir, "NX_named_expressions_log.csv")
|
||||
meta = {
|
||||
"RunID": run_id,
|
||||
"RunDateTimeLocal": run_dt.strftime("%Y-%m-%d %H:%M:%S"),
|
||||
"Source": source,
|
||||
"PartName": part_name,
|
||||
}
|
||||
row = {**meta, **expr_dict}
|
||||
|
||||
# Create or extend header as needed
|
||||
if not os.path.exists(log_csv):
|
||||
fieldnames = list(meta.keys()) + sorted(expr_dict.keys())
|
||||
with open(log_csv, "w", newline="", encoding="utf-8") as f:
|
||||
w = csv.DictWriter(f, fieldnames=fieldnames)
|
||||
w.writeheader()
|
||||
w.writerow({k: row.get(k, "") for k in fieldnames})
|
||||
lw.WriteLine(f"[EXP-EXPORT] Created CSV log: {log_csv}")
|
||||
return
|
||||
|
||||
with open(log_csv, "r", newline="", encoding="utf-8") as f:
|
||||
r = csv.reader(f)
|
||||
existing = list(r)
|
||||
|
||||
if not existing:
|
||||
fieldnames = list(meta.keys()) + sorted(expr_dict.keys())
|
||||
with open(log_csv, "w", newline="", encoding="utf-8") as f:
|
||||
w = csv.DictWriter(f, fieldnames=fieldnames)
|
||||
w.writeheader()
|
||||
w.writerow({k: row.get(k, "") for k in fieldnames})
|
||||
lw.WriteLine(f"[EXP-EXPORT] Rebuilt CSV log: {log_csv}")
|
||||
return
|
||||
|
||||
header = existing[0]
|
||||
known = set(header)
|
||||
new_cols = [c for c in meta.keys() if c not in known] + \
|
||||
sorted([k for k in expr_dict.keys() if k not in known])
|
||||
if new_cols:
|
||||
header = header + new_cols
|
||||
|
||||
with open(log_csv, "w", newline="", encoding="utf-8") as f:
|
||||
w = csv.DictWriter(f, fieldnames=header)
|
||||
w.writeheader()
|
||||
# Rewrite old rows (padding any new columns)
|
||||
for data in existing[1:]:
|
||||
old_row = {h: (data[i] if i < len(data) else "") for i, h in enumerate(existing[0])}
|
||||
for c in new_cols:
|
||||
old_row.setdefault(c, "")
|
||||
w.writerow({k: old_row.get(k, "") for k in header})
|
||||
# Append new row
|
||||
w.writerow({k: row.get(k, "") for k in header})
|
||||
|
||||
lw.WriteLine(f"[EXP-EXPORT] Appended CSV log: {log_csv}")
|
||||
|
||||
def export_geometry_named_expressions(sim_part, results_dir, run_id, lw):
|
||||
"""
|
||||
Switch display to the geometry part (like in your journal), export expressions, then restore.
|
||||
GEOM_PART_NAME must be resolvable via Session.Parts.FindObject.
|
||||
"""
|
||||
theSession = NXOpen.Session.GetSession()
|
||||
original_display = theSession.Parts.BaseDisplay
|
||||
original_work = theSession.Parts.BaseWork
|
||||
|
||||
try:
|
||||
if not GEOM_PART_NAME:
|
||||
lw.WriteLine("[EXP-EXPORT][WARN] GEOM_PART_NAME not set; skipping geometry export.")
|
||||
return False, None, None
|
||||
|
||||
try:
|
||||
part1 = theSession.Parts.FindObject(GEOM_PART_NAME)
|
||||
except Exception:
|
||||
lw.WriteLine(f"[EXP-EXPORT][WARN] Geometry part not found by name: {GEOM_PART_NAME}")
|
||||
return False, None, None
|
||||
|
||||
mark = theSession.SetUndoMark(NXOpen.Session.MarkVisibility.Visible, "Change Displayed Part")
|
||||
status, pls = theSession.Parts.SetActiveDisplay(
|
||||
part1,
|
||||
NXOpen.DisplayPartOption.AllowAdditional,
|
||||
NXOpen.PartDisplayPartWorkPartOption.UseLast
|
||||
)
|
||||
# Switch to Modeling, like your journal
|
||||
try:
|
||||
theSession.ApplicationSwitchImmediate("UG_APP_MODELING")
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
workPart = theSession.Parts.Work
|
||||
out_exp = os.path.join(results_dir, f"NamedExpressions_PART_{run_id}.exp")
|
||||
ok = export_all_named_expressions_to_exp(workPart, out_exp, lw)
|
||||
|
||||
if pls is not None:
|
||||
pls.Dispose()
|
||||
theSession.DeleteUndoMark(mark, None)
|
||||
|
||||
# Part name for logging
|
||||
part_name = os.path.splitext(os.path.basename(workPart.FullPath))[0] if workPart and workPart.FullPath else GEOM_PART_NAME
|
||||
return ok, out_exp if ok else None, part_name
|
||||
|
||||
except Exception as ex:
|
||||
lw.WriteLine(f"[EXP-EXPORT][ERROR] Geometry export failed: {ex}")
|
||||
return False, None, None
|
||||
|
||||
finally:
|
||||
# Try to restore prior display/work part and CAE app
|
||||
try:
|
||||
if original_display is not None:
|
||||
theSession.Parts.SetActiveDisplay(
|
||||
original_display,
|
||||
NXOpen.DisplayPartOption.AllowAdditional,
|
||||
NXOpen.PartDisplayPartWorkPartOption.UseLast
|
||||
)
|
||||
except Exception:
|
||||
pass
|
||||
try:
|
||||
theSession.ApplicationSwitchImmediate("UG_APP_SFEM") # back to CAE if applicable
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
|
||||
|
||||
def run_post(sim_dir, lw, run_id, results_dir):
|
||||
post_script = os.path.join(sim_dir, SCRIPT_NAME)
|
||||
op2 = os.path.join(sim_dir, OP2_NAME)
|
||||
|
||||
if not os.path.exists(TEST_ENV_PY):
|
||||
lw.WriteLine(f"[ERROR] test_env python not found: {TEST_ENV_PY}")
|
||||
return 3
|
||||
if not os.path.exists(post_script):
|
||||
lw.WriteLine(f"[ERROR] Post script not found: {post_script}")
|
||||
return 4
|
||||
if not os.path.exists(op2):
|
||||
lw.WriteLine(f"[ERROR] OP2 not found: {op2}")
|
||||
return 2
|
||||
|
||||
cmd = [TEST_ENV_PY, post_script, "--op2", op2]
|
||||
lw.WriteLine("[POST] " + " ".join(cmd))
|
||||
lw.WriteLine(f"[POST] cwd={sim_dir}")
|
||||
|
||||
env = os.environ.copy()
|
||||
env["ZERNIKE_RUN_ID"] = run_id
|
||||
env["ZERNIKE_RESULTS_DIR"] = results_dir
|
||||
|
||||
proc = subprocess.run(
|
||||
cmd, cwd=sim_dir, capture_output=True, text=True,
|
||||
shell=False, timeout=TIMEOUT, env=env
|
||||
)
|
||||
if proc.stdout:
|
||||
lw.WriteLine(proc.stdout)
|
||||
if proc.stderr:
|
||||
lw.WriteLine("[STDERR]\n" + proc.stderr)
|
||||
lw.WriteLine(f"[INFO] Post finished (rc={proc.returncode})")
|
||||
return proc.returncode
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
def main():
|
||||
s = NXOpen.Session.GetSession()
|
||||
lw = s.ListingWindow; lw.Open()
|
||||
sim_part = s.Parts.BaseWork
|
||||
sim_dir = os.path.dirname(sim_part.FullPath)
|
||||
|
||||
# --- New: Results folder + a run id/timestamp we can also hand to Zernike ---
|
||||
results_dir = os.path.join(sim_dir, "Results")
|
||||
os.makedirs(results_dir, exist_ok=True)
|
||||
run_dt = datetime.now()
|
||||
run_id = run_dt.strftime("%Y%m%d_%H%M%S")
|
||||
|
||||
# --- Run the Zernike post (hand it the same run id & results dir via env) ---
|
||||
rc = run_post(sim_dir, lw, run_id, results_dir)
|
||||
|
||||
if rc != 0:
|
||||
lw.WriteLine(f"[POST] Zernike post failed (rc={rc}). Skipping EXP import and NX expr logging.")
|
||||
return # or 'pass' if you prefer to continue anyway
|
||||
|
||||
# Import EXP if it exists — prefer Results/, then fall back to the sim folder
|
||||
exp_candidates = [
|
||||
os.path.join(results_dir, EXP_NAME),
|
||||
os.path.join(sim_dir, EXP_NAME),
|
||||
]
|
||||
for exp_path in exp_candidates:
|
||||
if os.path.isfile(exp_path):
|
||||
import_iteration_results_exp(exp_path, lw)
|
||||
break
|
||||
else:
|
||||
lw.WriteLine(f"[EXP] Skipped: not found → {exp_candidates[0]}")
|
||||
|
||||
|
||||
# --- Export SIM (work CAE part) expressions and append log ---
|
||||
sim_part_name = os.path.splitext(os.path.basename(sim_part.FullPath))[0] if sim_part and sim_part.FullPath else "SIM"
|
||||
named_exp_sim = os.path.join(results_dir, f"NamedExpressions_SIM_{run_id}.exp")
|
||||
if export_all_named_expressions_to_exp(sim_part, named_exp_sim, lw):
|
||||
exprs_sim = parse_exp_file_to_dict(named_exp_sim)
|
||||
append_named_exprs_row(results_dir, run_id, run_dt, exprs_sim, lw, source="SIM", part_name=sim_part_name)
|
||||
|
||||
# --- Export GEOMETRY (modeling) part expressions like your journal, and append log ---
|
||||
ok_part, part_exp_path, part_name = export_geometry_named_expressions(sim_part, results_dir, run_id, lw)
|
||||
if ok_part and part_exp_path:
|
||||
exprs_part = parse_exp_file_to_dict(part_exp_path)
|
||||
append_named_exprs_row(results_dir, run_id, run_dt, exprs_part, lw, source="PART", part_name=part_name)
|
||||
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
1012
examples/Zernike_old_reference/zernike_Post_Script_NX.py
Normal file
1012
examples/Zernike_old_reference/zernike_Post_Script_NX.py
Normal file
File diff suppressed because it is too large
Load Diff
124
examples/optimization_config_zernike_mirror.json
Normal file
124
examples/optimization_config_zernike_mirror.json
Normal file
@@ -0,0 +1,124 @@
|
||||
{
|
||||
"$schema": "Atomizer Optimization Config - Telescope Mirror Zernike Optimization",
|
||||
"_description": "Example configuration for optimizing telescope mirror support structures using Zernike wavefront error metrics",
|
||||
|
||||
"study_name": "mirror_wfe_optimization",
|
||||
|
||||
"design_variables": [
|
||||
{
|
||||
"name": "support_ring_radius",
|
||||
"expression_name": "support_radius",
|
||||
"min": 150.0,
|
||||
"max": 250.0,
|
||||
"units": "mm",
|
||||
"description": "Radial position of support ring"
|
||||
},
|
||||
{
|
||||
"name": "support_count",
|
||||
"expression_name": "n_supports",
|
||||
"min": 3,
|
||||
"max": 12,
|
||||
"type": "integer",
|
||||
"units": "count",
|
||||
"description": "Number of support points"
|
||||
},
|
||||
{
|
||||
"name": "support_stiffness",
|
||||
"expression_name": "k_support",
|
||||
"min": 1000.0,
|
||||
"max": 50000.0,
|
||||
"units": "N/mm",
|
||||
"description": "Support spring stiffness"
|
||||
}
|
||||
],
|
||||
|
||||
"objectives": [
|
||||
{
|
||||
"name": "filtered_rms_20deg",
|
||||
"description": "Filtered RMS WFE at 20 deg elevation (operational)",
|
||||
"direction": "minimize",
|
||||
"extractor": "zernike",
|
||||
"extractor_config": {
|
||||
"subcase": "20",
|
||||
"metric": "filtered_rms_nm",
|
||||
"displacement_unit": "mm",
|
||||
"n_modes": 50,
|
||||
"filter_orders": 4
|
||||
}
|
||||
},
|
||||
{
|
||||
"name": "mass",
|
||||
"description": "Total mirror assembly mass",
|
||||
"direction": "minimize",
|
||||
"extractor": "mass_from_expression",
|
||||
"extractor_config": {
|
||||
"expression_name": "total_mass"
|
||||
}
|
||||
}
|
||||
],
|
||||
|
||||
"constraints": [
|
||||
{
|
||||
"name": "max_stress",
|
||||
"description": "Maximum von Mises stress in mirror",
|
||||
"type": "upper_bound",
|
||||
"threshold": 5.0,
|
||||
"units": "MPa",
|
||||
"extractor": "von_mises_stress",
|
||||
"extractor_config": {
|
||||
"subcase": "90"
|
||||
}
|
||||
},
|
||||
{
|
||||
"name": "astigmatism_limit",
|
||||
"description": "Astigmatism RMS at polishing orientation",
|
||||
"type": "upper_bound",
|
||||
"threshold": 50.0,
|
||||
"units": "nm",
|
||||
"extractor": "zernike",
|
||||
"extractor_config": {
|
||||
"subcase": "90",
|
||||
"metric": "astigmatism_rms_nm"
|
||||
}
|
||||
}
|
||||
],
|
||||
|
||||
"optimization_settings": {
|
||||
"n_trials": 200,
|
||||
"sampler": "NSGA-II",
|
||||
"protocol": 11,
|
||||
"n_startup_trials": 30,
|
||||
"seed": 42
|
||||
},
|
||||
|
||||
"simulation_settings": {
|
||||
"solver": "NX Nastran",
|
||||
"solution_name": null,
|
||||
"required_subcases": [20, 40, 60, 90],
|
||||
"op2_pattern": "*-solution_1.op2"
|
||||
},
|
||||
|
||||
"zernike_settings": {
|
||||
"_description": "Global Zernike analysis settings",
|
||||
"n_modes": 50,
|
||||
"filter_low_orders": 4,
|
||||
"displacement_unit": "mm",
|
||||
"reference_subcase": "20",
|
||||
"polishing_subcase": "90",
|
||||
"operational_subcases": ["20", "40", "60"],
|
||||
"metrics_to_log": [
|
||||
"global_rms_nm",
|
||||
"filtered_rms_nm",
|
||||
"astigmatism_rms_nm",
|
||||
"coma_rms_nm",
|
||||
"trefoil_rms_nm",
|
||||
"spherical_nm"
|
||||
]
|
||||
},
|
||||
|
||||
"output_settings": {
|
||||
"save_zernike_coefficients": true,
|
||||
"generate_html_reports": true,
|
||||
"export_exp_file": true
|
||||
}
|
||||
}
|
||||
File diff suppressed because it is too large
Load Diff
@@ -1 +1,25 @@
|
||||
"""Core extractor library for Atomizer."""
|
||||
"""Core extractor library for Atomizer.
|
||||
|
||||
Available extractors:
|
||||
- Displacement: extract_displacement
|
||||
- Stress: extract_solid_stress (von Mises)
|
||||
- Frequency: extract_frequency
|
||||
- Mass: extract_mass_from_expression, extract_mass_from_op2
|
||||
- Zernike: extract_zernike_from_op2, ZernikeExtractor (telescope mirrors)
|
||||
"""
|
||||
|
||||
# Zernike extractor for telescope mirror optimization
|
||||
from optimization_engine.extractors.extract_zernike import (
|
||||
ZernikeExtractor,
|
||||
extract_zernike_from_op2,
|
||||
extract_zernike_filtered_rms,
|
||||
extract_zernike_relative_rms,
|
||||
)
|
||||
|
||||
__all__ = [
|
||||
# Zernike (telescope mirrors)
|
||||
'ZernikeExtractor',
|
||||
'extract_zernike_from_op2',
|
||||
'extract_zernike_filtered_rms',
|
||||
'extract_zernike_relative_rms',
|
||||
]
|
||||
|
||||
860
optimization_engine/extractors/extract_zernike.py
Normal file
860
optimization_engine/extractors/extract_zernike.py
Normal file
@@ -0,0 +1,860 @@
|
||||
"""
|
||||
Zernike Coefficient Extractor for Telescope Mirror Optimization
|
||||
================================================================
|
||||
|
||||
Extracts Zernike polynomial coefficients from OP2 displacement results
|
||||
for optical surface quality analysis. Designed for telescope mirror
|
||||
optimization where wavefront error (WFE) metrics are critical.
|
||||
|
||||
Key Features:
|
||||
- Noll-indexed Zernike polynomials (standard optical convention)
|
||||
- Multi-subcase support (different gravity orientations: 20, 40, 60, 90 deg)
|
||||
- Global and filtered RMS wavefront error
|
||||
- Individual aberration magnitudes (astigmatism, coma, trefoil, spherical)
|
||||
- Relative metrics between subcases (e.g., operational vs polishing orientation)
|
||||
|
||||
Usage:
|
||||
from optimization_engine.extractors.extract_zernike import (
|
||||
extract_zernike_from_op2,
|
||||
ZernikeExtractor
|
||||
)
|
||||
|
||||
# Simple usage - get filtered RMS for optimization objective
|
||||
result = extract_zernike_from_op2(op2_file, subcase=20)
|
||||
rms_filtered = result['filtered_rms_nm']
|
||||
|
||||
# Full extractor for detailed analysis
|
||||
extractor = ZernikeExtractor(op2_file, bdf_file)
|
||||
metrics = extractor.extract_all_subcases()
|
||||
|
||||
Author: Atomizer Framework (adapted from telescope mirror analysis scripts)
|
||||
"""
|
||||
|
||||
from pathlib import Path
|
||||
from typing import Dict, Any, Optional, List, Tuple, Union
|
||||
import numpy as np
|
||||
from math import factorial
|
||||
from numpy.linalg import LinAlgError
|
||||
|
||||
try:
|
||||
from pyNastran.op2.op2 import OP2
|
||||
from pyNastran.bdf.bdf import BDF
|
||||
except ImportError:
|
||||
raise ImportError("pyNastran is required. Install with: pip install pyNastran")
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Configuration
|
||||
# ============================================================================
|
||||
|
||||
DEFAULT_N_MODES = 50 # Number of Zernike modes to fit
|
||||
DEFAULT_FILTER_ORDERS = 4 # Filter first N modes (piston, tip, tilt, defocus)
|
||||
DEFAULT_CHUNK_SIZE = 100000 # For memory-efficient processing of large meshes
|
||||
|
||||
# Standard telescope orientations (gravity angles in degrees)
|
||||
STANDARD_SUBCASES = [20, 40, 60, 90]
|
||||
|
||||
# Displacement unit conversions (to nanometers for WFE)
|
||||
UNIT_TO_NM = {
|
||||
'mm': 1e6, # 1 mm = 1e6 nm
|
||||
'm': 1e9, # 1 m = 1e9 nm
|
||||
'um': 1e3, # 1 um = 1e3 nm
|
||||
'nm': 1.0, # already nm
|
||||
}
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Zernike Polynomial Mathematics
|
||||
# ============================================================================
|
||||
|
||||
def noll_indices(j: int) -> Tuple[int, int]:
|
||||
"""
|
||||
Convert Noll index j to radial order n and azimuthal frequency m.
|
||||
|
||||
The Noll indexing scheme is the standard convention in optics.
|
||||
j=1: Piston, j=2,3: Tip/Tilt, j=4: Defocus, j=5,6: Astigmatism, etc.
|
||||
|
||||
Args:
|
||||
j: Noll index (1-based)
|
||||
|
||||
Returns:
|
||||
(n, m): Radial order and azimuthal frequency
|
||||
"""
|
||||
if j < 1:
|
||||
raise ValueError("Noll index j must be >= 1")
|
||||
|
||||
count = 0
|
||||
n = 0
|
||||
while True:
|
||||
if n == 0:
|
||||
ms = [0]
|
||||
elif n % 2 == 0:
|
||||
ms = [0] + [m for k in range(1, n//2 + 1) for m in (-2*k, 2*k)]
|
||||
else:
|
||||
ms = [m for k in range(0, (n+1)//2) for m in (-(2*k+1), (2*k+1))]
|
||||
for m in ms:
|
||||
count += 1
|
||||
if count == j:
|
||||
return n, m
|
||||
n += 1
|
||||
|
||||
|
||||
def zernike_radial(n: int, m: int, r: np.ndarray) -> np.ndarray:
|
||||
"""
|
||||
Compute the radial component R_n^m(r) of the Zernike polynomial.
|
||||
|
||||
Args:
|
||||
n: Radial order
|
||||
m: Azimuthal frequency (absolute value used)
|
||||
r: Radial coordinates (normalized to unit disk)
|
||||
|
||||
Returns:
|
||||
Radial polynomial evaluated at r
|
||||
"""
|
||||
R = np.zeros_like(r)
|
||||
m_abs = abs(m)
|
||||
|
||||
for s in range((n - m_abs) // 2 + 1):
|
||||
coef = ((-1)**s * factorial(n - s) /
|
||||
(factorial(s) *
|
||||
factorial((n + m_abs) // 2 - s) *
|
||||
factorial((n - m_abs) // 2 - s)))
|
||||
R += coef * r**(n - 2*s)
|
||||
|
||||
return R
|
||||
|
||||
|
||||
def zernike_noll(j: int, r: np.ndarray, theta: np.ndarray) -> np.ndarray:
|
||||
"""
|
||||
Evaluate Noll-indexed Zernike polynomial Z_j(r, theta).
|
||||
|
||||
Args:
|
||||
j: Noll index
|
||||
r: Radial coordinates (normalized to unit disk)
|
||||
theta: Angular coordinates (radians)
|
||||
|
||||
Returns:
|
||||
Zernike polynomial values at (r, theta)
|
||||
"""
|
||||
n, m = noll_indices(j)
|
||||
R = zernike_radial(n, m, r)
|
||||
|
||||
if m == 0:
|
||||
return R
|
||||
elif m > 0:
|
||||
return R * np.cos(m * theta)
|
||||
else:
|
||||
return R * np.sin(-m * theta)
|
||||
|
||||
|
||||
def zernike_name(j: int) -> str:
|
||||
"""
|
||||
Get common optical name for Zernike mode.
|
||||
|
||||
Args:
|
||||
j: Noll index
|
||||
|
||||
Returns:
|
||||
Human-readable name (e.g., "Defocus", "Astigmatism 0 deg")
|
||||
"""
|
||||
n, m = noll_indices(j)
|
||||
|
||||
names = {
|
||||
(0, 0): "Piston",
|
||||
(1, -1): "Tilt X",
|
||||
(1, 1): "Tilt Y",
|
||||
(2, 0): "Defocus",
|
||||
(2, -2): "Astigmatism 45 deg",
|
||||
(2, 2): "Astigmatism 0 deg",
|
||||
(3, -1): "Coma X",
|
||||
(3, 1): "Coma Y",
|
||||
(3, -3): "Trefoil X",
|
||||
(3, 3): "Trefoil Y",
|
||||
(4, 0): "Primary Spherical",
|
||||
(4, -2): "Secondary Astig X",
|
||||
(4, 2): "Secondary Astig Y",
|
||||
(4, -4): "Quadrafoil X",
|
||||
(4, 4): "Quadrafoil Y",
|
||||
(5, -1): "Secondary Coma X",
|
||||
(5, 1): "Secondary Coma Y",
|
||||
(5, -3): "Secondary Trefoil X",
|
||||
(5, 3): "Secondary Trefoil Y",
|
||||
(5, -5): "Pentafoil X",
|
||||
(5, 5): "Pentafoil Y",
|
||||
(6, 0): "Secondary Spherical",
|
||||
}
|
||||
|
||||
return names.get((n, m), f"Z(n={n}, m={m})")
|
||||
|
||||
|
||||
def zernike_label(j: int) -> str:
|
||||
"""Full label for Zernike mode: J{j} - Name (n=, m=)"""
|
||||
n, m = noll_indices(j)
|
||||
return f"J{j:02d} - {zernike_name(j)} (n={n}, m={m})"
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Zernike Coefficient Fitting
|
||||
# ============================================================================
|
||||
|
||||
def compute_zernike_coefficients(
|
||||
x: np.ndarray,
|
||||
y: np.ndarray,
|
||||
values: np.ndarray,
|
||||
n_modes: int = DEFAULT_N_MODES,
|
||||
chunk_size: int = DEFAULT_CHUNK_SIZE
|
||||
) -> Tuple[np.ndarray, float]:
|
||||
"""
|
||||
Fit Zernike coefficients to surface data using least-squares.
|
||||
|
||||
Uses chunked processing for memory efficiency with large meshes.
|
||||
Points outside the unit disk (after centering/normalization) are excluded.
|
||||
|
||||
Args:
|
||||
x, y: Node coordinates (will be centered and normalized)
|
||||
values: Surface values at each node (e.g., WFE in nm)
|
||||
n_modes: Number of Zernike modes to fit
|
||||
chunk_size: Chunk size for memory-efficient processing
|
||||
|
||||
Returns:
|
||||
(coefficients, R_max): Zernike coefficients and normalization radius
|
||||
"""
|
||||
# Center coordinates
|
||||
x_centered = x - np.mean(x)
|
||||
y_centered = y - np.mean(y)
|
||||
|
||||
# Normalize to unit disk
|
||||
R_max = float(np.max(np.hypot(x_centered, y_centered)))
|
||||
r = np.hypot(x_centered / R_max, y_centered / R_max).astype(np.float32)
|
||||
theta = np.arctan2(y_centered, x_centered).astype(np.float32)
|
||||
|
||||
# Mask: inside unit disk and valid values
|
||||
mask = (r <= 1.0) & ~np.isnan(values)
|
||||
if not np.any(mask):
|
||||
raise RuntimeError("No valid points inside unit disk for Zernike fitting.")
|
||||
|
||||
idx = np.nonzero(mask)[0]
|
||||
m = int(n_modes)
|
||||
|
||||
# Normal equations: (Z^T Z) c = Z^T v
|
||||
# Build incrementally for memory efficiency
|
||||
G = np.zeros((m, m), dtype=np.float64) # Z^T Z
|
||||
h = np.zeros((m,), dtype=np.float64) # Z^T v
|
||||
v = values.astype(np.float64)
|
||||
|
||||
for start in range(0, len(idx), chunk_size):
|
||||
chunk_idx = idx[start:start + chunk_size]
|
||||
r_chunk = r[chunk_idx]
|
||||
theta_chunk = theta[chunk_idx]
|
||||
v_chunk = v[chunk_idx]
|
||||
|
||||
# Build Zernike basis for this chunk
|
||||
Z_chunk = np.column_stack([
|
||||
zernike_noll(j, r_chunk, theta_chunk).astype(np.float32)
|
||||
for j in range(1, m + 1)
|
||||
])
|
||||
|
||||
# Accumulate normal equations
|
||||
G += (Z_chunk.T @ Z_chunk).astype(np.float64)
|
||||
h += (Z_chunk.T @ v_chunk).astype(np.float64)
|
||||
|
||||
# Solve normal equations
|
||||
try:
|
||||
coeffs = np.linalg.solve(G, h)
|
||||
except LinAlgError:
|
||||
coeffs = np.linalg.lstsq(G, h, rcond=None)[0]
|
||||
|
||||
return coeffs, R_max
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# RMS Calculations
|
||||
# ============================================================================
|
||||
|
||||
def compute_rms_metrics(
|
||||
x: np.ndarray,
|
||||
y: np.ndarray,
|
||||
wfe: np.ndarray,
|
||||
n_modes: int = DEFAULT_N_MODES,
|
||||
filter_orders: int = DEFAULT_FILTER_ORDERS
|
||||
) -> Dict[str, float]:
|
||||
"""
|
||||
Compute global and filtered RMS wavefront error.
|
||||
|
||||
Args:
|
||||
x, y: Node coordinates
|
||||
wfe: Wavefront error values (nm)
|
||||
n_modes: Number of Zernike modes to fit
|
||||
filter_orders: Number of low-order modes to filter (typically 4)
|
||||
|
||||
Returns:
|
||||
Dict with 'global_rms_nm' and 'filtered_rms_nm'
|
||||
"""
|
||||
coeffs, R_max = compute_zernike_coefficients(x, y, wfe, n_modes)
|
||||
|
||||
# Reconstruct filtered WFE (remove low-order modes)
|
||||
x_c = x - np.mean(x)
|
||||
y_c = y - np.mean(y)
|
||||
r = np.hypot(x_c / R_max, y_c / R_max)
|
||||
theta = np.arctan2(y_c, x_c)
|
||||
|
||||
# Build Zernike basis for low-order modes only
|
||||
Z_low = np.column_stack([
|
||||
zernike_noll(j, r, theta) for j in range(1, filter_orders + 1)
|
||||
])
|
||||
|
||||
# Subtract low-order contribution
|
||||
wfe_filtered = wfe - Z_low @ coeffs[:filter_orders]
|
||||
|
||||
global_rms = float(np.sqrt(np.mean(wfe**2)))
|
||||
filtered_rms = float(np.sqrt(np.mean(wfe_filtered**2)))
|
||||
|
||||
return {
|
||||
'global_rms_nm': global_rms,
|
||||
'filtered_rms_nm': filtered_rms,
|
||||
'coefficients': coeffs,
|
||||
'R_max': R_max
|
||||
}
|
||||
|
||||
|
||||
def compute_aberration_magnitudes(coeffs: np.ndarray) -> Dict[str, float]:
|
||||
"""
|
||||
Compute RMS magnitudes of common optical aberrations.
|
||||
|
||||
Args:
|
||||
coeffs: Zernike coefficients (at least 11 modes)
|
||||
|
||||
Returns:
|
||||
Dict with aberration RMS values in nm
|
||||
"""
|
||||
if len(coeffs) < 11:
|
||||
raise ValueError("Need at least 11 Zernike modes for aberration analysis")
|
||||
|
||||
return {
|
||||
'defocus_nm': float(abs(coeffs[3])), # J4
|
||||
'astigmatism_rms_nm': float(np.sqrt(coeffs[4]**2 + coeffs[5]**2)), # J5+J6
|
||||
'coma_rms_nm': float(np.sqrt(coeffs[6]**2 + coeffs[7]**2)), # J7+J8
|
||||
'trefoil_rms_nm': float(np.sqrt(coeffs[8]**2 + coeffs[9]**2)), # J9+J10
|
||||
'spherical_nm': float(abs(coeffs[10])), # J11
|
||||
}
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# OP2/BDF Data Extraction
|
||||
# ============================================================================
|
||||
|
||||
def read_node_geometry(bdf_path: Path) -> Dict[int, np.ndarray]:
|
||||
"""
|
||||
Read node coordinates from BDF/DAT file.
|
||||
|
||||
Args:
|
||||
bdf_path: Path to .bdf or .dat file
|
||||
|
||||
Returns:
|
||||
Dict mapping node ID to [x, y, z] coordinates
|
||||
"""
|
||||
bdf = BDF()
|
||||
bdf.read_bdf(str(bdf_path))
|
||||
|
||||
return {
|
||||
int(nid): node.get_position()
|
||||
for nid, node in bdf.nodes.items()
|
||||
}
|
||||
|
||||
|
||||
def find_geometry_file(op2_path: Path) -> Path:
|
||||
"""
|
||||
Find matching BDF/DAT file for an OP2.
|
||||
|
||||
Looks for same-basename first, then any .dat/.bdf in same folder.
|
||||
|
||||
Args:
|
||||
op2_path: Path to OP2 file
|
||||
|
||||
Returns:
|
||||
Path to geometry file
|
||||
"""
|
||||
folder = op2_path.parent
|
||||
base = op2_path.stem
|
||||
|
||||
# Try same basename
|
||||
for ext in ['.dat', '.bdf']:
|
||||
cand = folder / (base + ext)
|
||||
if cand.exists():
|
||||
return cand
|
||||
|
||||
# Try any geometry file
|
||||
for name in folder.iterdir():
|
||||
if name.suffix.lower() in ['.dat', '.bdf']:
|
||||
return name
|
||||
|
||||
raise FileNotFoundError(f"No .dat or .bdf geometry file found for {op2_path}")
|
||||
|
||||
|
||||
def extract_displacements_by_subcase(
|
||||
op2_path: Path,
|
||||
required_subcases: Optional[List[int]] = None
|
||||
) -> Dict[str, Dict[str, np.ndarray]]:
|
||||
"""
|
||||
Extract displacement data from OP2 organized by subcase.
|
||||
|
||||
Args:
|
||||
op2_path: Path to OP2 file
|
||||
required_subcases: List of required subcases (e.g., [20, 40, 60, 90])
|
||||
|
||||
Returns:
|
||||
Dict keyed by subcase label: {'20': {'node_ids': array, 'disp': array}, ...}
|
||||
"""
|
||||
op2 = OP2()
|
||||
op2.read_op2(str(op2_path))
|
||||
|
||||
if not op2.displacements:
|
||||
raise RuntimeError("No displacement data found in OP2 file")
|
||||
|
||||
result = {}
|
||||
|
||||
for key, darr in op2.displacements.items():
|
||||
data = darr.data
|
||||
dmat = data[0] if data.ndim == 3 else (data if data.ndim == 2 else None)
|
||||
if dmat is None:
|
||||
continue
|
||||
|
||||
ngt = darr.node_gridtype.astype(int)
|
||||
node_ids = ngt if ngt.ndim == 1 else ngt[:, 0]
|
||||
|
||||
# Try to identify subcase from subtitle or isubcase
|
||||
subtitle = getattr(darr, 'subtitle', None)
|
||||
isubcase = getattr(darr, 'isubcase', None)
|
||||
|
||||
# Extract numeric from subtitle
|
||||
label = None
|
||||
if isinstance(subtitle, str):
|
||||
import re
|
||||
m = re.search(r'-?\d+', subtitle)
|
||||
if m:
|
||||
label = m.group(0)
|
||||
|
||||
if label is None and isinstance(isubcase, int):
|
||||
label = str(isubcase)
|
||||
|
||||
if label:
|
||||
result[label] = {
|
||||
'node_ids': node_ids.astype(int),
|
||||
'disp': dmat.copy()
|
||||
}
|
||||
|
||||
# Validate required subcases if specified
|
||||
if required_subcases:
|
||||
missing = [str(s) for s in required_subcases if str(s) not in result]
|
||||
if missing:
|
||||
available = list(result.keys())
|
||||
raise RuntimeError(
|
||||
f"Required subcases {missing} not found. Available: {available}"
|
||||
)
|
||||
|
||||
return result
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Main Extractor Class
|
||||
# ============================================================================
|
||||
|
||||
class ZernikeExtractor:
|
||||
"""
|
||||
Complete Zernike analysis extractor for telescope mirror optimization.
|
||||
|
||||
This class handles:
|
||||
- Loading OP2 displacement results
|
||||
- Matching with BDF geometry
|
||||
- Computing Zernike coefficients and RMS metrics
|
||||
- Multi-subcase analysis (different gravity orientations)
|
||||
- Relative metrics between subcases
|
||||
|
||||
Example usage in optimization:
|
||||
extractor = ZernikeExtractor(op2_file, bdf_file)
|
||||
|
||||
# For single-objective optimization (minimize filtered RMS at 20 deg)
|
||||
result = extractor.extract_subcase('20')
|
||||
objective = result['filtered_rms_nm']
|
||||
|
||||
# For multi-subcase optimization
|
||||
all_results = extractor.extract_all_subcases()
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
op2_path: Union[str, Path],
|
||||
bdf_path: Optional[Union[str, Path]] = None,
|
||||
displacement_unit: str = 'mm',
|
||||
n_modes: int = DEFAULT_N_MODES,
|
||||
filter_orders: int = DEFAULT_FILTER_ORDERS
|
||||
):
|
||||
"""
|
||||
Initialize Zernike extractor.
|
||||
|
||||
Args:
|
||||
op2_path: Path to OP2 results file
|
||||
bdf_path: Path to BDF/DAT geometry file (auto-detected if None)
|
||||
displacement_unit: Unit of displacement in OP2 ('mm', 'm', 'um', 'nm')
|
||||
n_modes: Number of Zernike modes to fit
|
||||
filter_orders: Number of low-order modes to filter
|
||||
"""
|
||||
self.op2_path = Path(op2_path)
|
||||
self.bdf_path = Path(bdf_path) if bdf_path else find_geometry_file(self.op2_path)
|
||||
self.displacement_unit = displacement_unit
|
||||
self.n_modes = n_modes
|
||||
self.filter_orders = filter_orders
|
||||
|
||||
# Unit conversion factor (displacement to nm)
|
||||
self.nm_scale = UNIT_TO_NM.get(displacement_unit.lower(), 1e6)
|
||||
|
||||
# WFE = 2 * surface displacement (optical convention)
|
||||
self.wfe_factor = 2.0 * self.nm_scale
|
||||
|
||||
# Lazy-loaded data
|
||||
self._node_geo = None
|
||||
self._displacements = None
|
||||
|
||||
@property
|
||||
def node_geometry(self) -> Dict[int, np.ndarray]:
|
||||
"""Lazy-load node geometry from BDF."""
|
||||
if self._node_geo is None:
|
||||
self._node_geo = read_node_geometry(self.bdf_path)
|
||||
return self._node_geo
|
||||
|
||||
@property
|
||||
def displacements(self) -> Dict[str, Dict[str, np.ndarray]]:
|
||||
"""Lazy-load displacements from OP2."""
|
||||
if self._displacements is None:
|
||||
self._displacements = extract_displacements_by_subcase(self.op2_path)
|
||||
return self._displacements
|
||||
|
||||
def _build_dataframe(
|
||||
self,
|
||||
subcase_label: str
|
||||
) -> Tuple[np.ndarray, np.ndarray, np.ndarray]:
|
||||
"""
|
||||
Build coordinate and WFE arrays for a subcase.
|
||||
|
||||
Returns:
|
||||
(X, Y, WFE_nm): Arrays of coordinates and wavefront error
|
||||
"""
|
||||
if subcase_label not in self.displacements:
|
||||
available = list(self.displacements.keys())
|
||||
raise ValueError(f"Subcase '{subcase_label}' not found. Available: {available}")
|
||||
|
||||
data = self.displacements[subcase_label]
|
||||
node_ids = data['node_ids']
|
||||
disp = data['disp']
|
||||
|
||||
# Build arrays
|
||||
X, Y, WFE = [], [], []
|
||||
for nid, vec in zip(node_ids, disp):
|
||||
geo = self.node_geometry.get(int(nid))
|
||||
if geo is None:
|
||||
continue
|
||||
|
||||
X.append(geo[0])
|
||||
Y.append(geo[1])
|
||||
# Z-displacement to WFE (nm)
|
||||
wfe = vec[2] * self.wfe_factor
|
||||
WFE.append(wfe)
|
||||
|
||||
return np.array(X), np.array(Y), np.array(WFE)
|
||||
|
||||
def extract_subcase(
|
||||
self,
|
||||
subcase_label: str,
|
||||
include_coefficients: bool = False
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Extract Zernike metrics for a single subcase.
|
||||
|
||||
Args:
|
||||
subcase_label: Subcase identifier (e.g., '20', '90')
|
||||
include_coefficients: Whether to include all Zernike coefficients
|
||||
|
||||
Returns:
|
||||
Dict with RMS metrics, aberrations, and optionally coefficients
|
||||
"""
|
||||
X, Y, WFE = self._build_dataframe(subcase_label)
|
||||
|
||||
# Compute RMS metrics
|
||||
rms_result = compute_rms_metrics(
|
||||
X, Y, WFE, self.n_modes, self.filter_orders
|
||||
)
|
||||
|
||||
# Compute aberration magnitudes
|
||||
aberrations = compute_aberration_magnitudes(rms_result['coefficients'])
|
||||
|
||||
result = {
|
||||
'subcase': subcase_label,
|
||||
'global_rms_nm': rms_result['global_rms_nm'],
|
||||
'filtered_rms_nm': rms_result['filtered_rms_nm'],
|
||||
'n_nodes': len(X),
|
||||
**aberrations
|
||||
}
|
||||
|
||||
if include_coefficients:
|
||||
result['coefficients'] = rms_result['coefficients'].tolist()
|
||||
result['coefficient_labels'] = [zernike_label(j) for j in range(1, self.n_modes + 1)]
|
||||
|
||||
return result
|
||||
|
||||
def extract_relative(
|
||||
self,
|
||||
target_subcase: str,
|
||||
reference_subcase: str
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Extract Zernike metrics relative to a reference subcase.
|
||||
|
||||
Computes: WFE_relative = WFE_target - WFE_reference
|
||||
|
||||
Args:
|
||||
target_subcase: Subcase to analyze
|
||||
reference_subcase: Reference subcase to subtract
|
||||
|
||||
Returns:
|
||||
Dict with relative RMS metrics and aberrations
|
||||
"""
|
||||
X_t, Y_t, WFE_t = self._build_dataframe(target_subcase)
|
||||
X_r, Y_r, WFE_r = self._build_dataframe(reference_subcase)
|
||||
|
||||
# Build node-to-index mapping for reference
|
||||
target_data = self.displacements[target_subcase]
|
||||
ref_data = self.displacements[reference_subcase]
|
||||
|
||||
ref_node_to_idx = {
|
||||
int(nid): i for i, nid in enumerate(ref_data['node_ids'])
|
||||
}
|
||||
|
||||
# Compute relative WFE for common nodes
|
||||
X_rel, Y_rel, WFE_rel = [], [], []
|
||||
|
||||
for i, nid in enumerate(target_data['node_ids']):
|
||||
nid = int(nid)
|
||||
if nid not in ref_node_to_idx:
|
||||
continue
|
||||
|
||||
ref_idx = ref_node_to_idx[nid]
|
||||
geo = self.node_geometry.get(nid)
|
||||
if geo is None:
|
||||
continue
|
||||
|
||||
X_rel.append(geo[0])
|
||||
Y_rel.append(geo[1])
|
||||
|
||||
target_wfe = target_data['disp'][i, 2] * self.wfe_factor
|
||||
ref_wfe = ref_data['disp'][ref_idx, 2] * self.wfe_factor
|
||||
WFE_rel.append(target_wfe - ref_wfe)
|
||||
|
||||
X_rel = np.array(X_rel)
|
||||
Y_rel = np.array(Y_rel)
|
||||
WFE_rel = np.array(WFE_rel)
|
||||
|
||||
# Compute metrics on relative WFE
|
||||
rms_result = compute_rms_metrics(
|
||||
X_rel, Y_rel, WFE_rel, self.n_modes, self.filter_orders
|
||||
)
|
||||
aberrations = compute_aberration_magnitudes(rms_result['coefficients'])
|
||||
|
||||
return {
|
||||
'target_subcase': target_subcase,
|
||||
'reference_subcase': reference_subcase,
|
||||
'relative_global_rms_nm': rms_result['global_rms_nm'],
|
||||
'relative_filtered_rms_nm': rms_result['filtered_rms_nm'],
|
||||
'n_common_nodes': len(X_rel),
|
||||
**{f'relative_{k}': v for k, v in aberrations.items()}
|
||||
}
|
||||
|
||||
def extract_all_subcases(
|
||||
self,
|
||||
reference_subcase: Optional[str] = '20'
|
||||
) -> Dict[str, Dict[str, Any]]:
|
||||
"""
|
||||
Extract metrics for all available subcases.
|
||||
|
||||
Args:
|
||||
reference_subcase: Reference for relative calculations (None to skip)
|
||||
|
||||
Returns:
|
||||
Dict mapping subcase label to metrics dict
|
||||
"""
|
||||
results = {}
|
||||
|
||||
for label in self.displacements.keys():
|
||||
results[label] = self.extract_subcase(label)
|
||||
|
||||
# Add relative metrics if reference specified
|
||||
if reference_subcase and label != reference_subcase:
|
||||
try:
|
||||
rel = self.extract_relative(label, reference_subcase)
|
||||
results[label].update({
|
||||
f'rel_{reference_subcase}_{k}': v
|
||||
for k, v in rel.items()
|
||||
if k.startswith('relative_')
|
||||
})
|
||||
except Exception as e:
|
||||
results[label][f'rel_{reference_subcase}_error'] = str(e)
|
||||
|
||||
return results
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Convenience Functions for Optimization
|
||||
# ============================================================================
|
||||
|
||||
def extract_zernike_from_op2(
|
||||
op2_file: Union[str, Path],
|
||||
bdf_file: Optional[Union[str, Path]] = None,
|
||||
subcase: Union[int, str] = 1,
|
||||
displacement_unit: str = 'mm',
|
||||
n_modes: int = DEFAULT_N_MODES,
|
||||
filter_orders: int = DEFAULT_FILTER_ORDERS
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Convenience function to extract Zernike metrics from OP2.
|
||||
|
||||
This is the main entry point for optimization objectives.
|
||||
|
||||
Args:
|
||||
op2_file: Path to OP2 results file
|
||||
bdf_file: Path to BDF geometry (auto-detected if None)
|
||||
subcase: Subcase identifier
|
||||
displacement_unit: Unit of displacement in OP2
|
||||
n_modes: Number of Zernike modes
|
||||
filter_orders: Low-order modes to filter
|
||||
|
||||
Returns:
|
||||
Dict with:
|
||||
- 'global_rms_nm': Global RMS WFE in nanometers
|
||||
- 'filtered_rms_nm': Filtered RMS (low orders removed)
|
||||
- 'defocus_nm', 'astigmatism_rms_nm', etc.: Individual aberrations
|
||||
"""
|
||||
extractor = ZernikeExtractor(
|
||||
op2_file, bdf_file, displacement_unit, n_modes, filter_orders
|
||||
)
|
||||
return extractor.extract_subcase(str(subcase))
|
||||
|
||||
|
||||
def extract_zernike_filtered_rms(
|
||||
op2_file: Union[str, Path],
|
||||
bdf_file: Optional[Union[str, Path]] = None,
|
||||
subcase: Union[int, str] = 1,
|
||||
**kwargs
|
||||
) -> float:
|
||||
"""
|
||||
Extract filtered RMS WFE - the primary metric for mirror optimization.
|
||||
|
||||
Filtered RMS removes piston, tip, tilt, and defocus (modes 1-4),
|
||||
which can be corrected by alignment and focus adjustment.
|
||||
|
||||
Args:
|
||||
op2_file: Path to OP2 file
|
||||
bdf_file: Path to BDF geometry (auto-detected if None)
|
||||
subcase: Subcase identifier
|
||||
**kwargs: Additional arguments for ZernikeExtractor
|
||||
|
||||
Returns:
|
||||
Filtered RMS WFE in nanometers
|
||||
"""
|
||||
result = extract_zernike_from_op2(op2_file, bdf_file, subcase, **kwargs)
|
||||
return result['filtered_rms_nm']
|
||||
|
||||
|
||||
def extract_zernike_relative_rms(
|
||||
op2_file: Union[str, Path],
|
||||
target_subcase: Union[int, str],
|
||||
reference_subcase: Union[int, str],
|
||||
bdf_file: Optional[Union[str, Path]] = None,
|
||||
**kwargs
|
||||
) -> float:
|
||||
"""
|
||||
Extract relative filtered RMS between two subcases.
|
||||
|
||||
Useful for analyzing gravity-induced deformation relative to
|
||||
a reference orientation (e.g., polishing position).
|
||||
|
||||
Args:
|
||||
op2_file: Path to OP2 file
|
||||
target_subcase: Subcase to analyze
|
||||
reference_subcase: Reference subcase
|
||||
bdf_file: Path to BDF geometry
|
||||
**kwargs: Additional arguments for ZernikeExtractor
|
||||
|
||||
Returns:
|
||||
Relative filtered RMS WFE in nanometers
|
||||
"""
|
||||
extractor = ZernikeExtractor(op2_file, bdf_file, **kwargs)
|
||||
result = extractor.extract_relative(str(target_subcase), str(reference_subcase))
|
||||
return result['relative_filtered_rms_nm']
|
||||
|
||||
|
||||
# ============================================================================
|
||||
# Module Exports
|
||||
# ============================================================================
|
||||
|
||||
__all__ = [
|
||||
# Main extractor class
|
||||
'ZernikeExtractor',
|
||||
|
||||
# Convenience functions for optimization
|
||||
'extract_zernike_from_op2',
|
||||
'extract_zernike_filtered_rms',
|
||||
'extract_zernike_relative_rms',
|
||||
|
||||
# Zernike utilities (for advanced use)
|
||||
'compute_zernike_coefficients',
|
||||
'compute_rms_metrics',
|
||||
'compute_aberration_magnitudes',
|
||||
'noll_indices',
|
||||
'zernike_noll',
|
||||
'zernike_name',
|
||||
'zernike_label',
|
||||
]
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
# Example/test usage
|
||||
import sys
|
||||
|
||||
if len(sys.argv) > 1:
|
||||
op2_file = Path(sys.argv[1])
|
||||
|
||||
print(f"Analyzing: {op2_file}")
|
||||
|
||||
try:
|
||||
extractor = ZernikeExtractor(op2_file)
|
||||
|
||||
print(f"\nAvailable subcases: {list(extractor.displacements.keys())}")
|
||||
|
||||
results = extractor.extract_all_subcases()
|
||||
|
||||
for label, metrics in results.items():
|
||||
print(f"\n=== Subcase {label} ===")
|
||||
print(f" Global RMS: {metrics['global_rms_nm']:.2f} nm")
|
||||
print(f" Filtered RMS: {metrics['filtered_rms_nm']:.2f} nm")
|
||||
print(f" Astigmatism: {metrics['astigmatism_rms_nm']:.2f} nm")
|
||||
print(f" Coma: {metrics['coma_rms_nm']:.2f} nm")
|
||||
print(f" Trefoil: {metrics['trefoil_rms_nm']:.2f} nm")
|
||||
print(f" Spherical: {metrics['spherical_nm']:.2f} nm")
|
||||
|
||||
except Exception as e:
|
||||
print(f"Error: {e}")
|
||||
sys.exit(1)
|
||||
else:
|
||||
print("Usage: python extract_zernike.py <op2_file>")
|
||||
print("\nThis module provides Zernike coefficient extraction for telescope mirror optimization.")
|
||||
print("\nExample in optimization config:")
|
||||
print(' "objectives": [')
|
||||
print(' {')
|
||||
print(' "name": "filtered_rms",')
|
||||
print(' "extractor": "zernike",')
|
||||
print(' "direction": "minimize",')
|
||||
print(' "extractor_config": {')
|
||||
print(' "subcase": "20",')
|
||||
print(' "metric": "filtered_rms_nm"')
|
||||
print(' }')
|
||||
print(' }')
|
||||
print(' ]')
|
||||
403
optimization_engine/extractors/zernike_helpers.py
Normal file
403
optimization_engine/extractors/zernike_helpers.py
Normal file
@@ -0,0 +1,403 @@
|
||||
"""
|
||||
Zernike Helper Functions for Atomizer Optimization
|
||||
===================================================
|
||||
|
||||
Convenience wrappers and utilities for using Zernike analysis
|
||||
in optimization studies. These helpers simplify integration with
|
||||
the standard Atomizer optimization patterns.
|
||||
|
||||
Usage in run_optimization.py:
|
||||
from optimization_engine.extractors.zernike_helpers import (
|
||||
create_zernike_objective,
|
||||
ZernikeObjectiveBuilder
|
||||
)
|
||||
|
||||
# Simple: create objective function
|
||||
zernike_obj = create_zernike_objective(
|
||||
op2_finder=lambda: sim_dir / "model-solution_1.op2",
|
||||
subcase="20",
|
||||
metric="filtered_rms_nm"
|
||||
)
|
||||
|
||||
# Use in Optuna trial
|
||||
rms = zernike_obj()
|
||||
"""
|
||||
|
||||
from pathlib import Path
|
||||
from typing import Callable, Dict, Any, Optional, Union, List
|
||||
import logging
|
||||
|
||||
from optimization_engine.extractors.extract_zernike import (
|
||||
ZernikeExtractor,
|
||||
extract_zernike_from_op2,
|
||||
extract_zernike_filtered_rms,
|
||||
)
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def create_zernike_objective(
|
||||
op2_finder: Callable[[], Path],
|
||||
bdf_finder: Optional[Callable[[], Path]] = None,
|
||||
subcase: Union[int, str] = "20",
|
||||
metric: str = "filtered_rms_nm",
|
||||
displacement_unit: str = "mm",
|
||||
**kwargs
|
||||
) -> Callable[[], float]:
|
||||
"""
|
||||
Create a Zernike objective function for optimization.
|
||||
|
||||
This factory creates a callable that:
|
||||
1. Finds the OP2 file (using op2_finder)
|
||||
2. Extracts Zernike metrics
|
||||
3. Returns the specified metric value
|
||||
|
||||
Args:
|
||||
op2_finder: Callable that returns path to current OP2 file
|
||||
bdf_finder: Callable that returns path to BDF file (auto-detect if None)
|
||||
subcase: Subcase to analyze (e.g., "20" for 20 deg elevation)
|
||||
metric: Metric to return (see available_metrics below)
|
||||
displacement_unit: Unit of displacement in OP2 file
|
||||
**kwargs: Additional arguments for ZernikeExtractor
|
||||
|
||||
Returns:
|
||||
Callable that returns the metric value
|
||||
|
||||
Available metrics:
|
||||
- global_rms_nm: Global RMS wavefront error
|
||||
- filtered_rms_nm: Filtered RMS (low orders removed)
|
||||
- defocus_nm: Defocus aberration
|
||||
- astigmatism_rms_nm: Combined astigmatism
|
||||
- coma_rms_nm: Combined coma
|
||||
- trefoil_rms_nm: Combined trefoil
|
||||
- spherical_nm: Primary spherical aberration
|
||||
|
||||
Example:
|
||||
op2_finder = lambda: Path("sim_dir") / "model-solution_1.op2"
|
||||
objective = create_zernike_objective(op2_finder, subcase="20")
|
||||
|
||||
# In optimization loop
|
||||
rms_value = objective() # Returns filtered RMS in nm
|
||||
"""
|
||||
def evaluate() -> float:
|
||||
op2_path = op2_finder()
|
||||
bdf_path = bdf_finder() if bdf_finder else None
|
||||
|
||||
result = extract_zernike_from_op2(
|
||||
op2_path,
|
||||
bdf_path,
|
||||
subcase=subcase,
|
||||
displacement_unit=displacement_unit,
|
||||
**kwargs
|
||||
)
|
||||
|
||||
if metric not in result:
|
||||
available = [k for k in result.keys() if isinstance(result[k], (int, float))]
|
||||
raise ValueError(f"Metric '{metric}' not found. Available: {available}")
|
||||
|
||||
return result[metric]
|
||||
|
||||
return evaluate
|
||||
|
||||
|
||||
def create_relative_zernike_objective(
|
||||
op2_finder: Callable[[], Path],
|
||||
target_subcase: Union[int, str],
|
||||
reference_subcase: Union[int, str],
|
||||
bdf_finder: Optional[Callable[[], Path]] = None,
|
||||
metric: str = "relative_filtered_rms_nm",
|
||||
**kwargs
|
||||
) -> Callable[[], float]:
|
||||
"""
|
||||
Create objective for relative Zernike metrics between subcases.
|
||||
|
||||
Useful for minimizing gravity-induced deformation relative to
|
||||
a reference orientation (e.g., polishing position at 90 deg).
|
||||
|
||||
Args:
|
||||
op2_finder: Callable returning OP2 path
|
||||
target_subcase: Subcase to analyze
|
||||
reference_subcase: Reference subcase to subtract
|
||||
bdf_finder: Optional BDF path finder
|
||||
metric: Relative metric to return
|
||||
**kwargs: Additional ZernikeExtractor arguments
|
||||
|
||||
Returns:
|
||||
Callable that returns relative metric value
|
||||
"""
|
||||
def evaluate() -> float:
|
||||
op2_path = op2_finder()
|
||||
bdf_path = bdf_finder() if bdf_finder else None
|
||||
|
||||
extractor = ZernikeExtractor(op2_path, bdf_path, **kwargs)
|
||||
result = extractor.extract_relative(
|
||||
str(target_subcase),
|
||||
str(reference_subcase)
|
||||
)
|
||||
|
||||
if metric not in result:
|
||||
available = [k for k in result.keys() if isinstance(result[k], (int, float))]
|
||||
raise ValueError(f"Metric '{metric}' not found. Available: {available}")
|
||||
|
||||
return result[metric]
|
||||
|
||||
return evaluate
|
||||
|
||||
|
||||
class ZernikeObjectiveBuilder:
|
||||
"""
|
||||
Builder for complex Zernike objectives with multiple subcases.
|
||||
|
||||
This is useful for multi-subcase optimization where you want
|
||||
to combine metrics from different gravity orientations.
|
||||
|
||||
Example:
|
||||
builder = ZernikeObjectiveBuilder(
|
||||
op2_finder=lambda: sim_dir / "model.op2"
|
||||
)
|
||||
|
||||
# Add objectives for different subcases
|
||||
builder.add_subcase_objective("20", "filtered_rms_nm", weight=1.0)
|
||||
builder.add_subcase_objective("40", "filtered_rms_nm", weight=0.5)
|
||||
builder.add_subcase_objective("60", "filtered_rms_nm", weight=0.5)
|
||||
|
||||
# Create combined objective
|
||||
objective = builder.build_weighted_sum()
|
||||
combined_rms = objective() # Returns weighted sum
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
op2_finder: Callable[[], Path],
|
||||
bdf_finder: Optional[Callable[[], Path]] = None,
|
||||
displacement_unit: str = "mm",
|
||||
**kwargs
|
||||
):
|
||||
self.op2_finder = op2_finder
|
||||
self.bdf_finder = bdf_finder
|
||||
self.displacement_unit = displacement_unit
|
||||
self.kwargs = kwargs
|
||||
self.objectives: List[Dict[str, Any]] = []
|
||||
self._extractor = None
|
||||
|
||||
def add_subcase_objective(
|
||||
self,
|
||||
subcase: Union[int, str],
|
||||
metric: str = "filtered_rms_nm",
|
||||
weight: float = 1.0,
|
||||
name: Optional[str] = None
|
||||
) -> "ZernikeObjectiveBuilder":
|
||||
"""Add a subcase objective to the builder."""
|
||||
self.objectives.append({
|
||||
"subcase": str(subcase),
|
||||
"metric": metric,
|
||||
"weight": weight,
|
||||
"name": name or f"{metric}_{subcase}"
|
||||
})
|
||||
return self
|
||||
|
||||
def add_relative_objective(
|
||||
self,
|
||||
target_subcase: Union[int, str],
|
||||
reference_subcase: Union[int, str],
|
||||
metric: str = "relative_filtered_rms_nm",
|
||||
weight: float = 1.0,
|
||||
name: Optional[str] = None
|
||||
) -> "ZernikeObjectiveBuilder":
|
||||
"""Add a relative objective between subcases."""
|
||||
self.objectives.append({
|
||||
"target_subcase": str(target_subcase),
|
||||
"reference_subcase": str(reference_subcase),
|
||||
"metric": metric,
|
||||
"weight": weight,
|
||||
"name": name or f"rel_{target_subcase}_vs_{reference_subcase}",
|
||||
"is_relative": True
|
||||
})
|
||||
return self
|
||||
|
||||
def _get_extractor(self) -> ZernikeExtractor:
|
||||
"""Lazy-create extractor (reused for all objectives)."""
|
||||
if self._extractor is None:
|
||||
op2_path = self.op2_finder()
|
||||
bdf_path = self.bdf_finder() if self.bdf_finder else None
|
||||
self._extractor = ZernikeExtractor(
|
||||
op2_path, bdf_path,
|
||||
displacement_unit=self.displacement_unit,
|
||||
**self.kwargs
|
||||
)
|
||||
return self._extractor
|
||||
|
||||
def _reset_extractor(self):
|
||||
"""Reset extractor (call after OP2 changes)."""
|
||||
self._extractor = None
|
||||
|
||||
def evaluate_all(self) -> Dict[str, float]:
|
||||
"""
|
||||
Evaluate all objectives and return dict of values.
|
||||
|
||||
Returns:
|
||||
Dict mapping objective name to value
|
||||
"""
|
||||
self._reset_extractor()
|
||||
extractor = self._get_extractor()
|
||||
results = {}
|
||||
|
||||
for obj in self.objectives:
|
||||
try:
|
||||
if obj.get("is_relative"):
|
||||
rel_result = extractor.extract_relative(
|
||||
obj["target_subcase"],
|
||||
obj["reference_subcase"]
|
||||
)
|
||||
results[obj["name"]] = rel_result.get(obj["metric"], 0.0)
|
||||
else:
|
||||
sub_result = extractor.extract_subcase(obj["subcase"])
|
||||
results[obj["name"]] = sub_result.get(obj["metric"], 0.0)
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to evaluate {obj['name']}: {e}")
|
||||
results[obj["name"]] = float("inf")
|
||||
|
||||
return results
|
||||
|
||||
def build_weighted_sum(self) -> Callable[[], float]:
|
||||
"""
|
||||
Build a callable that returns weighted sum of all objectives.
|
||||
|
||||
Returns:
|
||||
Callable returning combined objective value
|
||||
"""
|
||||
def evaluate() -> float:
|
||||
values = self.evaluate_all()
|
||||
total = 0.0
|
||||
for obj in self.objectives:
|
||||
val = values.get(obj["name"], 0.0)
|
||||
total += obj["weight"] * val
|
||||
return total
|
||||
|
||||
return evaluate
|
||||
|
||||
def build_max(self) -> Callable[[], float]:
|
||||
"""
|
||||
Build a callable that returns maximum of all objectives.
|
||||
|
||||
Useful for worst-case optimization across subcases.
|
||||
"""
|
||||
def evaluate() -> float:
|
||||
values = self.evaluate_all()
|
||||
weighted = [
|
||||
obj["weight"] * values.get(obj["name"], 0.0)
|
||||
for obj in self.objectives
|
||||
]
|
||||
return max(weighted) if weighted else 0.0
|
||||
|
||||
return evaluate
|
||||
|
||||
def build_individual(self) -> Callable[[], Dict[str, float]]:
|
||||
"""
|
||||
Build a callable that returns dict of individual objective values.
|
||||
|
||||
Useful for multi-objective optimization (NSGA-II).
|
||||
"""
|
||||
return self.evaluate_all
|
||||
|
||||
|
||||
def extract_zernike_for_trial(
|
||||
op2_path: Path,
|
||||
bdf_path: Optional[Path] = None,
|
||||
subcases: Optional[List[str]] = None,
|
||||
reference_subcase: str = "20",
|
||||
metrics: Optional[List[str]] = None,
|
||||
**kwargs
|
||||
) -> Dict[str, Any]:
|
||||
"""
|
||||
Extract comprehensive Zernike data for a trial.
|
||||
|
||||
This is a high-level function for logging/exporting trial data.
|
||||
It extracts all metrics for specified subcases and computes
|
||||
relative metrics vs the reference.
|
||||
|
||||
Args:
|
||||
op2_path: Path to OP2 file
|
||||
bdf_path: Path to BDF file (auto-detect if None)
|
||||
subcases: List of subcases to extract (None = all available)
|
||||
reference_subcase: Reference for relative calculations
|
||||
metrics: Specific metrics to extract (None = all)
|
||||
**kwargs: Additional ZernikeExtractor arguments
|
||||
|
||||
Returns:
|
||||
Dict with complete trial Zernike data:
|
||||
{
|
||||
'subcases': {
|
||||
'20': {'global_rms_nm': ..., 'filtered_rms_nm': ..., ...},
|
||||
'40': {...},
|
||||
...
|
||||
},
|
||||
'relative': {
|
||||
'40_vs_20': {'relative_filtered_rms_nm': ..., ...},
|
||||
...
|
||||
},
|
||||
'summary': {
|
||||
'best_filtered_rms': ...,
|
||||
'worst_filtered_rms': ...,
|
||||
...
|
||||
}
|
||||
}
|
||||
"""
|
||||
extractor = ZernikeExtractor(op2_path, bdf_path, **kwargs)
|
||||
|
||||
# Get available subcases
|
||||
available = list(extractor.displacements.keys())
|
||||
if subcases:
|
||||
subcases = [s for s in subcases if str(s) in available]
|
||||
else:
|
||||
subcases = available
|
||||
|
||||
# Extract per-subcase data
|
||||
subcase_data = {}
|
||||
for sc in subcases:
|
||||
try:
|
||||
subcase_data[sc] = extractor.extract_subcase(str(sc))
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to extract subcase {sc}: {e}")
|
||||
|
||||
# Extract relative data
|
||||
relative_data = {}
|
||||
if reference_subcase in subcases:
|
||||
for sc in subcases:
|
||||
if sc != reference_subcase:
|
||||
try:
|
||||
key = f"{sc}_vs_{reference_subcase}"
|
||||
relative_data[key] = extractor.extract_relative(
|
||||
str(sc), str(reference_subcase)
|
||||
)
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to extract relative {key}: {e}")
|
||||
|
||||
# Summary statistics
|
||||
filtered_rms_values = [
|
||||
d.get('filtered_rms_nm', float('inf'))
|
||||
for d in subcase_data.values()
|
||||
]
|
||||
|
||||
summary = {
|
||||
'best_filtered_rms': min(filtered_rms_values) if filtered_rms_values else None,
|
||||
'worst_filtered_rms': max(filtered_rms_values) if filtered_rms_values else None,
|
||||
'mean_filtered_rms': sum(filtered_rms_values) / len(filtered_rms_values) if filtered_rms_values else None,
|
||||
'n_subcases': len(subcases),
|
||||
'reference_subcase': reference_subcase,
|
||||
}
|
||||
|
||||
return {
|
||||
'subcases': subcase_data,
|
||||
'relative': relative_data,
|
||||
'summary': summary,
|
||||
}
|
||||
|
||||
|
||||
# Export all helpers
|
||||
__all__ = [
|
||||
'create_zernike_objective',
|
||||
'create_relative_zernike_objective',
|
||||
'ZernikeObjectiveBuilder',
|
||||
'extract_zernike_for_trial',
|
||||
]
|
||||
@@ -285,14 +285,11 @@ sys.argv = ['', {argv_str}] # Set argv for the main function
|
||||
# Set up environment for Simcenter/NX
|
||||
env = os.environ.copy()
|
||||
|
||||
# Set license server (use 29000 for Simcenter)
|
||||
# Override any incorrect license server settings
|
||||
env['SPLM_LICENSE_SERVER'] = '29000@AntoineThinkpad'
|
||||
|
||||
# Force desktop licensing instead of enterprise
|
||||
# User has nx_nas_bn_basic_dsk (desktop) not nx_nas_basic_ent (enterprise)
|
||||
env['NXNA_LICENSE_FILE'] = '29000@AntoineThinkpad'
|
||||
env['NXNASTRAN_LICENSE_FILE'] = '29000@AntoineThinkpad'
|
||||
# Use existing SPLM_LICENSE_SERVER from environment if set
|
||||
# Only set if not already defined (respects user's license configuration)
|
||||
if 'SPLM_LICENSE_SERVER' not in env or not env['SPLM_LICENSE_SERVER']:
|
||||
env['SPLM_LICENSE_SERVER'] = '29000@localhost'
|
||||
print(f"[NX SOLVER] WARNING: SPLM_LICENSE_SERVER not set, using default: {env['SPLM_LICENSE_SERVER']}")
|
||||
|
||||
# Add NX/Simcenter paths to environment
|
||||
nx_bin = self.nx_install_dir / "NXBIN"
|
||||
|
||||
@@ -1,13 +1,53 @@
|
||||
"""
|
||||
NX Journal Script to Solve Simulation in Batch Mode
|
||||
|
||||
This script opens a .sim file, updates the FEM, and solves it through the NX API.
|
||||
Usage: run_journal.exe solve_simulation.py <sim_file_path>
|
||||
This script handles BOTH single-part simulations AND multi-part assembly FEMs.
|
||||
|
||||
Based on recorded NX journal pattern for solving simulations.
|
||||
=============================================================================
|
||||
MULTI-PART ASSEMBLY FEM WORKFLOW (for .afm-based simulations)
|
||||
=============================================================================
|
||||
|
||||
Based on recorded NX journal from interactive session (Nov 28, 2025).
|
||||
|
||||
The correct workflow for assembly FEM updates:
|
||||
|
||||
1. LOAD PARTS
|
||||
- Open ASSY_M1.prt and M1_Blank_fem1_i.prt to have geometry loaded
|
||||
- Find and switch to M1_Blank part for expression editing
|
||||
|
||||
2. UPDATE EXPRESSIONS
|
||||
- Switch to modeling application
|
||||
- Edit expressions with units
|
||||
- Call MakeUpToDate() on modified expressions
|
||||
- Call DoUpdate() to rebuild geometry
|
||||
|
||||
3. SWITCH TO SIM AND UPDATE FEM COMPONENTS
|
||||
- Open the .sim file
|
||||
- Navigate component hierarchy via RootComponent.FindObject()
|
||||
- For each component FEM:
|
||||
- SetWorkComponent() to make it the work part
|
||||
- FindObject("FEModel").UpdateFemodel()
|
||||
|
||||
4. MERGE DUPLICATE NODES (critical for assembly FEM!)
|
||||
- Switch to assembly FEM component
|
||||
- CreateDuplicateNodesCheckBuilder()
|
||||
- Set MergeOccurrenceNodes = True
|
||||
- IdentifyDuplicateNodes() then MergeDuplicateNodes()
|
||||
|
||||
5. RESOLVE LABEL CONFLICTS
|
||||
- CreateAssemblyLabelManagerBuilder()
|
||||
- SetFEModelOccOffsets() for each occurrence
|
||||
- Commit()
|
||||
|
||||
6. SOLVE
|
||||
- SetWorkComponent(Null) to return to sim level
|
||||
- SolveChainOfSolutions()
|
||||
|
||||
=============================================================================
|
||||
"""
|
||||
|
||||
import sys
|
||||
import os
|
||||
import NXOpen
|
||||
import NXOpen.Assemblies
|
||||
import NXOpen.CAE
|
||||
@@ -15,341 +55,510 @@ import NXOpen.CAE
|
||||
|
||||
def main(args):
|
||||
"""
|
||||
Open and solve a simulation file with updated expression values.
|
||||
Main entry point for NX journal.
|
||||
|
||||
Args:
|
||||
args: Command line arguments
|
||||
args[0]: .sim file path
|
||||
args[1]: solution_name (optional, e.g., "Solution_Normal_Modes" or None for default)
|
||||
args[1]: solution_name (optional, or "None" for default)
|
||||
args[2+]: expression updates as "name=value" pairs
|
||||
"""
|
||||
if len(args) < 1:
|
||||
print("ERROR: No .sim file path provided")
|
||||
print("Usage: run_journal.exe solve_simulation.py <sim_file_path> [solution_name] [expr1=val1] [expr2=val2] ...")
|
||||
print("Usage: run_journal.exe solve_simulation.py <sim_file_path> [solution_name] [expr1=val1] ...")
|
||||
return False
|
||||
|
||||
sim_file_path = args[0]
|
||||
|
||||
# Parse solution name if provided (args[1])
|
||||
solution_name = args[1] if len(args) > 1 and args[1] != 'None' else None
|
||||
|
||||
# Extract base name from sim file (e.g., "Beam_sim1.sim" -> "Beam")
|
||||
import os
|
||||
sim_filename = os.path.basename(sim_file_path)
|
||||
part_base_name = sim_filename.split('_sim')[0] if '_sim' in sim_filename else sim_filename.split('.sim')[0]
|
||||
|
||||
# Parse expression updates from args[2+] as "name=value" pairs
|
||||
# Parse expression updates
|
||||
expression_updates = {}
|
||||
for arg in args[2:]:
|
||||
if '=' in arg:
|
||||
name, value = arg.split('=', 1)
|
||||
expression_updates[name] = float(value)
|
||||
|
||||
print(f"[JOURNAL] Opening simulation: {sim_file_path}")
|
||||
print(f"[JOURNAL] Detected part base name: {part_base_name}")
|
||||
if solution_name:
|
||||
print(f"[JOURNAL] Will solve specific solution: {solution_name}")
|
||||
else:
|
||||
print(f"[JOURNAL] Will solve default solution (Solution 1)")
|
||||
if expression_updates:
|
||||
print(f"[JOURNAL] Will update expressions:")
|
||||
for name, value in expression_updates.items():
|
||||
print(f"[JOURNAL] {name} = {value}")
|
||||
# Get working directory
|
||||
working_dir = os.path.dirname(os.path.abspath(sim_file_path))
|
||||
sim_filename = os.path.basename(sim_file_path)
|
||||
|
||||
print(f"[JOURNAL] " + "="*60)
|
||||
print(f"[JOURNAL] NX SIMULATION SOLVER (Assembly FEM Workflow)")
|
||||
print(f"[JOURNAL] " + "="*60)
|
||||
print(f"[JOURNAL] Simulation: {sim_filename}")
|
||||
print(f"[JOURNAL] Working directory: {working_dir}")
|
||||
print(f"[JOURNAL] Solution: {solution_name or 'Solution 1'}")
|
||||
print(f"[JOURNAL] Expression updates: {len(expression_updates)}")
|
||||
for name, value in expression_updates.items():
|
||||
print(f"[JOURNAL] {name} = {value}")
|
||||
|
||||
try:
|
||||
theSession = NXOpen.Session.GetSession()
|
||||
|
||||
# Set load options to load linked parts from directory
|
||||
print("[JOURNAL] Setting load options for linked parts...")
|
||||
import os
|
||||
working_dir = os.path.dirname(os.path.abspath(sim_file_path))
|
||||
|
||||
# Complete load options setup (from recorded journal)
|
||||
# Set load options
|
||||
theSession.Parts.LoadOptions.LoadLatest = False
|
||||
theSession.Parts.LoadOptions.ComponentLoadMethod = NXOpen.LoadOptions.LoadMethod.FromDirectory
|
||||
|
||||
searchDirectories = [working_dir]
|
||||
searchSubDirs = [True]
|
||||
theSession.Parts.LoadOptions.SetSearchDirectories(searchDirectories, searchSubDirs)
|
||||
|
||||
theSession.Parts.LoadOptions.SetSearchDirectories([working_dir], [True])
|
||||
theSession.Parts.LoadOptions.ComponentsToLoad = NXOpen.LoadOptions.LoadComponents.All
|
||||
theSession.Parts.LoadOptions.PartLoadOption = NXOpen.LoadOptions.LoadOption.FullyLoad
|
||||
theSession.Parts.LoadOptions.SetInterpartData(True, NXOpen.LoadOptions.Parent.All)
|
||||
theSession.Parts.LoadOptions.AllowSubstitution = False
|
||||
theSession.Parts.LoadOptions.GenerateMissingPartFamilyMembers = True
|
||||
theSession.Parts.LoadOptions.AbortOnFailure = False
|
||||
|
||||
referenceSets = ["As Saved", "Use Simplified", "Use Model", "Entire Part", "Empty"]
|
||||
theSession.Parts.LoadOptions.SetDefaultReferenceSets(referenceSets)
|
||||
theSession.Parts.LoadOptions.ReferenceSetOverride = False
|
||||
|
||||
print(f"[JOURNAL] Load directory set to: {working_dir}")
|
||||
|
||||
# Close any currently open sim file to force reload from disk
|
||||
print("[JOURNAL] Checking for open parts...")
|
||||
# Close any open parts
|
||||
try:
|
||||
current_work = theSession.Parts.BaseWork
|
||||
if current_work and hasattr(current_work, 'FullPath'):
|
||||
current_path = current_work.FullPath
|
||||
print(f"[JOURNAL] Closing currently open part: {current_path}")
|
||||
# Close without saving (we want to reload from disk)
|
||||
partCloseResponses1 = [NXOpen.BasePart.CloseWholeTree]
|
||||
theSession.Parts.CloseAll(partCloseResponses1)
|
||||
print("[JOURNAL] Parts closed")
|
||||
except Exception as e:
|
||||
print(f"[JOURNAL] No parts to close or error closing: {e}")
|
||||
theSession.Parts.CloseAll([NXOpen.BasePart.CloseWholeTree])
|
||||
except:
|
||||
pass
|
||||
|
||||
# Open the .sim file (now will load fresh from disk with updated .prt files)
|
||||
print(f"[JOURNAL] Opening simulation fresh from disk...")
|
||||
basePart1, partLoadStatus1 = theSession.Parts.OpenActiveDisplay(
|
||||
sim_file_path,
|
||||
NXOpen.DisplayPartOption.AllowAdditional
|
||||
)
|
||||
# Check for assembly FEM files
|
||||
afm_files = [f for f in os.listdir(working_dir) if f.endswith('.afm')]
|
||||
is_assembly = len(afm_files) > 0
|
||||
|
||||
workSimPart = theSession.Parts.BaseWork
|
||||
displaySimPart = theSession.Parts.BaseDisplay
|
||||
|
||||
print(f"[JOURNAL] Simulation opened successfully")
|
||||
partLoadStatus1.Dispose()
|
||||
|
||||
# Switch to simulation application
|
||||
theSession.ApplicationSwitchImmediate("UG_APP_SFEM")
|
||||
|
||||
simPart1 = workSimPart
|
||||
theSession.Post.UpdateUserGroupsFromSimPart(simPart1)
|
||||
|
||||
# STEP 1: Try to switch to part and update expressions (optional for some models)
|
||||
print(f"[JOURNAL] STEP 1: Checking for {part_base_name}.prt geometry...")
|
||||
geometry_updated = False
|
||||
try:
|
||||
# Find the main part (may not exist for embedded geometry models)
|
||||
bracketPart = None
|
||||
try:
|
||||
bracketPart = theSession.Parts.FindObject(part_base_name)
|
||||
except:
|
||||
pass
|
||||
|
||||
if bracketPart:
|
||||
print(f"[JOURNAL] Found {part_base_name} part, updating geometry...")
|
||||
# Make Bracket the active display part
|
||||
status, partLoadStatus = theSession.Parts.SetActiveDisplay(
|
||||
bracketPart,
|
||||
NXOpen.DisplayPartOption.AllowAdditional,
|
||||
NXOpen.PartDisplayPartWorkPartOption.UseLast
|
||||
)
|
||||
partLoadStatus.Dispose()
|
||||
|
||||
workPart = theSession.Parts.Work
|
||||
|
||||
# CRITICAL: Apply expression changes BEFORE updating geometry
|
||||
expressions_updated = []
|
||||
|
||||
# Apply all expression updates dynamically
|
||||
for expr_name, expr_value in expression_updates.items():
|
||||
print(f"[JOURNAL] Applying {expr_name} = {expr_value}")
|
||||
try:
|
||||
expr_obj = workPart.Expressions.FindObject(expr_name)
|
||||
if expr_obj:
|
||||
# Use millimeters as default unit for geometric parameters
|
||||
unit_mm = workPart.UnitCollection.FindObject("MilliMeter")
|
||||
workPart.Expressions.EditExpressionWithUnits(expr_obj, unit_mm, str(expr_value))
|
||||
expressions_updated.append(expr_obj)
|
||||
print(f"[JOURNAL] {expr_name} updated successfully")
|
||||
else:
|
||||
print(f"[JOURNAL] WARNING: {expr_name} expression not found!")
|
||||
except Exception as e:
|
||||
print(f"[JOURNAL] ERROR updating {expr_name}: {e}")
|
||||
|
||||
# Make expressions up to date
|
||||
if expressions_updated:
|
||||
print(f"[JOURNAL] Making {len(expressions_updated)} expression(s) up to date...")
|
||||
for expr in expressions_updated:
|
||||
markId_expr = theSession.SetUndoMark(NXOpen.Session.MarkVisibility.Invisible, "Make Up to Date")
|
||||
objects1 = [expr]
|
||||
theSession.UpdateManager.MakeUpToDate(objects1, markId_expr)
|
||||
theSession.DeleteUndoMark(markId_expr, None)
|
||||
|
||||
# CRITICAL: Update the geometry model - rebuilds features with new expressions
|
||||
print(f"[JOURNAL] Rebuilding geometry with new expression values...")
|
||||
markId_update = theSession.SetUndoMark(NXOpen.Session.MarkVisibility.Invisible, "NX update")
|
||||
nErrs = theSession.UpdateManager.DoUpdate(markId_update)
|
||||
theSession.DeleteUndoMark(markId_update, "NX update")
|
||||
print(f"[JOURNAL] {part_base_name} geometry updated ({nErrs} errors)")
|
||||
|
||||
# Extract mass from expression p173 if it exists and write to temp file
|
||||
try:
|
||||
mass_expr = workPart.Expressions.FindObject("p173")
|
||||
if mass_expr:
|
||||
mass_kg = mass_expr.Value
|
||||
mass_output_file = os.path.join(working_dir, "_temp_mass.txt")
|
||||
with open(mass_output_file, 'w') as f:
|
||||
f.write(str(mass_kg))
|
||||
print(f"[JOURNAL] Mass from p173: {mass_kg:.6f} kg ({mass_kg * 1000:.2f} g)")
|
||||
print(f"[JOURNAL] Mass written to: {mass_output_file}")
|
||||
except:
|
||||
pass # Expression p173 might not exist in all models
|
||||
|
||||
geometry_updated = True
|
||||
else:
|
||||
print(f"[JOURNAL] {part_base_name} part not found - may be embedded in sim file")
|
||||
except Exception as e:
|
||||
print(f"[JOURNAL] Could not update {part_base_name}.prt: {e}")
|
||||
print(f"[JOURNAL] Continuing with sim-only solve...")
|
||||
|
||||
# STEP 2: Try to switch to FEM part and update (optional for some models)
|
||||
fem_part_name = f"{part_base_name}_fem1"
|
||||
print(f"[JOURNAL] STEP 2: Checking for {fem_part_name}.fem...")
|
||||
fem_updated = False
|
||||
try:
|
||||
# Find the FEM part (may not exist or may have different name)
|
||||
femPart1 = None
|
||||
try:
|
||||
femPart1 = theSession.Parts.FindObject(fem_part_name)
|
||||
except:
|
||||
# Try with _i suffix for idealized FEM
|
||||
try:
|
||||
femPart1 = theSession.Parts.FindObject(f"{fem_part_name}_i")
|
||||
except:
|
||||
pass
|
||||
|
||||
if femPart1:
|
||||
print(f"[JOURNAL] Found FEM part, updating...")
|
||||
# Make FEM the active display part
|
||||
status, partLoadStatus = theSession.Parts.SetActiveDisplay(
|
||||
femPart1,
|
||||
NXOpen.DisplayPartOption.AllowAdditional,
|
||||
NXOpen.PartDisplayPartWorkPartOption.SameAsDisplay
|
||||
)
|
||||
partLoadStatus.Dispose()
|
||||
|
||||
workFemPart = theSession.Parts.BaseWork
|
||||
|
||||
# CRITICAL: Update FE Model - regenerates FEM with new geometry
|
||||
print("[JOURNAL] Updating FE Model...")
|
||||
fEModel1 = workFemPart.FindObject("FEModel")
|
||||
if fEModel1:
|
||||
fEModel1.UpdateFemodel()
|
||||
print("[JOURNAL] FE Model updated with new geometry!")
|
||||
fem_updated = True
|
||||
else:
|
||||
print("[JOURNAL] WARNING: Could not find FEModel object")
|
||||
else:
|
||||
print(f"[JOURNAL] FEM part not found - may be embedded in sim file")
|
||||
except Exception as e:
|
||||
print(f"[JOURNAL] Could not update FEM: {e}")
|
||||
print(f"[JOURNAL] Continuing with sim-only solve...")
|
||||
|
||||
# STEP 3: Switch back to sim part
|
||||
print("[JOURNAL] STEP 3: Switching back to sim part...")
|
||||
try:
|
||||
status, partLoadStatus = theSession.Parts.SetActiveDisplay(
|
||||
simPart1,
|
||||
NXOpen.DisplayPartOption.AllowAdditional,
|
||||
NXOpen.PartDisplayPartWorkPartOption.UseLast
|
||||
)
|
||||
partLoadStatus.Dispose()
|
||||
workSimPart = theSession.Parts.BaseWork
|
||||
print("[JOURNAL] Switched back to sim part")
|
||||
except Exception as e:
|
||||
print(f"[JOURNAL] WARNING: Error switching to sim part: {e}")
|
||||
|
||||
# Note: Old output files are deleted by nx_solver.py before calling this journal
|
||||
# This ensures NX performs a fresh solve
|
||||
|
||||
# Solve the simulation
|
||||
print("[JOURNAL] Starting solve...")
|
||||
markId3 = theSession.SetUndoMark(NXOpen.Session.MarkVisibility.Visible, "Start")
|
||||
theSession.SetUndoMarkName(markId3, "Solve Dialog")
|
||||
|
||||
markId5 = theSession.SetUndoMark(NXOpen.Session.MarkVisibility.Invisible, "Solve")
|
||||
|
||||
theCAESimSolveManager = NXOpen.CAE.SimSolveManager.GetSimSolveManager(theSession)
|
||||
|
||||
# Get the simulation object
|
||||
simSimulation1 = workSimPart.FindObject("Simulation")
|
||||
|
||||
# CRITICAL: Disable solution monitor when solving multiple solutions
|
||||
# This prevents NX from opening multiple monitor windows which superpose and cause usability issues
|
||||
if not solution_name:
|
||||
print("[JOURNAL] Disabling solution monitor for all solutions to prevent window pile-up...")
|
||||
try:
|
||||
# Get all solutions in the simulation
|
||||
solutions_disabled = 0
|
||||
solution_num = 1
|
||||
while True:
|
||||
try:
|
||||
solution_obj_name = f"Solution[Solution {solution_num}]"
|
||||
simSolution = simSimulation1.FindObject(solution_obj_name)
|
||||
if simSolution:
|
||||
propertyTable = simSolution.SolverOptionsPropertyTable
|
||||
propertyTable.SetBooleanPropertyValue("solution monitor", False)
|
||||
solutions_disabled += 1
|
||||
solution_num += 1
|
||||
else:
|
||||
break
|
||||
except:
|
||||
break # No more solutions
|
||||
print(f"[JOURNAL] Solution monitor disabled for {solutions_disabled} solution(s)")
|
||||
except Exception as e:
|
||||
print(f"[JOURNAL] WARNING: Could not disable solution monitor: {e}")
|
||||
print(f"[JOURNAL] Continuing with solve anyway...")
|
||||
|
||||
# Get the solution(s) to solve - either specific or all
|
||||
if solution_name:
|
||||
# Solve specific solution in background mode
|
||||
solution_obj_name = f"Solution[{solution_name}]"
|
||||
print(f"[JOURNAL] Looking for solution: {solution_obj_name}")
|
||||
simSolution1 = simSimulation1.FindObject(solution_obj_name)
|
||||
psolutions1 = [simSolution1]
|
||||
|
||||
numsolutionssolved1, numsolutionsfailed1, numsolutionsskipped1 = theCAESimSolveManager.SolveChainOfSolutions(
|
||||
psolutions1,
|
||||
NXOpen.CAE.SimSolution.SolveOption.Solve,
|
||||
NXOpen.CAE.SimSolution.SetupCheckOption.CompleteDeepCheckAndOutputErrors,
|
||||
NXOpen.CAE.SimSolution.SolveMode.Background
|
||||
if is_assembly and expression_updates:
|
||||
print(f"[JOURNAL] ")
|
||||
print(f"[JOURNAL] DETECTED: Multi-part Assembly FEM")
|
||||
print(f"[JOURNAL] Using ASSEMBLY FEM WORKFLOW")
|
||||
print(f"[JOURNAL] ")
|
||||
return solve_assembly_fem_workflow(
|
||||
theSession, sim_file_path, solution_name, expression_updates, working_dir
|
||||
)
|
||||
else:
|
||||
# Solve ALL solutions using SolveAllSolutions API (Foreground mode)
|
||||
# This ensures all solutions (static + modal, etc.) complete before returning
|
||||
print(f"[JOURNAL] Solving all solutions using SolveAllSolutions API (Foreground mode)...")
|
||||
|
||||
numsolutionssolved1, numsolutionsfailed1, numsolutionsskipped1 = theCAESimSolveManager.SolveAllSolutions(
|
||||
NXOpen.CAE.SimSolution.SolveOption.Solve,
|
||||
NXOpen.CAE.SimSolution.SetupCheckOption.CompleteCheckAndOutputErrors,
|
||||
NXOpen.CAE.SimSolution.SolveMode.Foreground,
|
||||
False
|
||||
print(f"[JOURNAL] ")
|
||||
print(f"[JOURNAL] Using SIMPLE WORKFLOW (no expression updates or single-part)")
|
||||
print(f"[JOURNAL] ")
|
||||
return solve_simple_workflow(
|
||||
theSession, sim_file_path, solution_name, expression_updates, working_dir
|
||||
)
|
||||
|
||||
theSession.DeleteUndoMark(markId5, None)
|
||||
theSession.SetUndoMarkName(markId3, "Solve")
|
||||
|
||||
print(f"[JOURNAL] Solve completed!")
|
||||
print(f"[JOURNAL] Solutions solved: {numsolutionssolved1}")
|
||||
print(f"[JOURNAL] Solutions failed: {numsolutionsfailed1}")
|
||||
print(f"[JOURNAL] Solutions skipped: {numsolutionsskipped1}")
|
||||
|
||||
# NOTE: When solution_name=None, we use Foreground mode to ensure all solutions
|
||||
# complete before returning. When solution_name is specified, Background mode is used.
|
||||
|
||||
# Save the simulation to write all output files
|
||||
print("[JOURNAL] Saving simulation to ensure output files are written...")
|
||||
simPart2 = workSimPart
|
||||
partSaveStatus1 = simPart2.Save(
|
||||
NXOpen.BasePart.SaveComponents.TrueValue,
|
||||
NXOpen.BasePart.CloseAfterSave.FalseValue
|
||||
)
|
||||
partSaveStatus1.Dispose()
|
||||
print("[JOURNAL] Save complete!")
|
||||
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
print(f"[JOURNAL] ERROR: {e}")
|
||||
print(f"[JOURNAL] FATAL ERROR: {e}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
return False
|
||||
|
||||
|
||||
def solve_assembly_fem_workflow(theSession, sim_file_path, solution_name, expression_updates, working_dir):
|
||||
"""
|
||||
Full assembly FEM workflow based on recorded NX journal.
|
||||
|
||||
This is the correct workflow for multi-part assembly FEMs.
|
||||
"""
|
||||
sim_filename = os.path.basename(sim_file_path)
|
||||
|
||||
# ==========================================================================
|
||||
# STEP 1: LOAD REQUIRED PARTS
|
||||
# ==========================================================================
|
||||
print(f"[JOURNAL] STEP 1: Loading required parts...")
|
||||
|
||||
# Load ASSY_M1.prt (to have the geometry assembly available)
|
||||
assy_prt_path = os.path.join(working_dir, "ASSY_M1.prt")
|
||||
if os.path.exists(assy_prt_path):
|
||||
print(f"[JOURNAL] Loading ASSY_M1.prt...")
|
||||
markId1 = theSession.SetUndoMark(NXOpen.Session.MarkVisibility.Visible, "Load Part")
|
||||
part1, partLoadStatus1 = theSession.Parts.Open(assy_prt_path)
|
||||
partLoadStatus1.Dispose()
|
||||
else:
|
||||
print(f"[JOURNAL] WARNING: ASSY_M1.prt not found, continuing anyway...")
|
||||
|
||||
# Load M1_Blank_fem1_i.prt (idealized geometry)
|
||||
idealized_prt_path = os.path.join(working_dir, "M1_Blank_fem1_i.prt")
|
||||
if os.path.exists(idealized_prt_path):
|
||||
print(f"[JOURNAL] Loading M1_Blank_fem1_i.prt...")
|
||||
markId2 = theSession.SetUndoMark(NXOpen.Session.MarkVisibility.Visible, "Load Part")
|
||||
part2, partLoadStatus2 = theSession.Parts.Open(idealized_prt_path)
|
||||
partLoadStatus2.Dispose()
|
||||
|
||||
# ==========================================================================
|
||||
# STEP 2: UPDATE EXPRESSIONS IN M1_BLANK
|
||||
# ==========================================================================
|
||||
print(f"[JOURNAL] STEP 2: Updating expressions in M1_Blank...")
|
||||
|
||||
# Find and switch to M1_Blank part
|
||||
try:
|
||||
part3 = theSession.Parts.FindObject("M1_Blank")
|
||||
markId3 = theSession.SetUndoMark(NXOpen.Session.MarkVisibility.Visible, "Change Displayed Part")
|
||||
status1, partLoadStatus3 = theSession.Parts.SetActiveDisplay(
|
||||
part3,
|
||||
NXOpen.DisplayPartOption.AllowAdditional,
|
||||
NXOpen.PartDisplayPartWorkPartOption.UseLast
|
||||
)
|
||||
partLoadStatus3.Dispose()
|
||||
|
||||
# Switch to modeling application for expression editing
|
||||
theSession.ApplicationSwitchImmediate("UG_APP_MODELING")
|
||||
|
||||
workPart = theSession.Parts.Work
|
||||
|
||||
# Create undo mark for expressions
|
||||
markId4 = theSession.SetUndoMark(NXOpen.Session.MarkVisibility.Visible, "Start")
|
||||
theSession.SetUndoMarkName(markId4, "Expressions Dialog")
|
||||
|
||||
# Write expressions to a temp file and import (more reliable than editing one by one)
|
||||
exp_file_path = os.path.join(working_dir, "_temp_expressions.exp")
|
||||
with open(exp_file_path, 'w') as f:
|
||||
for expr_name, expr_value in expression_updates.items():
|
||||
# Determine unit
|
||||
if 'angle' in expr_name.lower() or 'vertical' in expr_name.lower():
|
||||
unit_str = "Degrees"
|
||||
else:
|
||||
unit_str = "MilliMeter"
|
||||
f.write(f"[{unit_str}]{expr_name}={expr_value}\n")
|
||||
print(f"[JOURNAL] {expr_name} = {expr_value} ({unit_str})")
|
||||
|
||||
print(f"[JOURNAL] Importing expressions from file...")
|
||||
markId_import = theSession.SetUndoMark(NXOpen.Session.MarkVisibility.Visible, "Import Expressions")
|
||||
|
||||
try:
|
||||
expModified, errorMessages = workPart.Expressions.ImportFromFile(
|
||||
exp_file_path,
|
||||
NXOpen.ExpressionCollection.ImportMode.Replace
|
||||
)
|
||||
print(f"[JOURNAL] Expressions imported: {expModified} modified")
|
||||
if errorMessages:
|
||||
print(f"[JOURNAL] Import errors: {errorMessages}")
|
||||
|
||||
# Update geometry after import
|
||||
print(f"[JOURNAL] Rebuilding geometry...")
|
||||
markId_update = theSession.SetUndoMark(NXOpen.Session.MarkVisibility.Invisible, "NX update")
|
||||
nErrs = theSession.UpdateManager.DoUpdate(markId_update)
|
||||
theSession.DeleteUndoMark(markId_update, "NX update")
|
||||
print(f"[JOURNAL] Geometry rebuilt ({nErrs} errors)")
|
||||
|
||||
updated_expressions = list(expression_updates.keys())
|
||||
|
||||
except Exception as e:
|
||||
print(f"[JOURNAL] ERROR importing expressions: {e}")
|
||||
updated_expressions = []
|
||||
|
||||
# Clean up temp file
|
||||
try:
|
||||
os.remove(exp_file_path)
|
||||
except:
|
||||
pass
|
||||
|
||||
theSession.SetUndoMarkName(markId4, "Expressions")
|
||||
|
||||
except Exception as e:
|
||||
print(f"[JOURNAL] ERROR updating expressions: {e}")
|
||||
|
||||
# ==========================================================================
|
||||
# STEP 3: OPEN SIM AND UPDATE COMPONENT FEMs
|
||||
# ==========================================================================
|
||||
print(f"[JOURNAL] STEP 3: Opening sim and updating component FEMs...")
|
||||
|
||||
# Try to find the sim part first (like the recorded journal does)
|
||||
# This ensures we're working with the same loaded sim part context
|
||||
sim_part_name = os.path.splitext(sim_filename)[0] # e.g., "ASSY_M1_assyfem1_sim1"
|
||||
print(f"[JOURNAL] Looking for sim part: {sim_part_name}")
|
||||
|
||||
markId_sim = theSession.SetUndoMark(NXOpen.Session.MarkVisibility.Visible, "Change Displayed Part")
|
||||
|
||||
try:
|
||||
# First try to find it among loaded parts (like recorded journal)
|
||||
simPart1 = theSession.Parts.FindObject(sim_part_name)
|
||||
status_sim, partLoadStatus = theSession.Parts.SetActiveDisplay(
|
||||
simPart1,
|
||||
NXOpen.DisplayPartOption.AllowAdditional,
|
||||
NXOpen.PartDisplayPartWorkPartOption.UseLast
|
||||
)
|
||||
partLoadStatus.Dispose()
|
||||
print(f"[JOURNAL] Found and activated existing sim part")
|
||||
except:
|
||||
# Fallback: Open fresh if not found
|
||||
print(f"[JOURNAL] Sim part not found, opening fresh: {sim_filename}")
|
||||
basePart, partLoadStatus = theSession.Parts.OpenActiveDisplay(
|
||||
sim_file_path,
|
||||
NXOpen.DisplayPartOption.AllowAdditional
|
||||
)
|
||||
partLoadStatus.Dispose()
|
||||
|
||||
workSimPart = theSession.Parts.BaseWork
|
||||
displaySimPart = theSession.Parts.BaseDisplay
|
||||
theSession.ApplicationSwitchImmediate("UG_APP_SFEM")
|
||||
theSession.Post.UpdateUserGroupsFromSimPart(workSimPart)
|
||||
|
||||
# Navigate component hierarchy
|
||||
try:
|
||||
rootComponent = workSimPart.ComponentAssembly.RootComponent
|
||||
component1 = rootComponent.FindObject("COMPONENT ASSY_M1_assyfem1 1")
|
||||
|
||||
# Update M1_Blank_fem1
|
||||
print(f"[JOURNAL] Updating M1_Blank_fem1...")
|
||||
try:
|
||||
component2 = component1.FindObject("COMPONENT M1_Blank_fem1 1")
|
||||
markId_fem1 = theSession.SetUndoMark(NXOpen.Session.MarkVisibility.Visible, "Make Work Part")
|
||||
partLoadStatus5 = theSession.Parts.SetWorkComponent(
|
||||
component2,
|
||||
NXOpen.PartCollection.RefsetOption.Entire,
|
||||
NXOpen.PartCollection.WorkComponentOption.Visible
|
||||
)
|
||||
workFemPart = theSession.Parts.BaseWork
|
||||
partLoadStatus5.Dispose()
|
||||
|
||||
markId_update1 = theSession.SetUndoMark(NXOpen.Session.MarkVisibility.Visible, "Update FE Model")
|
||||
fEModel1 = workFemPart.FindObject("FEModel")
|
||||
fEModel1.UpdateFemodel()
|
||||
print(f"[JOURNAL] M1_Blank_fem1 updated")
|
||||
except Exception as e:
|
||||
print(f"[JOURNAL] WARNING: M1_Blank_fem1: {e}")
|
||||
|
||||
# Update M1_Vertical_Support_Skeleton_fem1
|
||||
print(f"[JOURNAL] Updating M1_Vertical_Support_Skeleton_fem1...")
|
||||
try:
|
||||
component3 = component1.FindObject("COMPONENT M1_Vertical_Support_Skeleton_fem1 3")
|
||||
markId_fem2 = theSession.SetUndoMark(NXOpen.Session.MarkVisibility.Visible, "Make Work Part")
|
||||
partLoadStatus6 = theSession.Parts.SetWorkComponent(
|
||||
component3,
|
||||
NXOpen.PartCollection.RefsetOption.Entire,
|
||||
NXOpen.PartCollection.WorkComponentOption.Visible
|
||||
)
|
||||
workFemPart = theSession.Parts.BaseWork
|
||||
partLoadStatus6.Dispose()
|
||||
|
||||
markId_update2 = theSession.SetUndoMark(NXOpen.Session.MarkVisibility.Visible, "Update FE Model")
|
||||
fEModel2 = workFemPart.FindObject("FEModel")
|
||||
fEModel2.UpdateFemodel()
|
||||
print(f"[JOURNAL] M1_Vertical_Support_Skeleton_fem1 updated")
|
||||
except Exception as e:
|
||||
print(f"[JOURNAL] WARNING: M1_Vertical_Support_Skeleton_fem1: {e}")
|
||||
|
||||
except Exception as e:
|
||||
print(f"[JOURNAL] ERROR navigating component hierarchy: {e}")
|
||||
|
||||
# ==========================================================================
|
||||
# STEP 4: MERGE DUPLICATE NODES
|
||||
# ==========================================================================
|
||||
print(f"[JOURNAL] STEP 4: Merging duplicate nodes...")
|
||||
|
||||
try:
|
||||
# Switch to assembly FEM
|
||||
partLoadStatus8 = theSession.Parts.SetWorkComponent(
|
||||
component1,
|
||||
NXOpen.PartCollection.RefsetOption.Entire,
|
||||
NXOpen.PartCollection.WorkComponentOption.Visible
|
||||
)
|
||||
workAssyFemPart = theSession.Parts.BaseWork
|
||||
displaySimPart = theSession.Parts.BaseDisplay
|
||||
partLoadStatus8.Dispose()
|
||||
print(f"[JOURNAL] Switched to assembly FEM: {workAssyFemPart.Name}")
|
||||
|
||||
markId_merge = theSession.SetUndoMark(NXOpen.Session.MarkVisibility.Visible, "Start")
|
||||
|
||||
caePart1 = workAssyFemPart
|
||||
duplicateNodesCheckBuilder1 = caePart1.ModelCheckMgr.CreateDuplicateNodesCheckBuilder()
|
||||
|
||||
# Set tolerance
|
||||
unit_tol = duplicateNodesCheckBuilder1.Tolerance.Units
|
||||
duplicateNodesCheckBuilder1.Tolerance.Units = unit_tol
|
||||
duplicateNodesCheckBuilder1.Tolerance.SetFormula("0.01")
|
||||
print(f"[JOURNAL] Tolerance: 0.01 mm")
|
||||
|
||||
# Enable occurrence node merge - CRITICAL for assembly FEM
|
||||
duplicateNodesCheckBuilder1.MergeOccurrenceNodes = True
|
||||
print(f"[JOURNAL] MergeOccurrenceNodes: True")
|
||||
|
||||
theSession.SetUndoMarkName(markId_merge, "Duplicate Nodes Dialog")
|
||||
|
||||
# Configure display settings
|
||||
displaysettings1 = NXOpen.CAE.ModelCheck.DuplicateNodesCheckBuilder.DisplaySettings()
|
||||
displaysettings1.ShowDuplicateNodes = True
|
||||
displaysettings1.ShowMergedNodeLabels = False
|
||||
displaysettings1.ShowRetainedNodeLabels = False
|
||||
displaysettings1.KeepNodesColor = displaySimPart.Colors.Find("Blue")
|
||||
displaysettings1.MergeNodesColor = displaySimPart.Colors.Find("Yellow")
|
||||
displaysettings1.UnableToMergeNodesColor = displaySimPart.Colors.Find("Red")
|
||||
duplicateNodesCheckBuilder1.DisplaySettingsData = displaysettings1
|
||||
|
||||
# Check scope
|
||||
duplicateNodesCheckBuilder1.CheckScopeOption = NXOpen.CAE.ModelCheck.CheckScope.Displayed
|
||||
print(f"[JOURNAL] CheckScope: Displayed")
|
||||
|
||||
# Identify duplicates
|
||||
print(f"[JOURNAL] Identifying duplicate nodes...")
|
||||
numDuplicates = duplicateNodesCheckBuilder1.IdentifyDuplicateNodes()
|
||||
print(f"[JOURNAL] Found {numDuplicates} duplicate node sets")
|
||||
|
||||
# Merge duplicates
|
||||
if numDuplicates > 0:
|
||||
print(f"[JOURNAL] Merging duplicate nodes...")
|
||||
numMerged = duplicateNodesCheckBuilder1.MergeDuplicateNodes()
|
||||
print(f"[JOURNAL] Merged {numMerged} duplicate node sets")
|
||||
else:
|
||||
print(f"[JOURNAL] WARNING: No duplicate nodes found to merge!")
|
||||
print(f"[JOURNAL] This may indicate mesh update didn't work properly")
|
||||
|
||||
theSession.SetUndoMarkName(markId_merge, "Duplicate Nodes")
|
||||
duplicateNodesCheckBuilder1.Destroy()
|
||||
theSession.DeleteUndoMark(markId_merge, None)
|
||||
|
||||
except Exception as e:
|
||||
print(f"[JOURNAL] WARNING: Node merge: {e}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
|
||||
# ==========================================================================
|
||||
# STEP 5: RESOLVE LABEL CONFLICTS
|
||||
# ==========================================================================
|
||||
print(f"[JOURNAL] STEP 5: Resolving label conflicts...")
|
||||
|
||||
try:
|
||||
markId_labels = theSession.SetUndoMark(NXOpen.Session.MarkVisibility.Visible, "Start")
|
||||
|
||||
assyFemPart1 = workAssyFemPart
|
||||
assemblyLabelManagerBuilder1 = assyFemPart1.CreateAssemblyLabelManagerBuilder()
|
||||
|
||||
theSession.SetUndoMarkName(markId_labels, "Assembly Label Manager Dialog")
|
||||
|
||||
markId_labels2 = theSession.SetUndoMark(NXOpen.Session.MarkVisibility.Invisible, "Assembly Label Manager")
|
||||
|
||||
# Set offsets for each FE model occurrence
|
||||
# These offsets ensure unique node/element labels across components
|
||||
entitytypes = [
|
||||
NXOpen.CAE.AssemblyLabelManagerBuilder.EntityType.Node,
|
||||
NXOpen.CAE.AssemblyLabelManagerBuilder.EntityType.Element,
|
||||
NXOpen.CAE.AssemblyLabelManagerBuilder.EntityType.Csys,
|
||||
NXOpen.CAE.AssemblyLabelManagerBuilder.EntityType.Physical,
|
||||
NXOpen.CAE.AssemblyLabelManagerBuilder.EntityType.Group,
|
||||
NXOpen.CAE.AssemblyLabelManagerBuilder.EntityType.Ply,
|
||||
NXOpen.CAE.AssemblyLabelManagerBuilder.EntityType.Ssmo,
|
||||
]
|
||||
|
||||
# Apply offsets to each occurrence (values from recorded journal)
|
||||
occurrence_offsets = [
|
||||
("FEModelOccurrence[3]", 2),
|
||||
("FEModelOccurrence[4]", 74),
|
||||
("FEModelOccurrence[5]", 146),
|
||||
("FEModelOccurrence[7]", 218),
|
||||
]
|
||||
|
||||
for occ_name, offset_val in occurrence_offsets:
|
||||
try:
|
||||
fEModelOcc = workAssyFemPart.FindObject(occ_name)
|
||||
offsets = [offset_val] * 7
|
||||
assemblyLabelManagerBuilder1.SetFEModelOccOffsets(fEModelOcc, entitytypes, offsets)
|
||||
except:
|
||||
pass # Some occurrences may not exist
|
||||
|
||||
nXObject1 = assemblyLabelManagerBuilder1.Commit()
|
||||
|
||||
theSession.DeleteUndoMark(markId_labels2, None)
|
||||
theSession.SetUndoMarkName(markId_labels, "Assembly Label Manager")
|
||||
assemblyLabelManagerBuilder1.Destroy()
|
||||
|
||||
print(f"[JOURNAL] Label conflicts resolved")
|
||||
|
||||
except Exception as e:
|
||||
print(f"[JOURNAL] WARNING: Label management: {e}")
|
||||
|
||||
# ==========================================================================
|
||||
# STEP 6: SOLVE
|
||||
# ==========================================================================
|
||||
print(f"[JOURNAL] STEP 6: Solving simulation...")
|
||||
|
||||
try:
|
||||
# Return to sim level by setting null component
|
||||
partLoadStatus9 = theSession.Parts.SetWorkComponent(
|
||||
NXOpen.Assemblies.Component.Null,
|
||||
NXOpen.PartCollection.RefsetOption.Entire,
|
||||
NXOpen.PartCollection.WorkComponentOption.Visible
|
||||
)
|
||||
workSimPart = theSession.Parts.BaseWork
|
||||
partLoadStatus9.Dispose()
|
||||
|
||||
# Set up solve
|
||||
markId_solve = theSession.SetUndoMark(NXOpen.Session.MarkVisibility.Visible, "Start")
|
||||
theSession.SetUndoMarkName(markId_solve, "Solve Dialog")
|
||||
|
||||
markId_solve2 = theSession.SetUndoMark(NXOpen.Session.MarkVisibility.Invisible, "Solve")
|
||||
|
||||
theCAESimSolveManager = NXOpen.CAE.SimSolveManager.GetSimSolveManager(theSession)
|
||||
|
||||
simSimulation1 = workSimPart.FindObject("Simulation")
|
||||
sol_name = solution_name if solution_name else "Solution 1"
|
||||
simSolution1 = simSimulation1.FindObject(f"Solution[{sol_name}]")
|
||||
|
||||
psolutions1 = [simSolution1]
|
||||
|
||||
print(f"[JOURNAL] Solving: {sol_name} (Foreground mode)")
|
||||
numsolved, numfailed, numskipped = theCAESimSolveManager.SolveChainOfSolutions(
|
||||
psolutions1,
|
||||
NXOpen.CAE.SimSolution.SolveOption.Solve,
|
||||
NXOpen.CAE.SimSolution.SetupCheckOption.CompleteCheckAndOutputErrors,
|
||||
NXOpen.CAE.SimSolution.SolveMode.Foreground # Use Foreground to ensure OP2 is complete
|
||||
)
|
||||
|
||||
theSession.DeleteUndoMark(markId_solve2, None)
|
||||
theSession.SetUndoMarkName(markId_solve, "Solve")
|
||||
|
||||
print(f"[JOURNAL] Solve completed: {numsolved} solved, {numfailed} failed, {numskipped} skipped")
|
||||
|
||||
return numfailed == 0
|
||||
|
||||
except Exception as e:
|
||||
print(f"[JOURNAL] ERROR solving: {e}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
return False
|
||||
|
||||
|
||||
def solve_simple_workflow(theSession, sim_file_path, solution_name, expression_updates, working_dir):
|
||||
"""
|
||||
Simple workflow for single-part simulations or when no expression updates needed.
|
||||
"""
|
||||
print(f"[JOURNAL] Opening simulation: {sim_file_path}")
|
||||
|
||||
# Open the .sim file
|
||||
basePart1, partLoadStatus1 = theSession.Parts.OpenActiveDisplay(
|
||||
sim_file_path,
|
||||
NXOpen.DisplayPartOption.AllowAdditional
|
||||
)
|
||||
partLoadStatus1.Dispose()
|
||||
|
||||
workSimPart = theSession.Parts.BaseWork
|
||||
theSession.ApplicationSwitchImmediate("UG_APP_SFEM")
|
||||
theSession.Post.UpdateUserGroupsFromSimPart(workSimPart)
|
||||
|
||||
# Set up solve
|
||||
markId_solve = theSession.SetUndoMark(NXOpen.Session.MarkVisibility.Visible, "Start")
|
||||
theSession.SetUndoMarkName(markId_solve, "Solve Dialog")
|
||||
|
||||
markId_solve2 = theSession.SetUndoMark(NXOpen.Session.MarkVisibility.Invisible, "Solve")
|
||||
|
||||
theCAESimSolveManager = NXOpen.CAE.SimSolveManager.GetSimSolveManager(theSession)
|
||||
|
||||
simSimulation1 = workSimPart.FindObject("Simulation")
|
||||
sol_name = solution_name if solution_name else "Solution 1"
|
||||
simSolution1 = simSimulation1.FindObject(f"Solution[{sol_name}]")
|
||||
|
||||
psolutions1 = [simSolution1]
|
||||
|
||||
print(f"[JOURNAL] Solving: {sol_name}")
|
||||
numsolved, numfailed, numskipped = theCAESimSolveManager.SolveChainOfSolutions(
|
||||
psolutions1,
|
||||
NXOpen.CAE.SimSolution.SolveOption.Solve,
|
||||
NXOpen.CAE.SimSolution.SetupCheckOption.CompleteCheckAndOutputErrors,
|
||||
NXOpen.CAE.SimSolution.SolveMode.Background
|
||||
)
|
||||
|
||||
theSession.DeleteUndoMark(markId_solve2, None)
|
||||
theSession.SetUndoMarkName(markId_solve, "Solve")
|
||||
|
||||
print(f"[JOURNAL] Solve completed: {numsolved} solved, {numfailed} failed, {numskipped} skipped")
|
||||
|
||||
# Save
|
||||
try:
|
||||
partSaveStatus = workSimPart.Save(
|
||||
NXOpen.BasePart.SaveComponents.TrueValue,
|
||||
NXOpen.BasePart.CloseAfterSave.FalseValue
|
||||
)
|
||||
partSaveStatus.Dispose()
|
||||
print(f"[JOURNAL] Saved!")
|
||||
except:
|
||||
pass
|
||||
|
||||
return numfailed == 0
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
success = main(sys.argv[1:])
|
||||
sys.exit(0 if success else 1)
|
||||
|
||||
@@ -0,0 +1,236 @@
|
||||
{
|
||||
"$schema": "Atomizer M1 Mirror Zernike Optimization",
|
||||
"study_name": "m1_mirror_zernike_optimization",
|
||||
"description": "Telescope primary mirror support structure optimization using Zernike wavefront error metrics with neural acceleration",
|
||||
|
||||
"design_variables": [
|
||||
{
|
||||
"name": "lateral_inner_angle",
|
||||
"expression_name": "lateral_inner_angle",
|
||||
"min": 25.0,
|
||||
"max": 28.5,
|
||||
"baseline": 26.79,
|
||||
"units": "degrees",
|
||||
"description": "Lateral support inner angle",
|
||||
"enabled": false
|
||||
},
|
||||
{
|
||||
"name": "lateral_outer_angle",
|
||||
"expression_name": "lateral_outer_angle",
|
||||
"min": 13.0,
|
||||
"max": 17.0,
|
||||
"baseline": 14.64,
|
||||
"units": "degrees",
|
||||
"description": "Lateral support outer angle",
|
||||
"enabled": false
|
||||
},
|
||||
{
|
||||
"name": "lateral_outer_pivot",
|
||||
"expression_name": "lateral_outer_pivot",
|
||||
"min": 9.0,
|
||||
"max": 12.0,
|
||||
"baseline": 10.40,
|
||||
"units": "mm",
|
||||
"description": "Lateral outer pivot position",
|
||||
"enabled": false
|
||||
},
|
||||
{
|
||||
"name": "lateral_inner_pivot",
|
||||
"expression_name": "lateral_inner_pivot",
|
||||
"min": 9.0,
|
||||
"max": 12.0,
|
||||
"baseline": 10.07,
|
||||
"units": "mm",
|
||||
"description": "Lateral inner pivot position",
|
||||
"enabled": false
|
||||
},
|
||||
{
|
||||
"name": "lateral_middle_pivot",
|
||||
"expression_name": "lateral_middle_pivot",
|
||||
"min": 18.0,
|
||||
"max": 23.0,
|
||||
"baseline": 20.73,
|
||||
"units": "mm",
|
||||
"description": "Lateral middle pivot position",
|
||||
"enabled": false
|
||||
},
|
||||
{
|
||||
"name": "lateral_closeness",
|
||||
"expression_name": "lateral_closeness",
|
||||
"min": 9.5,
|
||||
"max": 12.5,
|
||||
"baseline": 11.02,
|
||||
"units": "mm",
|
||||
"description": "Lateral support closeness parameter",
|
||||
"enabled": false
|
||||
},
|
||||
{
|
||||
"name": "whiffle_min",
|
||||
"expression_name": "whiffle_min",
|
||||
"min": 35.0,
|
||||
"max": 55.0,
|
||||
"baseline": 40.55,
|
||||
"units": "mm",
|
||||
"description": "Whiffle tree minimum parameter",
|
||||
"enabled": true
|
||||
},
|
||||
{
|
||||
"name": "whiffle_outer_to_vertical",
|
||||
"expression_name": "whiffle_outer_to_vertical",
|
||||
"min": 68.0,
|
||||
"max": 80.0,
|
||||
"baseline": 75.67,
|
||||
"units": "degrees",
|
||||
"description": "Whiffle tree outer to vertical angle",
|
||||
"enabled": true
|
||||
},
|
||||
{
|
||||
"name": "whiffle_triangle_closeness",
|
||||
"expression_name": "whiffle_triangle_closeness",
|
||||
"min": 50.0,
|
||||
"max": 65.0,
|
||||
"baseline": 60.00,
|
||||
"units": "mm",
|
||||
"description": "Whiffle tree triangle closeness",
|
||||
"enabled": false
|
||||
},
|
||||
{
|
||||
"name": "blank_backface_angle",
|
||||
"expression_name": "blank_backface_angle",
|
||||
"min": 3.5,
|
||||
"max": 5.0,
|
||||
"baseline": 4.23,
|
||||
"units": "degrees",
|
||||
"description": "Mirror blank backface angle",
|
||||
"enabled": false
|
||||
},
|
||||
{
|
||||
"name": "inner_circular_rib_dia",
|
||||
"expression_name": "inner_circular_rib_dia",
|
||||
"min": 480.0,
|
||||
"max": 620.0,
|
||||
"baseline": 534.00,
|
||||
"units": "mm",
|
||||
"description": "Inner circular rib diameter",
|
||||
"enabled": true
|
||||
}
|
||||
],
|
||||
|
||||
"objectives": [
|
||||
{
|
||||
"name": "rel_filtered_rms_40_vs_20",
|
||||
"description": "Filtered RMS WFE at 40 deg relative to 20 deg reference",
|
||||
"direction": "minimize",
|
||||
"weight": 5.0,
|
||||
"target": 4.0,
|
||||
"units": "nm",
|
||||
"extractor": "zernike_relative",
|
||||
"extractor_config": {
|
||||
"target_subcase": "40",
|
||||
"reference_subcase": "20",
|
||||
"metric": "relative_filtered_rms_nm"
|
||||
}
|
||||
},
|
||||
{
|
||||
"name": "rel_filtered_rms_60_vs_20",
|
||||
"description": "Filtered RMS WFE at 60 deg relative to 20 deg reference",
|
||||
"direction": "minimize",
|
||||
"weight": 5.0,
|
||||
"target": 10.0,
|
||||
"units": "nm",
|
||||
"extractor": "zernike_relative",
|
||||
"extractor_config": {
|
||||
"target_subcase": "60",
|
||||
"reference_subcase": "20",
|
||||
"metric": "relative_filtered_rms_nm"
|
||||
}
|
||||
},
|
||||
{
|
||||
"name": "mfg_90_optician_workload",
|
||||
"description": "Optician workload at 90 deg polishing orientation (filtered RMS with J1-J3)",
|
||||
"direction": "minimize",
|
||||
"weight": 1.0,
|
||||
"target": 20.0,
|
||||
"units": "nm",
|
||||
"extractor": "zernike",
|
||||
"extractor_config": {
|
||||
"subcase": "90",
|
||||
"metric": "rms_filter_j1to3",
|
||||
"reference_subcase": "20"
|
||||
}
|
||||
}
|
||||
],
|
||||
|
||||
"constraints": [
|
||||
{
|
||||
"name": "max_stress",
|
||||
"description": "Maximum von Mises stress in mirror assembly",
|
||||
"type": "upper_bound",
|
||||
"threshold": 10.0,
|
||||
"units": "MPa",
|
||||
"enabled": false
|
||||
}
|
||||
],
|
||||
|
||||
"zernike_settings": {
|
||||
"n_modes": 50,
|
||||
"filter_low_orders": 4,
|
||||
"displacement_unit": "mm",
|
||||
"subcases": ["1", "2", "3", "4"],
|
||||
"subcase_labels": {"1": "90deg", "2": "20deg", "3": "40deg", "4": "60deg"},
|
||||
"reference_subcase": "2",
|
||||
"polishing_subcase": "1",
|
||||
"output_full_coefficients": true,
|
||||
"_note": "Subcase mapping matches NX: 1=90deg, 2=20deg(ref), 3=40deg, 4=60deg"
|
||||
},
|
||||
|
||||
"optimization_settings": {
|
||||
"n_trials": 100,
|
||||
"n_fea_trials": 40,
|
||||
"n_neural_trials": 500,
|
||||
"sampler": "TPE",
|
||||
"seed": 42,
|
||||
"n_startup_trials": 15,
|
||||
"tpe_n_ei_candidates": 150,
|
||||
"tpe_multivariate": true,
|
||||
"objective_strategy": "weighted_sum",
|
||||
"objective_direction": "minimize"
|
||||
},
|
||||
|
||||
"surrogate_settings": {
|
||||
"enabled": true,
|
||||
"model_type": "ParametricZernikePredictor",
|
||||
"training_config": {
|
||||
"hidden_channels": 128,
|
||||
"num_layers": 4,
|
||||
"learning_rate": 0.001,
|
||||
"epochs": 200,
|
||||
"batch_size": 8,
|
||||
"train_split": 0.8
|
||||
},
|
||||
"outputs": {
|
||||
"description": "50 Zernike coefficients x 4 subcases = 200 outputs",
|
||||
"coefficients_per_subcase": 50,
|
||||
"subcases": ["20", "40", "60", "90"]
|
||||
}
|
||||
},
|
||||
|
||||
"nx_settings": {
|
||||
"nx_install_path": "C:\\Program Files\\Siemens\\NX2506",
|
||||
"model_dir": "C:\\Users\\Antoine\\CADTOMASTE\\Atomizer\\M1-Gigabit\\Latest",
|
||||
"sim_file": "ASSY_M1_assyfem1_sim1.sim",
|
||||
"solution_name": "Solution 1",
|
||||
"solve_all_subcases": true,
|
||||
"op2_pattern": "*-solution_1.op2",
|
||||
"op2_timeout_s": 1800,
|
||||
"op2_stable_s": 5,
|
||||
"post_solve_delay_s": 5
|
||||
},
|
||||
|
||||
"output_settings": {
|
||||
"save_zernike_coefficients": true,
|
||||
"save_field_data": true,
|
||||
"generate_reports": true,
|
||||
"archive_results": true
|
||||
}
|
||||
}
|
||||
BIN
studies/m1_mirror_zernike_optimization/2_results/study.db
Normal file
BIN
studies/m1_mirror_zernike_optimization/2_results/study.db
Normal file
Binary file not shown.
134
studies/m1_mirror_zernike_optimization/DASHBOARD.md
Normal file
134
studies/m1_mirror_zernike_optimization/DASHBOARD.md
Normal file
@@ -0,0 +1,134 @@
|
||||
# M1 Mirror Zernike Optimization Dashboard
|
||||
|
||||
## Study Overview
|
||||
|
||||
**Objective**: Optimize telescope primary mirror (M1) support structure to minimize wavefront error across different gravity orientations.
|
||||
|
||||
**Method**: Hybrid FEA + Neural Network acceleration using Zernike polynomial decomposition.
|
||||
|
||||
## Quick Start
|
||||
|
||||
### 1. Prepare Model Files
|
||||
|
||||
Copy your NX model files to:
|
||||
```
|
||||
studies/m1_mirror_zernike_optimization/1_setup/model/
|
||||
```
|
||||
|
||||
Required files:
|
||||
- `ASSY_M1.prt` (or your assembly name)
|
||||
- `ASSY_M1_assyfem1.afm`
|
||||
- `ASSY_M1_assyfem1_sim1.sim`
|
||||
- Associated `.fem` and `_i.prt` files
|
||||
|
||||
### 2. Run FEA Trials (Build Training Data)
|
||||
|
||||
```bash
|
||||
cd studies/m1_mirror_zernike_optimization
|
||||
python run_optimization.py --run --trials 40
|
||||
```
|
||||
|
||||
This will:
|
||||
- Run ~40 FEA trials (10-15 min each = ~8-10 hours)
|
||||
- Extract 50 Zernike coefficients for each subcase (20/40/60/90 deg)
|
||||
- Store all data in Optuna database
|
||||
|
||||
### 3. Train Neural Surrogate
|
||||
|
||||
```bash
|
||||
python run_optimization.py --train-surrogate
|
||||
```
|
||||
|
||||
Trains MLP to predict 200 outputs (50 coefficients x 4 subcases) from design variables.
|
||||
|
||||
### 4. Run Neural-Accelerated Optimization
|
||||
|
||||
```bash
|
||||
python run_optimization.py --run --trials 1000 --enable-nn
|
||||
```
|
||||
|
||||
1000 trials in ~seconds!
|
||||
|
||||
### 5. View Results
|
||||
|
||||
**Optuna Dashboard:**
|
||||
```bash
|
||||
optuna-dashboard sqlite:///2_results/study.db --port 8081
|
||||
```
|
||||
Open http://localhost:8081
|
||||
|
||||
## Design Variables
|
||||
|
||||
| Variable | Range | Baseline | Units | Status |
|
||||
|----------|-------|----------|-------|--------|
|
||||
| whiffle_min | 35-55 | 40.55 | mm | **Enabled** |
|
||||
| whiffle_outer_to_vertical | 68-80 | 75.67 | deg | **Enabled** |
|
||||
| inner_circular_rib_dia | 480-620 | 534.00 | mm | **Enabled** |
|
||||
| whiffle_triangle_closeness | 50-65 | 60.00 | mm | Disabled |
|
||||
| blank_backface_angle | 3.5-5.0 | 4.23 | deg | Disabled |
|
||||
| lateral_inner_angle | 25-28.5 | 26.79 | deg | Disabled |
|
||||
| lateral_outer_angle | 13-17 | 14.64 | deg | Disabled |
|
||||
| lateral_outer_pivot | 9-12 | 10.40 | mm | Disabled |
|
||||
| lateral_inner_pivot | 9-12 | 10.07 | mm | Disabled |
|
||||
| lateral_middle_pivot | 18-23 | 20.73 | mm | Disabled |
|
||||
| lateral_closeness | 9.5-12.5 | 11.02 | mm | Disabled |
|
||||
|
||||
Edit `1_setup/optimization_config.json` to enable/disable variables.
|
||||
|
||||
## Objectives
|
||||
|
||||
| Objective | Weight | Target | Description |
|
||||
|-----------|--------|--------|-------------|
|
||||
| rel_filtered_rms_40_vs_20 | 5 | 4 nm | WFE at 40° relative to 20° reference |
|
||||
| rel_filtered_rms_60_vs_20 | 5 | 10 nm | WFE at 60° relative to 20° reference |
|
||||
| mfg_90_optician_workload | 1 | 20 nm | Polishing workload at 90° orientation |
|
||||
|
||||
**Strategy**: Weighted sum minimization (normalized by targets)
|
||||
|
||||
## Neural Surrogate Architecture
|
||||
|
||||
- **Input**: Design variables (3-11 depending on enabled)
|
||||
- **Output**: 200 values (50 Zernike coefficients × 4 subcases)
|
||||
- **Architecture**: MLP with 4 layers, 128 hidden units
|
||||
- **Training**: ~40 FEA samples, 200 epochs
|
||||
|
||||
## File Structure
|
||||
|
||||
```
|
||||
m1_mirror_zernike_optimization/
|
||||
├── 1_setup/
|
||||
│ ├── optimization_config.json # Configuration
|
||||
│ └── model/ # NX model files (add yours here)
|
||||
├── 2_results/
|
||||
│ ├── study.db # Optuna database
|
||||
│ └── zernike_surrogate/ # Trained neural model
|
||||
│ └── checkpoint_best.pt
|
||||
├── run_optimization.py # Main script
|
||||
└── DASHBOARD.md # This file
|
||||
```
|
||||
|
||||
## Commands Reference
|
||||
|
||||
```bash
|
||||
# Run FEA optimization
|
||||
python run_optimization.py --run --trials 40
|
||||
|
||||
# Train neural surrogate
|
||||
python run_optimization.py --train-surrogate
|
||||
|
||||
# Run with neural acceleration
|
||||
python run_optimization.py --run --trials 1000 --enable-nn
|
||||
|
||||
# Check status
|
||||
python run_optimization.py --status
|
||||
|
||||
# Launch Optuna dashboard
|
||||
optuna-dashboard sqlite:///2_results/study.db --port 8081
|
||||
```
|
||||
|
||||
## Tips
|
||||
|
||||
1. **Start small**: Run 5-10 FEA trials first to verify everything works
|
||||
2. **Check Zernike extraction**: Verify OP2 has correct subcases (20/40/60/90)
|
||||
3. **Enable variables gradually**: Start with 3, add more after initial exploration
|
||||
4. **Neural validation**: After finding good neural designs, verify top candidates with FEA
|
||||
429
studies/m1_mirror_zernike_optimization/README.md
Normal file
429
studies/m1_mirror_zernike_optimization/README.md
Normal file
@@ -0,0 +1,429 @@
|
||||
# M1 Mirror Zernike Optimization
|
||||
|
||||
Multi-objective telescope primary mirror support structure optimization using Zernike wavefront error decomposition with neural network acceleration.
|
||||
|
||||
**Created**: 2025-11-28
|
||||
**Protocol**: Protocol 12 (Hybrid FEA/Neural with Zernike)
|
||||
**Status**: Setup Complete - Requires Expression Path Fix
|
||||
|
||||
---
|
||||
|
||||
## 1. Engineering Problem
|
||||
|
||||
### 1.1 Objective
|
||||
|
||||
Optimize the telescope primary mirror (M1) support structure to minimize wavefront error (WFE) across different gravity orientations (zenith angles), ensuring consistent optical performance from 20° to 90° elevation.
|
||||
|
||||
### 1.2 Physical System
|
||||
|
||||
- **Component**: M1 primary mirror assembly with whiffle tree support
|
||||
- **Material**: Borosilicate glass (mirror blank), steel (support structure)
|
||||
- **Loading**: Gravity at multiple zenith angles (20°, 40°, 60°, 90°)
|
||||
- **Boundary Conditions**: Whiffle tree kinematic mount
|
||||
- **Analysis Type**: Linear static multi-subcase (Nastran SOL 101)
|
||||
- **Output**: Surface deformation → Zernike polynomial decomposition
|
||||
|
||||
---
|
||||
|
||||
## 2. Mathematical Formulation
|
||||
|
||||
### 2.1 Objectives
|
||||
|
||||
| Objective | Goal | Weight | Formula | Units | Target |
|
||||
|-----------|------|--------|---------|-------|--------|
|
||||
| rel_filtered_rms_40_vs_20 | minimize | 5.0 | $\sigma_{40/20} = \sqrt{\sum_{j=5}^{50} (Z_j^{rel})^2}$ | nm | 4 nm |
|
||||
| rel_filtered_rms_60_vs_20 | minimize | 5.0 | $\sigma_{60/20} = \sqrt{\sum_{j=5}^{50} (Z_j^{rel})^2}$ | nm | 10 nm |
|
||||
| mfg_90_optician_workload | minimize | 1.0 | $\sigma_{90}^{J4+} = \sqrt{\sum_{j=4}^{50} (Z_j^{rel})^2}$ | nm | 20 nm |
|
||||
|
||||
Where:
|
||||
- $Z_j^{rel}$ = Relative Zernike coefficient (target subcase minus reference)
|
||||
- Filtered RMS excludes J1-J4 (piston, tip, tilt, defocus) - correctable by alignment
|
||||
- Manufacturing workload keeps J4 (defocus) since it represents optician correction effort
|
||||
|
||||
### 2.2 Zernike Decomposition
|
||||
|
||||
The wavefront error $W(r,\theta)$ is decomposed into Zernike polynomials:
|
||||
|
||||
$$W(r,\theta) = \sum_{j=1}^{50} Z_j \cdot P_j(r,\theta)$$
|
||||
|
||||
Where $P_j$ are Noll-indexed Zernike polynomials on the unit disk.
|
||||
|
||||
**WFE from Displacement**:
|
||||
$$W_{nm} = 2 \cdot \delta_z \cdot 10^6$$
|
||||
|
||||
Where $\delta_z$ is the Z-displacement in mm (factor of 2 for reflection).
|
||||
|
||||
### 2.3 Design Variables
|
||||
|
||||
| Parameter | Symbol | Bounds | Baseline | Units | Description |
|
||||
|-----------|--------|--------|----------|-------|-------------|
|
||||
| whiffle_min | $w_{min}$ | [35, 55] | 40.55 | mm | Whiffle tree minimum parameter |
|
||||
| whiffle_outer_to_vertical | $\alpha$ | [68, 80] | 75.67 | deg | Outer support angle to vertical |
|
||||
| inner_circular_rib_dia | $D_{rib}$ | [480, 620] | 534.00 | mm | Inner circular rib diameter |
|
||||
|
||||
**Design Space**:
|
||||
$$\mathbf{x} = [w_{min}, \alpha, D_{rib}]^T \in \mathbb{R}^3$$
|
||||
|
||||
**Additional Variables (Disabled)**:
|
||||
- lateral_inner_angle, lateral_outer_angle (lateral support angles)
|
||||
- lateral_outer_pivot, lateral_inner_pivot, lateral_middle_pivot (pivot positions)
|
||||
- lateral_closeness (lateral support spacing)
|
||||
- whiffle_triangle_closeness (whiffle tree geometry)
|
||||
- blank_backface_angle (mirror blank geometry)
|
||||
|
||||
### 2.4 Objective Strategy
|
||||
|
||||
**Weighted Sum Minimization**:
|
||||
$$J(\mathbf{x}) = \sum_{i=1}^{3} w_i \cdot \frac{f_i(\mathbf{x})}{t_i}$$
|
||||
|
||||
Where:
|
||||
- $w_i$ = weight for objective $i$
|
||||
- $f_i(\mathbf{x})$ = objective value
|
||||
- $t_i$ = target value (normalization)
|
||||
|
||||
---
|
||||
|
||||
## 3. Optimization Algorithm
|
||||
|
||||
### 3.1 TPE Configuration
|
||||
|
||||
| Parameter | Value | Description |
|
||||
|-----------|-------|-------------|
|
||||
| Algorithm | TPE | Tree-structured Parzen Estimator |
|
||||
| Sampler | `TPESampler` | Bayesian optimization |
|
||||
| n_startup_trials | 15 | Random exploration before modeling |
|
||||
| n_ei_candidates | 150 | Expected improvement candidates |
|
||||
| multivariate | true | Model parameter correlations |
|
||||
| Trials | 100 | 40 FEA + neural acceleration |
|
||||
| Seed | 42 | Reproducibility |
|
||||
|
||||
**TPE Properties**:
|
||||
- Models $p(x|y<y^*)$ and $p(x|y \geq y^*)$ separately
|
||||
- Expected Improvement: $EI(x) = \int_{-\infty}^{y^*} (y^* - y) p(y|x) dy$
|
||||
- Handles high-dimensional continuous spaces efficiently
|
||||
|
||||
### 3.2 Return Format
|
||||
|
||||
```python
|
||||
def fea_objective(trial) -> Tuple[float, dict]:
|
||||
# ... simulation and Zernike extraction ...
|
||||
weighted_obj = compute_weighted_objective(objectives, config)
|
||||
return weighted_obj, trial_data
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 4. Simulation Pipeline
|
||||
|
||||
### 4.1 Trial Execution Flow
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────────────────────────────────┐
|
||||
│ TRIAL n EXECUTION │
|
||||
├─────────────────────────────────────────────────────────────────────┤
|
||||
│ │
|
||||
│ 1. OPTUNA SAMPLES (TPE) │
|
||||
│ whiffle_min = trial.suggest_float("whiffle_min", 35, 55) │
|
||||
│ whiffle_outer_to_vertical = trial.suggest_float(..., 68, 80) │
|
||||
│ inner_circular_rib_dia = trial.suggest_float(..., 480, 620) │
|
||||
│ │
|
||||
│ 2. NX PARAMETER UPDATE │
|
||||
│ Module: optimization_engine/solve_simulation.py │
|
||||
│ Target Part: M1_Blank.prt │
|
||||
│ Action: Update expressions with new design values │
|
||||
│ │
|
||||
│ 3. NX SIMULATION (Nastran SOL 101 - 4 Subcases) │
|
||||
│ Module: optimization_engine/solve_simulation.py │
|
||||
│ Input: ASSY_M1_assyfem1_sim1.sim │
|
||||
│ Subcases: 1=20°, 2=40°, 3=60°, 4=90° zenith │
|
||||
│ Output: .dat, .op2, .f06 │
|
||||
│ │
|
||||
│ 4. ZERNIKE EXTRACTION (Displacement-Based) │
|
||||
│ a. Read node coordinates from BDF/DAT │
|
||||
│ b. Read Z-displacements from OP2 for each subcase │
|
||||
│ c. Compute RELATIVE displacement (subcase - reference) │
|
||||
│ d. Convert to WFE: W = 2 * Δδz * 10^6 nm │
|
||||
│ e. Fit 50 Zernike coefficients via least-squares │
|
||||
│ f. Compute filtered RMS (exclude J1-J4) │
|
||||
│ │
|
||||
│ 5. OBJECTIVE COMPUTATION │
|
||||
│ rel_filtered_rms_40_vs_20 ← Zernike RMS (subcase 2 - 1) │
|
||||
│ rel_filtered_rms_60_vs_20 ← Zernike RMS (subcase 3 - 1) │
|
||||
│ mfg_90_optician_workload ← Zernike RMS J4+ (subcase 4 - 1) │
|
||||
│ │
|
||||
│ 6. WEIGHTED SUM │
|
||||
│ J = Σ (weight × objective / target) │
|
||||
│ │
|
||||
│ 7. RETURN TO OPTUNA │
|
||||
│ return weighted_objective │
|
||||
│ │
|
||||
└─────────────────────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
### 4.2 Multi-Subcase Structure
|
||||
|
||||
| Subcase | Zenith Angle | Role | Description |
|
||||
|---------|--------------|------|-------------|
|
||||
| 1 | 20° | Reference | Near-zenith baseline orientation |
|
||||
| 2 | 40° | Target | Mid-elevation performance |
|
||||
| 3 | 60° | Target | Low-elevation performance |
|
||||
| 4 | 90° | Polishing | Horizontal (manufacturing reference) |
|
||||
|
||||
---
|
||||
|
||||
## 5. Result Extraction Methods
|
||||
|
||||
### 5.1 Zernike Extraction (Displacement-Based Subtraction)
|
||||
|
||||
| Attribute | Value |
|
||||
|-----------|-------|
|
||||
| **Method** | `extract_zernike_with_relative()` |
|
||||
| **Location** | `run_optimization.py` (inline) |
|
||||
| **Geometry Source** | `.dat` (BDF format) |
|
||||
| **Displacement Source** | `.op2` (OP2 binary) |
|
||||
| **Output** | 50 Zernike coefficients per subcase |
|
||||
|
||||
**Algorithm (Correct Approach - Matches Original Script)**:
|
||||
|
||||
1. **Load Geometry**: Read node coordinates $(X_i, Y_i)$ from BDF
|
||||
2. **Load Displacements**: Read $\delta_{z,i}$ from OP2 for each subcase
|
||||
3. **Compute Relative Displacement** (node-by-node):
|
||||
$$\Delta\delta_{z,i} = \delta_{z,i}^{target} - \delta_{z,i}^{reference}$$
|
||||
4. **Convert to WFE**:
|
||||
$$W_i = 2 \cdot \Delta\delta_{z,i} \cdot 10^6 \text{ nm}$$
|
||||
5. **Fit Zernike** (least-squares on unit disk):
|
||||
$$\min_{\mathbf{Z}} \| \mathbf{W} - \mathbf{P} \mathbf{Z} \|^2$$
|
||||
6. **Compute RMS**:
|
||||
$$\sigma_{filtered} = \sqrt{\sum_{j=5}^{50} Z_j^2}$$
|
||||
|
||||
**Critical Implementation Note**:
|
||||
The relative calculation MUST subtract displacements first, then fit Zernike - NOT subtract Zernike coefficients directly. This matches the original `zernike_Post_Script_NX.py` implementation.
|
||||
|
||||
### 5.2 Code Pattern
|
||||
|
||||
```python
|
||||
from pyNastran.op2.op2 import OP2
|
||||
from pyNastran.bdf.bdf import BDF
|
||||
|
||||
# Read geometry
|
||||
bdf = BDF()
|
||||
bdf.read_bdf(str(bdf_path))
|
||||
node_geo = {nid: node.get_position() for nid, node in bdf.nodes.items()}
|
||||
|
||||
# Read displacements
|
||||
op2 = OP2()
|
||||
op2.read_op2(str(op2_path))
|
||||
|
||||
# Compute relative displacement (node-by-node)
|
||||
for i, nid in enumerate(node_ids):
|
||||
rel_dz = disp_z_target[i] - disp_z_reference[nid]
|
||||
|
||||
# Convert to WFE and fit Zernike
|
||||
rel_wfe_nm = 2.0 * rel_disp_z * 1e6
|
||||
coeffs, R_max = compute_zernike_from_wfe(X, Y, rel_wfe_nm, n_modes=50)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 6. Neural Acceleration (AtomizerField)
|
||||
|
||||
### 6.1 Configuration
|
||||
|
||||
| Setting | Value | Description |
|
||||
|---------|-------|-------------|
|
||||
| `enabled` | `true` | Neural surrogate active |
|
||||
| `model_type` | `ParametricZernikePredictor` | Predicts Zernike coefficients |
|
||||
| `hidden_channels` | 128 | MLP width |
|
||||
| `num_layers` | 4 | MLP depth |
|
||||
| `learning_rate` | 0.001 | Adam optimizer |
|
||||
| `epochs` | 200 | Training iterations |
|
||||
| `batch_size` | 8 | Mini-batch size |
|
||||
| `train_split` | 0.8 | Training fraction |
|
||||
|
||||
### 6.2 Surrogate Model
|
||||
|
||||
**Input**: $\mathbf{x} = [w_{min}, \alpha, D_{rib}]^T \in \mathbb{R}^3$
|
||||
|
||||
**Output**: $\hat{\mathbf{Z}} \in \mathbb{R}^{200}$ (50 coefficients × 4 subcases)
|
||||
|
||||
**Architecture**: Multi-Layer Perceptron
|
||||
```
|
||||
Input(3) → Linear(128) → ReLU → Linear(128) → ReLU →
|
||||
Linear(128) → ReLU → Linear(128) → ReLU → Linear(200)
|
||||
```
|
||||
|
||||
**Training Objective**:
|
||||
$$\mathcal{L} = \frac{1}{N} \sum_{i=1}^{N} \| \mathbf{Z}_i - \hat{\mathbf{Z}}_i \|^2$$
|
||||
|
||||
### 6.3 Training Data Location
|
||||
|
||||
```
|
||||
studies/m1_mirror_zernike_optimization/2_results/zernike_surrogate/
|
||||
├── checkpoint_best.pt # Best model weights
|
||||
├── training_history.json # Loss curves
|
||||
└── validation_metrics.json # R², MAE per coefficient
|
||||
```
|
||||
|
||||
### 6.4 Expected Performance
|
||||
|
||||
| Metric | Value |
|
||||
|--------|-------|
|
||||
| FEA time per trial | 10-15 min |
|
||||
| Neural time per trial | ~10 ms |
|
||||
| Speedup | ~60,000x |
|
||||
| Expected R² | > 0.95 (after 40 samples) |
|
||||
|
||||
---
|
||||
|
||||
## 7. Study File Structure
|
||||
|
||||
```
|
||||
m1_mirror_zernike_optimization/
|
||||
│
|
||||
├── 1_setup/ # INPUT CONFIGURATION
|
||||
│ ├── model/ # NX Model Files (symlinked/referenced)
|
||||
│ │ └── → C:\Users\Antoine\CADTOMASTE\Atomizer\M1-Gigabit\Latest\
|
||||
│ │ ├── ASSY_M1.prt # Top-level assembly
|
||||
│ │ ├── M1_Blank.prt # Mirror blank (EXPRESSIONS HERE)
|
||||
│ │ ├── ASSY_M1_assyfem1.afm # Assembly FEM
|
||||
│ │ ├── ASSY_M1_assyfem1_sim1.sim # Simulation file
|
||||
│ │ └── assy_m1_assyfem1_sim1-solution_1.op2 # Results
|
||||
│ │
|
||||
│ └── optimization_config.json # Study configuration
|
||||
│
|
||||
├── 2_results/ # OUTPUT (auto-generated)
|
||||
│ ├── study.db # Optuna SQLite database
|
||||
│ ├── zernike_surrogate/ # Neural model checkpoints
|
||||
│ └── reports/ # Generated reports
|
||||
│
|
||||
├── run_optimization.py # Main entry point
|
||||
├── DASHBOARD.md # Quick reference
|
||||
└── README.md # This blueprint
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 8. Results Location
|
||||
|
||||
After optimization completes, results are stored in `2_results/`:
|
||||
|
||||
| File | Description | Format |
|
||||
|------|-------------|--------|
|
||||
| `study.db` | Optuna database with all trials | SQLite |
|
||||
| `zernike_surrogate/checkpoint_best.pt` | Trained neural model | PyTorch |
|
||||
| `reports/optimization_report.md` | Full results report | Markdown |
|
||||
|
||||
### 8.1 Results Report Contents
|
||||
|
||||
The generated report will contain:
|
||||
|
||||
1. **Optimization Summary** - Best WFE configurations found
|
||||
2. **Zernike Analysis** - Coefficient distributions per subcase
|
||||
3. **Parameter Sensitivity** - Design variable vs WFE relationships
|
||||
4. **Convergence History** - Weighted objective over trials
|
||||
5. **Neural Surrogate Performance** - R² per Zernike mode
|
||||
6. **Recommended Configurations** - Top designs for production
|
||||
|
||||
### 8.2 Zernike-Specific Analysis
|
||||
|
||||
| Mode | Name | Physical Meaning |
|
||||
|------|------|------------------|
|
||||
| J1 | Piston | Constant offset (ignored) |
|
||||
| J2, J3 | Tip/Tilt | Angular misalignment (correctable) |
|
||||
| J4 | Defocus | Power error (correctable) |
|
||||
| J5, J6 | Astigmatism | Cylindrical error |
|
||||
| J7, J8 | Coma | Off-axis aberration |
|
||||
| J9-J11 | Trefoil, Spherical | Higher-order terms |
|
||||
|
||||
---
|
||||
|
||||
## 9. Quick Start
|
||||
|
||||
### Staged Workflow (Recommended)
|
||||
|
||||
```bash
|
||||
cd studies/m1_mirror_zernike_optimization
|
||||
|
||||
# Check current status
|
||||
python run_optimization.py --status
|
||||
|
||||
# Run FEA trials (builds training data)
|
||||
python run_optimization.py --run --trials 40
|
||||
|
||||
# Train neural surrogate
|
||||
python run_optimization.py --train-surrogate
|
||||
|
||||
# Run neural-accelerated optimization
|
||||
python run_optimization.py --run --trials 500 --enable-nn
|
||||
```
|
||||
|
||||
### Stage Descriptions
|
||||
|
||||
| Stage | Command | Purpose | When to Use |
|
||||
|-------|---------|---------|-------------|
|
||||
| **STATUS** | `--status` | Check database, trial count | Anytime |
|
||||
| **RUN** | `--run --trials N` | Run FEA optimization | Initial exploration |
|
||||
| **TRAIN** | `--train-surrogate` | Train neural model | After ~40 FEA trials |
|
||||
| **NEURAL** | `--run --enable-nn` | Fast neural trials | After training |
|
||||
|
||||
### Dashboard Access
|
||||
|
||||
| Dashboard | URL | Purpose |
|
||||
|-----------|-----|---------|
|
||||
| **Optuna Dashboard** | `optuna-dashboard sqlite:///2_results/study.db` | Trial history |
|
||||
|
||||
```bash
|
||||
# Launch Optuna dashboard
|
||||
cd studies/m1_mirror_zernike_optimization
|
||||
optuna-dashboard sqlite:///2_results/study.db --port 8081
|
||||
# Open http://localhost:8081
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 10. Configuration Reference
|
||||
|
||||
**File**: `1_setup/optimization_config.json`
|
||||
|
||||
| Section | Key | Description |
|
||||
|---------|-----|-------------|
|
||||
| `design_variables[]` | 11 parameters | 3 enabled, 8 disabled |
|
||||
| `objectives[]` | 3 WFE metrics | Relative filtered RMS |
|
||||
| `zernike_settings.n_modes` | 50 | Zernike polynomial count |
|
||||
| `zernike_settings.filter_low_orders` | 4 | Exclude J1-J4 |
|
||||
| `zernike_settings.subcases` | ["1","2","3","4"] | OP2 subcase IDs |
|
||||
| `zernike_settings.reference_subcase` | "1" | 20° baseline |
|
||||
| `optimization_settings.n_trials` | 100 | Total FEA trials |
|
||||
| `surrogate_settings.model_type` | ParametricZernikePredictor | Neural architecture |
|
||||
| `nx_settings.model_dir` | M1-Gigabit/Latest | NX model location |
|
||||
| `nx_settings.sim_file` | ASSY_M1_assyfem1_sim1.sim | Simulation file |
|
||||
|
||||
---
|
||||
|
||||
## 11. Known Issues & Solutions
|
||||
|
||||
### 11.1 Expression Update Failure
|
||||
|
||||
**Issue**: NX journal cannot find expressions in assembly FEM.
|
||||
|
||||
**Cause**: Expressions are in component part `M1_Blank.prt`, not in `ASSY_M1_assyfem1`.
|
||||
|
||||
**Solution**: The `solve_simulation.py` journal now searches for `M1_Blank` part to update expressions. If still failing, verify:
|
||||
1. `M1_Blank.prt` is loaded in the assembly
|
||||
2. Expression names match exactly (case-sensitive)
|
||||
3. Part is not read-only
|
||||
|
||||
### 11.2 Subcase Numbering
|
||||
|
||||
**Issue**: OP2 file uses numeric subcases (1,2,3,4) not angle labels (20,40,60,90).
|
||||
|
||||
**Solution**: Config uses `subcases: ["1","2","3","4"]` with `subcase_labels` mapping.
|
||||
|
||||
---
|
||||
|
||||
## 12. References
|
||||
|
||||
- **Noll, R.J.** (1976). Zernike polynomials and atmospheric turbulence. *JOSA*.
|
||||
- **Wilson, R.N.** (2004). *Reflecting Telescope Optics I*. Springer.
|
||||
- **pyNastran Documentation**: BDF/OP2 parsing for FEA post-processing
|
||||
- **Optuna Documentation**: TPE sampler for black-box optimization
|
||||
1377
studies/m1_mirror_zernike_optimization/run_optimization.py
Normal file
1377
studies/m1_mirror_zernike_optimization/run_optimization.py
Normal file
File diff suppressed because it is too large
Load Diff
Reference in New Issue
Block a user