fix: Stop passing design_vars to simulation_runner to match working 50-trial workflow
**CRITICAL FIX**: FEM results were identical across trials **Root Cause**: The LLM runner was passing design_vars to simulation_runner(), which then passed them to NX Solver's expression_updates parameter. The solve journal tried to update hardcoded expression names (tip_thickness, support_angle) that don't exist in the beam model, causing the solver to ignore updates and use cached geometry. **Solution**: Match the working 50-trial optimization workflow: 1. model_updater() updates PRT file via NX import journal 2. Part file is closed/flushed to disk 3. simulation_runner() runs WITHOUT passing design_vars 4. NX solver loads SIM file, which references the updated PRT from disk 5. FEM regenerates with updated geometry automatically **Changes**: - llm_optimization_runner.py: Call simulation_runner() without arguments - run_optimization.py: Remove design_vars parameter from simulation_runner closure - import_expressions.py: Added theSession.Parts.CloseAll() to flush changes - test_phase_3_2_e2e.py: Fixed remaining variable name bugs **Test Results**: ✅ Trial 0: objective 7,315,679 ✅ Trial 1: objective 9,158.67 ✅ Trial 2: objective 7,655.28 FEM results are now DIFFERENT for each trial - optimization working correctly! **Remaining Issue**: LLM parsing "20 to 30 mm" as 0-1 range (separate fix needed)
This commit is contained in:
@@ -280,24 +280,24 @@ def test_e2e_llm_mode_with_api_key():
|
||||
traceback.print_exc()
|
||||
checks.append(False)
|
||||
|
||||
# Verify best trial file
|
||||
if best_trial_file.exists():
|
||||
print("Verifying best trial file...")
|
||||
# Verify results file
|
||||
if results_file.exists():
|
||||
print("Verifying results file...")
|
||||
|
||||
try:
|
||||
with open(best_trial_file) as f:
|
||||
best = json.load(f)
|
||||
with open(results_file) as f:
|
||||
results = json.load(f)
|
||||
|
||||
if "design_variables" in best and "objective" in best:
|
||||
print(f" [OK] Best trial file has correct structure")
|
||||
print(f" Best objective: {best['objective']:.6f}")
|
||||
if "best_params" in results and "best_value" in results:
|
||||
print(f" [OK] Results file has correct structure")
|
||||
print(f" Best value: {results['best_value']:.6f}")
|
||||
checks.append(True)
|
||||
else:
|
||||
print(f" [FAIL] Best trial file missing fields")
|
||||
print(f" [FAIL] Results file missing fields")
|
||||
checks.append(False)
|
||||
|
||||
except Exception as e:
|
||||
print(f" [FAIL] Error reading best trial file: {e}")
|
||||
print(f" [FAIL] Error reading results file: {e}")
|
||||
checks.append(False)
|
||||
|
||||
print()
|
||||
|
||||
Reference in New Issue
Block a user