Dissipative Learning • Causal Mismatch Detection • Adaptive Energy Management
Science Question: A data center runs LLM inference workloads. Known factors (GPU
utilization,
batch size, model parameters, cooling efficiency) predict baseline power consumption. But actual energy
draw
consistently exceeds predictions. What causal factors are we missing?
Simulation Controls
Learning Cycles
0
Avg Certainty
0.00
Current Power
0kW
Predicted
0kW
Error
0kW
RMS Error
0kW
System Status
Ready - Click "Run Single Cycle" to begin
Methodology:
Dissipative learning with sigmoid-modulated learning rate η = 1/(1 + e^(10(Z - 0.5))).
Model learns known factors (GPU util, batch size, model params, cooling).
Hidden factor: Memory bandwidth saturation causes additional 50-80 kW overhead.
Curiosity fires when Z > 0.80 but error remains > 10 kW.