Skip to content
// bridge system
useState()Model Weights·Event PropagationForward Pass·Array.map()Tensor Operation·React diffLoss Function L = Σ(y−ŷ)²·transition-durationLearning Rate η·CSS clamp()σ(x) Activation·Re-render cycleTraining Epoch·Event bubblingBackpropagation ∂L/∂w·useCallbackGradient Caching·Promise.all()Batch Inference·Redux storeWeight Matrix·DevTools profilerLoss Landscape·useState()Model Weights·Event PropagationForward Pass·Array.map()Tensor Operation·React diffLoss Function L = Σ(y−ŷ)²·transition-durationLearning Rate η·CSS clamp()σ(x) Activation·Re-render cycleTraining Epoch·Event bubblingBackpropagation ∂L/∂w·useCallbackGradient Caching·Promise.all()Batch Inference·Redux storeWeight Matrix·DevTools profilerLoss Landscape·
Bridges/Activation functions (ReLU)
Identity Bridge

Math.max(0, x)
=
Activation functions (ReLU)

All Themes // Bridge #14
The connection

ReLU is literally Math.max(0, x) — the same function, same math, different name. The strongest bridge in the system. Clip negative values to zero, pass positives through unchanged.

Why "Identity"?

Identity bridges are the same concept in both domains. The code is identical or nearly identical.

Frontend concept
Math.max(0, x)
View in glossary
ML concept
Activation functions (ReLU)
View in glossary
Report Issue
0/2000
Severity
Screenshot
+ Attach screenshot (optional)
page url + browser info captured automatically