Generation: 1
Cars Alive: 0
Best Distance: 0

Sigmoid Activation

Hidden neurons use 1/(1+e^-x) to produce continuous values between 0.0 and 1.0, expressing degrees of confidence rather than binary on/off.

Softmax + Temperature

The output layer applies softmax to produce a probability distribution. Temperature controls randomness: low = deterministic, high = exploratory.

Backpropagation

In backprop mode, the network measures its error with a loss function and adjusts weights using gradient descent — the same mechanism behind LLMs.

Neural Network Visualizer

0%
Forward
0%
Left
0%
Right
0%
Reverse

Training Mode

Evolution Settings

10%
1.0
Network Legend
Active Neuron
Inactive Neuron
Positive Weight
Negative Weight