Continuous-time recurrent neural network implementationΒΆ
The default continuous-time recurrent neural network (CTRNN) implementation
in neat-python
is modeled as a system of ordinary differential equations, with neuron potentials as the dependent variables.
\(\tau_i \frac{d y_i}{dt} = -y_i + f_i\left(\beta_i + \sum\limits_{j \in A_i} w_{ij} y_j\right)\)
Where:
- \(\tau_i\) is the time constant of neuron \(i\).
- \(y_i\) is the potential of neuron \(i\).
- \(f_i\) is the activation function of neuron \(i\).
- \(\beta_i\) is the bias of neuron \(i\).
- \(A_i\) is the set of indices of neurons that provide input to neuron \(i\).
- \(w_{ij}\) is the weight of the connection from neuron \(j\) to neuron \(i\).
The time evolution of the network is computed using the forward Euler method:
\(y_i(t+\Delta t) = y_i(t) + \Delta t \frac{d y_i}{dt}\)