Enhance behavioral response from merging visual orientation and direction using spiking neurons in a robotic learning context

This paper proposes a spiking neuron model controlling a virtual robot that integrates visual direction and orientation selectivity features. The experiment shows that merging these two neural circuits fasten the decision of the behavioral response when tested in an operant conditioning learning context. Specifically, the positive effect was observed when the stimuli are orthogonal to their motions.

Note : The following is all supplementary materials available for the article.

SNN architecture

Since the experimental protocol consists in moving full length black lines in front of the robot, it was not necessary to add all synaptic links and orientation/direction neurons. The necessity to add the remaining neurons and synapses would only come in a context where shorter lines are displayed.

Initial table of neural components

- Synaptic values
- Neural values