The Adelson-Bergen motion energy sensor is well established as the leading model of low-level visual motion sensing in human vision. However, the standard model cannot predict adaptation effects in motion perception. A previous paper Pavan et al.(Journal of Vision 10:1–17, 2013) presented an extension to the model which uses a first-order RC gain-control circuit (leaky integrator) to implement adaptation effects which can span many seconds, and showed that the extended model’s output is consistent with psychophysical data on the classic motion after-effect. Recent psychophysical research has reported adaptation over much shorter time periods, spanning just a few hundred milliseconds. The present paper further extends the sensor model to implement rapid adaptation, by adding a second-order RC circuit which causes the sensor to require a finite amount of time to react to a sudden change in stimulation. The output of the new sensor accounts accurately for psychophysical data on rapid forms of facilitation (rapid visual motion priming, rVMP) and suppression (rapid motion after-effect, rMAE). Changes in natural scene content occur over multiple time scales, and multi-stage leaky integrators of the kind proposed here offer a computational scheme for modelling adaptation over multiple time scales.
Modelling fast forms of visual neural plasticity using a modified second-order motion energy model
CONTILLO, Adriano;
2014
Abstract
The Adelson-Bergen motion energy sensor is well established as the leading model of low-level visual motion sensing in human vision. However, the standard model cannot predict adaptation effects in motion perception. A previous paper Pavan et al.(Journal of Vision 10:1–17, 2013) presented an extension to the model which uses a first-order RC gain-control circuit (leaky integrator) to implement adaptation effects which can span many seconds, and showed that the extended model’s output is consistent with psychophysical data on the classic motion after-effect. Recent psychophysical research has reported adaptation over much shorter time periods, spanning just a few hundred milliseconds. The present paper further extends the sensor model to implement rapid adaptation, by adding a second-order RC circuit which causes the sensor to require a finite amount of time to react to a sudden change in stimulation. The output of the new sensor accounts accurately for psychophysical data on rapid forms of facilitation (rapid visual motion priming, rVMP) and suppression (rapid motion after-effect, rMAE). Changes in natural scene content occur over multiple time scales, and multi-stage leaky integrators of the kind proposed here offer a computational scheme for modelling adaptation over multiple time scales.I documenti in SFERA sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.