08.11.2017: Francesca Mastrogiuseppe


Linking connectivity, dynamics and computations in recurrent neural networks


Synaptic connectivity determines the dynamics and computations performed by neural circuits. Due to the highly recurrent nature of circuitry in cortical networks, the relationship between connectivity, dynamics and computations is complex, and understanding it requires theoretical models.
Classical models of recurrent networks are based on connectivity that is either fully random or highly structured, e.g. clustered. Experimental measurements in contrast show that cortical connectivity lies somewhere between these two extremes. Moreover, a number of functional approaches suggest that a minimal amount of structure in the connectivity is sufficient to implement a large range of computations.
Based on these observations, here we develop a theory of recurrent networks with a connectivity consisting of a combination of a random part and a minimal, low dimensional structure. We show that in such networks, the dynamics are low-dimensional and can be directly inferred from connectivity using a geometrical approach. We exploit this understanding to determine minimal connectivity structures required to implement specific computations. We find that the dynamical range and computational capacity of a network quickly increases with the dimensionality of the structure in the connectivity. Our simplified theoretical framework captures and connects a number of outstanding experimental observations, in particular the fact that neural representations are high-dimensional and distributed, while dynamics are low-dimensional, with a dimensionality
that increases with task complexity.