ankursinha.in/blog

neuroscience/fedora/musings

Thu 30 July 2015

CNS 2015 - Day 1

Posted by ankur in Research (421 words, approximately a 2 minute read)


  • Share this post:

The notes have not been proofread. Please do your research before you pick anything from this post. It is meant to be a rough sketch of everything that I heard and noted at the conference. Since quite a bit of this is new to me, it is bound to be inaccurate, unspecific, and possibly even incorrectly quoted.

Day 0 Keynote - "Birdsong"

  1. Adrienne Fairhall
  2. Birds learn their songs by trial and error.
  3. The Zebra Finch has a single song.
  4. STDP may require sustained depolarisation or bursting to occur.
  5. The structure of the basal ganglia is pretty conserved in all mammals.
  6. EI -> atractor -> stability
  7. Dopamine effect is U shaped in avalanche distribution in basal ganglia, therefore, both too much and too little will give negative results.
  8. Q: Why do you need variability for learning? (Structured variability)
  9. Q: How do we isolate the variability that was "good"?

Day 1 keynote - Wilson-Cowan equations

  1. Jack Cowan
  2. Wilson-Cowan equations.
  3. Attractor dynamics in neural systems.
  4. Exhibit various stable behaviours
  5. Oscillations before settle to a fixed point
    1. Stable forms.
    2. In the Vogels self organising model
    3. CITE: paper in press
  6. Near a phase transition, no need to look at details of single neurons - you're not missing anything by ignoring single neuron details.

Limits of scalability of cortical network models

  1. Sacha van Albada
  2. Mechanism at \(N \rightarrow \infty\) is not the same mechanism at finite size.
  3. Inappropriate scaling can also cause the network to become unstable -> for example, cause large osciallations.
  4. Asynchronous irregular state, therefore, Gaussian inputs assumed
  5. LIF is like the rate model with white noise added to outputs
  6. So, while scaling you have to maintain effective connectivity and also maintain mean activities.
  7. Important to simulate at natural scale to verify.

Complex synapses as efficient memory systems

  1. Markus Benna.
  2. Dense coding.
  3. Also use SNR.
  4. Synaptic weight distribution gets wider and wider - diffusion process.
  5. Good synaptic memory model:
    1. Work with tightly bounded weights
    2. Online learning
    3. High SNR.
    4. Not too complicated
    5. Long life time
    6. CITE: Amit and Fusi 1994
  6. Cascade model of complex synapse
  7. Need a balance of LTP/LTD - otherwise your distribution is squished against one of the boundaries.

Self-organisation of computation in neural systems by interaction between homoeostatic and synaptic plasticity

  1. Sakyasingha Dasgupta
  2. Cell assembly properties
    1. Pattern completion
    2. I-O association
    3. Persistent activity
  3. Synaptic scaling is about 50-100 times slower than synaptic plasticity process

A model for spatially periodic firing in the hippocampal formation based on interacting excitatory and inhibitory plasticity

  1. Simon Weber
  2. If inhibition is not precise enough, you get periodic firing.
  3. Model of grid and place cells

  • Share this post:

Comments