Fri 14 August 2015

CNS 2015 Day 2 and 3

Posted by ankur in Research (334 words, approximately a 2 minute read)

  • Share this post:

The notes have not been proofread. Please do your research before you pick anything from this post. It is meant to be a rough sketch of everything that I heard and noted at the conference. Since quite a bit of this is new to me, it is bound to be inaccurate, unspecific, and possibly even incorrectly quoted.

Day 2 - Collective information storage by stochastic model of structural plasticity

  1. If animal is learning, spine formation and destruction is much higher.
  2. Model of structural plasticity
    1. Neural activity.
    2. Synaptic weights.
    3. Network structure.
    4. Weight is directly proportional to spine volume
    5. Spine volume is directly proportional to spine stability
    6. Stochastic
    7. P(removal), P(formation)
    8. Calibrated using experimental data.
    9. Post synaptic correlation stabilises synaptic weight.
    10. Synapses between 2 neurons don't know about each other.

Limited range correlations, when modulated by firing rate, can substantially improve neural population coding

  1. Noisy population coding problem.
  2. Retina displays rate dependent correlations that strongly enhance population codes.

Day 3 - Gerstner Keynote

  1. Model scales
    1. Population rate model (Wilson Cowan) -> coarse graining -> phenomenological model (LIF) -> simplification -> biophysical (Hodgkin-Huxley).
    2. CITE: Harris & Shepperd 2015 - populations of neuron and classification.
    3. Parameter extraction
    4. Adaptation
    5. Generalised linear model (GLM) or spine response model (SRM).
    6. CITE: Gerstner and Naud 2009
  2. Steps:
    1. Systematic optimisation of parameters
      1. Predict membrane potential -> quadratic error function
      2. We have potentials, now optimise spike timings.
      3. Very quick process
      4. Spikes and thresholds have an effect on functioning of neuron from about 10 seconds.
    2. Quantifying spike timing - 90% predictability
    3. CITE: Mensi et al. J. neurophsyiology 2011
    4. Allen Institute - high throughput work
    5. CITE: Naud and Gerstner 2012 - PLOS computational biology.
    6. Fluctuations are good because they ensure that multiple solutions are exhibited
    7. Finite size issue
    8. Power spectrum
    9. Schwalger et al. 2014 + poster at CNS
    10. For differential equations -> find fixed points -> linearise -> them other analysis.


  1. Critical state -> functions of network are most efficient
  2. Lots of evidence from models but only recently have they received experimental data - Poster 221
  3. Disadvantages of criticality
    1. Fine tuning
    2. Slightly sub-critical is better.

  • Share this post: