Some old news

2015 Mehran Ahmadlou received the NIN PhD Brain Award for his article in Nature Communications

2014 Christiaan Levelt and I organized the NIN International Interneuron Summer School, which took place June 29 - July 3, 2014 in Amsterdam.

2013 Rob Williams, Rupert Overall and I organized the first neuroinformatics jamboree, a cool short course on neuroinformatics, neurogenomics and brain disease. With about 40 people we spend 6 days mining the huge pile of high quality online neuroscience data and produced draft papers, now published at Frontiers of Neuroscience

2012 (Levelt lab).  A newspaper article described our finding that mice (and perhaps you too) lose more inhibitory connections between neurons, when an eye is closed for a period of time.
Two inhibitory synapses disappearing from a neuronal dendrite

2010 (Levelt lab). Monocular deprivation in the mouse is a model for human amblyopia (lazy eye) and the paradigmatic in vivo model for experience-dependent plasticity. We are particularly interested in the molecular pathways underlying the changes. In the elderly and in people with a lazy eye, vision can be poor, while the eyes are in fine condition. We discovered that deficient BDNF signaling in the cerebral cortex can be the cause of these vision impairments, and how the relationship of contrast and acuity are predicted by the normalization model(Nature Neuroscience, 2010). A national newspaper reported on our research, Scherp gezichtsvermogen hangt mede af van contrast (in Dutch).


2009 (Levelt lab).  To study changes in the early visual pathway in mouse model, we also set up mouse ERG recordings. With this technique we took part in the discovery that the TRPM1 channel is the transduction channel in rod bipolar cells in the mouse retina. This subsequently led to the identification of TRPM1 as locus of inheritable night blindness in humans.

2008. Amsterdam (Levelt lab). The intrinsic signal is the change in blood flow and oxygenation caused by local brain activity. With the intrinsic signal we can study the horizontal organization of the cortex at a resolution of about 0.1 mm. By studying population activity with non-invasive intrinsic signal imaging (e.g. Heimel et al. 2006) we measured the large heritability in the amount experience-dependent plasticity and determined a possible quantitative trait locus for ocular dominance plasticity. 

2005. Boston (Nelson lab). The gray squirrel is a diurnal, highly visual animal with a good eyes, cone-dominated retina and a well-developed visual system. We provided evidence for the homology of X and Y neuronal cell classes in the lateral geniculate nucleus of carnivores and rodents, with the P and M cell classes in the primate LGN, respectively (Van Hooser et al. 2003). The squirrel has a larger visual cortex than some animals which have an orientation map, but it lacks such organization (Van Hooser et al. 2005). A laminar organization of response properties is present in V1, in particular in the proportion of simple and complex cells, direction selectivity and cone opponency (Heimel et al. 2005)
Image courtesy of Steve Van Hooser

2001. London (Coolen group). In the Minority Game all players aim to make a minority decision. It models some aspects of a market where all traders try to make a rational decision based on the same public information. In the limit of a large number of players, one can use statistical mechanics to predict the fluctuations in the market. In the graph the market volatility is plotted on the y-axis against the dimension of the information space on which players base their decision. If there is little relevant information, the behaviour of the market becomes non-ergodic (i.e. dependent on the initial conditions).


2000. London (Coolen group). Neural networks are algorithms for pattern recognition and classification which are inspired by models of the brain. The dynamics of learning a finite set of training examples can be computed using the generating functional formalism of Martin-Siggia-Rose. The graphs shows simulation and analysis for the perceptron learning rule with different training set sizes, where alpha is the ratio of patterns and neurons. Top lines are generalization errors, bottom lines (in reverse order) are training errors.