COSYNELog in


Cosyne 2007 Workshops


February 26-27, 2007

The Canyons, Utah


Workshop Title

Emerging information-theoretic measures and methods in neuroscience

Organizer(s)

Michael Gastpar (UC Berkeley): gastpar@eecs.berkeley.edu
Jonathan Victor (Cornell): jdvicto@med.cornell.edu

Abstract

Direct application of information-theoretic tools to laboratory measurements of stimulus-response relationships have resulted in a number of important insights. However, these approaches often require very large amounts of data (especially for multineuronal analyses), and are thus of limited practicality in vertebrate systems, especially the central nervous system. Moreover, there are sound theoretical reasons for using an information-theoretic approach even when the neurons under study do not behave "optimally."

In extension and response to these issues, over the past few years, several research groups have developed a second generation of information-theoretic tools. The goal of the proposed workshop is to provide an in-depth snapshot of the status of these investigations in some of their most exciting aspects, including:

 1. Notions of optimality of information representations in neurons
 2. Correlation and information measures of redundancy in populations of neurons and/or the implications of limited data
 3. Refined methods and approaches to estimate mutual information from measurement data, with a particular focus on populations of neurons
 4. Use of information-theoretic tools as a means to characterize the nature of the neural code, rather than the quantity of information carried

Speakers

Toby Berger (Cornell/U. Virginia) Energy-efficient recursive estimation by variable-threshold neurons
Michael Berry (Princeton University) Correlated Neural Populations in the Retina
Dmitri Chklovskii (Cold Spring Harbor) Optimal Information Storage in Noisy Synapses under Resource Constraints
Adrienne Fairhall (U. Washington) Model evaluation using information
David Field (Cornell) Measuring the information content and dimensionality of complex signals: An example of natural scenes and proximity distributions
Michael Gastpar (UC Berkeley) Scaling Information Measures for Population Codes
William Levy (U. Virginia) The interaction between timeliness and information in determining the energetic cost of the action potential of unmyelinated nerves.
Liam Paninski (Columbia University) Model-based methods for stimulus decoding, information estimation, and information-theoretic optimal stimulus design
Jonathan Pillow (University College London) Neural characterization using an information-theoretic generalization of spike-triggered average and covariance analysis
Jonathon Shlens (Salk Institute) Exploring the network structure of primate retina using maximum entropy methods
Naftali Tishby (The Hebrew University) Optimal adaptation and predictive information
Jonathan Victor (Cornell) Why it is difficult to calculate information, and why there are so many approaches

Retrieved from "http://www.cosyne.org/c/index.php?title=Emerging_information-theoretic_measures_and_methods_in_neuroscience"

This page has been accessed 4,798 times. This page was last modified 16:56, 23 January 2007.


Cosyne 15
Meeting program
Workshops
Hotels
Transportation
Abstracts
Registration
Volunteers
Mailing lists
Travel grants
Exhibitors
FAQs

Cosyne 14
Cosyne 13
Cosyne 12
Cosyne 11
Cosyne 10
Cosyne 09
Cosyne 08
Cosyne 07
Cosyne 06
Cosyne 05
Cosyne 04