Bernstein-Tutorials on Computational Neuroscience

Similar documents
Proposal of Pattern Recognition as a necessary and sufficient principle to Cognitive Science

Accelerated Learning Online. Course Outline

Introduction to Simulation

Module 12. Machine Learning. Version 2 CSE IIT, Kharagpur

Accelerated Learning Course Outline

Knowledge based expert systems D H A N A N J A Y K A L B A N D E

Course Outline. Course Grading. Where to go for help. Academic Integrity. EE-589 Introduction to Neural Networks NN 1 EE

Beyond Classroom Solutions: New Design Perspectives for Online Learning Excellence

Artificial Neural Networks

Breaking the Habit of Being Yourself Workshop for Quantum University

Python Machine Learning

Welcome to. ECML/PKDD 2004 Community meeting

Knowledge-Based - Systems

Evolution of Symbolisation in Chimpanzees and Neural Nets

A Bayesian Model of Imitation in Infants and Robots

BUILD-IT: Intuitive plant layout mediated by natural interaction

Neural pattern formation via a competitive Hebbian mechanism

New Project Learning Environment Integrates Company Based R&D-work and Studying

Department of Anatomy and Cell Biology Curriculum

Neuroscience I. BIOS/PHIL/PSCH 484 MWF 1:00-1:50 Lecture Center F6. Fall credit hours

Time series prediction

faculty of science and engineering Appendices for the Bachelor s degree programme(s) in Astronomy

POLA: a student modeling framework for Probabilistic On-Line Assessment of problem solving performance

Specification and Evaluation of Machine Translation Toy Systems - Criteria for laboratory assignments

On-Line Data Analytics

AGENDA LEARNING THEORIES LEARNING THEORIES. Advanced Learning Theories 2/22/2016

File # for photo

Open source tools for the information theoretic analysis of neural data

Evolutive Neural Net Fuzzy Filtering: Basic Description

Communication and Cybernetics 17

DIGITAL GAMING & INTERACTIVE MEDIA BACHELOR S DEGREE. Junior Year. Summer (Bridge Quarter) Fall Winter Spring GAME Credits.

AUTOMATED TROUBLESHOOTING OF MOBILE NETWORKS USING BAYESIAN NETWORKS

University of Toronto Physics Practicals. University of Toronto Physics Practicals. University of Toronto Physics Practicals

EGRHS Course Fair. Science & Math AP & IB Courses

What is PDE? Research Report. Paul Nichols

DRAFT PROPOSAL. The Faculty of the Department of Biological, Chemical, and Physical Sciences Illinois Institute of Technology

Lecture 10: Reinforcement Learning

Networks in Cognitive Science

Circuit Simulators: A Revolutionary E-Learning Platform

A Neural Network GUI Tested on Text-To-Phoneme Mapping

Using EEG to Improve Massive Open Online Courses Feedback Interaction

University of Illinois

Language Acquisition Fall 2010/Winter Lexical Categories. Afra Alishahi, Heiner Drenhaus

Rajesh P. N. Rao, Aaron P. Shon and Andrew N. Meltzoff

XXII BrainStorming Day

Effect of Treadmill Training Protocols on Locomotion Recovery in Spinalized Rats

Higher education is becoming a major driver of economic competitiveness

IAT 888: Metacreation Machines endowed with creative behavior. Philippe Pasquier Office 565 (floor 14)

Assessing Student Learning in the Major

OPTIMIZATINON OF TRAINING SETS FOR HEBBIAN-LEARNING- BASED CLASSIFIERS

Student Perceptions of Reflective Learning Activities

An Interactive Intelligent Language Tutor Over The Internet

Curriculum Vitae. Work Address Center for Economics and Neuroscience (CENs) and Telephone Nachtigallenweg 86

Radius STEM Readiness TM

Exploration. CS : Deep Reinforcement Learning Sergey Levine

Multidisciplinary Engineering Systems 2 nd and 3rd Year College-Wide Courses

InTraServ. Dissemination Plan INFORMATION SOCIETY TECHNOLOGIES (IST) PROGRAMME. Intelligent Training Service for Management Training in SMEs

Ph.D. in Behavior Analysis Ph.d. i atferdsanalyse

Dynamic Pictures and Interactive. Björn Wittenmark, Helena Haglund, and Mikael Johansson. Department of Automatic Control

Historical maintenance relevant information roadmap for a self-learning maintenance prediction procedural approach

Biomedical Sciences (BC98)

Guide to Teaching Computer Science

Probabilistic principles in unsupervised learning of visual structure: human data and a model

Self Study Report Computer Science

BI408-01: Cellular and Molecular Neurobiology

Seminar - Organic Computing

Level 6. Higher Education Funding Council for England (HEFCE) Fee for 2017/18 is 9,250*

Concept Acquisition Without Representation William Dylan Sabo

Axiom 2013 Team Description Paper

Innovative Methods for Teaching Engineering Courses

COMPUTER-AIDED DESIGN TOOLS THAT ADAPT

MASTER OF SCIENCE (M.S.) MAJOR IN COMPUTER SCIENCE

Spinal Cord. Student Pages. Classroom Ac tivities

ENME 605 Advanced Control Systems, Fall 2015 Department of Mechanical Engineering

Written by Joseph Chilton Pearce Thursday, 01 March :00 - Last Updated Wednesday, 25 February :34

CSC200: Lecture 4. Allan Borodin

Purdue Data Summit Communication of Big Data Analytics. New SAT Predictive Validity Case Study

Lecture 6: Applications

On Human Computer Interaction, HCI. Dr. Saif al Zahir Electrical and Computer Engineering Department UBC

Abstractions and the Brain

CALIFORNIA STATE UNIVERSITY, SAN MARCOS SCHOOL OF EDUCATION

Motivation to e-learn within organizational settings: What is it and how could it be measured?

HIGHLIGHTS OF FINDINGS FROM MAJOR INTERNATIONAL STUDY ON PEDAGOGY AND ICT USE IN SCHOOLS

BAYESIAN ANALYSIS OF INTERLEAVED LEARNING AND RESPONSE BIAS IN BEHAVIORAL EXPERIMENTS

Master s Programme in Computer, Communication and Information Sciences, Study guide , ELEC Majors

QuickStroke: An Incremental On-line Chinese Handwriting Recognition System

Deploying Agile Practices in Organizations: A Case Study

M-Learning. Hauptseminar E-Learning Sommersemester Michael Kellerer LFE Medieninformatik

Title:A Flexible Simulation Platform to Quantify and Manage Emergency Department Crowding

GRADUATE STUDENT HANDBOOK Master of Science Programs in Biostatistics

CS 446: Machine Learning

Knowledge Transfer in Deep Convolutional Neural Nets

GACE Computer Science Assessment Test at a Glance

ME 443/643 Design Techniques in Mechanical Engineering. Lecture 1: Introduction

Knowledge Synthesis and Integration: Changing Models, Changing Practices

Ecosystem: Description of the modules:

Probability and Statistics Curriculum Pacing Guide

Speech Emotion Recognition Using Support Vector Machine

Lecture 1: Machine Learning Basics

Sharing, Reusing, and Repurposing Data

Transcription:

Bernstein-Tutorials on Computational Neuroscience The Bernstein-Tutorials on various topics in Computational Neurosciences will be held on Saturday July 18th, prior to the main meeting, in the Berlin-Brandenburgische Akademie der Wissenschaften. The aim of the 10 tutorials is to provide students with a comprehensive introduction to various topics in Computational Neurosciences, as well as to give advanced researchers an overview of the state-of-the-art in the topic area. The tutorials will be held in a morning or an afternoon session, giving participants the opportunity to participate in two different tutorials. All attendees will be provided a DVD with scripts and slides from all tutorials. In the following, a short list of the topics and speakers. For more detailed information, please see the abstracts on the bottom of this page. 1. Spike train analysis (3 hours). Jutta Kretzberg (Oldenburg, Germany) 2. Neural basis of probabilistic computations (3 hours) Sophie Denève (Paris, France) 3. Tools and Methods in Psychophysics (3 hours). Felix Wichmann (Berlin, Germany). 4. Activity-dependent Synaptic Plasticity and Neuronal Adaptation (3 hours). Lars Schwabe (Rostock, Germany) 5. Neural Control Engineering -- The Emerging Intersection of Control Theory and Neuroscience (6 hours). Steven Schiff (Penn State Univ, USA) 6. Probabilistic Models of Natural Stimuli and Neural Populations (6 hours). Matthias Bethge (Tuebingen, Germany) 7. Neural Coding (3 hours). Peter Latham (London, UK) 8. Reinforcement learning -- a tool for cracking the neural codes of behavioral learning (3 hours). Kenji Doya (Okinawa, Japan) 9. Large-Scale Neuronal Network Models: Principles and Practice (3 hours). Hans Ekkehard Plesser (Oslo, Norway) 10. Analysis methods for functional neuroimaging data (3 hours). Stefan Kiebel (Leipzig, Germany) Organizational support provided by the National Network for Computational Neuroscience

Abstracts of the Bernstein-Tutorials 1. Spike train analysis (3 hours). Jutta Kretzberg (Oldenburg, Germany) This tutorial will focus on the analysis of neuronal population activity as it is obtained in extracellular multi-electrode recordings. After a short review of basic concepts of spike train analysis (e.g. receptive field measurement, tuning curves, PSTH), I will introduce two more advanced methods to estimate stimulus properties based on neuronal responses: Bayesian stimulus reconstruction and metric based clustering. For all analysis methods covered in the tutorial Matlab routines will be provided and applied to multi-electrode recordings from the retina. 2. Neural basis of probabilistic computations (3 hours) Sophie Denève (Paris, France) Our sensory input is noisy and ambiguous and the consequences of our actions are not completely predictable. For all these reasons, perception and behavioural choices require probabilistic inference. We will consider how neurons and neural populations could compute, represent and exploit uncertainties and probabilities. We will show that spike trains of integrate and fire neurons provide a natural basis to represent probabilistic evidence in a perpetually changing world. This leads us to reconsider the nature of signal and noise in the variable responses of cortical neurons. 3. Tools and Methods in Psychophysics (3 hours). Felix Wichmann (Berlin, Germany). The tutorial will cover some of the essentials of modern psychophysical methods and tools: Signal detection theory and (proper) forced-choice paradigms. The method of constant stimulus versus adaptive procedures. Psychometric function estimation and Monte-Carlo based goodness-of- fit assessment. Limits of currently available display technology in visual psychophysics. 4. Activity-dependent Synaptic Plasticity and Neuronal Adaptation (3 hours). Lars Schwabe (Rostock, Germany) Summary: In this tutorial we consider rules and mechanisms for synaptic plasticity, normalization and competition as well as recent spike- based learning rules (supervised and reinforcement-based). In addition, we consider short-term adaptation at the synapse and single- cell level as well as their functional consequences for network dynamics and sensory coding. After the tutorial the participants will be able to model both phenomena at different levels of description and utilize them in their own modeling studies. Abstract: The goal of this tutorial is to give an overview of the mathematical models of activity-dependent synaptic plasticity and neuronal adaptation as well as their functional consequences for the representation and processing of sensory information. Activitydependent synaptic plasticity is the mechanism, which governs the build-up, maintenance, and change of the connections between neurons. Hence, it probably plays a key role in development, learning and memory. Neuronal adaptation refers to the change in responsiveness after constant stimulation, and it may correspond to adaptation at the single-cell or synapse level. It is believed to be the underlying mechanism of perceptual phenomena like, for example, perceptual aftereffects. After the tutorial the participants

will be able to model both phenomena at different levels of description. In particular, they will be able to utilize these models in their own research in order to further explore the consequences of these two mechanisms for network dynamics and the functions realizes by such networks. All models are motivated and illustrated with examples (taken mainly, but not exclusively, from the visual system). Recent advances and open questions are discussed. Topics: Brief recap of the relevant biophysics Rules and corresponding mechanisms for plasticity, normalization and competition Consequences for activity-dependent development Recent spike-based learning rules (supervised and reinforcement-based) Input-driven vs. output-driven adaptation: single-cell vs. synaptic mechanisms Consequences for network dynamics and sensory coding 5. Neural Control Engineering -- The Emerging Intersection of Control Theory and Neuroscience (6 hours). Steven Schiff (Penn State Univ, USA) Abstract: With the advent of model based ensemble techniques to track and control nonlinear systems in real time, the intersection between formal control theory and computational neuroscience is emerging as a powerful new area for exploration. This tutorial will explore how common models from computational neuroscience can be placed within a control theoretic framework, using a variety of cellular and network modeling frameworks. The route to real time feedback control systems will be explained with algorithm and code examples. A detailed discussion of formalizing model inadequacy will be covered. Applications to rhythmic hippocampal oscillations, seizures, Parkinson s disease, and cortical wave formation will be discussed. Linear Kalman Filtering Nonlinear Kalman Filtering The Hodgkin Huxley Equations The Fitzhugh-Nagumo Equations The Bridge from Kalman Filtering to Neuronal Dynamics Spatiotemporal Neural Dynamics Parkinson s Disease Controlling Neuronal Dynamics with Electrical Stimulation Empirical Spatiotemporal Models Brain Machine Interfaces All Models are Wrong Formalizing Model Inadequacy A View Towards Future Applications 6. Probabilistic Models of Natural Stimuli and Neural Populations (6 hours). Matthias Bethge (Tuebingen, Germany) Both natural stimuli and neural recordings can exhibit complex statistical structure. Therefore, flexible statistical models are needed for capturing this complexity in a quantitative manner. In particular, probabilistic methods provide a principled framework for comparing and evaluating different models. In the morning session of the tutorial, we will discuss model classes for describing the statistical structure of natural images and their relevance for sensory coding. The afternoon session will consist of a self-contained

introduction to probabilistic models of spiking neurons. Our focus will be on the generalized linear model framework and related model classes. In each session we will aim to point out relationships and commonalities between different approaches. Forenoon: Probabilistic Models of Natural Stimuli Afternoon: Probabilistic Models of Neural Populations Further information will be posted on the following webpage: http://www.kyb.tuebingen.mpg.de/bethge/tutorials/cns2009/ 7. Neural Coding (3 hours). Peter Latham (London, UK) We study neural coding because we want to answer the question: "what are spike trains telling us?". Although the answer is still "we don't know," over the last several decades considerable progress has been made, and we now know much more than we did even ten years ago. In this tutorial, I will describe standard and not-so-standard methods for answering this question, what we have found, and where we are going. 8. Reinforcement learning -- a tool for cracking the neural codes of behavioral learning (3 hours). Kenji Doya (Okinawa, Japan) The theory of reinforcement learning evolved in the field of machine learning based on the intuition of animal learning from reward and punishment. Now the framework has been utilized for quantitatively modeling choice behaviors of rats, monkeys, and humans, and for fishing for neural correlates of decision making and action learning. This tutorial will present the mathematical basics of reinforcement learning and some case studies of how it is used for the analyses of behaviors, neural firing, and brain imaging data. 9. Large-Scale Neuronal Network Models: Principles and Practice (3 hours). Hans Ekkehard Plesser (Oslo, Norway) Simulations are widely used to study the dynamics of neuronal networks, but computational neuroscientists seldom reflect on the modeling process: How do we move from our understanding of experimental findings about neuroanatomy and -physiology, first to mental models of neuronal networks, and then to simulations performed by computer? Do our computer simulations really simulate the models we have built in ours minds? How well do we succeed in describing our simulated models to our colleagues when writing papers? We will discuss these topics in the first part of the tutorial, building on theoretical work on modeling in physics and ecology, as well as examples from the neuroscience literature. In the second part of the tutorial, we will discuss how to describe large-scale neuronal networks well in scientific publications. We will particularly discuss the advantages of high-level descriptions of neuronal networks, using the Topology Module of the NEST simulator as an example. Participants are invited to bring their laptops for hands-on experiments. Bootable live-dvds with a complete NEST installation will be provided.

10. Analysis methods for functional neuroimaging data (3 hours). Stefan Kiebel (Leipzig, Germany) In systems neuroscience, researchers acquire data using techniques such as functional magnetic resonance imaging (fmri) and magneto- and electroencephalography (M/EEG). These data show indirect evidence of neuronal activity, acquired from the whole brain. This tutorial will first cover the standard analyses for fmri and M/EEG, which are used for locating brain responses in space and time. In the second part of the tutorial we will go beyond these standard analyses and cover recent developments for inferring effective connectivity, i.e., interactions among brain areas. In particular, the tutorial will focus on Dynamic Causal Modelling, a state-space approach to model activity caused by neuronal networks, where we will go through the relevant details and illustrate the approach using some example studies. In the last part of the tutorial, we will motivate the use of Bayesian model selection in neuroimaging studies and will show that this kind of inference is particular relevant for M/EEG studies