Adam's talk on deep learning

May 8, 2018

Adam is visiting from MIT and giving a talk on deep learning and neuroscience, looking forward to catching up.. 

 

Towards an integration of deep learning and neuroscience

​Dr. Adam Marblestone

Chief Strategy Officer, Kernel

Research affiliate, Synthetic Neurobiology group at MIT

​Cambridge, USA​

 

UZH-Irchel, lecture hall Y55 H12

​Monday, 2​8​.0​5​.18, 1​6​:​0​0 – 1​7:​0​0

 

Abstract: Neuroscience has focused on the detailed implementation of computation, studying neural codes, dynamics and circuits. In machine learning, however, artificial neural networks tend to eschew precisely designed codes, dynamics or circuits in favor of brute force optimization of a cost function, often using simple and relatively uniform initial architectures. I will discuss the hypotheses that (1) the brain optimizes cost functions, (2) the cost functions are diverse and differ across brain locations and over development, and (3) optimization operates within a pre-structured architecture matched to the computational problems posed by behavior. I will also suggest directions by which neuroscience could seek to refine and test these hypotheses, and in particular, a recent speculative proposal for how to incorporate neural attractors/assemblies/ensembles into a view of the thalamo-cortical system that incorporates both deep training and "symbolic" memory operations.