self-organized recurrent neural learning for language processing april 1, 2009 - march 31, 2012...

8
Self-Organized Recurrent Neural Learning for Language Processing www.reservoir-computing.org April 1, 2009 - March 31, 2012 State from June 2009

Upload: bennett-hines

Post on 16-Jan-2016

218 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Self-Organized Recurrent Neural Learning for Language Processing  April 1, 2009 - March 31, 2012 State from June 2009

Self-Organized Recurrent Neural Learning for Language Processing

www.reservoir-computing.org

April 1, 2009 - March 31, 2012State from June 2009

Page 2: Self-Organized Recurrent Neural Learning for Language Processing  April 1, 2009 - March 31, 2012 State from June 2009

2

The task

(www.georgholzer.at)

(introspectreangel.wordpress.com)(coli.uni-saarland.de/~steiner/)

writing/speech source feature stream

(compuskills.com.cy)

AI machine

• Speech and handwriting recognition = essentially same problem

• Humans can do it -- but only after years of learning: thus, a very difficult problem

• No human-level AI solution in sight

Page 3: Self-Organized Recurrent Neural Learning for Language Processing  April 1, 2009 - March 31, 2012 State from June 2009

3

MissionEstablish neurodynamical architectures as viable alternative to statistical methods for speech and handwriting recognition.

State-of-the-art

• Speech recognition = statistical data analysis problem

• Leads to data-driven, feedforward "serial" learning and representation techniques (HMMs)

• Performance appears to asymptote well below human performance

ORGANIC alternative

• Speech recognition = an achievement of human brains

• Leads to neural computation and cognitive neuroscience modelling with recurrent dynamics (cyclic top-down and bottom-up paths)

• Potential to come closer to human performance

(From Rabiner 1990, classical speech recognition tutorial)(From Dominey et al 1995)

Page 4: Self-Organized Recurrent Neural Learning for Language Processing  April 1, 2009 - March 31, 2012 State from June 2009

4

Basic paradigm: reservoir computing (RC)

• Also known as Echo State Networks and Liquid State Machines

• Discovered in 2000, now an established paradigm in computational neuroscience and machine learning

• RC makes, for the first time, training of recurrent neural networks practically feasible: a major enabling technology

• RC is biologically plausible

• Consortium comprises pioneers and leading investigators of RC field

Principle of RC:

• Use large, fixed, random recurrent network as excitable medium

• Excite by input signal

• Read out desired output by trainable output weights (red)

Page 5: Self-Organized Recurrent Neural Learning for Language Processing  April 1, 2009 - March 31, 2012 State from June 2009

5

Scientific objectivesBasic blueprints: Design and proof-of-principle tests of fundamental

architecture layouts for hierarchical neural system that can learn multi-scale sequence tasks.

Reservoir adaptation: Investigate mechanisms of unsupervised adaptation of reservoirs.

Spiking vs. non-spiking neurons, role of noise: Clarify the functional implications for spiking vs. non-spiking neurons and the role of noise.

Single-shot model extension, lifelong learning capability: Develop learning mechanisms which allow a learning system to become extended in “single-shot” learning episodes to enable lifelong learning capabilities.

Working memory and grammatical processing: Extend the basic paradigm by a neural index-addressable working memory.

Interactive systems: Extend the adaptive capabilities of human-robot cooperative interaction systems by on-line and lifelong learning capabilities.

Integration of dynamical mechanisms: Integrate biologically mechanisms of learning, optimization, adaptation and stabilization into coherent architectures.

Page 6: Self-Organized Recurrent Neural Learning for Language Processing  April 1, 2009 - March 31, 2012 State from June 2009

6

Community service and dissemination objectives

High performing, well formalized core engine: Collaborative development of a well formalized and high performing core Engine, which will be made publicly accessible.

Comply to FP6 unification initiatives: Ensure that the Engine integrates with the standards set in the FACETS FP6 IP, and integrate with other existing code.

Benchmark repository: Create a database with temporal, multi-scale benchmark data sets which can be used as an international touchstone for comparing algorithms.

Page 7: Self-Organized Recurrent Neural Learning for Language Processing  April 1, 2009 - March 31, 2012 State from June 2009

7

Consortium

Institution Group Research

Jacobs University Bremen

Machine Learning (Herbert Jaeger)

Recurrent neural networks, nonlinear dynamics, pattern recognition

Technical University Graz

Computational Neuroscience (Wolfgang Maass)

Spiking neurodynamics, generic neural microcircuits, reinforcement learning

INSERM

Lyon

Human and Robot Interactive Cognitive Systems Team (Peter F. Dominey)

Cognitive neuroscience, human cortical sequence processing and speech recognition

Universiteit

Gent

Reservoir Computing Lab (Benjamin Schrauwen)

Reservoir computing applications, algorithm design

Speech processing group (Jean-Pierre Martens)

Speech recognition methods research and application development

Planet intelligent systems GmbH

Research and Development (Welf Wustlich)

Text and handwriting recognition solutions, address recognition

Page 8: Self-Organized Recurrent Neural Learning for Language Processing  April 1, 2009 - March 31, 2012 State from June 2009

8

Workpackages and collaboration scheme