network models lecture 6. i.introduction − basic concepts of neural networks ii.realistic neural...

Post on 27-Dec-2015

215 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Network Models

LECTURE 6

I. Introduction − Basic concepts of neural networks

II. Realistic neural networks − Homogeneous excitatory and inhibitory populations − The olfactory bulb − Persistent neural activity

What is a ( an Artificial) Neural Network?

• Inspired from real neural systems

• Having a network structure, consisting of artificial neurons (nodes) and neuronal connections (weights )

• A general methodology for function approximation

• It may either be used to gain an understanding of biological neural networks, or for solving artificial intelligence problems

How neural systems look like?

• Neuron: the fundamental singalling/computational units

• Synapses: the connections between neurons

• Layer: neurons are organized into layers

• Extremely complex: around 1011 neurons in the brain, each with 104 connections

Imagine 6 business cards putted together. − Human: as large as a dinner napkin; − Monkey: about the size of a business- letter envelope; − Rat: a stamp. There are about 105 neurons within 1mm2.

Cortical columns: − In the neocortex, neurons lie in 6 vertical layers highly coupled within cylindrical columns with about 0.1 to 1 mm in diameter (e.g. ocular dominance columns and orientation maps in V1)

2 mm

Three main classes of interconnections (e.g. visual system):

− Feedforward connections bring input to a given region from another region located at an earlier stage along a particular processing pathway

− Recurrent synapses interconnect neurons within a particular region that are considered to be at the same stage along the processing pathway

− Top-down connections carry signals back from areas located at later stages.

Compare neural systems with computers

• A different style of computation: parallel distributed processing

• An universal computational architecture: the same structure carries out many different functions

• It can learn new knowledge, therefore it is adaptive

The brain is superior than modern computers in many aspects (e.g., face recognition and talk).

The idea of artificial neural networks

• A single neuron’s function is simple. The specific functions come from the network structure

• ANN is more and more engineering-driven nowadays, its biological root is gradually losing. The key of ANN is on the design and training of suitable network structures

• The connection style– Feed-forward and recurrent

1 Nodes --- Neurons (artificial neurons, performing a linear or non-linear mapping)2 Weights --- Synapses3 Mimic the network structure of neural systems

Feedforward and recurrent networks

An example of one-layer feed-forward neural network

w1

y

b

n

iii bxwfy

1

)(

Bias ‘b’ is an external parameter that can be modeled by adding an extra input

Activation functionw2

wn

w3

x2

x1

xn

‘+’ or ‘-’

Activation function

g

Sigmoid function

if 0,

if 0)(

th

th

VVaaV

VVVg

V

f

• Linear-Threshold function:

• Sigmoid function:

VeVg

1

1)(

The sigmoid function

Ve 1

1with β = 1.0 β = 5.0

-6.0 0.0 6.0

0.0

1.0

-6.0 0.0 6.0

β = 0.1

-6.0 0.0 6.0

1.0

0.0

β = 0.1

-25.0 0.0 25.0

An example of 3-layer feed-forward network

xn

x1

x2

Hidden layersInput Layer Output Layer

y1

y2

ym

n

jijiji bxwfy

1

)(

Neural Network (NN) for function approximation

NN transforms input into output. In other words, a NN model implements a function

NN has a set of adjustable parameters (weights, activation threshold, and etc.)

Network parameters (in particular, network weights) are free to be adjusted in order to achieve desired functions

Type of learning used depends on task at hand

• Supervised learning: have a teacher

• Unsupervised learning: no teacher

• Reinforcement learning: no detailed instruction, only the final reward is available

Possible use of artificial neural networks (Do you remember the application of Neurocomputing? )

• Classification: given an input, decide its class index• Regression: given an input, decide the corresponding continuous output value

Pattern recognition (face recognition, radar systems, object recognition), Sequence recognition (gesture, speech, handwritten text recognition)Medical diagnosis, Financial applications, Data mining,Stock market analysis, Weather forecast

• I will mainly focus on the realistic neural networks in this course

• And Prof. Shi Zhongzhi will covers the artificial neural networks and …

I. Introduction − Basic concepts of neural networks

II. Realistic neural networks − Homogeneous excitatory and inhibitory populations − The olfactory bulb − Persistent neural activity − ……

Homogeneous excitatory and inhibitory populations (i.e. Wilson-Cowan model)

• It describes the dynamics of interacting excitatory and inhibitory neuronal populations

• Usually two neurons (excitatory/inhibitory)• Explain the oscillation source in neural systems• A typical recurrent network

I E

Input I

Input EwEI

wIE

wEE

wII

0 if

0 if 0)(

xx

xxg

x

g

• Here g(x) is an activation function or the steady-state firing rate as a function of input

• g(x) is Linear-Threshold function:

)(

)(

IIIEIIIII

I

EEEIEEEEE

E

vwvwgvdt

dv

vwvwgvdt

dv

emLm IRVEdt

dV

Compare with

msE 10

1. Find the fixed points by setting:

0

0

dt

dvdt

dv

I

E

IEII

EIEEE

vgvdt

dv

vvgvdt

dv

/)]10([

/)]1025.1([

adjustable parameter

VE = 26.67Hz

VI = 16.67Hz

2. Identify stability by using phase plane or 2-D phase space portrait

(x(t),y(t))

x

y (x(t+h),y(t+h)) =(x(t)+hdx/dt, y(t)+hdy/dt)

dVI /dt < 0

dVI /dt > 0

msI 30Adjust parameter

(Dayan and Abbott, 2001)

msI 50Adjust parameter

(Dayan and Abbott, 2001)

Hopf bifurcation

As a parameter is changed, a fixed point of a dynamical system loses stability and a limit cycle emerges

I. Introduction − Basic concepts of neural networks

II. Realistic neural networks − Homogeneous excitatory and inhibitory populations − The olfactory bulb − Persistent neural activity − ……

The olfactory bulb

Input

InputE

InputI

InputI

)(

)(

IIIEIIIIII

I

EEEIEEEEEE

E

InputvwvwFvdt

vd

InputvwvwFvdt

vd

10 and ,0 ,7.6 mnwwms IIEEIE where

• A recurrent network• A ?-D phase space is needed • The InputE, must induce a transition between fixed-point and oscillatory activity

Activities of four of ten mitral and granule cells during a single sniff cycle for two different odors

(Dayan and Abbott, 2001)

(Dayan and Abbott, 2001)

I. Introduction − Basic concepts of neural networks

II. Realistic neural networks − Homogeneous excitatory and inhibitory populations − The olfactory bulb − Persistent neural activity

Persistent neural activity

• It refers to a sustained change in action potential discharge that long outlasts a stimulus

• It is found in a diverse set of brain regions and organisms and several in vitro systems, suggesting that it can be considered a universal form of circuit dynamics

Several examples for persistent neural activity

• Oculomotor neural integrator cells in an awake behaving goldfish

• Monkey prefrontal cortical cells

• Head direction cells in rat

An oculomotor neural integrator cell in an awake behaving goldfish

(Major and Tank 2004)

Generation mechanism of persistent neural activity during memory-guided saccade task in the prefrontal cortex

i

Ei

II

bsi

aEEEEij

Sj

I

ji

Ei

EEji

Ej

Ej

E

rdt

dV

er

eww

rrrwVdt

dV

ipT

ip

jp

(300ms)

5.0

2

2

2

2

2

)(

2

)(

max

120 units

Persistent activity

(Gruber et al. 2006)

A distractor input for 300ms

)(

1

1

1

g

V Ej

e

r

Activation function:

1. The persistent PFC activity is a network effect associated with recurrent excitation

2. The network has a line (or ring) attractor

From this PFC network, we learn:

Homework

1. 什么是动力系统的 Hopf 分叉?

2. 用一个兴奋性和一个抑制性神经元构造 Wilson-Cowan网络模型,探讨参数与动力学行为关系。

3. 思考神经系统 persistent activity 的可能计算神经机制

top related