perceptron networks and vector notation

12
Perceptron Networks and Vector Notation CS/PY 231 Lab Presentation # 3 January 31, 2005 Mount Union College

Upload: lester-hahn

Post on 31-Dec-2015

29 views

Category:

Documents


1 download

DESCRIPTION

Perceptron Networks and Vector Notation. CS/PY 231 Lab Presentation # 3 January 31, 2005 Mount Union College. A Multiple Perceptron Network for computing the XOR function. We found that a single perceptron could not compute the XOR function - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Perceptron Networks and Vector Notation

Perceptron Networks and Vector Notation

CS/PY 231 Lab Presentation # 3 January 31, 2005 Mount Union College

Page 2: Perceptron Networks and Vector Notation

A Multiple Perceptron Network for computing the XOR function We found that a single perceptron could not

compute the XOR function Solution: set up one perceptron to detect if x1

= 1 and x2 = 0 set up another perceptron for x1 = 0 and x2 =

1 feed the outputs of these two perceptrons into

a third one that produces an output of 1 if either input is a 1

Page 3: Perceptron Networks and Vector Notation

A Nightmare!

Even for this simple example, choosing the weights that cause a network to compute the desired output takes skill and lots of patience

Much more difficult than programming a conventional computer:

• OR function: if x1 + x2 > 1, output 1; otherwise output 0

• XOR function: if x1 + x2 = 1, output 1; otherwise output 0

Page 4: Perceptron Networks and Vector Notation

There must be a better way….

These labs and demos were designed to show that manually adjusting weights is tedious and difficult

This is not what happens in nature– No creature says, “Hmmm, what weight should I

choose for this neural connection?” Formal training methods exist that allow

networks to learn by updating weights automatically (explored next week)

Page 5: Perceptron Networks and Vector Notation

Expanding to More Inputs

artificial neurons may have many more than two input connections

calculation performed is the same: multiply each input by the weight of the connection, and find the sum of all of these products

notation can become unwieldy:• sum = x1·w1 + x2·w2 + x3·w3 + … + x100·w100

Page 6: Perceptron Networks and Vector Notation

Some Mathematical Notation

Most references (e.g., Plunkett & Elman text) use mathematical summation notation

Sums of large numbers of terms are represented by the symbol Sigma ()– previous sum is denoted as:

100

xk·wk

k = 1

Page 7: Perceptron Networks and Vector Notation

Summation Notation Basics

Terms are described once, generally Index variable shows range of possible

values Example:

5

k / (k - 1) = 3/2 + 4/3 + 5/4

k = 3

Page 8: Perceptron Networks and Vector Notation

Summation Notation Example

Write the following sum using Sigma notation:3·x0 + 4·x1 + 5·x2 + 6·x3 + 7·x4 + 8·x5

+ 9·x6 + 10·x7

Answer: 7

(k + 3) ·xk

k = 0

Page 9: Perceptron Networks and Vector Notation

Vector Notation

The most compact way to specify values for inputs and weights when we have many connections

The ORDER in which values are specified is important

Example: if w1 = 3.5, w2 = 1.74, and w3 = 18.2, we say that the weight vector w = (3.5, 1.74, 18.2)

Page 10: Perceptron Networks and Vector Notation

Vector Operations

Vector Addition: adding two vectors means adding the values from the same position in each vector– result is a new vector

Example: (9.2, 0, 17) + (1, 2, 3) = (10.2, 2, 20)

Vector Subtraction: subtract corresponding values

(9.2, 0, 17) - (1, 2, 3) = (8.2, -2, 14)

Page 11: Perceptron Networks and Vector Notation

Vector Operations

Dot Product: mathematical name for what a perceptron does

x · m = x1·m1 + x2·m2 + x3·m3 + … + xlast·mlast

Result of a Dot Product is a single number

example: (9.2, 0, 17) · (1, 2, 3) = 9.2 + 0 + 51 = 60.2

Page 12: Perceptron Networks and Vector Notation

Perceptron Networks and Vector Notation

CS/PY 231 Lab Presentation # 3 January 31, 2005 Mount Union College