perceptron networks and vector notation

Post on 31-Dec-2015

29 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

DESCRIPTION

Perceptron Networks and Vector Notation. CS/PY 231 Lab Presentation # 3 January 31, 2005 Mount Union College. A Multiple Perceptron Network for computing the XOR function. We found that a single perceptron could not compute the XOR function - PowerPoint PPT Presentation

TRANSCRIPT

Perceptron Networks and Vector Notation

CS/PY 231 Lab Presentation # 3 January 31, 2005 Mount Union College

A Multiple Perceptron Network for computing the XOR function We found that a single perceptron could not

compute the XOR function Solution: set up one perceptron to detect if x1

= 1 and x2 = 0 set up another perceptron for x1 = 0 and x2 =

1 feed the outputs of these two perceptrons into

a third one that produces an output of 1 if either input is a 1

A Nightmare!

Even for this simple example, choosing the weights that cause a network to compute the desired output takes skill and lots of patience

Much more difficult than programming a conventional computer:

• OR function: if x1 + x2 > 1, output 1; otherwise output 0

• XOR function: if x1 + x2 = 1, output 1; otherwise output 0

There must be a better way….

These labs and demos were designed to show that manually adjusting weights is tedious and difficult

This is not what happens in nature– No creature says, “Hmmm, what weight should I

choose for this neural connection?” Formal training methods exist that allow

networks to learn by updating weights automatically (explored next week)

Expanding to More Inputs

artificial neurons may have many more than two input connections

calculation performed is the same: multiply each input by the weight of the connection, and find the sum of all of these products

notation can become unwieldy:• sum = x1·w1 + x2·w2 + x3·w3 + … + x100·w100

Some Mathematical Notation

Most references (e.g., Plunkett & Elman text) use mathematical summation notation

Sums of large numbers of terms are represented by the symbol Sigma ()– previous sum is denoted as:

100

xk·wk

k = 1

Summation Notation Basics

Terms are described once, generally Index variable shows range of possible

values Example:

5

k / (k - 1) = 3/2 + 4/3 + 5/4

k = 3

Summation Notation Example

Write the following sum using Sigma notation:3·x0 + 4·x1 + 5·x2 + 6·x3 + 7·x4 + 8·x5

+ 9·x6 + 10·x7

Answer: 7

(k + 3) ·xk

k = 0

Vector Notation

The most compact way to specify values for inputs and weights when we have many connections

The ORDER in which values are specified is important

Example: if w1 = 3.5, w2 = 1.74, and w3 = 18.2, we say that the weight vector w = (3.5, 1.74, 18.2)

Vector Operations

Vector Addition: adding two vectors means adding the values from the same position in each vector– result is a new vector

Example: (9.2, 0, 17) + (1, 2, 3) = (10.2, 2, 20)

Vector Subtraction: subtract corresponding values

(9.2, 0, 17) - (1, 2, 3) = (8.2, -2, 14)

Vector Operations

Dot Product: mathematical name for what a perceptron does

x · m = x1·m1 + x2·m2 + x3·m3 + … + xlast·mlast

Result of a Dot Product is a single number

example: (9.2, 0, 17) · (1, 2, 3) = 9.2 + 0 + 51 = 60.2

Perceptron Networks and Vector Notation

CS/PY 231 Lab Presentation # 3 January 31, 2005 Mount Union College

top related