knowledge representation: spaces, trees, features · knowledge representation a good representation...
TRANSCRIPT
![Page 1: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/1.jpg)
Knowledge Representation:Spaces, Trees, Features
![Page 2: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/2.jpg)
Announcements
● Optional section 1: Introduction to Matlab
– Tonight, 8:00 pm
● Problem Set 1 is available
![Page 3: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/3.jpg)
The best statistical graphic ever?
Image removed due to copyright considerations. Please see: Tufte, Edward. The Visual Display of Quantitative Information. Cheshire CT: Graphics Press, 2001. ISBN: 0961392142.
![Page 4: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/4.jpg)
The worst statistical graphic ever ?
Image removed due to copyright considerations. Please see: Tufte, Edward. The Visual Display of Quantitative Information.Cheshire CT: Graphics Press, 2001. ISBN: 0961392142.
![Page 5: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/5.jpg)
Knowledge Representation
● A good representation should: – be parsimonious – pick out important features – make common operations easy – make less common operations possible
![Page 6: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/6.jpg)
Mental Representations
● Pick a domain: say animals ● Consider everything you know about that domain. ● How is all of that knowledge organized? – a list of facts? – a collection of facts and rules? – a database of statements in first-order logic?
![Page 7: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/7.jpg)
Two Questions
1. How can a scientist figure out the structure of people's mental representations?
2. How do people acquire their representations?
World Scientist Learner
![Page 8: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/8.jpg)
Q: How can a scientist figure out the structure of people's mental representations?
A: Ask them for similarity ratings
Scientist Learner
objects
objects
![Page 9: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/9.jpg)
Q: How do people acquire their mental representations?
A: They build them from raw features – features that come for free
Learner World
objects
raw features
![Page 10: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/10.jpg)
Outline
● Spatial Representations – Multidimensional scaling – Principal component analysis
● Tree representations – Additive trees – Hierarchical agglomerative clustering
● Feature representations – Additive clustering
![Page 11: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/11.jpg)
Multidimensional scaling (MDS)
Image removed due to copyright considerations.
![Page 12: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/12.jpg)
Marr’s three levels
● Level 1: Computational theory – What is the goal of the computation, and what is the
logic by which it is carried out? ● Level 2: Representation and algorithm
– How is information represented and processed to achieve the computational goal?
● Level 3: Hardware implementation – How is the computation realized in physical or
biological hardware?
![Page 13: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/13.jpg)
MDS: Computational Theory
: distance in a low-dimensional space
: human dissimilarity ratings
● Classical MDS:
● Metric MDS:
● Non-metric MDS: rank order of the should match rank order of the
![Page 14: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/14.jpg)
MDS: Computational Theory
● Cost function
– Classical MDS: cost =
MDS: Algorithm
● Minimize the cost function using standard methods (solve an eigenproblem if possible: if not use gradient-based methods)
![Page 15: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/15.jpg)
Choosing the dimensionality
● Elbow method
Image removed due to copyright considerations.
![Page 16: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/16.jpg)
Colours
Image removed due to copyright considerations.
![Page 17: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/17.jpg)
Phonemes
Image removed due to copyright considerations.
![Page 18: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/18.jpg)
What MDS achieves
● Sometimes discovers meaningful dimensions ● Are the dimensions qualitatively new ? Does
MDS solve Fodor's problem?
What MDS doesn't achieve
● Solution (usually) invariant under rotation of the axes
● The algorithm doesn't know what the axes mean. We look at the low-dimensional plots and find meaning in them.
![Page 19: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/19.jpg)
ideonomy.mit.edu
Image removed due to copyright considerations. Please See: http://ideonomy.mit.edu/slides/16things.html
PatrickGunkel
_______________________________________________
![Page 20: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/20.jpg)
Two Questions
1. How can a scientist figure out the structure of people's mental representations?
2. How do people acquire their representations?
World Scientist Learner
![Page 21: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/21.jpg)
Principal Components Analysis (PCA)
objects objects
new raw features features
![Page 22: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/22.jpg)
PCA
Image removed due to copyright considerations.
![Page 23: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/23.jpg)
PCA
Image removed due to copyright considerations.
![Page 24: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/24.jpg)
PCA
Image removed due to copyright considerations.
![Page 25: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/25.jpg)
PCA
● Computational Theory – find a low-dimensional subspace that preserves as
much of the variance as possible ● Algorithm – based on the Singular Value Decomposition (SVD)
![Page 26: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/26.jpg)
objectsSVD raw = features
≈
objects new features
=
![Page 27: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/27.jpg)
Image removed due to copyright considerations.
![Page 28: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/28.jpg)
SVDobjects
features =
=
= objects
≈
0.8 0.5
![Page 29: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/29.jpg)
PCA and MDS
PCA on a raw Classical MDS onfeature matrix = Euclidean
distances between feature vectors
![Page 30: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/30.jpg)
Applications: Politicspoliticians
roll-call votes
≈ co-ordinates of politicians in ideology space
![Page 31: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/31.jpg)
CongressUS Senate, 2003
(Stephen Weis) Courtesy of Stephen Weis. Used with permission.
![Page 32: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/32.jpg)
US Senate, 1990
(Stephen Weis) Courtesy of Stephen Weis. Used with permission.
![Page 33: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/33.jpg)
Applications: Personalitypeople
answers to questions on personality test
≈ co-ordinates of people in personality space
● The Big 5 – Openness – Conscientiousness – Extraversion – Agreeableness
– Neuroticism
![Page 34: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/34.jpg)
Applications: Face Recognition
images
pixel ≈values
co-ordinates of images in face space
![Page 35: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/35.jpg)
Original faces
Image removed due to copyright considerations.
![Page 36: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/36.jpg)
Principal Components
Image removed due to copyright considerations.
![Page 37: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/37.jpg)
Face Recognition
● PCA has been discussed as a model of human perception – not just an engineering solution – Hancock, Burton and Bruce (1996). Face processing:
human perception and principal components analysis
![Page 38: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/38.jpg)
Latent Semantic Analysis (LSA)documents
logword ≈ frequencies co-ordinates of
documents insemantic space
● New documents can be located in semantic space ● Similarity between documents is the angle
between their vectors in semantic space
![Page 39: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/39.jpg)
LSA: Applications
● Essay grading
● Synonym test
![Page 40: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/40.jpg)
LSA as a cognitive theory
● Do brains really carry out SVD?
– Irrelevant: the proposal is at the level of computational theory
● A solution to Fodor's problem?
– Are the dimensions that LSA finds really new?
![Page 41: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/41.jpg)
Compare with the Bruner reading
“striped and three borders”: conjunctive concept
Figure by MIT OCW.
![Page 42: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/42.jpg)
● Bruner Reading: – Raw features: texture (striped, black)
shape (cross, circle)number
– Disjunctive and conjunctive combinations allowed
● LSA: – Raw features: words – Linear combinations of raw features allowed
(new dimensions are linear combinations of the raw features)
![Page 43: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/43.jpg)
LSA as a cognitive theory
● Do brains really carry out SVD?
– Irrelevant: the proposal is at the level of computational theory
● A solution to Fodor's problem?
– Are the dimensions that LSA finds really new?
● What the heck do the dimensions even mean?
![Page 44: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/44.jpg)
Non-Negative Matrix Factorizationobjects
raw ≈PCA: features
entries can be negative
objects
raw ≈NMF: features (Lee and Seung)
entries must be non-negative
![Page 45: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/45.jpg)
Dimensions found by NMF
Please see: Lee, D. D., and H. S. Seung. "Algorithms for non-negative matrix factorization." Advances in Neural Information Processing 13. Proc. NIPS*2000, MIT Press, 2001.
See also Tom Griffiths' work on finding topics in text
Image removed due to copyright considerations.
______________________________________________________________
![Page 46: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/46.jpg)
Outline
● Spatial Representations – Multidimensional scaling – Principal component analysis
● Tree representations – Additive trees – Hierarchical agglomerative clustering
● Feature representations – Additive clustering
![Page 47: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/47.jpg)
Tree Representations
Image removed due to copyright considerations.
![Page 48: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/48.jpg)
Tree Representations
● Library of Congress system ● Q335.R86
Q: ScienceQ1-Q385: General Science
Q300-336: CyberneticsQ331-Q335: Artificial Intelligence
Q335.R86: Russell & Norvig, AIMA
![Page 49: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/49.jpg)
Tree Representations
5-year-old’s 7-year-old’s ontology ontology
Keil, Frank C. Concepts, Kinds, and Cognitive Development. Cambridge, MA: MIT Press, 1989.______________________________________
![Page 50: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/50.jpg)
Tree Representations
● We find hierarchical representations very natural. Why ?
BUT
● Hierarchical representations are not always obvious. The work of Linnaeus was a real breakthrough.
![Page 51: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/51.jpg)
Today:
● Trees with objects located only at leaf nodes
![Page 52: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/52.jpg)
ADDTREE (Sattath and Tversky)● Input: a dissimilarity matrix ● Output: an unrooted binary tree
●
● Computational Theory
Algorithm:
: distance in a tree
: human dissimilarity ratings
Want
– search the space of trees using heuristics
![Page 53: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/53.jpg)
ADDTREE: example
Image removed due to copyright considerations.
![Page 54: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/54.jpg)
ADDTREE
● Tree-distance is a metric
● Can think of a tree as a space with an unusual kind of geometry
![Page 55: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/55.jpg)
Hierarchical Clustering
● Input: a dissimilarity matrix ● Output: a rooted binary tree ● Computational Theory – ? (but see Kamvar, Klein and Manning, 2002)
● Algorithm: – Begin with one group per object – Merge the two closest groups – Continue until only one group remains
![Page 56: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/56.jpg)
Hierarchical Clustering
D E F
B A C
![Page 57: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/57.jpg)
How close are two groups?
Single-link clustering Complete-link clustering
Average-link clustering
![Page 58: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/58.jpg)
Hierarchical Clustering: Example
![Page 59: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/59.jpg)
Tree-building as feature discovery
cetaceanprimate
![Page 60: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/60.jpg)
Outline
● Spatial Representations – Multidimensional scaling – Principal component analysis
● Tree representations – Additive trees – Hierarchical agglomerative clustering
● Feature representations WARNING: – Additive clustering additive clustering is
not about trees
![Page 61: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/61.jpg)
Additive Clustering● Representation: an object is a collection of
discrete features – eg Elephant = {grey, wrinkly, has_trunk,
is_animal ...}
● Additive clustering is about discovering features from similarity data
![Page 62: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/62.jpg)
Additive clustering
sij wk fi k f jk k
sij : similarity of stimuli i , j wk : weight of cluster k fik : membership of stimulus i in cluster k
(1 if stimulus i in cluster k , 0 otherwise)
Equivalent to similarity as a weighted sum of common features (Tversky, 1977).
![Page 63: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/63.jpg)
Additive clustering
≈
objects feats objects
feat
s
obje
cts
obje
cts
wk fik f jksij ≈ k
![Page 64: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/64.jpg)
Additive clustering for the integers 0-9:
sij wk fik f jkk
Rank Weight Stimuli in cluster Interpretation
0 1 2 3 4 5 6 7 8 9
1 .444 * * * powers of two
2 .345 * * * small numbers
3 .331 * * * multiples of three
4 .291 * * * * large numbers
5 .255 * * * * * middle numbers
6 .216 * * * * * odd numbers
7 .214 * * * * smallish numbers
8 .172 * * * * * largish numbers
![Page 65: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/65.jpg)
General Questions
● We've seen several types of representations. How do you pick the right representation for a domain? – related to the statistical problem of model selection
– to be discussed later
![Page 66: Knowledge Representation: Spaces, Trees, Features · Knowledge Representation A good representation should: – be parsimonious – pick out important features – make common operations](https://reader034.vdocuments.us/reader034/viewer/2022042713/5fb2f34c5f40027c61122f30/html5/thumbnails/66.jpg)
Next Week
● More complex representations