modeling rich structured data via kernel distribution...
TRANSCRIPT
Introduction
Machine Learning CS 7641,CSE/ISYE 6740, Fall 2015
Le Song
What is machine learning (ML)
Study of algorithms that improve their performance at some task with experience
2
Common to industrial scale problems
3
13 million wikipedia pages
800 million users
6 billion photos
24 hours video uploaded per minutes
> 1 trillion webpages
340 million tweets per day
Data from 2014
Increasingly relevant to science problems
4
Genome
Cell Protein
Brain Galaxy
Weather
Self-driving car Robot
Finance
Sustainability
Design
Music
Syllabus
Cover a number of most commonly used machine learning algorithms in sufficient amount of details on their mechanisms.
Organization
Unsupervised learning (data exploration)
Learning without labels or without optimizing for predictive task
Supervised learning (predictive models)
Learning with labels, focusing on predictive performance
Complex models (dealing with nonlinearity, combine models etc)
Nonlinearity, complex dependency, real world applications
Basics and Breadth (guest lecture and seminars)
5
Syllabus: unsupervised learning
Learning without labels or without optimizing for predictive task
Clustering vectorial data
Kmeans
Hierarchical clustering
Clustering networks
Spectral algorithm
Dimensionality reduction,
Principal component analysis
Dimensionality for manifold
Locally linear embedding
Density estimation
Feature selection
Novelty/abnormality detection
6
Organizing Images
7
Image Databases
What are the desired outcomes? What are the inputs (data)? What are the learning paradigms?
Find community in social networks
8
What are the desired outcomes? What are the inputs (data)? What are the learning paradigms?
Visualize Image Relations
9
Each image has thousands or millions of pixels.
What are the desired outcomes? What are the inputs (data)? What are the learning paradigms?
Shape of data
10
What are the desired outcomes? What are the inputs (data)? What are the learning paradigms?
Feature selection
11
What are the desired outcomes? What are the inputs (data)? What are the learning paradigms?
Novelty/abnormality detection
12
What are the desired outcomes? What are the inputs (data)? What are the learning paradigms?
Find abnormal object
Syllabus: supervised learning
Learning with labels, focusing on predictive performance
Classifications
Nearest neighbor classifier
Naïve Bayes classifier
Logistic regression
Support vector machine
Combined classifiers
Boosting
Regressions
Ridge regression
Cross-validation
13
Image classification
14
What are the desired outcomes? What are the inputs (data)? What are the learning paradigms?
Face Detection
15
What are the desired outcomes? What are the inputs (data)? What are the learning paradigms?
Weather Prediction
16
Pre
dict
Predict
Numeric values: 40 F Wind: NE at 14 km/h Humidity: 83%
What are the desired outcomes? What are the inputs (data)? What are the learning paradigms?
Understanding brain activity
17
What are the desired outcomes? What are the inputs (data)? What are the learning paradigms?
Nonlinear classifier
18
Nonlinear Decision Boundaries
Linear SVM Decision Boundaries
What are the desired outcomes? What are the inputs (data)? What are the learning paradigms?
Kernel methods Neural Networks
Syllabus: advanced topics, complex models
Nonlinearity, complex dependency, real world applications
Kernel methods
Hidden Markov models
Graphical models
Topic modeling
Social network analysis
Collaborative filtering
19
Handwritten digit recognition/text annotation
20
Inter-character dependency
Inter-word dependency
Aoccdrnig to a sudty at Cmabrigde Uinervtisy, it deosn’t mttaer in waht oredr the ltteers in a wrod are, the olny iprmoetnt tihng is taht the frist and lsat ltteer be at the rghit pclae. The rset can be a ttoal mses and you can sitll raed it wouthit a porbelm. Tihs is bcuseae the huamn mnid deos not raed ervey lteter by istlef, but the wrod as a wlohe.
What are the desired outcomes? What are the inputs (data)? What are the learning paradigms?
Speech recognition
21
“Machine Learning is the preferred method for speech recognition …”
Audio signals
Text
Models Hidden Markov Models
tgtatcccagcatattttacgtaaaaacaaaacggtaatgcgaacataacttatttattggggcccggaccgcaaaccggccaaacgcgtttgcacccataaaaacataagggcaacaaaaaaattgttaagctgttgtttatttttgcaatcgaaaatc
gaaagcgctcaaatagctgcgatcactcgggagcagggtaaagtcgcctcgaaacaggaagctgaagcatcttctataaatacactcaaagcgatcattccgaggcgagtctggttagaaatttacatggactgcaaaaaggtatagccccacaaactgc
gtctccacatcgctgcgtttcggcagctaattgccttttagaaattattttcccatttcgagaaactcgtgtgggatgccggatgcggctttcaatcacttctggcccgggatcggattgggtcacattgtctgcgggctctattgtctcgatccgcgga
tttaaggcgcagttcgcgtgcttagcggtcagaaaggcagagattcggttcggattgatgcgctggcagcagggcacaaagatctaatgactggcaaatcgctacaaataaattaaagtccggcggctaattaatgagcggactgaagccactttggcgg
agttcattaaccaaaaaacagcagataaacaaaaacggcaaagaaaattgccacagagttgtcacgctttgttgcacaaacatttgtgcagaaaagtgaaaagcttttagccattattaagtttttcctcagctcgctggcagcacttgcgaatgtaaat
tgaaactgatgttcctcataaatgaaaattaatgtttgctctacgctccaccgaactcgcttgtttgggggattggctggctaatcgcggctagatcccaggcggtataaccttttcgcttcatcagttgtgaaaccagatggctggtgttttggcaacg
acagccagcggactcccctcgaacgctctcgaaatcaagtggctttccagccggcccgctgggccgctcgcccactggaccggtattcccaggccaggccacactgtaccgcaccgcataatcctcgccagactcggcgctgataaggcccaatgtcaag
accgcactccgcaggcgtctatttatgccaaggaccgttcttcttcagctttcggctcgagtatttgttgtgccatgttggttacgatgccaatcgcggtacagttatgcaaatgagcagcgaataccgctcactgacaatgaacggcgtcttgtcaacc
attcatattcatgctgacattcatattcattcctttggttttttgtcttcgacggactgaaaagtgcggagagaaacccaaaaacagaagcgcgcaaagcgccgttaatatgcgaactcagcgaactcattgaagttatcacaacaccatatccataacc
atacccatatccatatcaatatcaatatcgctattattaacgatcatgctctgctgatcaagtattcagcgctgcgctagattcgacagattgaatcgagctcaatagactcaacagactccactcgacagatgcgcaatgccaaggacaattgccgtgt
gtaagtggagtaaacgaggcgtatgcgcaacctgcacctggcggacgcggcgtatgcgcaatgtgcaattcgcttaccttctcgttgcgggtcaggaactcccagatgggaatggccgatgacgagctgatctgaatgtggaaggcgcccagcaggcagc
aggccaagattactttcgccgcagtcgtcatggtgtcgttgctgcttttatgttgcgtactccgcactacacggagagttcaggggattcgtgctccgtgatctgtgatccgtgttccgtgggtcaattgcacggttcggttgtgtaaccttcgtgtggc
gactttctttttttttagggcccaataaaagcgcttttgtggcggcttgatagattatcacttggtttcggtggctagccaagtggctttcttctgtccgacgcacttaattgaattaaccaaacaacgagcgtggccaattcgtattatcgctgttttc
gggtgtacgtgtgtctcagcttgaaacgcaaaagcttgtttcacacatcggtttctcggcaagatgggggagtcagtcggtctagggagaggggcgcccaccagtcgatcacgaaaacggcgaattccaagcgaaacggaaacggagcgagcactataat
cttgcagtactatgtcgaacaaccgatcgcggcgatgtcagtgagtcgtcttcggacagcgctggcgctccacacgtatttaagctctgagatcggctttgggagagcgcagagagcgccatcgcacggcagagcgaaagcggcagtgagcgaaagcgag
agggagagcggcagcgggtgggggatcgggagccccccgaaaaaaacagaggcgcacgtcgatgccatcggggaattggaacctcaatgtgtgggaatgtttaaatattctgtgttaggtagtgtagtttcatagactatagattctcatacagattgta
tctgagagtccttcgagccgattatacacgacagcaaaatatttcagtcgcgcttgggcaaaaggcttaagcacgactcccagtccccccttacatttgtcttcctaagcccctggagccactatcaaacttgttctacgcttgcactgaaaatagaaat
taaaaaccaaagtaaacaatcaaaaagaccaaaaacaataacaaccagcaccgagtcgaacatcagtgaggcattgcaaaaatttcaaagtcaagtttgcgtcgtcatcgcgtctgagtccgatcaagccgggcttgtaattgaagttgttgatgagccg
cgggattactggattgtggcgaattctggtcagcatacttaacagcagcccgctaattaagcaaaataaacatatcaaattccagaatgcgacggcgccatcatcctgtttgggaattcaattcgcgggcagatcgtttaattcaattaaaaggtagaag
tcggtaaaagggagcagaagaatgcgatcgctggaatttcctaacatcacggaccccataaatttgataagcccgagctcgctgcgttgagtcagccaccccacatccccaaatccccgccaaaagaagacagctgggttgttgactcgccagattgagg
tgcggattgcagtggagtggacctggtcaaagaagcaccgttaatgtgctgattccattcgattccatccgggaatgcgataaagaaaggctctgatccaagcaactgcaatccggatttcgattttctctttccatttggttttgtatttacgtactta
tgggaaagcattctaatgaagacttggagaagacttacgttatattcagaccatcgtgcgatagaggatgagtcatttccatatggccgaaatttattatgtttactatcgtttttagaggtgttttttggacttaccaaaagaggcatttgttttctgg
gattattcaactgaaaagatatttaaattttttcttggaccattttcaaggttccggatatatttgaaacacactagctagcagtgttggtaagttacatgtatttctataatgtcatattcctttgtccgtattcaaatcgaatactccacatctcaca
ggatgttgtacttgaggaattggcgatcgtagcgatttcccccgccgtaaagttcctgatcctcgttgtttttgtacatcataaagtccggattctgctcgtcgccgaagatgggaacgaagctgccaaagctgagagtctgcttgaggtgctggtccgg
acggcgtcccagctggataaccttgctgtacagatcggcatctgcctggagggcacgatcgaaatccttccagtggacgaacttcacctgctcgctgggaatagcgttgttgtcaagcagctcaaggagcgtattcgagttgacgggctgcaccacgtta
agttcctgctccttcgctggggattcccctgcgggtaagcgccgcttgcttggactcgtttccaaatcccatagccacgccagcagaggagtaacagagctcwhereisthegenetgattaaaaatatcctttaagaaagcccatgggtataacttacg
tgctcactgcgtcctatgcgaggaatggtctttaggttctttatggcaaagttctcgcctcgcttgcccagccgcggtacgttcttggtgatctttaggaagaatcctggactactgtcgtctgcctggcttatggccacaagacccaccaagagcgaaa
cagacaggactgttatgattctcatgctgatgcgactgaagcttcacctgactcctgctccacaattggtggcctttatatagcgagatccacccgcatcttgcgtggaatagaaatgcgggtgactccaggaattagcattatcgatcggaaagtgcaa
agcagataaaactgaactaacctgacctaaatgcctggccataattaagtgcatacatacacattacattacttacatttgtataagaactaaattttatagtacataccacttgcgtatgtaaatgcttgtcttttctcttatatacgttttataaatt
tgtatcccagcatattttacgtaaaaacaaaacggtaatgcgaacataacttatttattggggcccggaccgcaaaccggccaaacgcgtttgcacccataaaaacataagggcaacaaaaaaattgttaagctgttgtttatttttgcaatcgaaaatc
gaaagcgctcaaatagctgcgatcactcgggagcagggtaaagtcgcctcgaaacaggaagctgaagcatcttctataaatacactcaaagcgatcattccgaggcgagtctggttagaaatttacatggactgcaaaaaggtatagccccacaaactgc
gtctccacatcgctgcgtttcggcagctaattgccttttagaaattattttcccatttcgagaaactcgtgtgggatgccggatgcggctttcaatcacttctggcccgggatcggattgggtcacattgtctgcgggctctattgtctcgatccgcgga
tttaaggcgcagttcgcgtgcttagcggtcagaaaggcagagattcggttcggattgatgcgctggcagcagggcacaaagatctaatgactggcaaatcgctacaaataaattaaagtccggcggctaattaatgagcggactgaagccactttggcgg
agttcattaaccaaaaaacagcagataaacaaaaacggcaaagaaaattgccacagagttgtcacgctttgttgcacaaacatttgtgcagaaaagtgaaaagcttttagccattattaagtttttcctcagctcgctggcagcacttgcgaatgtaaat
tgaaactgatgttcctcataaatgaaaattaatgtttgctctacgctccaccgaactcgcttgtttgggggattggctggctaatcgcggctagatcccaggcggtataaccttttcgcttcatcagttgtgaaaccagatggctggtgttttggcaacg
acagccagcggactcccctcgaacgctctcgaaatcaagtggctttccagccggcccgctgggccgctcgcccactggaccggtattcccaggccaggccacactgtaccgcaccgcataatcctcgccagactcggcgctgataaggcccaatgtcaag
accgcactccgcaggcgtctatttatgccaaggaccgttcttcttcagctttcggctcgagtatttgttgtgccatgttggttacgatgccaatcgcggtacagttatgcaaatgagcagcgaataccgctcactgacaatgaacggcgtcttgtcaacc
attcatattcatgctgacattcatattcattcctttggttttttgtcttcgacggactgaaaagtgcggagagaaacccaaaaacagaagcgcgcaaagcgccgttaatatgcgaactcagcgaactcattgaagttatcacaacaccatatccataacc
atacccatatccatatcaatatcaatatcgctattattaacgatcatgctctgctgatcaagtattcagcgctgcgctagattcgacagattgaatcgagctcaatagactcaacagactccactcgacagatgcgcaatgccaaggacaattgccgtgt
gtaagtggagtaaacgaggcgtatgcgcaacctgcacctggcggacgcggcgtatgcgcaatgtgcaattcgcttaccttctcgttgcgggtcaggaactcccagatgggaatggccgatgacgagctgatctgaatgtggaaggcgcccagcaggcagc
aggccaagattactttcgccgcagtcgtcatggtgtcgttgctgcttttatgttgcgtactccgcactacacggagagttcaggggattcgtgctccgtgatctgtgatccgtgttccgtgggtcaattgcacggttcggttgtgtaaccttcgtgtggc
gactttctttttttttagggcccaataaaagcgcttttgtggcggcttgatagattatcacttggtttcggtggctagccaagtggctttcttctgtccgacgcacttaattgaattaaccaaacaacgagcgtggccaattcgtattatcgctgttttc
gggtgtacgtgtgtctcagcttgaaacgcaaaagcttgtttcacacatcggtttctcggcaagatgggggagtcagtcggtctagggagaggggcgcccaccagtcgatcacgaaaacggcgaattccaagcgaaacggaaacggagcgagcactataat
cttgcagtactatgtcgaacaaccgatcgcggcgatgtcagtgagtcgtcttcggacagcgctggcgctccacacgtatttaagctctgagatcggctttgggagagcgcagagagcgccatcgcacggcagagcgaaagcggcagtgagcgaaagcgag
agggagagcggcagcgggtgggggatcgggagccccccgaaaaaaacagaggcgcacgtcgatgccatcggggaattggaacctcaatgtgtgggaatgtttaaatattctgtgttaggtagtgtagtttcatagactatagattctcatacagattgta
tctgagagtccttcgagccgattatacacgacagcaaaatatttcagtcgcgcttgggcaaaaggcttaagcacgactcccagtccccccttacatttgtcttcctaagcccctggagccactatcaaacttgttctacgcttgcactgaaaatagaaat
taaaaaccaaagtaaacaatcaaaaagaccaaaaacaataacaaccagcaccgagtcgaacatcagtgaggcattgcaaaaatttcaaagtcaagtttgcgtcgtcatcgcgtctgagtccgatcaagccgggcttgtaattgaagttgttgatgagccg
cgggattactggattgtggcgaattctggtcagcatacttaacagcagcccgctaattaagcaaaataaacatatcaaattccagaatgcgacggcgccatcatcctgtttgggaattcaattcgcgggcagatcgtttaattcaattaaaaggtagaag
tcggtaaaagggagcagaagaatgcgatcgctggaatttcctaacatcacggaccccataaatttgataagcccgagctcgctgcgttgagtcagccaccccacatccccaaatccccgccaaaagaagacagctgggttgttgactcgccagattgagg
tgcggattgcagtggagtggacctggtcaaagaagcaccgttaatgtgctgattccattcgattccatccgggaatgcgataaagaaaggctctgatccaagcaactgcaatccggatttcgattttctctttccatttggttttgtatttacgtactta
tgggaaagcattctaatgaagacttggagaagacttacgttatattcagaccatcgtgcgatagaggatgagtcatttccatatggccgaaatttattatgtttactatcgtttttagaggtgttttttggacttaccaaaagaggcatttgttttctgg
gattattcaactgaaaagatatttaaattttttcttggaccattttcaaggttccggatatatttgaaacacactagctagcagtgttggtaagttacatgtatttctataatgtcatattcctttgtccgtattcaaatcgaatactccacatctcaca
ggatgttgtacttgaggaattggcgatcgtagcgatttcccccgccgtaaagttcctgatcctcgttgtttttgtacatcataaagtccggattctgctcgtcgccgaagatgggaacgaagctgccaaagctgagagtctgcttgaggtgctggtccgg
acggcgtcccagctggataaccttgctgtacagatcggcatctgcctggagggcacgatcgaaatccttccagtggacgaacttcacctgctcgctgggaatagcgttgttgtcaagcagctcaaggagcgtattcgagttgacgggctgcaccacgtta
agttcctgctccttcgctggggattcccctgcgggtaagcgccgcttgcttggactcgtttccaaatcccatagccacgccagcagaggagtaacagagctctgaaaacagttcatggtttaaaaatatcctttaagaaagcccatgggtataacttacg
tgctcactgcgtcctatgcgaggaatggtctttaggttctttatggcaaagttctcgcctcgcttgcccagccgcggtacgttcttggtgatctttaggaagaatcctggactactgtcgtctgcctggcttatggccacaagacccaccaagagcgaaa
cagacaggactgttatgattctcatgctgatgcgactgaagcttcacctgactcctgctccacaattggtggcctttatatagcgagatccacccgcatcttgcgtggaatagaaatgcgggtgactccaggaattagcattatcgatcggaaagtgcaa
agcagataaaactgaactaacctgacctaaatgcctggccataattaagtgcatacatacacattacattacttacatttgtataagaactaaattttatagtacataccacttgcgtatgtaaatgcttgtcttttctcttatatacgttttataaatt
Bioinformatics
22
Where is the gene?
Spam Filtering
23
What are the desired outcomes? What are the inputs (data)? What are the learning paradigms?
Webpage classification
24
Company homepage vs. University homepage
Organizing documents
25
Reading, digesting, and categorizing a vast text database is too much for human!
We want:
What are the desired outcomes? What are the inputs (data)? What are the learning paradigms?
Quantifying social influence
26
What are the desired outcomes? What are the inputs (data)? What are the learning paradigms?
Product Recommendation
27
What are the desired outcomes? What are the inputs (data)? What are the learning paradigms?
Robot Control
28
Now cars can find their own ways!
What are the desired outcomes? What are the inputs (data)? What are the learning paradigms?
Basics/Prerequisites
Probabilities
Distributions, densities, marginalization, conditioning
Statistics
Mean, variance, maximum likelihood estimation
Linear algebra
Vector, matrix, multiplication, inversion, eigen-decomposition
Algorithms and Programming
Matlab, Basic data structures, computational complexity
Convex optimization
Basics will be covered during lecture
29
Machine learning for apartment hunting
Suppose you are to move to Atlanta
And you want to find the most reasonably priced apartment satisfying your needs:
square-ft., # of bedroom, distance to campus …
Living area (ft2) # bedroom Rent ($)
230 1 600
506 2 1000
433 2 1100
109 1 500
…
150 1 ?
270 1.5 ?
Linear Regression Model
Assume 𝑦 is a linear function of 𝑥 (features) plus noise 𝜖
𝑦 = 𝜃0 + 𝜃1𝑥1 + ⋯ + 𝜃𝑛𝑥𝑛 + 𝜖
where 𝜖 is an error model as Gaussian 𝑁 0, 𝜎2
Let 𝜃 = 𝜃0, 𝜃1, … , 𝜃𝑛⊤, and augment data by one dimension
𝑥 ← 1, 𝑥 ⊤
Then 𝑦 = 𝜃⊤𝑥 + 𝜖
31
Probability
Linear algebra
Linear algebra
Least mean square method
Given m data points, find 𝜃 that minimizes the mean square error
𝜃 = 𝑎𝑟𝑔𝑚𝑖𝑛𝜃 𝐿 𝜃 =1
𝑚 𝑦𝑖 − 𝜃⊤𝑥𝑖 2𝑚
𝑖
Set gradient to 0 and find parameter
𝜕𝐿 𝜃
𝜕𝜃= −
2
𝑚 𝑦𝑖 − 𝜃⊤𝑥𝑖 𝑥𝑖 = 0
𝑚
𝑖
⇔ −2
𝑚 𝑦𝑖𝑥𝑖 +
2
𝑚 𝑥𝑖𝑥𝑖⊤
𝜃
𝑚
𝑖
= 0
𝑚
𝑖
32
Statistics
Optimization
Optimization
Linear algebra
Statistics Statistics
Matrix version of the gradient
Define 𝑋 = 𝑥1, 𝑥2, … 𝑥𝑚 , 𝑦 = 𝑦1, 𝑦2, … , 𝑦𝑚 ⊤, gradient becomes
𝜕𝐿 𝜃
𝜕𝜃= −
2
𝑚 𝑋𝑦 +
2
𝑚𝑋𝑋⊤𝜃
⇒ 𝜃 = 𝑋𝑋⊤ −1𝑋𝑦
33
Linear algebra
Linear algebra
Matrix inversion in 𝜃 = 𝑋𝑋⊤ −1𝑋𝑦 expensive to compute
Gradient descent
𝜃 𝑡+1 ← 𝜃 𝑡 +𝛼
𝑚 𝑦𝑖 − 𝜃 𝑡⊤
𝑥𝑖 𝑥𝑖
𝑚
𝑖
Algorithms Programming
Optimization
Probabilistic Interpretation of LMS
Assume 𝑦 is a linear in 𝑥 plus noise 𝜖 𝑦 = 𝜃⊤𝑥 + 𝜖
Assume 𝜖 follows a Gaussian N(0,σ)
𝑝 𝑦𝑖 𝑥𝑖; 𝜃 =1
2𝜋𝜎exp −
𝑦𝑖 − 𝜃⊤𝑥𝑖 2
2𝜎2
By independence assumption, likelihood is 𝐿 𝜃
= 𝑝 𝑦𝑖 𝑥𝑖; 𝜃 =
𝑚
𝑖
1
2𝜋𝜎
𝑚
exp − 𝑦𝑖 − 𝜃⊤𝑥𝑖𝑚
𝑖
2
2𝜎2
Probability
Probabilistic Interpretation of LMS, cont.
Hence the log-likelihood is:
log 𝐿 𝜃 = 𝑚 log1
2𝜋𝜎 −
1
2𝜎2 𝑦𝑖 − 𝜃⊤𝑥𝑖 2𝑚
𝑖
LMS is equivalent to MLE of 𝜃 !
𝐿𝑀𝑆: 1
𝑚 𝑦𝑖 − 𝜃⊤𝑥𝑖 2𝑚
𝑖
How to make it work in real data?
Statistics
Algorithms Programming
Textbooks
Textbooks:
No a single book which covers everything machine learning
Pattern Recognition and Machine Learning, Chris Bishop
Written by leading industrial researcher
Presented from probabilistic and graphical model view
The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Trevor Hastie, Robert Tibshirani, Jerome Friedman
Written by leading statisticians
Presented from statistical view
Machine Learning: a Probabilistic Perspective, Kevin Murphy
Written by previous university professor and now googler
More details on derivation and good for self studying 36
Grading
4 assignments (50%)
Approximately 1 assignment every 5 lectures
More next slide
Midterm exam (20%)
Final exam (20%)
Background test (5%)
More in 2 slides
Participation (5%)
guest lectures, seminar attendance, and class feedback.
37
Assignments
Homework should be submitted before the deadline set in T-Square.
No late submission will be accepted through email.
We strongly encourage to use LaTeX for your submission.
Any kind of academic misconduct is subject to F grade as well as reporting to the Dean of students.
All answers and codes should be prepared by yourself.
If you refer to any material, it should be properly cited.
Write on your homework anyone with whom you collaborate
38
Background test
Background test in the second lecture
check your basic knowledge of probability and statistics, linear algebra, and matlab.
Aim to check your readiness for the class
< 40%, the class may be too difficult, consider taking it next time
40-79%, brush up your basic knowledge
In both cases, need to attend a mandatory review section.
39
The team
Instructor: Le Song, Klaus 1340
TA: Bo Xie, Hanjun Dai, Ashwin Shenoi, Rashit Tevali
Guest Lecturer: TBD
Links:
TA contacts: [email protected]
Discussion group:
http://piazza.com/gatech/fall2015/cse6740cs7641isye6740/home
More information: http://www.cc.gatech.edu/~lsong/teaching/CSE6740fall15.html
40