design principles of neural network - github pages · biological analogies the visual ... memory...
TRANSCRIPT
Human brains as metaphors of statistical modelsBiological analogies
The visual cortex of mammals
Multiple sensing channels
Memory and attention
Machine learning instantiations
Deep convolutional neural networks
Multimodal neural networks
LSTMs and GRUs
Learning Mechanism: Correction of MistakesNature used a single tool to get to today’s success: mistake
Learning To Leverage ContextMemory in Recurrent Architectures: LSTM (Long Short Term Memory Network)
Input x, output y, context c (memory)y
x
y y
x x
t
y y y
c c c
Forgetgate
Memorizationgate
Outputgate
Concatenation
Why Is Context Important?In language, most grammars are not context free
End-to-end translation, Alex Graves
Memory is necessary for localizationLatest experiment in asynchronous deep RL: LSTMS for maze running
Memory comes at a cost: a lot of RAM or VRAM is necessary
The distributed brain at the edgeDistributed RL is reminiscent of the philosophical omega point of knowledge
Multiple Input ControlMultiplexing Inputs
Radar FrontCamera
RearCamera Odometry
Fully connected
layer
Fully connected
layer
ReluMaxConv
ReluMaxConv
ReluMaxConv
ReluMaxConv
ReluMaxConv
ReluMaxConv
Concatenated output
Fully connected layer
Fully connected layer
Softmax