![Page 1: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/1.jpg)
Learning Important Features Through PropagatingActivation Differences
Avanti Shrikumar Peyton Greenside Anshul Kundaje
Stanford University
ICML, 2017Presenter: Ritambhara Singh
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 1
/ 39
![Page 2: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/2.jpg)
Outline
1 IntroductionMotivationBackgroundState-of-the-artDrawbacks
2 Proposed ApproachDeepLIFT MethodDefining ReferenceSolutionMultipliers and Chain RuleSeparating positive and negative contributionRules for assigning contributions
3 ResultsMNIST digit classificationDNA sequence classification
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 2
/ 39
![Page 3: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/3.jpg)
Outline
1 IntroductionMotivationBackgroundState-of-the-artDrawbacks
2 Proposed ApproachDeepLIFT MethodDefining ReferenceSolutionMultipliers and Chain RuleSeparating positive and negative contributionRules for assigning contributions
3 ResultsMNIST digit classificationDNA sequence classification
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 3
/ 39
![Page 4: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/4.jpg)
Motivation
Interpretabilty of neural networks :Assign importance score toinputs for a given output.
Importance is defined in terms of differences from a ‘reference’ state.
Propagates importance signal even when gradient is zero.
Gives separate consideration to positive and negative contributions.
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 4
/ 39
![Page 5: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/5.jpg)
Motivation
Interpretabilty of neural networks :Assign importance score toinputs for a given output.
Importance is defined in terms of differences from a ‘reference’ state.
Propagates importance signal even when gradient is zero.
Gives separate consideration to positive and negative contributions.
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 4
/ 39
![Page 6: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/6.jpg)
Motivation
Interpretabilty of neural networks :Assign importance score toinputs for a given output.
Importance is defined in terms of differences from a ‘reference’ state.
Propagates importance signal even when gradient is zero.
Gives separate consideration to positive and negative contributions.
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 4
/ 39
![Page 7: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/7.jpg)
Motivation
Interpretabilty of neural networks :Assign importance score toinputs for a given output.
Importance is defined in terms of differences from a ‘reference’ state.
Propagates importance signal even when gradient is zero.
Gives separate consideration to positive and negative contributions.
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 4
/ 39
![Page 8: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/8.jpg)
Outline
1 IntroductionMotivationBackgroundState-of-the-artDrawbacks
2 Proposed ApproachDeepLIFT MethodDefining ReferenceSolutionMultipliers and Chain RuleSeparating positive and negative contributionRules for assigning contributions
3 ResultsMNIST digit classificationDNA sequence classification
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 5
/ 39
![Page 9: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/9.jpg)
Interpretation of Neural Networks
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 6
/ 39
![Page 10: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/10.jpg)
Outline
1 IntroductionMotivationBackgroundState-of-the-artDrawbacks
2 Proposed ApproachDeepLIFT MethodDefining ReferenceSolutionMultipliers and Chain RuleSeparating positive and negative contributionRules for assigning contributions
3 ResultsMNIST digit classificationDNA sequence classification
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 7
/ 39
![Page 11: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/11.jpg)
State-of-the-art
Perturbation-based forward propagation approaches: Zeiler andFergus (2013), Zhou and Troyanskaya (2015).
Backpropagation-based approaches: Saliency maps: Simonyan etal. (2013), Guided Backpropagation: Springenberg et al. (2014)
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 8
/ 39
![Page 12: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/12.jpg)
Outline
1 IntroductionMotivationBackgroundState-of-the-artDrawbacks
2 Proposed ApproachDeepLIFT MethodDefining ReferenceSolutionMultipliers and Chain RuleSeparating positive and negative contributionRules for assigning contributions
3 ResultsMNIST digit classificationDNA sequence classification
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 9
/ 39
![Page 13: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/13.jpg)
Saturation problem
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 10
/ 39
![Page 14: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/14.jpg)
Saturation problem
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 11
/ 39
![Page 15: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/15.jpg)
Thresholding Problem
y = max(0, x − 10)
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 12
/ 39
![Page 16: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/16.jpg)
Outline
1 IntroductionMotivationBackgroundState-of-the-artDrawbacks
2 Proposed ApproachDeepLIFT MethodDefining ReferenceSolutionMultipliers and Chain RuleSeparating positive and negative contributionRules for assigning contributions
3 ResultsMNIST digit classificationDNA sequence classification
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 13
/ 39
![Page 17: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/17.jpg)
Philosophy
Explains difference in output from some ‘reference’ output in terms ofdifference on input from some ‘reference’ input.
Summation-to-delta property:
Σni=1C∆xi∆t = ∆t (1)
Blame ∆t on ∆x1,∆x2, . . .
C∆xi∆t can be non-zero even when δtδxi
is zero.
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 14
/ 39
![Page 18: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/18.jpg)
Philosophy
Explains difference in output from some ‘reference’ output in terms ofdifference on input from some ‘reference’ input.
Summation-to-delta property:
Σni=1C∆xi∆t = ∆t (1)
Blame ∆t on ∆x1,∆x2, . . .
C∆xi∆t can be non-zero even when δtδxi
is zero.
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 14
/ 39
![Page 19: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/19.jpg)
Philosophy
Explains difference in output from some ‘reference’ output in terms ofdifference on input from some ‘reference’ input.
Summation-to-delta property:
Σni=1C∆xi∆t = ∆t (1)
Blame ∆t on ∆x1,∆x2, . . .
C∆xi∆t can be non-zero even when δtδxi
is zero.
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 14
/ 39
![Page 20: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/20.jpg)
Philosophy
Explains difference in output from some ‘reference’ output in terms ofdifference on input from some ‘reference’ input.
Summation-to-delta property:
Σni=1C∆xi∆t = ∆t (1)
Blame ∆t on ∆x1,∆x2, . . .
C∆xi∆t can be non-zero even when δtδxi
is zero.
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 14
/ 39
![Page 21: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/21.jpg)
Outline
1 IntroductionMotivationBackgroundState-of-the-artDrawbacks
2 Proposed ApproachDeepLIFT MethodDefining ReferenceSolutionMultipliers and Chain RuleSeparating positive and negative contributionRules for assigning contributions
3 ResultsMNIST digit classificationDNA sequence classification
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 15
/ 39
![Page 22: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/22.jpg)
Defining Reference
Given neuron x with inputs i1, i2, . . . such that x = f (i1, i2, . . . )
Given reference activations i01 , i02 , . . . of the input:
x0 = f (i01 , i02 , . . . ) (2)
Choose reference input and propagate activations though the net.
Good reference will rely on domain knowledge: “What am I interestedin measuring difference against?”
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 16
/ 39
![Page 23: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/23.jpg)
Outline
1 IntroductionMotivationBackgroundState-of-the-artDrawbacks
2 Proposed ApproachDeepLIFT MethodDefining ReferenceSolutionMultipliers and Chain RuleSeparating positive and negative contributionRules for assigning contributions
3 ResultsMNIST digit classificationDNA sequence classification
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 17
/ 39
![Page 24: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/24.jpg)
Saturation Problem
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 18
/ 39
![Page 25: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/25.jpg)
Thresholding Problem
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 19
/ 39
![Page 26: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/26.jpg)
Outline
1 IntroductionMotivationBackgroundState-of-the-artDrawbacks
2 Proposed ApproachDeepLIFT MethodDefining ReferenceSolutionMultipliers and Chain RuleSeparating positive and negative contributionRules for assigning contributions
3 ResultsMNIST digit classificationDNA sequence classification
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 20
/ 39
![Page 27: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/27.jpg)
Multipliers
m∆x∆t =C∆x∆t
∆t(3)
Multiplier is the contribution of ∆x to ∆t divided by ∆x
Compare: partial derivative = δtδx
Infinitesimal contribution of δx to δt, divided by δx
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 21
/ 39
![Page 28: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/28.jpg)
Chain Rule
m∆xi∆z = Σjm∆xi∆yjm∆yj∆z (4)
Can be computed efficiently via backpropagation
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 22
/ 39
![Page 29: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/29.jpg)
Outline
1 IntroductionMotivationBackgroundState-of-the-artDrawbacks
2 Proposed ApproachDeepLIFT MethodDefining ReferenceSolutionMultipliers and Chain RuleSeparating positive and negative contributionRules for assigning contributions
3 ResultsMNIST digit classificationDNA sequence classification
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 23
/ 39
![Page 30: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/30.jpg)
Separating positive and negative contribution
In some cases, important to treat positive and negative contributionsdifferently.
Introduce ∆x+i and ∆x−i , such that:
∆xi = ∆x+i + ∆x−i ;C∆xi∆t = C∆x+
i ∆t + C∆x−i ∆t
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 24
/ 39
![Page 31: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/31.jpg)
Outline
1 IntroductionMotivationBackgroundState-of-the-artDrawbacks
2 Proposed ApproachDeepLIFT MethodDefining ReferenceSolutionMultipliers and Chain RuleSeparating positive and negative contributionRules for assigning contributions
3 ResultsMNIST digit classificationDNA sequence classification
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 25
/ 39
![Page 32: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/32.jpg)
Linear Rule
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 26
/ 39
![Page 33: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/33.jpg)
Rescale Rule
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 27
/ 39
![Page 34: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/34.jpg)
Where it works
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 28
/ 39
![Page 35: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/35.jpg)
Where it works
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 29
/ 39
![Page 36: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/36.jpg)
Where it fails: “min” (AND) relation
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 30
/ 39
![Page 37: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/37.jpg)
Where it fails: “min” (AND) relation
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 31
/ 39
![Page 38: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/38.jpg)
RevealCancel Rule
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 32
/ 39
![Page 39: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/39.jpg)
Solution: “min” (AND) relation
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 33
/ 39
![Page 40: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/40.jpg)
Outline
1 IntroductionMotivationBackgroundState-of-the-artDrawbacks
2 Proposed ApproachDeepLIFT MethodDefining ReferenceSolutionMultipliers and Chain RuleSeparating positive and negative contributionRules for assigning contributions
3 ResultsMNIST digit classificationDNA sequence classification
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 34
/ 39
![Page 41: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/41.jpg)
MNIST digit classification
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 35
/ 39
![Page 42: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/42.jpg)
Outline
1 IntroductionMotivationBackgroundState-of-the-artDrawbacks
2 Proposed ApproachDeepLIFT MethodDefining ReferenceSolutionMultipliers and Chain RuleSeparating positive and negative contributionRules for assigning contributions
3 ResultsMNIST digit classificationDNA sequence classification
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 36
/ 39
![Page 43: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/43.jpg)
DNA sequence classification
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 37
/ 39
![Page 44: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/44.jpg)
DNA sequence classification
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 38
/ 39
![Page 45: Learning Important Features Through Propagating Activation ... · Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating](https://reader034.vdocuments.us/reader034/viewer/2022043004/5f85543f0b30ab6be63fb8e4/html5/thumbnails/45.jpg)
Summary
Novel approach for computing importance scores based on differencesfrom the ‘reference’.
Using difference-from-reference allows information to propagate evenwhen the gradient is zero
Separates contributions from positive and negative terms
Video at : https://www.youtube.com/watch?v=v8cxYjNZAXc&
index=1&list=PLJLjQOkqSRTP3cLB2cOOi_bQFw6KPGKML
Slides at: https://drive.google.com/file/d/0B15F_
QN41VQXbkVkcTVQYTVQNVE/view
Future Direction
Applying DeepLIFT to RNNsCompute ‘reference’ empirically from data
Avanti Shrikumar, Peyton Greenside, Anshul Kundaje (Stanford University)Learning Important Features Through Propagating Activation DifferencesICML, 2017 Presenter: Ritambhara Singh 39
/ 39