malware triage/ using open data to help develop robust ...€¦ · using open data to help develop...
TRANSCRIPT
What is an IOC
Indicators of Compromise (IOCs) are forensic artifacts of an intrusion that can be identified on a host or network.
openioc.org (http://openioc.org/resources/An_Introduction_to_OpenIOC.pdf)
In Practice…Is APTx attacking us? I saw
this frightening article in CISO monthly magazine…
WTF?? Ransomewarejust infected half of
accounting??
Wait didn’t I just remove this same trojan from Dave’s
workstation last week?
Hey look I just hooked that IOC feed to our IDS…
OMG! Our IDS just blocked traffic to all of our developers!
A Possible Solution: Triage
What is it exploiting?
Is it malicious?
Suspicious URL
Suspicious E-mail
Intel feed
Do we have exposure?
Security Event
Incident!
A Better Solution: Triage + IOCs = Automation!
What is it exploiting?
Is it malicious?
Suspicious URL
Suspicious E-mail
Intel feed
Do we have exposure?
Security Event
Incident!Filter knowns
IOCs Root cause analysis
The Problem With AVArtemis!1A5E05B1B9E1
Artemis!262BC0AE2FB0
Artemis!04296F13925B
Artemis!110C43F8A337
Artemis!9BFC61456261
Artemis!9BE792AC4667
Malware Specific IOCs
IOCs:Indicators of Compromise.
Malware Specific IOCs: forensic artifacts resulting from the presence or execution of malware.
AV Signatures
Robust IOCs
Lifetime of Malware Family
Effe
ctiv
enes
s of
IOC
Diversity ofMalware
Brittle IOC
Robust IOC
Most Robust Is…Multiple Samples
+Code Review
+Comparative
Analysis!!
Reverse engineering with IDA and a Debugger!!
The Key is Comparative Analysis
Primary Sample
Sample #1
Sample #2
Pivot (Attribute)
This is one of the most important slides
in the presentation.
Building Robust Indicators
IdentifyPivots
DiscoveryMining Open
DataAnalysis(Triage)
Test(Validate)
DevelopIOC
ComparativeAnalysis
Analysis Triage
Is it malicious?Can we identify the malware family?Collect static attributesCollect dynamic attributes
Static Attributes
Hmm… Something isn’t right, there are no file properties for
this executable?
I’m totally legit!
Static Attributes: Metadata
Compiler Artifacts
EXIF Data
Easily Modified Can make poor indicators
Sample Discovery Can work as primary indicators
Static Attributes Identified
• Initial Sample was packed with UPX
• Contains no file or version metadata
Packer / Crypter vs. Static Attributes
Packer Stub
Obfuscated Payload
Payload
Low Quality Static Attributes
Packer / Crypter Weakness: Runtime!
A
X B
C
PE Runtime
Win
dow
s A
PI
A
X B
C
PE Runtime
Win
dow
s A
PI
vs.
1
2
3
4
5
1
2
3
4
5
Sandbox Magic
A
X B
C
PE Runtime
Win
dow
s A
PINetwork
Filesystem
Registry
Process
Synchronization
Services
Sandbox Process Monitor
Dynamic AttributesIn-Memory Strings
Process Handles / Mutex
Access / Created Files
Registry Keys
Network Traffic
Level Up! Your Analysis With Some Light Debugging
Quickly trace the sample in a debugger to deobfuscate strings and gain CONTEXT.
Try Windbg!
Dynamic Attributes Identified
• Creates Mutex: QKitMan2016_1
• Creates Registry Key: HKEY_CURRENT_USER\SOFTWARE\QKitMan2016
• Requests IP from IPReq using HTTP GET Request
• Post IP as Payload to LiveJournal account qkitman1010
Tricks For Searching Online Sandboxes
• Shared Virtualization Infrastructure
• Shared Templates
• Hunting using computed Values
Comparative Analysis
Identify common characteristics
Common Properties
Common Behaviour
This is a key section even though there
aren’t a lot of slides.
Comparison ChecklistInitial Sample Pivot Sample A Pivot Sample B
Strings X X
Exif Data
Imphash
Memory Strings X X X
Mutex X X
File Names X
Registry Keys X X X
Network Traffic X X
Level Up! Your Comparative Analysis With Some Light Disassembly
Comparative analysis works at the byte code level as well!
The opcodes of the string building algorithm are identical. Don’t forget to use wildcards for variable bytes (0D)!!
IOC Formats
We Really Really
Aren’t Talking About
Formats!
There are tons of links to great free IOC training on our site : )
Known Bad
Indicator types dictate how they are tested.
Run indicators against repository
of known bad
Validate!Update when required
Known Good
Validate!Resolve Issues
Test indicator against a repository of known
good samples
Try testing IOCs against your corporate “Golden image(s)”.
Test Automation
Waiting for samples to hit you organization to test
indicators
Automating discovery before your organization
is affected
vs.
Key TakeawaysTriage + IOCs = Automation!
Robust IOCs can be built without the need for a debugger or disassembly.
Comparative analysis is key!
Open data can be leveraged to collect related samples. Try OAPivot…
Remember to continuously test your IOCs.
Image Attribution• Noun Project - Molecules by Zoë Austin
• Noun Project - Funnel by Vaibhav Radhakrishnan
• Noun Project - Shield by AFY Studio
• Noun Project - Checklist by Arthur Shlain
• Noun Project - Rocket Man by LuisPrado
• Noun Project - Head-Desk by Karthik Srinivas
• Noun Project - Checked Database by Arthur Slain
• Noun Project - Kevin Augustine LO
• Noun Project - Compilation by Richard Slater
• Noun Project - Database Warning by ProSymbols
• Noun Project - Flow Chart by Richard Schumann
• Noun Project - Debug by Lemon Liu
• Noun Project - Network by Creative Stall
• Noun Project - File Settings by ProSymbols