ip pier solution
TRANSCRIPT
Intro
56% of Internet traffic is generated by bots95% of sites breaches and infecting are automatic300% annual increase in DDoS attacks on the
Application layer30% annual increase in total number of DDoS attacksAverage indications of DDoS attacks are 9.7 Gb/s and
19 MppsMajor attacks increased beyond 600 Gb/s
Increase in number of users using NAT and proxyIncrease in number of mobile usersMass transfer from http to httpsPCI DSS requirement to prohibit transferring ssl
certificates to third partiesCAPTCHA is not efficient any longer
Intro 2
Necessary to block ALL queries by bots
New paradigms of breaches
High reliability of the service Wide channels for protection from L3&4
DDoS attacks Protection from DDoS attacks at the
Application layer Capability to detect singular queries by bots Protection from bots without blocking IP
addresses Capability to filter https without disclosure of
traffic
Requirements for security systems
In protection
Active bot Protection (ABP) Protection from DDoS layer 7 Protection from DDoS layer 3 Protection of HTTPS Detection of bots without
CAPTCHA WAF Zero Day White and black lists
Increase is site availability
Site boost (caching, optimization, SPDY)
Site balancing (for multiple platforms including)
Optimization (for a mobile client through traffic compression)
Site monitoring and statistics IPv6 Always Online Custom pages of errors
Capacities of Cloud
Cloud fail safety: 2 Tb/s – capacity of communication channels from different
operators 2N backup of all Cloud components
ISP 1 client's platform
General working principles of clearing cloud
Cloud connection: Change A of a DNS record Network notice on BGP (not less
than /24)
ISP 2
ISP N
client's platform
client's platform
Basic protection principles
borderpacketfilter
hardwarepacketfilter
softwarepacketfilter
stateful analyzer
Application
Layerverificatio
n
Implementation features:
Detection of some attacks by means of traffic analysis for L3&4 using original math algorithms
Active interaction with bots System of automated security levels control Different security levels for different URL simultaneously are
available Interaction with bots within 0.2 – 64 Kb of traffic Counter-bot system (we make an attack to be resource-intensive
and economically unsound).
Active Bot Protection (basic principles)
Benefits for client: Protection from DDoS at the Application layer Protection from scanning Protection from automated replication Protection from spam-bots in comments and forums Avoiding necessity to use CAPTCHA Protection beginning from the first query for HTTPS,
both with and without disclosure of traffic
Active Bot Protection for client
Operation modes of the complex:
✓ Filtration at the Application layer disabled.✓ “DDoS protection” – we analyze every query, but do not make changes to user-application
interaction until the user seems to be suspicious to us. This is the most common mode, suitable for most sites. If any suspicion arises concerning user's legitimacy, then, before proxying his queries, we enable mechanisms of additional verifications - watching his reaction. If everything is good - we allow the query.
✓ "Active Bot Protection" - in this mode we test every user regardless his prior activity. This mode is used when the maximum protection is required, even against a singular bot query. At that approach, analytics is not disabled. User testing modes are selected depending on personal account settings and user's activity. This approach is good for saving a site from bots totally.
Operation principles of traffic filtration at Application layer
ssl certificate with key transferred
Traffic disclosure
HTTPS traffic filtration (with disclosure)
Benefits Requires no integration with
security system (except certificate transfer)
Easy setting
Drawbacks Certificate transfer is
necessary PCI DSS requirements are not
met
Benefits Certificate transfer is not
required PCI DSS requirements are
met
Drawbacks Integration with security system is
necessary Time lags on protection activation Impossible to block sessions, only IP
addresses
Transfer of access logs for analysis and registering bots
in blacklists
HTTPS filtration (without traffic disclosure, with logs transfer)
HTTPS filtration (without traffic disclosure, with token)
Redirect to security system for user verification and granting a token, after that the user is not
subject to verification for a certain period of time
Benefits Certificate transfer is not
required PCI DSS requirements are met No time lags on protection
activation Blocking sessions, not IP
Drawbacks Integration with security system is
necessary During token validity period an attack
with use of this token is possible
If we reckon the user is legitimate
If we reckon that additional verification is required
Information provided by client: URL, IP, t, UA
HTTPS filtration (without traffic disclosure, with validation service)
Benefits Certificate transfer is not
required PCI DSS requirements are met No time lags on protection
activation Blocking sessions, not IP
Drawbacks Integration with security system is
necessary
WAF capacities: Protection from SQL Injections Protection from Cross-site scripting Protection from illegal resources access Protection from Remote file inclusions System has self learning mechanisms Custom rules can be added
client
Protection from manual breach (WAF)
Balancing for multiple platforms
platform 1
platform NBalancing modes:
Round robin With weight
ratio Active-passive
Caching
client
The complex is capable of: Caching queried URLs for a set period
of time
It enables a client to: Reduce channel load Reduce hardware load Smoothen “Habra effect”
The complex is capable of: Storing static copies of a client’s site and updating them in certain
period of time
It enables a client to: Provide users with static part of the site if the client’s infrastructure
fails Save clients To improve rating in search engines
client
Always online
Competitors