Transcript
Page 1: Apache Kafka Security

Page 1 © Hortonworks Inc. 2014

Apache Kafka Security

SSL, Kerberos & Authorization

Manikumar Reddy Hortonworks@omkreddy

Page 2: Apache Kafka Security

Page 2 © Hortonworks Inc. 2014

Kafka Security Authors

Sriharsha ChintalapaniApache Kafka CommitterApache Storm Committer & PMC

Parth BrahmbhattApache Kafka ContributorApache Storm Committer & PMC

Page 3: Apache Kafka Security

Page 3 © Hortonworks Inc. 2014

Why Kafka Security?

• Kafka is becoming centralized data bus connecting external data sources to Hadoop eco system.

• There are lot of requests/discussions in Kafka mailing lists to add security

Page 4: Apache Kafka Security

Page 4 © Hortonworks Inc. 2014

Kafka Security - Overview

• Wire encryption and Authentication via SSL• Role Based authentication via SASL ( Kerberos,

Plaintext)• Authorizer to add fine-grain access controls to Kafka

topics per User, per Host.

Page 5: Apache Kafka Security

Page 5 © Hortonworks Inc. 2014

Authentication

• Brokers support listening for connections on multiple ports• Plain text (no wire encryption/no authentication)• SSL (wire encryption/authentication)• SASL (Kerberos/Plain text authentication)• SSL + SASL ( SSL for wire encryption + SASL for

authentication)Ex: listeners=PLAINTEXT://host.name:port,SSL://host.name:port

Page 6: Apache Kafka Security

Page 6 © Hortonworks Inc. 2014

Kafka Security – SSL

• Kafka SSL / SASL requirements• No User-level API changes to clients• Retain length-encoded Kafka protocols • Client must authenticate before sending/receiving

requests• Kafka Channel

• Instead of using socket channel, we added KafkaChannel which consists a TransportLayer, Authenticator.

Page 7: Apache Kafka Security

Page 7 © Hortonworks Inc. 2014

Kafka Networking

KafkaChannel

TransportLayer

Authenticator

Kafka Serverhandshake

authenticate

Page 8: Apache Kafka Security

Page 8 © Hortonworks Inc. 2014

Kafka Security – SSL

Page 9: Apache Kafka Security

Page 9 © Hortonworks Inc. 2014

Kafka Security – SSL

• Principal Builder• By default, SSL user name will be of the form

"CN=hostname,OU=organizationunit,O=organization,L=location,ST=state,C=country".

• X509Certificate has lot more information about a client identity.

• PrincipalBuilder provides interface to plug in a custom PrincipalBuilder that has access to X509Certificate and can construct a user identity out of it.

Page 10: Apache Kafka Security

Page 10 © Hortonworks Inc. 2014

Kafka Security – SSL

• Broker Configs:• listeners=SSL://host.name:port• ssl.keystore.location=/var/private/ssl/

kafka.server.keystore.jks• ssl.keystore.password=test1234• ssl.key.password=test1234• ssl.truststore.location=/var/private/ssl/

kafka.server.truststore.jks• ssl.truststore.password=test1234• security.inter.broker.protocol=SSL• ssl.client.auth=true

Page 11: Apache Kafka Security

Page 11 © Hortonworks Inc. 2014

Kafka Security – SSL

• Client Configs:• security.protocol=SSL• ssl.truststore.location=/var/private/ssl/

kafka.client.truststore.jks• ssl.truststore.password=test1234• ssl.keystore.location=/var/private/ssl/

kafka.client.keystore.jks• ssl.keystore.password=test1234• ssl.key.password=test1234

Page 12: Apache Kafka Security

Page 12 © Hortonworks Inc. 2014

Kafka Security – SASL

• Simple Authentication and Security Layer, or SASL• Provides flexibility in using mechanisms • Challenge/Response protocols• Mechanisms : GSSAPI/Kerberos, clear text username/password, DIGEST-

MD5

• JAAS Login• Before client & server can handshake , they need to authenticate with

Kerberos or other Identity Provider.• JAAS provides a pluggable way of providing user credentials. One can easily

add LDAP or other mechanism just by changing a config file.

• Kafka supports GSSAPI/Kerberos, clear text username/password

Page 13: Apache Kafka Security

Page 13 © Hortonworks Inc. 2014

Kafka Security – SASL

Client Broker

Connection

Mechanism list

Selected Mechanism & sasl data

Evaluate and Response

Sasl data

Client Authenticated

Page 14: Apache Kafka Security

Page 14 © Hortonworks Inc. 2014

Kafka Security – SASL• Prepare JAAS Config file

KafkaServer { com.sun.security.auth.module.Krb5LoginModule required useKeyTab=true storeKey=true serviceName="kafka" keyTab="/vagrant/keytabs/kafka1.keytab" principal="kafka/[email protected]";};

KafkaClient { com.sun.security.auth.module.Krb5LoginModule required useKeyTab=true storeKey=true serviceName="kafka" keyTab="/vagrant/keytabs/client1.keytab" principal=”client/[email protected]";};

• Pass JAAS config file as jvm parameter. -Djava.security.auth.login.config• security.inter.broker.protocol=SASL_PLAINTEXT• security.protocol=SASL_PLAINTEXT

Page 15: Apache Kafka Security

Page 15 © Hortonworks Inc. 2014

Kafka Security – SASL

• Kerberos principal name• {username}/{hostname}@{REALM}• Ex: kafka/kafka.host1.com@{TEST.COM}• {username} part taken as default principal• sasl.kerberos.principal.to.local.rules – customize

principal name

Page 16: Apache Kafka Security

Page 16 © Hortonworks Inc. 2014

Kafka Security – Resources

• SSL• http://kafka.apache.org/documentation.html#security_ssl

• SASL• http://kafka.apache.org/documentation.html#security_sasl

• Vagrant Setup• SASL

• https://github.com/harshach/kafka-vagrant/tree/master/

• SSL• https://github.com/harshach/kafka-vagrant/tree/ssl/

Page 17: Apache Kafka Security

Page 17 © Hortonworks Inc. 2014

Authorizer

• Controls who can do what• Pluggable• Acl based approach

Page 18: Apache Kafka Security

Page 18 © Hortonworks Inc. 2014

Acl

• Alice is Allowed to Read from Orders-topic from Host-1

Principal Permission Operation Resource Host

Alice Allow Read Orders Host-1

Page 19: Apache Kafka Security

Page 19 © Hortonworks Inc. 2014

Principal

• PrincipalType:Name• Supported types: User • Extensible so users can add their own types• Wild Card User:*

Page 20: Apache Kafka Security

Page 20 © Hortonworks Inc. 2014

Operations and Resources

• Operation• Read, Write, Create, Delete, Describe, ClusterAction, All

• Resource• ResourceType:ResourceName• Topic, Cluster and ConsumerGroup• Wild card resource ResourceType:*• Topic -> Read, Write, Describe• ConsumerGroup -> Read• Cluster -> Create, ClusterAction

Page 21: Apache Kafka Security

Page 21 © Hortonworks Inc. 2014

Permissions

• Allow and Deny• Anyone without an explicit Allow ACL is denied• Deny works as negation• Deny takes precedence over Allow Acls

Page 22: Apache Kafka Security

Page 22 © Hortonworks Inc. 2014

Hosts

• Allows authorizer to provide firewall type security even in non secure environment.

• * as Wild card.

Page 23: Apache Kafka Security

Page 23 © Hortonworks Inc. 2014

Configuration

• Authorizer class• Super users• Authorizer properties• Default behavior for resources with no ACLs

– allow.everyone.if.no.acl.found = false

Page 24: Apache Kafka Security

Page 24 © Hortonworks Inc. 2014

SimpleAclAuthorizer

• Out of box authorizer implementation.• Stores all of its ACLs in zookeeper.• In built ACL cache to avoid performance penalty.• Provides authorizer audit log.

Page 25: Apache Kafka Security

Page 25 © Hortonworks Inc. 2014

Client Broker Authorizer Zookeeper

configureRead ACLs

Load Cache

Request

authorize

ACL matchOr Super User?

Allowed/Denied

Page 26: Apache Kafka Security

Page 27 © Hortonworks Inc. 2014

CLI

• Add, Remove and List acls• Convenience options:

– Producerbin/kafka-acls.sh --authorizer-properties zookeeper.connect=localhost:2181

--add --allow-principal User:Bob --producer --topic Test-topic

– Consumer bin/kafka-acls.sh --authorizer-properties zookeeper.connect=localhost:2181 --add --allow-principal User:Bob --consumer --topic test-topic --group Group-1

Page 27: Apache Kafka Security

Page 28 © Hortonworks Inc. 2014

Ranger Policy

Page 28: Apache Kafka Security

Page 29 © Hortonworks Inc. 2014

Ranger Auditing

Page 29: Apache Kafka Security

Page 30 © Hortonworks Inc. 2014

Securing Zookeeper

• Kafka’s metadata store , ACLs• Create , Delete directly interacts with zookeeper• Has its own security mechanism that supports SASL and

MD5-DIGEST for establishing identity and ACL based authorization

• Set zookeeper.set.acl = true• ZK paths are writable by brokers and readable by all

Page 30: Apache Kafka Security

Page 31 © Hortonworks Inc. 2014

Client JAAS

Client {com.sun.security.auth.module.Krb5LoginModule requireduseKeyTab=truestoreKey=true

serviceName="zookeeper"keyTab="/vagrant/keytabs/kafka.keytab"principal="kafka/[email protected]";

};

Page 31: Apache Kafka Security

Page 32 © Hortonworks Inc. 2014

Future

• KIP-4 (Admin API): Move everything to server side, no direct interactions with zookeeper

• Group Support • Pluggable Auditor

Page 32: Apache Kafka Security

Page 33 © Hortonworks Inc. 2014

Apache Kafka 0.10.0.0

• New Client Library, Kafka Streams• New timestamp field for messages• Balancing Replicas Across Racks• Authentication using SASL/PLAIN.• New Consumer configuration parameter

'max.poll.records'

Page 33: Apache Kafka Security

Page 34 © Hortonworks Inc. 2014

Summary

• SSL for wire encryption• SASL for authentication• Authorization• Secure Zookeeper

Thanks to the community for participation.

Page 34: Apache Kafka Security

Page 35 © Hortonworks Inc. 2014 35

Page 35: Apache Kafka Security

Page 36 © Hortonworks Inc. 2014

Kafka Networking

Page 36: Apache Kafka Security

Page 37 © Hortonworks Inc. 2014

Kafka Networking

http://www.slideshare.net/jjkoshy/troubleshooting-kafkas-socket-server-from-incident-to-resolution

Page 37: Apache Kafka Security

Page 38 © Hortonworks Inc. 2014

Kafka Networking

Page 38: Apache Kafka Security

Page 39 © Hortonworks Inc. 2014

Kafka Security – SSL

• SSLTransportLayer• Before sending any application data, both client and

server needs to go though SSL handshake• SSLTransportLayer uses SSLEngine to establish a

non-blocking handshake.• SSLEngine provides a state machine to go through

several steps of SSLhandshake

Page 39: Apache Kafka Security

Page 40 © Hortonworks Inc. 2014

Kafka Security – SSL

• SSLTransportLayer• SocketChannel read

• Returns encrypted data • Decrypts the data and returns the length of the data from Kafka protocols

• SocketChannel Write• Writes encrypted data onto channel• Regular socketChannel returns length of the data written to socket.

• Incase of SSL since we encrypt the data we can’t return exact length written to socket which will be more than actual data

• Its important to keep track length of data written to network. This signifies if we successfully written data to the network or not and move on to next request.


Top Related