Home>Data Ingestion Tools>Apache Kafka>Kafka Connect Security
Kafka Connect Security
Apache Kafka Articles Confluent Kafka Kafka Kafka Security

Kafka Connect Security

Kafka Connect Framework is a robust abstraction of Kafka producers and Consumers with high availability and distributed operation. That means, this piece of cake is what organisations need. These are special JVM applications which manage the underlying server resources. Developers and architects need not worry about writing too much custom code in this case to integrate Kafka to external systems in most cases.

A quick byte about Kafka Connect Security

For getting started with Connecters hit the Confluent Hub link. There are lot of stuff they have got. (like a super store for connectors!!)

Now bringing in data from external systems or inter-department systems in an organisation, requires to comply with some standard security protocol. Or even if it’s not required still, you wouldn’t want your data flowing through the wire in plaintext waiting to be intercepted and accessed by unauthorized entities.

To make Kafka Connect Security enterprise ready, it supports multiple security standards. You don’t have to squeeze in extra efforts to remain compliant. Here we will be discussing basic Kafka Connect security to begin with.

SSL encryption

Secure Socket Layer (SSL) is a security protocol for the transport layer. In SSL Protocol data is divided into fragments. The fragments are compressed and encrypted Message Authentication Code (MAC) generated by algorithms like Secure Hash Protocol(SHA) and MD5(Message Digest) is appended. SSL is the predecessor of Transport Layer Security(TLS) .After encryption of data, finally the SSL header is appended to the data. Read all about confluent kafka security configuration.

Kafka connect comprise of the following components:

  1. Worker – Basically a JVM process, abstracts server resources.
  2. Connector – Simply a configuration to communicate and establish a channel to copy data to and from external systems.
  3. Tasks – Multi initialization of connectors to handle transfers parallely.

 

Now these workers can host both sink and source connectors therefore, respective configuration would need entity prefixed. For configuring Kafka connect SSL, below are some kafka connect ssl examples.

Kafka Connect configuration for Meta-data topics

security.protocol=SSL

ssl.truststore.location=<somesecurepath>/kafka.client.truststore.jks

ssl.truststore.password=<somesecret>

ssl.keystore.location=<somesecurepath>/kafka.client.keystore.jks

ssl.keystore.password=<somesecret>

ssl.key.password=<somesecret>

Source connectors:

producer.security.protocol=SSL

producer.ssl.truststore.location=<somesecurepath>/kafka.client.truststore.jks

producer.ssl.truststore.password=<somesecret>

producer.ssl.keystore.location=<somesecurepath>/kafka.client.keystore.jks

producer.ssl.keystore.password=<somesecret>

producer.ssl.key.password=<somesecret>

 

For Sink connectors

consumer.security.protocol=SSL

consumer.ssl.truststore.location=<somesecurepath>/kafka.client.truststore.jks

consumer.ssl.truststore.password=<somesecret>

consumer.ssl.keystore.location=<somesecurepath>/kafka.client.keystore.jks

consumer.ssl.keystore.password=<somesecret>

consumer.ssl.key.password=<somesecret>

Kafka connect workers expose functionality via REST APIs, these are used to register, delete, pause, resume, validate etc.  (Almost everything and that’s awesome !!) Therefore, there might be a need to secure these with HTTPS SSL.

Additionally, the connect framework’s rest APIs can be secured with SSL. (But this severely affects administration efforts)

listeners=https://hostname:9000

rest.advertised.listener=https

rest.advertised.host.name=<localhost>

rest.advertised.host.port=8083

ssl.client.auth=requested

ssl.truststore.location=<somesecurepath>/kafka.client.truststore.jks

ssl.truststore.password=<somesecret>

ssl.keystore.location=<somesecurepath>/kafka.client.keystore.jks

ssl.keystore.password=<somesecret>

ssl.key.password=<somesecret>

SASL Plain authentication

SASL Plain is the base authentication, easy to implement and provides bare minimum enterprise-grade security to both Clients and services. It is a simple username/password based authentication used with SSL (Earlier called TLS). Both combined provide encryption and authentication. (isn’t it cool !!) Plain should not be confused with plaintext or no-SSL. It is highly recommended to use SSL with SASL PLAIN.

Below are the steps for enabling clients to communicate via SASL SSL PLAIN mechanism.

producer.sasl.mechanism=PLAIN (In case of producer client)
consumer.sasl.mechanism=PLAIN (In case of consumer client)
# Configure SASL_SSL if SSL encryption is enabled, otherwise configure SASL_PLAINTEXT
security.protocol=SASL_SSL

 

Create a JAAS file (Used for automated authentication) and include below lines.

sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \
username=”username” \
password=”some-secret”;

There are other SASL protocols available for securing Kafka

References

Getting started with kafka connect sasl_ssl

Leave a Reply

Your email address will not be published. Required fields are marked *