In previous articles in this series , We talked about Kafka Of Kerberos,LDAP and PAM Authentication . In this article , We're going to look at how to configure Kafka Cluster and client to use TLS Client authentication .
The example shown here highlights authentication related properties in bold , To distinguish it from other required security properties , As shown in the following example . Assumed to be Apache Kafka Cluster enabled TLS, And it should be enabled for each security cluster TLS.
We will kafka-console-consumer For all of the following examples . All the concepts and configurations apply to other applications as well .
TLS Client authentication is Kafka Another authentication method supported . It allows clients to use their own TLS The client certificate connects to the cluster for authentication .
Certificate management and keystore generation are beyond the scope of this article , But these are standard TLS practice . Please note that , certificate Keystores And Kerberos Keytab equally sensitive , It should be treated like this . Keystore permissions should always be restrictive , So they don't get hurt , And should not share . Each client should get its own certificate .
You must set the following Kafka Client properties , To configure Kafka Client to use TLS Certificate for authentication ：
Uses SSL security protocol
# TLS truststore
The above configuration uses TLS（SSL） Authentication and Data encryption .
install Kafka The service , By default, it is not Kafka Agent enabled TLS Authentication , But by Cloudera Manager It's pretty easy to configure .
By default , In a secure cluster ,Kafka Have a configuration to handle SASL_SSL A single listener for authentication . To enable the TLS Authentication , We need to create an additional listener on the other port to handle SSL agreement . This is through Kafka broker Of listeners Property configuration . After setting this property , We also need to pay attention to listing the original SASL_SSL The listener , To ensure that the client （ If it's in use ） It's still passable Kerberos and LDAP Authentication .
Besides , To use TLS Client authentication , We have to make sure that broker And clients trust each other's certificates . In the previous example , We have configured a truststore for the client , It contains the certificate of the agent's certificate publisher （ssl.truststore.location attribute ）. Now? , If This is related to the issue of proxy certificates CA Different CA , be You also have to make sure that the CA To the agent's truststore .
We recommend client certificates （ And proxy certificate ） Private property owned and controlled by you CA Issued by . Don't put out of control CA certificate （ Especially the public CA） Add to the cluster Repository .
stay Cloudera Data Platform（CDP） Deploying , Enable... Consistently across all clusters and services that share the same environment TLS. The environment has a common shared data experience （SDX） layer , It contains the common security and governance context shared among all environment clusters , also TLS Certificates can be made by SDX Embedded in FreeIPA Service distribution and management .
stay Cloudera Manager in , single click Kafka> example > Kafka Broker（ Click on a single agent ）> Configuration . An alert will be displayed , You can do this by clicking “ Continue editing role instances ” Ignore it .
by Kafka The agent sets the following properties （ Use the standard host name of your own agent ） And save the configuration . We set two different properties in this safety valve at the same time ：listeners and ssl.principal.mapping.rules . Please be there. listeners Note the different protocols for each listener in the and port .
Repeat the process for all other agents .
Now set the following at the service level , single click Kafka> To configure , Then select... In the configuration below “ required ”. Save your changes ：
As mentioned above ,Kafka Need to trust certificates issued to your customers . If these certificates are issued by Kafka Broker The certificate is different CA The signature of the , The client certificate's CA Add to Kafka In the trust pool . You can go to Cloudera Manager Find the location of the truststore in the following properties of ：
Run the following command （ With root identity ） take CA Add the certificate to the truststore ：
-keystore /opt/cloudera/security/jks/truststore.jks \
-storetype JKS \
-alias ldap-ca \
single click Kafka> operation > Restart To restart Kafka Service and make changes take effect .
The security protocol used for communication between agents is provided by Kafka Of security.inter.broker.protocol Attribute control .Cloudera Manager Set the default setting for this property to INFERRED .
In this configuration ,CM Will be set according to the following logic security.inter.broker.protocol attribute ：
If in use Kerberos or LDAP Authentication ：
If enabled TLS, Please set it to SASL_SSL
If not enabled TLS, Please set it to SASL_PLAINTEXT
in addition to ：
If enabled TLS, Please set it to SSL
If not enabled TLS, Please set it to PLAINTEXT
If you define multiple listeners using different security protocols , And the inferred intermediate agent protocol is not the one you want to use , You can use the properties shown above to override .
When the client uses TLS When the keystore authenticates , By default ,Kafka Assume that the client's user name It's a certificate Users name , It's usually a distinguished name , As shown below ：
It's cumbersome to use these long names . Security policy and group mapping are usually based on the short name of the user （alice ） It's not defined by the full proper name . therefore , We need to configure Kafka To convert the subject of the certificate to a short name , We can use it as a unique identifier for the user .
If you are using Kafka 2.4.0 （*） Or later , Can be set by using the necessary mapping rules ssl.principal.mapping.rules Parameter to do this . For older versions , You can provide a custom body builder . Creating a Custom Builder is beyond the scope of this document , But you can find a good example here .
The rule uses regular expressions to match the subject name of the certificate , And apply the transformation to match . There can be multiple rules , Separated by commas . The last rule is usually DEFAULT The rules , It only uses the full topic name
for example , Consider the following settings ：
The configuration above is 2 Bar rule , They are processed in order ：
The first rule that matches the subject name of the certificate will be used , The latter rule will be ignored . The Default The rule is “ All encompassing ”. If none of the previous matches match , It will always match and there will be no replacement .
The regular expression of the first rule above （^[Cc][Nn]=([a-zA-Z0-9.]*).*$） Will match with CN = （ or ：cn = ,Cn = ,cN =）, Followed by the short name of the user （ Should contain only the following characters ：a-zA-Z0-9.）, Followed by any other character . It replaces the matching string with the user's short name , The short name of the user is a match in brackets , In the second part of the rule with $ 1 quote . You can see it in practice , And use regular expressions and examples here .
At the end of the rule L Convert the result string to lowercase . You can go to Kafka See more details and examples of rules in the official documentation .
Certificate revocation list （ or CRL） Is the certification authority that has issued the certificate （CA） A list of digital certificates that have been revoked before their scheduled expiration date , And no longer trusted .CRL yes TLS The important function of Authentication , Ensures that broken client certificates can be marked as expired , In order to Kafka Agents deny connections from clients that use them .
Even though Kafka Not directly supported CRL Use （ see also KAFKA-3700）, however Java This option is provided in the framework . Can pass CRL Distribution point （CRLDP） Or through the online certificate status protocol （OCSP） To perform revocation checks . Use either method , You must first make sure that using one of these methods, the certification authority （CA） Properly configured for certificate revocation checking , And the certificate contains the necessary information for this operation .CA The configuration of and the generation of certificates with correct attributes are not within the scope of this document .
for Kafka Enable certificate revocation checking , Please perform the following steps ：
To use CRLDP Enable revocation checking ：
a. stay Cloudera Manager in , go to Kafka> To configure , And then the search Additional Broker Java Options attribute .
b. Append the following value to the end of the property ：
To use OCSP Enable revocation checking ：
a. except The above is for CRLDP Attribute outside , Also append the following values to the end of the same property ：
After making any of the above changes , All have to be rebooted Kafka service . If in CA And the certificate are not configured correctly for CRLDP and / or OCSP Support for , The service may not start .
Even if certificate revocation is not enabled , It can also be done by ensuring revocation and / Or reject all authorization policies that apply to those certificates （ adopt Ranger,Sentry or ACL） To stop the right Kafka Access to resources .
Here are the USES Kafka Console users use TLS An example of authentication reading from a topic . Please note that , When connecting to a cluster , We use SSL The port of the listener （9094） Not the default 9093 Provide a boot server .
--bootstrap-server host-1.example.com:9094 \
--topic test \
Be careful ： The client configuration above contains sensitive credentials . When you store this configuration in a file , Make sure the file permissions are set , So that only the owner of the file can read it .
We will review all of these authentication methods in this blog series , These methods give you flexible configuration Kafka Cluster approach , To integrate with the authentication mechanisms for your environment .
We'll continue to explore other authentication alternatives in the next article in this series . meanwhile , If you are interested in Cloudera Of Kafka product , please Download this white paper .
（*）ssl.principal.mapping.rules Attribute from Kafka 2.2.0 Available from , But you can't handle certificate proper names （KAFKA-8860） Spaces in .Kafka 2.4.0 It's a stronger starting point .
Original author ：Andre Araujo
Link to the original text ：https://blog.cloudera.com/how-to-configure-clients-to-connect-to-apache-kafka-clusters-securely-part-4-tls-client-authentication/
This article is from WeChat official account. - Big data grocery store （bigdataGrocery）.
If there is any infringement , Please contact the email@example.com Delete .
Participation of this paper “OSC Source creation plan ”, You are welcome to join us , share .