Configure clients to connect securely to Apache Kafka Cluster 4: TLS client authentication

Big data grocery store 2021-02-23 13:28:15
configure clients connect securely apache


In previous articles in this series , We talked about Kafka Of Kerberos,LDAP and PAM Authentication . In this article , We're going to look at how to configure Kafka Cluster and client to use TLS Client authentication .

The example shown here highlights authentication related properties in bold , To distinguish it from other required security properties , As shown in the following example . Assumed to be Apache Kafka Cluster enabled TLS, And it should be enabled for each security cluster TLS.


security.protocol=SSL
ssl.truststore.location=/opt/cloudera/security/jks/truststore.jks

We will kafka-console-consumer For all of the following examples . All the concepts and configurations apply to other applications as well .

TLS Client authentication

TLS Client authentication is Kafka Another authentication method supported . It allows clients to use their own TLS The client certificate connects to the cluster for authentication .

Certificate management and keystore generation are beyond the scope of this article , But these are standard TLS practice . Please note that , certificate Keystores And Kerberos Keytab equally sensitive , It should be treated like this . Keystore permissions should always be restrictive , So they don't get hurt , And should not share . Each client should get its own certificate .

You must set the following Kafka Client properties , To configure Kafka Client to use TLS Certificate for authentication :

# Uses SSL security protocolsecurity.protocol=SSLssl.keystore.location=./alice-keystore.jksssl.keystore.password=supersecret1ssl.key.password=supersecret1# TLS truststoressl.truststore.location=/opt/cloudera/security/jks/truststore.jks

The above configuration uses TLS(SSL) Authentication and Data encryption .

stay Kafka Broker Enable on TLS Authentication

install Kafka The service , By default, it is not Kafka Agent enabled TLS Authentication , But by Cloudera Manager It's pretty easy to configure .

By default , In a secure cluster ,Kafka Have a configuration to handle SASL_SSL A single listener for authentication . To enable the TLS Authentication , We need to create an additional listener on the other port to handle SSL agreement . This is through Kafka broker Of listeners Property configuration . After setting this property , We also need to pay attention to listing the original SASL_SSL The listener , To ensure that the client ( If it's in use ) It's still passable Kerberos and LDAP Authentication .

Besides , To use TLS Client authentication , We have to make sure that broker And clients trust each other's certificates . In the previous example , We have configured a truststore for the client , It contains the certificate of the agent's certificate publisher (ssl.truststore.location attribute ). Now? , If This is related to the issue of proxy certificates CA Different CA , be You also have to make sure that the CA To the agent's truststore .

We recommend client certificates ( And proxy certificate ) Private property owned and controlled by you CA Issued by . Don't put out of control CA certificate ( Especially the public CA) Add to the cluster Repository .

stay Cloudera Data Platform(CDP) Deploying , Enable... Consistently across all clusters and services that share the same environment TLS. The environment has a common shared data experience (SDX) layer , It contains the common security and governance context shared among all environment clusters , also TLS Certificates can be made by SDX Embedded in FreeIPA Service distribution and management .

  1. stay Cloudera Manager in , single click Kafka> example > Kafka Broker( Click on a single agent )> Configuration . An alert will be displayed , You can do this by clicking “ Continue editing role instances ” Ignore it .

  2. by Kafka The agent sets the following properties ( Use the standard host name of your own agent ) And save the configuration . We set two different properties in this safety valve at the same time :listeners and ssl.principal.mapping.rules . Please be there. listeners Note the different protocols for each listener in the and port .

  3. Repeat the process for all other agents .

  4. Now set the following at the service level , single click Kafka> To configure , Then select... In the configuration below “ required ”. Save your changes :

  5. As mentioned above ,Kafka Need to trust certificates issued to your customers . If these certificates are issued by Kafka Broker The certificate is different CA The signature of the , The client certificate's CA Add to Kafka In the trust pool . You can go to Cloudera Manager Find the location of the truststore in the following properties of :

  6. Run the following command ( With root identity ) take CA Add the certificate to the truststore :

keytool \ -importcert \ -keystore /opt/cloudera/security/jks/truststore.jks \ -storetype JKS \ -alias ldap-ca \ -file /path/to/ca-cert.pem

 

  1. single click Kafka> operation > Restart To restart Kafka Service and make changes take effect .

Secure intermediate agent protocol

The security protocol used for communication between agents is provided by Kafka Of security.inter.broker.protocol Attribute control .Cloudera Manager Set the default setting for this property to INFERRED .

In this configuration ,CM Will be set according to the following logic security.inter.broker.protocol attribute :

  • If in use Kerberos or LDAP Authentication :

    • If enabled TLS, Please set it to SASL_SSL

    • If not enabled TLS, Please set it to SASL_PLAINTEXT

  • in addition to :

    • If enabled TLS, Please set it to SSL

    • If not enabled TLS, Please set it to PLAINTEXT

If you define multiple listeners using different security protocols , And the inferred intermediate agent protocol is not the one you want to use , You can use the properties shown above to override .

Principal Name mapping

When the client uses TLS When the keystore authenticates , By default ,Kafka Assume that the client's user name It's a certificate Users name , It's usually a distinguished name , As shown below : 

cn=alice,cn=groups,cn=accounts,dc=hadoopsecurity,dc=local

It's cumbersome to use these long names . Security policy and group mapping are usually based on the short name of the user (alice ) It's not defined by the full proper name . therefore , We need to configure Kafka To convert the subject of the certificate to a short name , We can use it as a unique identifier for the user .

If you are using Kafka 2.4.0 (*) Or later , Can be set by using the necessary mapping rules ssl.principal.mapping.rules Parameter to do this . For older versions , You can provide a custom body builder . Creating a Custom Builder is beyond the scope of this document , But you can find a good example here . 

The rule uses regular expressions to match the subject name of the certificate , And apply the transformation to match . There can be multiple rules , Separated by commas . The last rule is usually DEFAULT The rules , It only uses the full topic name

for example , Consider the following settings :

ssl.principal.mapping.rules=RULE:^.*[Cc][Nn]=([a-zA-Z0-9.]*).*$/$1/L,DEFAULT

The configuration above is 2 Bar rule , They are processed in order :

RULE:^[Cc][Nn]=([a-zA-Z0-9.]*).*$/$1/LDEFAULT

The first rule that matches the subject name of the certificate will be used , The latter rule will be ignored . The Default The rule is “ All encompassing ”. If none of the previous matches match , It will always match and there will be no replacement .

The regular expression of the first rule above (^[Cc][Nn]=([a-zA-Z0-9.]*).*$) Will match with CN = ( or :cn = ,Cn = ,cN =), Followed by the short name of the user ( Should contain only the following characters :a-zA-Z0-9.), Followed by any other character . It replaces the matching string with the user's short name , The short name of the user is a match in brackets , In the second part of the rule with $ 1 quote . You can see it in practice , And use regular expressions and examples here .

At the end of the rule L Convert the result string to lowercase . You can go to Kafka See more details and examples of rules in the official documentation .

Certificate revocation list

Certificate revocation list ( or CRL) Is the certification authority that has issued the certificate (CA) A list of digital certificates that have been revoked before their scheduled expiration date , And no longer trusted .CRL yes TLS The important function of Authentication , Ensures that broken client certificates can be marked as expired , In order to Kafka Agents deny connections from clients that use them .

Even though Kafka Not directly supported CRL Use ( see also KAFKA-3700), however Java This option is provided in the framework . Can pass CRL Distribution point (CRLDP) Or through the online certificate status protocol (OCSP) To perform revocation checks . Use either method , You must first make sure that using one of these methods, the certification authority (CA) Properly configured for certificate revocation checking , And the certificate contains the necessary information for this operation .CA The configuration of and the generation of certificates with correct attributes are not within the scope of this document .

for Kafka Enable certificate revocation checking , Please perform the following steps :

To use CRLDP Enable revocation checking

a. stay Cloudera Manager in , go to Kafka> To configure , And then the search Additional Broker Java Options attribute .

b. Append the following value to the end of the property :

-Dcom.sun.security.enableCRLDP=true
-Dcom.sun.net.ssl.checkRevocation=true


To use OCSP Enable revocation checking

a. except The above is for CRLDP Attribute outside , Also append the following values to the end of the same property :

-Djava.security.properties=<(echo "ocsp.enable=true")

After making any of the above changes , All have to be rebooted Kafka service . If in CA And the certificate are not configured correctly for CRLDP and / or OCSP Support for , The service may not start .

Even if certificate revocation is not enabled , It can also be done by ensuring revocation and / Or reject all authorization policies that apply to those certificates ( adopt Ranger,Sentry or ACL) To stop the right Kafka Access to resources .

Example

Here are the USES Kafka Console users use TLS An example of authentication reading from a topic . Please note that , When connecting to a cluster , We use SSL The port of the listener (9094) Not the default 9093 Provide a boot server .

$ cat tls-client.propertiessecurity.protocol=SSLssl.keystore.location=./alice-keystore.jksssl.keystore.password=supersecret1ssl.key.password=supersecret1ssl.truststore.location=/opt/cloudera/security/jks/truststore.jks $ kafka-console-consumer \ --bootstrap-server host-1.example.com:9094 \ --topic test \ --consumer.config ./tls-client.properties

Be careful : The client configuration above contains sensitive credentials . When you store this configuration in a file , Make sure the file permissions are set , So that only the owner of the file can read it .

There are more

We will review all of these authentication methods in this blog series , These methods give you flexible configuration Kafka Cluster approach , To integrate with the authentication mechanisms for your environment .

We'll continue to explore other authentication alternatives in the next article in this series . meanwhile , If you are interested in Cloudera Of Kafka product , please Download this white paper .

(*)ssl.principal.mapping.rules Attribute from Kafka 2.2.0 Available from , But you can't handle certificate proper names (KAFKA-8860) Spaces in .Kafka 2.4.0 It's a stronger starting point .

Original author :Andre Araujo

Link to the original text :https://blog.cloudera.com/how-to-configure-clients-to-connect-to-apache-kafka-clusters-securely-part-4-tls-client-authentication/




This article is from WeChat official account. - Big data grocery store (bigdataGrocery).
If there is any infringement , Please contact the support@oschina.cn Delete .
Participation of this paper “OSC Source creation plan ”, You are welcome to join us , share .

版权声明
本文为[Big data grocery store]所创,转载请带上原文链接,感谢
https://javamana.com/2021/02/20210223132538303i.html

  1. Java collection processing / null value processing / exception processing, use experience sharing!
  2. mysql Innodb_flush_log_at_trx_commit 和 sync_binlog
  3. mysql Innodb_ flush_ log_ at_ trx_ Commit and sync_ binlog
  4. 不能回滚的Redis事务还能用吗
  5. 不能回滚的Redis事务还能用吗
  6. Can redis transactions that cannot be rolled back be used
  7. 23种java设计模式
  8. Java、JavaScript、C、C++、PHP、Python都是用来开发什么?
  9. Docker overlay 清理空间
  10. 「Linux」- 安装 Opera 浏览器 @20210223
  11. java的byte和C#的byte的不同之处
  12. Can redis transactions that cannot be rolled back be used
  13. 23 Java design patterns
  14. What are Java, JavaScript, C, C + +, PHP and python used to develop?
  15. Docker overlay cleaning space
  16. "Linux" - install opera browser @ 20210223
  17. Differences between Java byte and C byte
  18. SAP UI5 JavaScript文件的lazy load - 懒加载
  19. Java 在Excel中添加筛选器并执行筛选
  20. LiteOS:盘点那些重要的数据结构
  21. Lazy load lazy load of SAP ui5 JavaScript files
  22. Add filter and execute filter in excel by Java
  23. Liteos: inventory those important data structures
  24. HDFS依然是存储的王者
  25. [MySQL]事务的MVCC原理与幻读
  26. 93.7%的程序员!竟然都不知道Redis为什么默认16个数据库?
  27. Java 集合处理/ 空值处理/ 异常处理,使用心得分享!
  28. Spring Authorization Server 全新授权服务器整合使用
  29. Spring Security 实战干货:OAuth2登录获取Token的核心逻辑
  30. Java中各种锁的原理解析
  31. java的byte和C#的byte的不同之处
  32. Java 在Excel中添加筛选器并执行筛选
  33. HDFS is still the king of storage
  34. Mvcc principle and unreal reading of [MySQL] transaction
  35. 93.7% of programmers! Why does redis default to 16 databases?
  36. Java collection processing / null value processing / exception processing, use experience sharing!
  37. Integrated use of new authorization server of spring authorization server
  38. Spring security real combat dry goods: the core logic of oauth2 login to obtain token
  39. Principle analysis of various locks in Java
  40. Differences between Java byte and C byte
  41. Add filter and execute filter in excel by Java
  42. Dialogue in spring
  43. 解决Docker MySQL无法被宿主机访问的问题
  44. Oracle OCP 19c 认证1Z0-083考试题库(第1题)
  45. Solve the problem that docker MySQL cannot be accessed by the host
  46. Oracle OCP 19C certification 1z0-083 examination question bank (question 1)
  47. 在 2021 年你需要掌握的 7 种关于 JavaScript 的数组方法
  48. Seven array methods for JavaScript you need to master in 2021
  49. 在 2021 年你需要掌握的 7 种关于 JavaScript 的数组方法
  50. Struts2 + Json _ 配置,异常解决及深入了解Struts2返回JSON数据的原理及具体应用范例
  51. Seven array methods for JavaScript you need to master in 2021
  52. Struts2 + Json _ Configuration, exception resolution and in-depth understanding of Struts2 return JSON data principle and specific application examples
  53. (三)MySQL锁机制 + 事务
  54. (3) MySQL lock mechanism + transaction
  55. 在 2021 年你需要掌握的 7 种关于 JavaScript 的数组方法
  56. Seven array methods for JavaScript you need to master in 2021
  57. 基于Kafka和Elasticsearch构建实时站内搜索功能的实践
  58. Practice of building real time search function in the website based on Kafka and elasticsearch
  59. Golang 实现 Redis(9): 使用GeoHash 搜索附近的人
  60. RxHttp - 轻量级、可扩展、易使用、完美兼容MVVM、MVC架构的网络封装类库