Page tree

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.


Tip

请在使用站内资源的同时不要恶意进行爬取或倒链等行为,感谢支持!



Note

相关文档:

UI Button
colorblue
newWindowtrue
sizesmall
displayblock
iconlink
title链接

官方安全指南:

UI Button
colorblue
newWindowtrue
sizesmall
displayblock
iconlink
title链接
urlhttps://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-common/SecureMode.html

Java 客户端 Example:

UI Button
colorblue
newWindowtrue
sizesmall
displayblock
iconlink
title链接
urlhttps://community.cloudera.com/t5/Community-Articles/A-Secure-HDFS-Client-Example/ta-p/247424



Info
iconfalse

Table of Contents


CDH 5.13.0 Kerberos

CHD 5.13.0 启用 Kerberos。

首先安装 Kerberos:

UI Button
colorblue
newWindowtrue
sizesmall
iconlink
titlekerberos install
urlhttps://wiki.shileizcc.com/confluence/display/zd/Kerberos

准备好 CDH5:

UI Button
colorblue
newWindowtrue
sizesmall
iconlink
titleCDH5 install
urlhttps://wiki.shileizcc.com/confluence/display/HAD/CDH+5.13.0+install

Kerberos Service 配置如下:

Code Block
languagebash
titlekdc.conf
collapsetrue
[kdcdefaults]
 kdc_ports = 88
 kdc_tcp_ports = 88

[realms]
 KERBEROS.OPS.SHILEIZCC-OPS.COM = {
  #master_key_type = aes256-cts
  acl_file = /var/kerberos/krb5kdc/kadm5.acl
  dict_file = /usr/share/dict/words
  max_renewable_life = 7d
  max_life = 1d
  admin_keytab = /var/kerberos/krb5kdc/kadm5.keytab
  supported_enctypes = aes256-cts:normal aes128-cts:normal des3-hmac-sha1:normal arcfour-hmac:normal des-hmac-sha1:normal des-cbc-md5:normal des-cbc-crc:normal rc4-hmac:normal
  default_principal_flags = +renewable, +forwardable
 }
Code Block
languagebash
titlekrb5.conf
collapsetrue
[logging]
 default = FILE:/var/log/krb5libs.log
 kdc = FILE:/var/log/krb5kdc.log
 admin_server = FILE:/var/log/kadmind.log
[libdefaults]
 default_realm = KERBEROS.OPS.SHILEIZCC-OPS.COM
 dns_lookup_kdc = false
 dns_lookup_realm = false
 ticket_lifetime = 24h
 renew_lifetime = 7d
 forwardable = true
 default_tgs_enctypes = rc4-hmac
 default_tkt_enctypes = rc4-hmac
 permitted_enctypes = rc4-hmac
 clockskew = 120
 udp_preference_limit = 1
[realms]
 KERBEROS.OPS.ZHANGYUE-OPS.COM = {
 kdc = kerberos.ops.shileizcc-ops.com
 admin_server = kerberos.ops.shileizcc-ops.com
 }
[domain_realm]
 .kerberos.ops.shileizcc-ops.com = KERBEROS.OPS.SHILEIZCC-OPS.COM
 kerberos.ops.shileizcc-ops.com = KERBEROS.OPS.SHILEIZCC-OPS.COM

Deploys

所有客户端安装:

Code Block
languagebash
$ yum install -y krb5-workstation krb5-libs krb5-auth-dialog

客户端配置好 krb5.conf 配置.

这里我使用三台机器。在 CDH 准备配置 Kerberos 配置项:

选择启用 Kerberos:

下一步:

配置 Kerberos 服务地址,以及 KDC 地址信息:(注意使用 des3-cbc-sha1 算法,默认算法 rc4-hmac kerberos 以不支持)

下一步:

创建 kerberos 凭证:(密码 cloudera-scm)

Code Block
languagebash
kadmin:  addprinc cloudera-scm/admin
WARNING: no policy specified for cloudera-scm/admin@HADOOP.COM; defaulting to no policy
Enter password for principal "cloudera-scm/admin@HADOOP.COM": 
Re-enter password for principal "cloudera-scm/admin@HADOOP.COM": 
Principal "cloudera-scm/admin@HADOOP.COM" created.

通过创建好的用户进行配置:

下一步:

kerberos 主体范围,有需要可以自定义:

Info

 它的含义主要是自动生成如下 Prince:

Code Block
languagebash
kadmin.local:  listprincs
HTTP/master001.k8s.shileizcc.com@HADOOP.COM
HTTP/master002.k8s.shileizcc.com@HADOOP.COM
HTTP/master003.k8s.shileizcc.com@HADOOP.COM
cloudera-scm/admin@HADOOP.COM
hbase/master001.k8s.shileizcc.com@HADOOP.COM
hbase/master002.k8s.shileizcc.com@HADOOP.COM
hbase/master003.k8s.shileizcc.com@HADOOP.COM
hdfs/master001.k8s.shileizcc.com@HADOOP.COM
hdfs/master002.k8s.shileizcc.com@HADOOP.COM
hdfs/master003.k8s.shileizcc.com@HADOOP.COM
hive/master001.k8s.shileizcc.com@HADOOP.COM
hue/master001.k8s.shileizcc.com@HADOOP.COM
mapred/master001.k8s.shileizcc.com@HADOOP.COM
oozie/master001.k8s.shileizcc.com@HADOOP.COM
yarn/master001.k8s.shileizcc.com@HADOOP.COM
yarn/master002.k8s.shileizcc.com@HADOOP.COM
yarn/master003.k8s.shileizcc.com@HADOOP.COM
zookeeper/master001.k8s.shileizcc.com@HADOOP.COM
...


重启集群:

等待启动完成后,成功配置 Kerberos。

可通过此方式验证创建好的凭证是否可以使用:

Code Block
languagebash
$ kinit -kt /var/run/cloudera-scm-agent/process/`ls -lrt /var/run/cloudera-scm-agent/process/ | awk '{print $9}' |grep NAMENODE| tail -1`/hdfs.keytab hdfs/master001.k8s.shileizcc.com@HADOOP.COM
$ /opt/cloudera/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/bin/hadoop fs -ls /

否则报错如下:

Code Block
languagebash
$  hadoop fs -ls /
19/09/25 17:37:09 WARN security.UserGroupInformation: PriviledgedActionException as:root (auth:KERBEROS) cause:javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
19/09/25 17:37:09 WARN ipc.Client: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
19/09/25 17:37:09 WARN security.UserGroupInformation: PriviledgedActionException as:root (auth:KERBEROS) cause:java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
ls: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "master001.k8s.shileizcc.com/10.100.21.93"; destination host is: "master001.k8s.shileizcc.com":8020; 

验证分析

经过研究发现 hadoop 客户端工具再执行过程中访问了本地的配置文件如下:

Code Block
languagebash
$ HADOOP_ROOT_LOGGER=DEBUG,console /opt/cloudera/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/bin/hadoop fs -ls hdfs://10.100.21.93:8020/
19/09/26 12:12:27 DEBUG util.Shell: setsid exited with exit code 0
19/09/26 12:12:27 DEBUG conf.Configuration: parsing URL jar:file:/opt/cloudera/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/jars/hadoop-common-2.6.0-cdh5.13.0.jar!/core-default.xml
19/09/26 12:12:27 DEBUG conf.Configuration: parsing input stream sun.net.www.protocol.jar.JarURLConnection$JarURLInputStream@193bf6c8
19/09/26 12:12:27 DEBUG conf.Configuration: parsing URL file:/etc/hadoop/conf.cloudera.yarn/core-site.xml
19/09/26 12:12:27 DEBUG conf.Configuration: parsing input stream java.io.BufferedInputStream@37f92637
19/09/26 12:12:27 DEBUG core.Tracer: sampler.classes = ; loaded no samplers
19/09/26 12:12:27 DEBUG core.Tracer: span.receiver.classes = ; loaded no span receivers
19/09/26 12:12:27 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=, value=[Rate of successful kerberos logins and latency (milliseconds)], always=false, type=DEFAULT, sampleName=Ops)
19/09/26 12:12:27 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=, value=[Rate of failed kerberos logins and latency (milliseconds)], always=false, type=DEFAULT, sampleName=Ops)
19/09/26 12:12:27 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=, value=[GetGroups], always=false, type=DEFAULT, sampleName=Ops)
19/09/26 12:12:27 DEBUG lib.MutableMetricsFactory: field private org.apache.hadoop.metrics2.lib.MutableGaugeLong org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailuresTotal with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=, value=[Renewal failures since startup], always=false, type=DEFAULT, sampleName=Ops)
19/09/26 12:12:27 DEBUG lib.MutableMetricsFactory: field private org.apache.hadoop.metrics2.lib.MutableGaugeInt org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailures with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, about=, value=[Renewal failures since last successful login], always=false, type=DEFAULT, sampleName=Ops)
19/09/26 12:12:27 DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group related metrics
19/09/26 12:12:27 DEBUG security.SecurityUtil: Setting hadoop.security.token.service.use_ip to true
19/09/26 12:12:28 DEBUG security.Groups:  Creating new Groups object
19/09/26 12:12:28 DEBUG security.Groups: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping; cacheTimeout=300000; warningDeltaMs=5000
19/09/26 12:12:28 DEBUG security.UserGroupInformation: hadoop login
19/09/26 12:12:28 DEBUG security.UserGroupInformation: hadoop login commit
19/09/26 12:12:28 DEBUG security.UserGroupInformation: using kerberos user:hdfs/master001.k8s.shileizcc.com@HADOOP.COM
19/09/26 12:12:28 DEBUG security.UserGroupInformation: Using user: "hdfs/master001.k8s.shileizcc.com@HADOOP.COM" with name hdfs/master001.k8s.shileizcc.com@HADOOP.COM
19/09/26 12:12:28 DEBUG security.UserGroupInformation: User entry: "hdfs/master001.k8s.shileizcc.com@HADOOP.COM"
19/09/26 12:12:28 DEBUG security.UserGroupInformation: UGI loginUser:hdfs/master001.k8s.shileizcc.com@HADOOP.COM (auth:KERBEROS)
19/09/26 12:12:28 DEBUG security.UserGroupInformation: Current time is 1569471148234
19/09/26 12:12:28 DEBUG security.UserGroupInformation: Next refresh is 1569476883000
19/09/26 12:12:28 DEBUG core.Tracer: sampler.classes = ; loaded no samplers
19/09/26 12:12:28 DEBUG core.Tracer: span.receiver.classes = ; loaded no span receivers
19/09/26 12:12:28 DEBUG hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
19/09/26 12:12:28 DEBUG hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false
19/09/26 12:12:28 DEBUG hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
19/09/26 12:12:28 DEBUG hdfs.BlockReaderLocal: dfs.domain.socket.path = /var/run/hdfs-sockets/dn
19/09/26 12:12:28 DEBUG retry.RetryUtils: multipleLinearRandomRetry = null
19/09/26 12:12:28 DEBUG ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@6fa3703
19/09/26 12:12:28 DEBUG ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@5dd1bc1f
19/09/26 12:12:29 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
19/09/26 12:12:29 DEBUG util.NativeCodeLoader: Loaded the native-hadoop library
19/09/26 12:12:29 DEBUG unix.DomainSocketWatcher: org.apache.hadoop.net.unix.DomainSocketWatcher$2@2621c1f0: starting with interruptCheckPeriodMs = 60000
19/09/26 12:12:29 DEBUG util.PerformanceAdvisory: Both short-circuit local reads and UNIX domain socket are disabled.
19/09/26 12:12:29 DEBUG sasl.DataTransferSaslUtil: DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection
19/09/26 12:12:29 DEBUG ipc.Client: The ping interval is 60000 ms.
19/09/26 12:12:29 DEBUG ipc.Client: Connecting to /10.100.21.93:8020
19/09/26 12:12:29 DEBUG security.UserGroupInformation: PrivilegedAction as:hdfs/master001.k8s.shileizcc.com@HADOOP.COM (auth:KERBEROS) from:org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:756)
19/09/26 12:12:29 DEBUG security.SaslRpcClient: Sending sasl message state: NEGOTIATE

19/09/26 12:12:29 DEBUG security.SaslRpcClient: Get token info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.token.TokenInfo(value=class org.apache.hadoop.hdfs.security.token.delegation.DelegationTokenSelector)
19/09/26 12:12:29 DEBUG security.SaslRpcClient: Get kerberos info proto:interface org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB info:@org.apache.hadoop.security.KerberosInfo(clientPrincipal=, serverPrincipal=dfs.namenode.kerberos.principal)
19/09/26 12:12:29 DEBUG security.SaslRpcClient: RPC Server's Kerberos principal name for protocol=org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolPB is hdfs/master001.k8s.shileizcc.com@HADOOP.COM
19/09/26 12:12:29 DEBUG security.SaslRpcClient: Creating SASL GSSAPI(KERBEROS)  client to authenticate to service at master001.k8s.shileizcc.com
19/09/26 12:12:29 DEBUG security.SaslRpcClient: Use KERBEROS authentication for protocol ClientNamenodeProtocolPB
19/09/26 12:12:29 DEBUG security.SaslRpcClient: Sending sasl message state: INITIATE
token: "`\202\002\251\006\t*\206H\206\367\022\001\002\002\001\000n\202\002\2300\202\002\224\240\003\002\001\005\241\003\002\001\016\242\a\003\005\000 \000\000\000\243\202\001\225a\202\001\2210\202\001\215\240\003\002\001\005\241\031\033\027HADOOP.COM\242-0+\240\003\002\001\000\241$0\"\033\004hdfs\033\032master001.k8s.shileizcc.com\243\202\001:0\202\0016\240\003\002\001\020\241\003\002\001\002\242\202\001(\004\202\001$\371w\315\345(`\245\340\312\364x3\020y\0166\330^\2101u\315ks\364\344(\244}\333\357\252\243\016@Y5\027\035\002\247\331\355\0348\300.\223\204S\276\357\030\336\222QJ\320\301\021\334\350\310ET\373\260\t17;\\w4q\360@\004\232\026S\207FOjN~\265\002@X\244h\352K\363\263\340\243\321\310\311B6\r\v[R\000\363\252ew\255lgV\241\254\301\276\274\244pCV\201\343\376.\210\237\346D\216\003.\030a*\203\2134\235\0235k\357\352R\246RSuP\'D\213\263\037R\036\220\020\203Hd\366\003\016\340\314,:\370:\267Z[\367e\315Q\'(5;\356\f\321\030E:\213\354\217\207\234\315iq>\216>\251K\346\326e\243\375\326\'i\322\225\357B\a\323\372\\,\340\265\336\252\277\266F1q\265\332\260\311\240\251\335AN$2\327#H{2\300\025\376\034\246\214\206\252\256\305\335\331!(w\215\377\277\250\027D\024\2578K\232-yO`\225\301\353\0240\326J\024\035o\227\023{\006\244\201\3450\201\342\240\003\002\001\021\242\201\332\004\201\327T\201%e\rh\016j\243\223\262\343\3526\304\330\022!\361\244\275&[B)*n\350m[\203j\004fMGQ+\220\033x\261\326\367\375\'\377Yb4\234\324/\323GF\344$\321\213\305\371Y\002.}\2502<\263\376\214a\203\316\251\235p\216\245\n\3059o\r0\257\022*\361d\256\243\211\311m\0206x\345k\233h\370!\237i=\323*M\331\3473\262\247\271\033\273\260\314{\216\324\002%A\214\2624ur\370\344\324\314\375\323x\3156,\275\233\352\302\276\357^B\t~y\216i\207\364\317?\322\367eH\241r\343\372\023\205$\324\324\227\222\025;\020\305\353uxp`\034sZxiz`\325\335\226\017`\2628$\300(\244\300\365\023wg\025a\306\020\267!:"
auths {
  method: "KERBEROS"
  mechanism: "GSSAPI"
  protocol: "hdfs"
  serverId: "master001.k8s.shileizcc.com"
}

19/09/26 12:12:29 DEBUG security.SaslRpcClient: Sending sasl message state: RESPONSE
token: ""

19/09/26 12:12:29 DEBUG security.SaslRpcClient: Sending sasl message state: RESPONSE
token: "\005\004\000\377\000\f\000\000\000\000\000\000 \274g\320\001\001\000\000z\277\333\351\247\265\357\356\315\265$+"

19/09/26 12:12:29 DEBUG ipc.Client: Negotiated QOP is :auth
19/09/26 12:12:29 DEBUG ipc.Client: IPC Client (2113405974) connection to /10.100.21.93:8020 from hdfs/master001.k8s.shileizcc.com@HADOOP.COM: starting, having connections 1
19/09/26 12:12:29 DEBUG ipc.Client: IPC Client (2113405974) connection to /10.100.21.93:8020 from hdfs/master001.k8s.shileizcc.com@HADOOP.COM sending #0 org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo
19/09/26 12:12:29 DEBUG ipc.Client: IPC Client (2113405974) connection to /10.100.21.93:8020 from hdfs/master001.k8s.shileizcc.com@HADOOP.COM got value #0
19/09/26 12:12:29 DEBUG ipc.ProtobufRpcEngine: Call: getFileInfo took 248ms
19/09/26 12:12:29 DEBUG ipc.Client: IPC Client (2113405974) connection to /10.100.21.93:8020 from hdfs/master001.k8s.shileizcc.com@HADOOP.COM sending #1 org.apache.hadoop.hdfs.protocol.ClientProtocol.getListing
19/09/26 12:12:29 DEBUG ipc.Client: IPC Client (2113405974) connection to /10.100.21.93:8020 from hdfs/master001.k8s.shileizcc.com@HADOOP.COM got value #1
19/09/26 12:12:29 DEBUG ipc.ProtobufRpcEngine: Call: getListing took 2ms
Found 3 items
drwx------   - hbase hbase               0 2019-09-25 17:32 hdfs://10.100.21.93:8020/hbase
drwxrwxrwt   - hdfs  supergroup          0 2019-09-25 17:11 hdfs://10.100.21.93:8020/tmp
drwxr-xr-x   - hdfs  supergroup          0 2019-09-25 17:11 hdfs://10.100.21.93:8020/user
19/09/26 12:12:29 DEBUG ipc.Client: stopping client from cache: org.apache.hadoop.ipc.Client@5dd1bc1f
19/09/26 12:12:29 DEBUG ipc.Client: removing client from cache: org.apache.hadoop.ipc.Client@5dd1bc1f
19/09/26 12:12:29 DEBUG ipc.Client: stopping actual client because no more references remain: org.apache.hadoop.ipc.Client@5dd1bc1f
19/09/26 12:12:29 DEBUG ipc.Client: Stopping client
19/09/26 12:12:29 DEBUG ipc.Client: IPC Client (2113405974) connection to /10.100.21.93:8020 from hdfs/master001.k8s.shileizcc.com@HADOOP.COM: closed
19/09/26 12:12:29 DEBUG ipc.Client: IPC Client (2113405974) connection to /10.100.21.93:8020 from hdfs/master001.k8s.shileizcc.com@HADOOP.COM: stopped, remaining connections 0

也就是说通过 kerberos 验证时通过读取配置文件来获取访问信息,如果读取的配置文件中没有以下配置:

Code Block
languagebash
title/etc/hadoop/conf.cloudera.yarn/core-site.xml
...
<!--
<property>
  <name>hadoop.security.authentication</name>
  <value>kerberos</value>
</property>

<property>
  <name>hadoop.security.authorization</name>
  <value>true</value>
</property>
-->
...

那么执行时则会出现如下情况:

Code Block
languagebash
$ bin/hadoop fs -ls hdfs://10.100.21.93:8020/                                 
                [UnixLoginModule]: succeeded importing info: 
                        uid = 0
                        gid = 0
                        supp gid = 0
                        supp gid = 993
                [UnixLoginModule]: added UnixPrincipal,
                                UnixNumericUserPrincipal,
                                UnixNumericGroupPrincipal(s),
                         to Subject
ls: SIMPLE authentication is not enabled.  Available:[TOKEN, KERBEROS]

并且还需要配置 hdfs-site.xml 才可以正常使用 kerberos:

Code Block
languagebash
title/etc/hadoop/conf.cloudera.yarn/hdfs-site.xml
...
  <property>
    <name>dfs.namenode.kerberos.principal</name>
    <value>hdfs/_HOST@HADOOP.COM</value>
  </property>
...

否则报错如下:

Code Block
languagebash
$ bin/hadoop fs -ls /
                 [UnixLoginModule]: succeeded importing info: 
                        uid = 0
                        gid = 0
                        supp gid = 0
                        supp gid = 993
Debug is  true storeKey false useTicketCache true useKeyTab false doNotPrompt true ticketCache is null isInitiator true KeyTab is null refreshKrb5Config is false principal is null tryFirstPass is false useFirstPass is false storePass is false clearPass is false
Acquire TGT from Cache
Principal is root/10.100.96.91@HADOOP.COM
                [UnixLoginModule]: added UnixPrincipal,
                                UnixNumericUserPrincipal,
                                UnixNumericGroupPrincipal(s),
                         to Subject
Commit Succeeded 

19/09/26 13:21:18 WARN ipc.Client: Exception encountered while connecting to the server : java.lang.IllegalArgumentException: Failed to specify server's Kerberos principal name
ls: Failed on local exception: java.io.IOException: java.lang.IllegalArgumentException: Failed to specify server's Kerberos principal name; Host Details : local host is: "BJ-YZ-Falcon-96-91/10.100.96.91"; destination host is: "master001.k8s.shileizcc.com":8020; 

取消凭证时报错如下:

Code Block
languagebash
$ kdestroy 
$ bin/hadoop fs -ls / 
Debug is  true storeKey false useTicketCache true useKeyTab false doNotPrompt true ticketCache is null isInitiator true KeyTab is null refreshKrb5Config is false principal is null tryFirstPass is false useFirstPass is false storePass is false clearPass is false
19/09/26 13:22:10 WARN ipc.Client: Exception encountered while connecting to the server : java.lang.IllegalArgumentException: Failed to specify server's Kerberos principal name
ls: Failed on local exception: java.io.IOException: java.lang.IllegalArgumentException: Failed to specify server's Kerberos principal name; Host Details : local host is: "BJ-YZ-Falcon-96-91/10.100.96.91"; destination host is: "master001.k8s.shileizcc.com":8020; 

经过测试,client 不依赖 kerberos 的 kinit 工具,它依赖系统中 kinit 缓存在本地的 /tmp/krb5cc_0  这个文件。实际情况把所有的 kerberos 客户端可执行包删除也可以正常使用。

Info

但是 client 的宿主机上不能没有 /etc/krb5.conf 文件。所以说使用 kerberos 需要有凭证和配置文件,缺一不可。


文档创建于 最后一次更新于  , 文档当前的状态 

Status
colourGreen
title正式版
 , 当前编写页面的版本 
Status
colourBlue
titlev1.3.1
 。