-
嗨~目前采用计算引擎为 Spark On K8S,数据在远端开启了 Kerberos 的 Hadoop 集群上。已经可以通过 spark-sql 客户端正常读写 Hive 表了。现在 K8S 上部署了 Kyuubi 作为 JDBC Server,并按照 Kerberos文档 做了相关配置 kyuubi-defaults.conf
spark-defaults.conf
通过
于是又找到了这个文档 ,意思是访问 Kerberos 认证的 Hadoop 有两种方式(只能二选一)
根据 spark 配置,是第二种方式。但是通过报错,可以看到 kyuubi 把 |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 3 replies
-
Generally, we encourage using proxy user mode instead of providing superuser keytab to Spark application directly for security purposes. The reason is, that users can access all resources including keytab if you provided throw Scala/Spark api, in other words, providing a superuser keytab to Spark application means exposing your superuser keytab to anyone. In practice, I know there are two approaches that have been adopted widely,
You should choose one of two. |
Beta Was this translation helpful? Give feedback.
-
I agree with @pan3793 , and for the quick workaround, you can pass principal and keytab through jdbc url. For example:
|
Beta Was this translation helpful? Give feedback.
Generally, we encourage using proxy user mode instead of providing superuser keytab to Spark application directly for security purposes. The reason is, that users can access all resources including keytab if you provided throw Scala/Spark api, in other words, providing a superuser keytab to Spark application means exposing your superuser keytab to anyone.
In practice, I know there are two approaches that have been adopted widely,
You should choose one of two.