[Flink-KAFKA-KEYTAB] Kafkaconsumer error Kerberos

classic Classic list List threaded Threaded
9 messages Options
Reply | Threaded
Open this post in threaded view
|

[Flink-KAFKA-KEYTAB] Kafkaconsumer error Kerberos

Vijayendra Yadav
Hi Team,


Here my KAFKA is Kerberos secured and SSL enabled.

I am running my Flink streaming in yarn-cluster on EMR 5.31.

I have tried to pass keytab/principal in following 2 Ways:

1) Passing as JVM property in Flink run Command.

/usr/lib/flink/bin/flink run
   -yt ${app_install_path}/conf/                                                 \
-Dsecurity.kerberos.login.use-ticket-cache=false                              \
-yDsecurity.kerberos.login.use-ticket-cache=false                             \
-Dsecurity.kerberos.login.keytab=${app_install_path}/conf/keytab  \
-yDsecurity.kerberos.login.keytab=${app_install_path}/conf/.keytab \
-Djava.security.krb5.conf=${app_install_path}/conf/krb5.conf                  \
-yDjava.security.krb5.conf=${app_install_path}/conf/krb5.conf                 \
-Dsecurity.kerberos.login.principal=[hidden email]                 \
-yDsecurity.kerberos.login.principal= [hidden email]                \
-Dsecurity.kerberos.login.contexts=Client,KafkaClient                         \
-yDsecurity.kerberos.login.contexts=Client,KafkaClient

Here, I am getting the following Error, it seems like KEYTAB Was not transported to the run environment and probably not found.

org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
Caused by: java.lang.IllegalArgumentException: Could not find a 'KafkaClient' entry in the JAAS configuration. System property 'java.security.auth.login.config'

2) Passing from flink config:  /usr/lib/flink/conf/flink-conf.yaml

security.kerberos.login.use-ticket-cache: false
security.kerberos.login.keytab:  ${app_install_path}/conf/keytab
security.kerberos.login.principal:  [hidden email]
security.kerberos.login.contexts: Client,KafkaClient

Here, I am getting the following Error,

org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
Caused by: org.apache.kafka.common.KafkaException: java.lang.IllegalArgumentException: No serviceName defined in either JAAS or Kafka config
 

Could you please help find, what are probable causes and resolution?

Regards,
Vijay

Reply | Threaded
Open this post in threaded view
|

Re: [Flink-KAFKA-KEYTAB] Kafkaconsumer error Kerberos

Dawid Wysakowicz-2

Hi,

As far as I know the approach 2) is the supported way of setting up Kerberos authentication in Flink. In the second approach have you tried setting the `sasl.kerberos.service.name` in the configuration of your KafkaConsumer/Producer[1]? I think this might be the issue.

Best,

Dawid

[1] https://ci.apache.org/projects/flink/flink-docs-stable/dev/connectors/kafka.html#enabling-kerberos-authentication


On 09/08/2020 20:39, Vijayendra Yadav wrote:
Hi Team,


Here my KAFKA is Kerberos secured and SSL enabled.

I am running my Flink streaming in yarn-cluster on EMR 5.31.

I have tried to pass keytab/principal in following 2 Ways:

1) Passing as JVM property in Flink run Command.

/usr/lib/flink/bin/flink run
   -yt ${app_install_path}/conf/                                                 \
-Dsecurity.kerberos.login.use-ticket-cache=false                              \
-yDsecurity.kerberos.login.use-ticket-cache=false                             \
-Dsecurity.kerberos.login.keytab=${app_install_path}/conf/keytab  \
-yDsecurity.kerberos.login.keytab=${app_install_path}/conf/.keytab \
-Djava.security.krb5.conf=${app_install_path}/conf/krb5.conf                  \
-yDjava.security.krb5.conf=${app_install_path}/conf/krb5.conf                 \
-Dsecurity.kerberos.login.principal=[hidden email]                 \
-yDsecurity.kerberos.login.principal= [hidden email]                \
-Dsecurity.kerberos.login.contexts=Client,KafkaClient                         \
-yDsecurity.kerberos.login.contexts=Client,KafkaClient

Here, I am getting the following Error, it seems like KEYTAB Was not transported to the run environment and probably not found.

org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
Caused by: java.lang.IllegalArgumentException: Could not find a 'KafkaClient' entry in the JAAS configuration. System property 'java.security.auth.login.config'

2) Passing from flink config:  /usr/lib/flink/conf/flink-conf.yaml

security.kerberos.login.use-ticket-cache: false
security.kerberos.login.keytab:  ${app_install_path}/conf/keytab
security.kerberos.login.principal:  [hidden email]
security.kerberos.login.contexts: Client,KafkaClient

Here, I am getting the following Error,

org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
Caused by: org.apache.kafka.common.KafkaException: java.lang.IllegalArgumentException: No serviceName defined in either JAAS or Kafka config
 

Could you please help find, what are probable causes and resolution?

Regards,
Vijay


signature.asc (849 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: [Flink-KAFKA-KEYTAB] Kafkaconsumer error Kerberos

Vijayendra Yadav
Dawid, I was able to resolve the keytab issue by passing the service name, but now I am facing the KRB5 issue.

Caused by: org.apache.kafka.common.errors.SaslAuthenticationException: Failed to create SaslClient with mechanism GSSAPI
Caused by: javax.security.sasl.SaslException: Failure to initialize security context [Caused by GSSException: Invalid name provided (Mechanism level: KrbException: Cannot locate default realm)]

I passed KRB5 from yaml conf file like:

env.java.opts.jobmanager: -Djava.security.krb5.conf=/path/krb5.conf
env.java.opts.taskmanager: -Djava.security.krb5.conf=/path/krb5.conf

How can I resolve this? Is there another way to pass KRB5?

I also tried via option#1 from flink run command -D parameter.

Regards,
Vijay


On Tue, Aug 11, 2020 at 1:26 AM Dawid Wysakowicz <[hidden email]> wrote:

Hi,

As far as I know the approach 2) is the supported way of setting up Kerberos authentication in Flink. In the second approach have you tried setting the `sasl.kerberos.service.name` in the configuration of your KafkaConsumer/Producer[1]? I think this might be the issue.

Best,

Dawid

[1] https://ci.apache.org/projects/flink/flink-docs-stable/dev/connectors/kafka.html#enabling-kerberos-authentication


On 09/08/2020 20:39, Vijayendra Yadav wrote:
Hi Team,


Here my KAFKA is Kerberos secured and SSL enabled.

I am running my Flink streaming in yarn-cluster on EMR 5.31.

I have tried to pass keytab/principal in following 2 Ways:

1) Passing as JVM property in Flink run Command.

/usr/lib/flink/bin/flink run
   -yt ${app_install_path}/conf/                                                 \
-Dsecurity.kerberos.login.use-ticket-cache=false                              \
-yDsecurity.kerberos.login.use-ticket-cache=false                             \
-Dsecurity.kerberos.login.keytab=${app_install_path}/conf/keytab  \
-yDsecurity.kerberos.login.keytab=${app_install_path}/conf/.keytab \
-Djava.security.krb5.conf=${app_install_path}/conf/krb5.conf                  \
-yDjava.security.krb5.conf=${app_install_path}/conf/krb5.conf                 \
-Dsecurity.kerberos.login.principal=[hidden email]                 \
-yDsecurity.kerberos.login.principal= [hidden email]                \
-Dsecurity.kerberos.login.contexts=Client,KafkaClient                         \
-yDsecurity.kerberos.login.contexts=Client,KafkaClient

Here, I am getting the following Error, it seems like KEYTAB Was not transported to the run environment and probably not found.

org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
Caused by: java.lang.IllegalArgumentException: Could not find a 'KafkaClient' entry in the JAAS configuration. System property 'java.security.auth.login.config'

2) Passing from flink config:  /usr/lib/flink/conf/flink-conf.yaml

security.kerberos.login.use-ticket-cache: false
security.kerberos.login.keytab:  ${app_install_path}/conf/keytab
security.kerberos.login.principal:  [hidden email]
security.kerberos.login.contexts: Client,KafkaClient

Here, I am getting the following Error,

org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
Caused by: org.apache.kafka.common.KafkaException: java.lang.IllegalArgumentException: No serviceName defined in either JAAS or Kafka config
 

Could you please help find, what are probable causes and resolution?

Regards,
Vijay

Reply | Threaded
Open this post in threaded view
|

Re: [Flink-KAFKA-KEYTAB] Kafkaconsumer error Kerberos

Vijayendra Yadav
Any inputs ?  

On Tue, Aug 11, 2020 at 10:34 AM Vijayendra Yadav <[hidden email]> wrote:
Dawid, I was able to resolve the keytab issue by passing the service name, but now I am facing the KRB5 issue.

Caused by: org.apache.kafka.common.errors.SaslAuthenticationException: Failed to create SaslClient with mechanism GSSAPI
Caused by: javax.security.sasl.SaslException: Failure to initialize security context [Caused by GSSException: Invalid name provided (Mechanism level: KrbException: Cannot locate default realm)]

I passed KRB5 from yaml conf file like:

env.java.opts.jobmanager: -Djava.security.krb5.conf=/path/krb5.conf
env.java.opts.taskmanager: -Djava.security.krb5.conf=/path/krb5.conf

How can I resolve this? Is there another way to pass KRB5?

I also tried via option#1 from flink run command -D parameter.

Regards,
Vijay


On Tue, Aug 11, 2020 at 1:26 AM Dawid Wysakowicz <[hidden email]> wrote:

Hi,

As far as I know the approach 2) is the supported way of setting up Kerberos authentication in Flink. In the second approach have you tried setting the `sasl.kerberos.service.name` in the configuration of your KafkaConsumer/Producer[1]? I think this might be the issue.

Best,

Dawid

[1] https://ci.apache.org/projects/flink/flink-docs-stable/dev/connectors/kafka.html#enabling-kerberos-authentication


On 09/08/2020 20:39, Vijayendra Yadav wrote:
Hi Team,


Here my KAFKA is Kerberos secured and SSL enabled.

I am running my Flink streaming in yarn-cluster on EMR 5.31.

I have tried to pass keytab/principal in following 2 Ways:

1) Passing as JVM property in Flink run Command.

/usr/lib/flink/bin/flink run
   -yt ${app_install_path}/conf/                                                 \
-Dsecurity.kerberos.login.use-ticket-cache=false                              \
-yDsecurity.kerberos.login.use-ticket-cache=false                             \
-Dsecurity.kerberos.login.keytab=${app_install_path}/conf/keytab  \
-yDsecurity.kerberos.login.keytab=${app_install_path}/conf/.keytab \
-Djava.security.krb5.conf=${app_install_path}/conf/krb5.conf                  \
-yDjava.security.krb5.conf=${app_install_path}/conf/krb5.conf                 \
-Dsecurity.kerberos.login.principal=[hidden email]                 \
-yDsecurity.kerberos.login.principal= [hidden email]                \
-Dsecurity.kerberos.login.contexts=Client,KafkaClient                         \
-yDsecurity.kerberos.login.contexts=Client,KafkaClient

Here, I am getting the following Error, it seems like KEYTAB Was not transported to the run environment and probably not found.

org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
Caused by: java.lang.IllegalArgumentException: Could not find a 'KafkaClient' entry in the JAAS configuration. System property 'java.security.auth.login.config'

2) Passing from flink config:  /usr/lib/flink/conf/flink-conf.yaml

security.kerberos.login.use-ticket-cache: false
security.kerberos.login.keytab:  ${app_install_path}/conf/keytab
security.kerberos.login.principal:  [hidden email]
security.kerberos.login.contexts: Client,KafkaClient

Here, I am getting the following Error,

org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
Caused by: org.apache.kafka.common.KafkaException: java.lang.IllegalArgumentException: No serviceName defined in either JAAS or Kafka config
 

Could you please help find, what are probable causes and resolution?

Regards,
Vijay

Reply | Threaded
Open this post in threaded view
|

Re: [Flink-KAFKA-KEYTAB] Kafkaconsumer error Kerberos

Yangze Guo
Hi,

When deploying Flink on Yarn, you could ship krb5.conf by "--ship"
command. Notice that this command only supports to ship folders now.

Best,
Yangze Guo

On Fri, Aug 14, 2020 at 11:22 AM Vijayendra Yadav <[hidden email]> wrote:

>
> Any inputs ?
>
> On Tue, Aug 11, 2020 at 10:34 AM Vijayendra Yadav <[hidden email]> wrote:
>>
>> Dawid, I was able to resolve the keytab issue by passing the service name, but now I am facing the KRB5 issue.
>>
>> Caused by: org.apache.kafka.common.errors.SaslAuthenticationException: Failed to create SaslClient with mechanism GSSAPI
>> Caused by: javax.security.sasl.SaslException: Failure to initialize security context [Caused by GSSException: Invalid name provided (Mechanism level: KrbException: Cannot locate default realm)]
>>
>> I passed KRB5 from yaml conf file like:
>>
>> env.java.opts.jobmanager: -Djava.security.krb5.conf=/path/krb5.conf
>> env.java.opts.taskmanager: -Djava.security.krb5.conf=/path/krb5.conf
>>
>> How can I resolve this? Is there another way to pass KRB5?
>>
>> I also tried via option#1 from flink run command -D parameter.
>>
>> Regards,
>> Vijay
>>
>>
>> On Tue, Aug 11, 2020 at 1:26 AM Dawid Wysakowicz <[hidden email]> wrote:
>>>
>>> Hi,
>>>
>>> As far as I know the approach 2) is the supported way of setting up Kerberos authentication in Flink. In the second approach have you tried setting the `sasl.kerberos.service.name` in the configuration of your KafkaConsumer/Producer[1]? I think this might be the issue.
>>>
>>> Best,
>>>
>>> Dawid
>>>
>>> [1] https://ci.apache.org/projects/flink/flink-docs-stable/dev/connectors/kafka.html#enabling-kerberos-authentication
>>>
>>>
>>> On 09/08/2020 20:39, Vijayendra Yadav wrote:
>>>
>>> Hi Team,
>>>
>>> I am trying to stream data from kafkaconsumer using: https://ci.apache.org/projects/flink/flink-docs-stable/dev/connectors/kafka.html
>>>
>>> Here my KAFKA is Kerberos secured and SSL enabled.
>>>
>>> I am running my Flink streaming in yarn-cluster on EMR 5.31.
>>>
>>> I have tried to pass keytab/principal in following 2 Ways:
>>>
>>> 1) Passing as JVM property in Flink run Command.
>>>
>>> /usr/lib/flink/bin/flink run
>>>    -yt ${app_install_path}/conf/                                                 \
>>>>
>>>> -Dsecurity.kerberos.login.use-ticket-cache=false                              \
>>>> -yDsecurity.kerberos.login.use-ticket-cache=false                             \
>>>> -Dsecurity.kerberos.login.keytab=${app_install_path}/conf/keytab  \
>>>> -yDsecurity.kerberos.login.keytab=${app_install_path}/conf/.keytab \
>>>> -Djava.security.krb5.conf=${app_install_path}/conf/krb5.conf                  \
>>>> -yDjava.security.krb5.conf=${app_install_path}/conf/krb5.conf                 \
>>>> -Dsecurity.kerberos.login.principal=[hidden email]                 \
>>>> -yDsecurity.kerberos.login.principal= [hidden email]                \
>>>> -Dsecurity.kerberos.login.contexts=Client,KafkaClient                         \
>>>> -yDsecurity.kerberos.login.contexts=Client,KafkaClient
>>>
>>>
>>> Here, I am getting the following Error, it seems like KEYTAB Was not transported to the run environment and probably not found.
>>>
>>> org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
>>> Caused by: java.lang.IllegalArgumentException: Could not find a 'KafkaClient' entry in the JAAS configuration. System property 'java.security.auth.login.config'
>>>
>>> 2) Passing from flink config:  /usr/lib/flink/conf/flink-conf.yaml
>>>
>>> security.kerberos.login.use-ticket-cache: false
>>> security.kerberos.login.keytab:  ${app_install_path}/conf/keytab
>>> security.kerberos.login.principal:  [hidden email]
>>> security.kerberos.login.contexts: Client,KafkaClient
>>>
>>> Here, I am getting the following Error,
>>>
>>> org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
>>> Caused by: org.apache.kafka.common.KafkaException: java.lang.IllegalArgumentException: No serviceName defined in either JAAS or Kafka config
>>>
>>>
>>> Could you please help find, what are probable causes and resolution?
>>>
>>> Regards,
>>> Vijay
>>>
Reply | Threaded
Open this post in threaded view
|

Re: [Flink-KAFKA-KEYTAB] Kafkaconsumer error Kerberos

Vijayendra Yadav
Hi Yangze,

I tried the following: maybe I am missing something.
-yt,--yarnship <arg>  

Run: 
/usr/lib/flink/bin/flink run -m yarn-cluster
-yt ${app_install_path}/conf

my KRB5.conf is in  ${app_install_path}/conf n master node (local build path)

When this folder is shipped to yarn, how should i reference this KRB5.conf now in run command?

I tried like:   -yD java.security.krb5.conf=./krb5.conf                \

Didn't work this way. Please suggest, can file be used as relative path  ./krb5.conf or what is misinterpreted?

Note: When we manually updated KRB5.conf on all cluster nodes in /etc/KRB5.conf it was working. But I am trying to make it available as JVM property. 

Regards,
Vijay


On Thu, Aug 13, 2020 at 9:21 PM Yangze Guo <[hidden email]> wrote:
Hi,

When deploying Flink on Yarn, you could ship krb5.conf by "--ship"
command. Notice that this command only supports to ship folders now.

Best,
Yangze Guo

On Fri, Aug 14, 2020 at 11:22 AM Vijayendra Yadav <[hidden email]> wrote:
>
> Any inputs ?
>
> On Tue, Aug 11, 2020 at 10:34 AM Vijayendra Yadav <[hidden email]> wrote:
>>
>> Dawid, I was able to resolve the keytab issue by passing the service name, but now I am facing the KRB5 issue.
>>
>> Caused by: org.apache.kafka.common.errors.SaslAuthenticationException: Failed to create SaslClient with mechanism GSSAPI
>> Caused by: javax.security.sasl.SaslException: Failure to initialize security context [Caused by GSSException: Invalid name provided (Mechanism level: KrbException: Cannot locate default realm)]
>>
>> I passed KRB5 from yaml conf file like:
>>
>> env.java.opts.jobmanager: -Djava.security.krb5.conf=/path/krb5.conf
>> env.java.opts.taskmanager: -Djava.security.krb5.conf=/path/krb5.conf
>>
>> How can I resolve this? Is there another way to pass KRB5?
>>
>> I also tried via option#1 from flink run command -D parameter.
>>
>> Regards,
>> Vijay
>>
>>
>> On Tue, Aug 11, 2020 at 1:26 AM Dawid Wysakowicz <[hidden email]> wrote:
>>>
>>> Hi,
>>>
>>> As far as I know the approach 2) is the supported way of setting up Kerberos authentication in Flink. In the second approach have you tried setting the `sasl.kerberos.service.name` in the configuration of your KafkaConsumer/Producer[1]? I think this might be the issue.
>>>
>>> Best,
>>>
>>> Dawid
>>>
>>> [1] https://ci.apache.org/projects/flink/flink-docs-stable/dev/connectors/kafka.html#enabling-kerberos-authentication
>>>
>>>
>>> On 09/08/2020 20:39, Vijayendra Yadav wrote:
>>>
>>> Hi Team,
>>>
>>> I am trying to stream data from kafkaconsumer using: https://ci.apache.org/projects/flink/flink-docs-stable/dev/connectors/kafka.html
>>>
>>> Here my KAFKA is Kerberos secured and SSL enabled.
>>>
>>> I am running my Flink streaming in yarn-cluster on EMR 5.31.
>>>
>>> I have tried to pass keytab/principal in following 2 Ways:
>>>
>>> 1) Passing as JVM property in Flink run Command.
>>>
>>> /usr/lib/flink/bin/flink run
>>>    -yt ${app_install_path}/conf/                                                 \
>>>>
>>>> -Dsecurity.kerberos.login.use-ticket-cache=false                              \
>>>> -yDsecurity.kerberos.login.use-ticket-cache=false                             \
>>>> -Dsecurity.kerberos.login.keytab=${app_install_path}/conf/keytab  \
>>>> -yDsecurity.kerberos.login.keytab=${app_install_path}/conf/.keytab \
>>>> -Djava.security.krb5.conf=${app_install_path}/conf/krb5.conf                  \
>>>> -yDjava.security.krb5.conf=${app_install_path}/conf/krb5.conf                 \
>>>> -Dsecurity.kerberos.login.principal=[hidden email]                 \
>>>> -yDsecurity.kerberos.login.principal= [hidden email]                \
>>>> -Dsecurity.kerberos.login.contexts=Client,KafkaClient                         \
>>>> -yDsecurity.kerberos.login.contexts=Client,KafkaClient
>>>
>>>
>>> Here, I am getting the following Error, it seems like KEYTAB Was not transported to the run environment and probably not found.
>>>
>>> org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
>>> Caused by: java.lang.IllegalArgumentException: Could not find a 'KafkaClient' entry in the JAAS configuration. System property 'java.security.auth.login.config'
>>>
>>> 2) Passing from flink config:  /usr/lib/flink/conf/flink-conf.yaml
>>>
>>> security.kerberos.login.use-ticket-cache: false
>>> security.kerberos.login.keytab:  ${app_install_path}/conf/keytab
>>> security.kerberos.login.principal:  [hidden email]
>>> security.kerberos.login.contexts: Client,KafkaClient
>>>
>>> Here, I am getting the following Error,
>>>
>>> org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
>>> Caused by: org.apache.kafka.common.KafkaException: java.lang.IllegalArgumentException: No serviceName defined in either JAAS or Kafka config
>>>
>>>
>>> Could you please help find, what are probable causes and resolution?
>>>
>>> Regards,
>>> Vijay
>>>
Reply | Threaded
Open this post in threaded view
|

Re: [Flink-KAFKA-KEYTAB] Kafkaconsumer error Kerberos

rmetzger0
Hi Vijayendra,

I'm not sure if -yD is the right argument as you've posted it: It is meant to be used for Flink configuration keys, not for JVM properties.

With the Flink configuration "env.java.opts", you should be able to pass JVM properties.
This should work: -yD env.java.opts="-D java.security.krb5.conf=./krb5.conf"

You can validate if this setting has properly reached the JobManager / TaskManager JVM by accessing the logs through the Flink Web UI. There's a section at the top of the log file for "JVM Options:".

If it still doesn't work, I would also recommend you validate that the files really end up on the machines as expected.
Figure out a host that runs a Flink TaskManager, get the Flink directory (from the logs again), then ssh into the machine and go into the directory to see if the files are where you would expect them.


I hope this helps,
Robert

On Fri, Aug 14, 2020 at 6:57 AM Vijayendra Yadav <[hidden email]> wrote:
Hi Yangze,

I tried the following: maybe I am missing something.
-yt,--yarnship <arg>  

Run: 
/usr/lib/flink/bin/flink run -m yarn-cluster
-yt ${app_install_path}/conf

my KRB5.conf is in  ${app_install_path}/conf n master node (local build path)

When this folder is shipped to yarn, how should i reference this KRB5.conf now in run command?

I tried like:   -yD java.security.krb5.conf=./krb5.conf                \

Didn't work this way. Please suggest, can file be used as relative path  ./krb5.conf or what is misinterpreted?

Note: When we manually updated KRB5.conf on all cluster nodes in /etc/KRB5.conf it was working. But I am trying to make it available as JVM property. 

Regards,
Vijay


On Thu, Aug 13, 2020 at 9:21 PM Yangze Guo <[hidden email]> wrote:
Hi,

When deploying Flink on Yarn, you could ship krb5.conf by "--ship"
command. Notice that this command only supports to ship folders now.

Best,
Yangze Guo

On Fri, Aug 14, 2020 at 11:22 AM Vijayendra Yadav <[hidden email]> wrote:
>
> Any inputs ?
>
> On Tue, Aug 11, 2020 at 10:34 AM Vijayendra Yadav <[hidden email]> wrote:
>>
>> Dawid, I was able to resolve the keytab issue by passing the service name, but now I am facing the KRB5 issue.
>>
>> Caused by: org.apache.kafka.common.errors.SaslAuthenticationException: Failed to create SaslClient with mechanism GSSAPI
>> Caused by: javax.security.sasl.SaslException: Failure to initialize security context [Caused by GSSException: Invalid name provided (Mechanism level: KrbException: Cannot locate default realm)]
>>
>> I passed KRB5 from yaml conf file like:
>>
>> env.java.opts.jobmanager: -Djava.security.krb5.conf=/path/krb5.conf
>> env.java.opts.taskmanager: -Djava.security.krb5.conf=/path/krb5.conf
>>
>> How can I resolve this? Is there another way to pass KRB5?
>>
>> I also tried via option#1 from flink run command -D parameter.
>>
>> Regards,
>> Vijay
>>
>>
>> On Tue, Aug 11, 2020 at 1:26 AM Dawid Wysakowicz <[hidden email]> wrote:
>>>
>>> Hi,
>>>
>>> As far as I know the approach 2) is the supported way of setting up Kerberos authentication in Flink. In the second approach have you tried setting the `sasl.kerberos.service.name` in the configuration of your KafkaConsumer/Producer[1]? I think this might be the issue.
>>>
>>> Best,
>>>
>>> Dawid
>>>
>>> [1] https://ci.apache.org/projects/flink/flink-docs-stable/dev/connectors/kafka.html#enabling-kerberos-authentication
>>>
>>>
>>> On 09/08/2020 20:39, Vijayendra Yadav wrote:
>>>
>>> Hi Team,
>>>
>>> I am trying to stream data from kafkaconsumer using: https://ci.apache.org/projects/flink/flink-docs-stable/dev/connectors/kafka.html
>>>
>>> Here my KAFKA is Kerberos secured and SSL enabled.
>>>
>>> I am running my Flink streaming in yarn-cluster on EMR 5.31.
>>>
>>> I have tried to pass keytab/principal in following 2 Ways:
>>>
>>> 1) Passing as JVM property in Flink run Command.
>>>
>>> /usr/lib/flink/bin/flink run
>>>    -yt ${app_install_path}/conf/                                                 \
>>>>
>>>> -Dsecurity.kerberos.login.use-ticket-cache=false                              \
>>>> -yDsecurity.kerberos.login.use-ticket-cache=false                             \
>>>> -Dsecurity.kerberos.login.keytab=${app_install_path}/conf/keytab  \
>>>> -yDsecurity.kerberos.login.keytab=${app_install_path}/conf/.keytab \
>>>> -Djava.security.krb5.conf=${app_install_path}/conf/krb5.conf                  \
>>>> -yDjava.security.krb5.conf=${app_install_path}/conf/krb5.conf                 \
>>>> -Dsecurity.kerberos.login.principal=[hidden email]                 \
>>>> -yDsecurity.kerberos.login.principal= [hidden email]                \
>>>> -Dsecurity.kerberos.login.contexts=Client,KafkaClient                         \
>>>> -yDsecurity.kerberos.login.contexts=Client,KafkaClient
>>>
>>>
>>> Here, I am getting the following Error, it seems like KEYTAB Was not transported to the run environment and probably not found.
>>>
>>> org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
>>> Caused by: java.lang.IllegalArgumentException: Could not find a 'KafkaClient' entry in the JAAS configuration. System property 'java.security.auth.login.config'
>>>
>>> 2) Passing from flink config:  /usr/lib/flink/conf/flink-conf.yaml
>>>
>>> security.kerberos.login.use-ticket-cache: false
>>> security.kerberos.login.keytab:  ${app_install_path}/conf/keytab
>>> security.kerberos.login.principal:  [hidden email]
>>> security.kerberos.login.contexts: Client,KafkaClient
>>>
>>> Here, I am getting the following Error,
>>>
>>> org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
>>> Caused by: org.apache.kafka.common.KafkaException: java.lang.IllegalArgumentException: No serviceName defined in either JAAS or Kafka config
>>>
>>>
>>> Could you please help find, what are probable causes and resolution?
>>>
>>> Regards,
>>> Vijay
>>>
Reply | Threaded
Open this post in threaded view
|

Re: [Flink-KAFKA-KEYTAB] Kafkaconsumer error Kerberos

Vijayendra Yadav
Thanks for your valuable inputs Robert, it helped me solve the issue.
While I tried -yD from flink run, like you mentioned and many other combinations of the same, that didn't work out. 

Finally it worked when I passed it from flink-conf.yaml with relative path. Like below:

env.java.opts.jobmanager: -Djava.security.krb5.conf=./conf/krb5.conf
env.java.opts.taskmanager: -Djava.security.krb5.conf=./conf/krb5.conf
env.java.opts: -Djava.security.krb5.conf=./conf/krb5.conf

Regards,
Vijay

On Fri, Aug 14, 2020 at 12:42 PM Robert Metzger <[hidden email]> wrote:
Hi Vijayendra,

I'm not sure if -yD is the right argument as you've posted it: It is meant to be used for Flink configuration keys, not for JVM properties.

With the Flink configuration "env.java.opts", you should be able to pass JVM properties.
This should work: -yD env.java.opts="-D java.security.krb5.conf=./krb5.conf"

You can validate if this setting has properly reached the JobManager / TaskManager JVM by accessing the logs through the Flink Web UI. There's a section at the top of the log file for "JVM Options:".

If it still doesn't work, I would also recommend you validate that the files really end up on the machines as expected.
Figure out a host that runs a Flink TaskManager, get the Flink directory (from the logs again), then ssh into the machine and go into the directory to see if the files are where you would expect them.


I hope this helps,
Robert

On Fri, Aug 14, 2020 at 6:57 AM Vijayendra Yadav <[hidden email]> wrote:
Hi Yangze,

I tried the following: maybe I am missing something.
-yt,--yarnship <arg>  

Run: 
/usr/lib/flink/bin/flink run -m yarn-cluster
-yt ${app_install_path}/conf

my KRB5.conf is in  ${app_install_path}/conf n master node (local build path)

When this folder is shipped to yarn, how should i reference this KRB5.conf now in run command?

I tried like:   -yD java.security.krb5.conf=./krb5.conf                \

Didn't work this way. Please suggest, can file be used as relative path  ./krb5.conf or what is misinterpreted?

Note: When we manually updated KRB5.conf on all cluster nodes in /etc/KRB5.conf it was working. But I am trying to make it available as JVM property. 

Regards,
Vijay


On Thu, Aug 13, 2020 at 9:21 PM Yangze Guo <[hidden email]> wrote:
Hi,

When deploying Flink on Yarn, you could ship krb5.conf by "--ship"
command. Notice that this command only supports to ship folders now.

Best,
Yangze Guo

On Fri, Aug 14, 2020 at 11:22 AM Vijayendra Yadav <[hidden email]> wrote:
>
> Any inputs ?
>
> On Tue, Aug 11, 2020 at 10:34 AM Vijayendra Yadav <[hidden email]> wrote:
>>
>> Dawid, I was able to resolve the keytab issue by passing the service name, but now I am facing the KRB5 issue.
>>
>> Caused by: org.apache.kafka.common.errors.SaslAuthenticationException: Failed to create SaslClient with mechanism GSSAPI
>> Caused by: javax.security.sasl.SaslException: Failure to initialize security context [Caused by GSSException: Invalid name provided (Mechanism level: KrbException: Cannot locate default realm)]
>>
>> I passed KRB5 from yaml conf file like:
>>
>> env.java.opts.jobmanager: -Djava.security.krb5.conf=/path/krb5.conf
>> env.java.opts.taskmanager: -Djava.security.krb5.conf=/path/krb5.conf
>>
>> How can I resolve this? Is there another way to pass KRB5?
>>
>> I also tried via option#1 from flink run command -D parameter.
>>
>> Regards,
>> Vijay
>>
>>
>> On Tue, Aug 11, 2020 at 1:26 AM Dawid Wysakowicz <[hidden email]> wrote:
>>>
>>> Hi,
>>>
>>> As far as I know the approach 2) is the supported way of setting up Kerberos authentication in Flink. In the second approach have you tried setting the `sasl.kerberos.service.name` in the configuration of your KafkaConsumer/Producer[1]? I think this might be the issue.
>>>
>>> Best,
>>>
>>> Dawid
>>>
>>> [1] https://ci.apache.org/projects/flink/flink-docs-stable/dev/connectors/kafka.html#enabling-kerberos-authentication
>>>
>>>
>>> On 09/08/2020 20:39, Vijayendra Yadav wrote:
>>>
>>> Hi Team,
>>>
>>> I am trying to stream data from kafkaconsumer using: https://ci.apache.org/projects/flink/flink-docs-stable/dev/connectors/kafka.html
>>>
>>> Here my KAFKA is Kerberos secured and SSL enabled.
>>>
>>> I am running my Flink streaming in yarn-cluster on EMR 5.31.
>>>
>>> I have tried to pass keytab/principal in following 2 Ways:
>>>
>>> 1) Passing as JVM property in Flink run Command.
>>>
>>> /usr/lib/flink/bin/flink run
>>>    -yt ${app_install_path}/conf/                                                 \
>>>>
>>>> -Dsecurity.kerberos.login.use-ticket-cache=false                              \
>>>> -yDsecurity.kerberos.login.use-ticket-cache=false                             \
>>>> -Dsecurity.kerberos.login.keytab=${app_install_path}/conf/keytab  \
>>>> -yDsecurity.kerberos.login.keytab=${app_install_path}/conf/.keytab \
>>>> -Djava.security.krb5.conf=${app_install_path}/conf/krb5.conf                  \
>>>> -yDjava.security.krb5.conf=${app_install_path}/conf/krb5.conf                 \
>>>> -Dsecurity.kerberos.login.principal=[hidden email]                 \
>>>> -yDsecurity.kerberos.login.principal= [hidden email]                \
>>>> -Dsecurity.kerberos.login.contexts=Client,KafkaClient                         \
>>>> -yDsecurity.kerberos.login.contexts=Client,KafkaClient
>>>
>>>
>>> Here, I am getting the following Error, it seems like KEYTAB Was not transported to the run environment and probably not found.
>>>
>>> org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
>>> Caused by: java.lang.IllegalArgumentException: Could not find a 'KafkaClient' entry in the JAAS configuration. System property 'java.security.auth.login.config'
>>>
>>> 2) Passing from flink config:  /usr/lib/flink/conf/flink-conf.yaml
>>>
>>> security.kerberos.login.use-ticket-cache: false
>>> security.kerberos.login.keytab:  ${app_install_path}/conf/keytab
>>> security.kerberos.login.principal:  [hidden email]
>>> security.kerberos.login.contexts: Client,KafkaClient
>>>
>>> Here, I am getting the following Error,
>>>
>>> org.apache.kafka.common.KafkaException: Failed to construct kafka consumer
>>> Caused by: org.apache.kafka.common.KafkaException: java.lang.IllegalArgumentException: No serviceName defined in either JAAS or Kafka config
>>>
>>>
>>> Could you please help find, what are probable causes and resolution?
>>>
>>> Regards,
>>> Vijay
>>>
Reply | Threaded
Open this post in threaded view
|

Re: [Flink-KAFKA-KEYTAB] Kafkaconsumer error Kerberos

tunm4
I aslo meet this problem. Can you share me solutions?
Thank you so much!