Flink 1.8.1 HDFS 2.6.5 issue

classic Classic list List threaded Threaded
6 messages Options
Reply | Threaded
Open this post in threaded view
|

Flink 1.8.1 HDFS 2.6.5 issue

V N, Suchithra (Nokia - IN/Bangalore)

Hi,

 

I am trying to execute Wordcount.jar in Flink 1.8.1 with Hadoop version 2.6.5. HDFS is enabled with Kerberos+SSL. While writing output to HDFS, facing the below exception and job will be failed. Please let me know if any suggestions to debug this issue.

 

Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
        at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
        at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:265)
        ... 21 more
Caused by: java.io.IOException: DataStreamer Exception:
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:695)
Caused by: java.lang.NullPointerException
        at org.apache.hadoop.crypto.CryptoInputStream.<init>(CryptoInputStream.java:132)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil.createStreamPair(DataTransferSaslUtil.java:345)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.doSaslHandshake(SaslDataTransferClient.java:489)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.getEncryptedStreams(SaslDataTransferClient.java:298)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.send(SaslDataTransferClient.java:241)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:210)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.socketSend(SaslDataTransferClient.java:182)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1409)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587)

 

Regards,

Suchithra

Reply | Threaded
Open this post in threaded view
|

Re: Flink 1.8.1 HDFS 2.6.5 issue

Dian Fu
It seems that the CryptoCodec is null from the exception stack trace. This may occur when "hadoop.security.crypto.codec.classes.aes.ctr.nopadding" is misconfigured. You could change the log level to "DEBUG" and it will show more detailed information about why CryptoCodec is null.

在 2019年10月28日,下午7:14,V N, Suchithra (Nokia - IN/Bangalore) <[hidden email]> 写道:

Hi,
 
I am trying to execute Wordcount.jar in Flink 1.8.1 with Hadoop version 2.6.5. HDFS is enabled with Kerberos+SSL. While writing output to HDFS, facing the below exception and job will be failed. Please let me know if any suggestions to debug this issue.
 
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
        at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
        at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:265)
        ... 21 more
Caused by: java.io.IOException: DataStreamer Exception:
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:695)
Caused by: java.lang.NullPointerException
        at org.apache.hadoop.crypto.CryptoInputStream.<init>(CryptoInputStream.java:132)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil.createStreamPair(DataTransferSaslUtil.java:345)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.doSaslHandshake(SaslDataTransferClient.java:489)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.getEncryptedStreams(SaslDataTransferClient.java:298)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.send(SaslDataTransferClient.java:241)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:210)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.socketSend(SaslDataTransferClient.java:182)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1409)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587)
 
Regards,
Suchithra

Reply | Threaded
Open this post in threaded view
|

RE: Flink 1.8.1 HDFS 2.6.5 issue

V N, Suchithra (Nokia - IN/Bangalore)

Hi,

From debug logs I could see below logs in taskmanager. Please have a look.

 

org.apache.hadoop.ipc.ProtobufRpcEngine Call: addBlock took 372ms"}

org.apache.hadoop.ipc.ProtobufRpcEngine Call: addBlock took 372ms"}

org.apache.hadoop.hdfs.DFSClient pipeline = 10.76.113.216:1044"}

org.apache.hadoop.hdfs.DFSClient pipeline = 10.76.113.216:1044"}

org.apache.hadoop.hdfs.DFSClient Connecting to datanode 10.76.113.216:1044"}

org.apache.hadoop.hdfs.DFSClient Connecting to datanode 10.76.113.216:1044"}

org.apache.hadoop.hdfs.DFSClient Send buf size 131072"}

org.apache.hadoop.hdfs.DFSClient Send buf size 131072"}

o.a.h.h.p.d.sasl.SaslDataTransferClient SASL client doing encrypted handshake for addr = /10.76.113.216, datanodeId = 10.76.113.216:1044"}

o.a.h.h.p.d.sasl.SaslDataTransferClient SASL client doing encrypted handshake for addr = /10.76.113.216, datanodeId = 10.76.113.216:1044"}

o.a.h.h.p.d.sasl.SaslDataTransferClient Client using encryption algorithm 3des"}

o.a.h.h.p.d.sasl.SaslDataTransferClient Client using encryption algorithm 3des"}

o.a.h.h.p.d.sasl.DataTransferSaslUtil Verifying QOP, requested QOP = [auth-conf], negotiated QOP = auth-conf"}

o.a.h.h.p.d.sasl.DataTransferSaslUtil Verifying QOP, requested QOP = [auth-conf], negotiated QOP = auth-conf"}

o.a.h.h.p.d.sasl.DataTransferSaslUtil Creating IOStreamPair of CryptoInputStream and CryptoOutputStream."}

o.a.h.h.p.d.sasl.DataTransferSaslUtil Creating IOStreamPair of CryptoInputStream and CryptoOutputStream."}

o.apache.hadoop.util.PerformanceAdvisory No crypto codec classes with cipher suite configured."}

o.apache.hadoop.util.PerformanceAdvisory No crypto codec classes with cipher suite configured."}

org.apache.hadoop.hdfs.DFSClient DataStreamer Exception"}

java.lang.NullPointerException: null

              at org.apache.hadoop.crypto.CryptoInputStream.<init>(CryptoInputStream.java:132)

              at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil.createStreamPair(DataTransferSaslUtil.java:345)

              at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.doSaslHandshake(SaslDataTransferClient.java:489)

              at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.getEncryptedStreams(SaslDataTransferClient.java:298)

              at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.send(SaslDataTransferClient.java:241)

              at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:210)

              at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.socketSend(SaslDataTransferClient.java:182)

              at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1409)

              at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357)

              at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587)

 

Regards,

Suchithra

 

From: Dian Fu <[hidden email]>
Sent: Monday, October 28, 2019 5:40 PM
To: V N, Suchithra (Nokia - IN/Bangalore) <[hidden email]>
Cc: [hidden email]
Subject: Re: Flink 1.8.1 HDFS 2.6.5 issue

 

It seems that the CryptoCodec is null from the exception stack trace. This may occur when "hadoop.security.crypto.codec.classes.aes.ctr.nopadding" is misconfigured. You could change the log level to "DEBUG" and it will show more detailed information about why CryptoCodec is null.

 

20191028日,下午7:14V N, Suchithra (Nokia - IN/Bangalore) <[hidden email]> 写道:

 

Hi,

 

I am trying to execute Wordcount.jar in Flink 1.8.1 with Hadoop version 2.6.5. HDFS is enabled with Kerberos+SSL. While writing output to HDFS, facing the below exception and job will be failed. Please let me know if any suggestions to debug this issue.

 

Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
        at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
        at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:265)
        ... 21 more
Caused by: java.io.IOException: DataStreamer Exception:
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:695)
Caused by: java.lang.NullPointerException
        at org.apache.hadoop.crypto.CryptoInputStream.<init>(CryptoInputStream.java:132)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil.createStreamPair(DataTransferSaslUtil.java:345)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.doSaslHandshake(SaslDataTransferClient.java:489)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.getEncryptedStreams(SaslDataTransferClient.java:298)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.send(SaslDataTransferClient.java:241)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:210)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.socketSend(SaslDataTransferClient.java:182)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1409)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587)

 

Regards,

Suchithra

 

Reply | Threaded
Open this post in threaded view
|

Re: Flink 1.8.1 HDFS 2.6.5 issue

Dian Fu
I guess this is a bug in Hadoop 2.6.5 and has been fixed in Hadoop 2.8.0 [1]. You can work around it by explicitly setting the configration "hadoop.security.crypto.codec.classes.aes.ctr.nopadding" as "org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec, org.apache.hadoop.crypto.JceAesCtrCryptoCodec".


在 2019年10月28日,下午8:59,V N, Suchithra (Nokia - IN/Bangalore) <[hidden email]> 写道:

Hi,
From debug logs I could see below logs in taskmanager. Please have a look.
 
org.apache.hadoop.ipc.ProtobufRpcEngine Call: addBlock took 372ms"}
org.apache.hadoop.ipc.ProtobufRpcEngine Call: addBlock took 372ms"}
org.apache.hadoop.hdfs.DFSClient pipeline = 10.76.113.216:1044"}
org.apache.hadoop.hdfs.DFSClient pipeline = 10.76.113.216:1044"}
org.apache.hadoop.hdfs.DFSClient Connecting to datanode 10.76.113.216:1044"}
org.apache.hadoop.hdfs.DFSClient Connecting to datanode 10.76.113.216:1044"}
org.apache.hadoop.hdfs.DFSClient Send buf size 131072"}
org.apache.hadoop.hdfs.DFSClient Send buf size 131072"}
o.a.h.h.p.d.sasl.SaslDataTransferClient SASL client doing encrypted handshake for addr = /10.76.113.216, datanodeId = 10.76.113.216:1044"}
o.a.h.h.p.d.sasl.SaslDataTransferClient SASL client doing encrypted handshake for addr = /10.76.113.216, datanodeId = 10.76.113.216:1044"}
o.a.h.h.p.d.sasl.SaslDataTransferClient Client using encryption algorithm 3des"}
o.a.h.h.p.d.sasl.SaslDataTransferClient Client using encryption algorithm 3des"}
o.a.h.h.p.d.sasl.DataTransferSaslUtil Verifying QOP, requested QOP = [auth-conf], negotiated QOP = auth-conf"}
o.a.h.h.p.d.sasl.DataTransferSaslUtil Verifying QOP, requested QOP = [auth-conf], negotiated QOP = auth-conf"}
o.a.h.h.p.d.sasl.DataTransferSaslUtil Creating IOStreamPair of CryptoInputStream and CryptoOutputStream."}
o.a.h.h.p.d.sasl.DataTransferSaslUtil Creating IOStreamPair of CryptoInputStream and CryptoOutputStream."}
o.apache.hadoop.util.PerformanceAdvisory No crypto codec classes with cipher suite configured."}
o.apache.hadoop.util.PerformanceAdvisory No crypto codec classes with cipher suite configured."}
org.apache.hadoop.hdfs.DFSClient DataStreamer Exception"}
java.lang.NullPointerException: null
              at org.apache.hadoop.crypto.CryptoInputStream.<init>(CryptoInputStream.java:132)
              at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil.createStreamPair(DataTransferSaslUtil.java:345)
              at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.doSaslHandshake(SaslDataTransferClient.java:489)
              at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.getEncryptedStreams(SaslDataTransferClient.java:298)
              at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.send(SaslDataTransferClient.java:241)
              at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:210)
              at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.socketSend(SaslDataTransferClient.java:182)
              at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1409)
              at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357)
              at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587)
 
Regards,
Suchithra
 
From: Dian Fu <[hidden email]> 
Sent: Monday, October 28, 2019 5:40 PM
To: V N, Suchithra (Nokia - IN/Bangalore) <[hidden email]>
Cc: [hidden email]
Subject: Re: Flink 1.8.1 HDFS 2.6.5 issue
 
It seems that the CryptoCodec is null from the exception stack trace. This may occur when "hadoop.security.crypto.codec.classes.aes.ctr.nopadding" is misconfigured. You could change the log level to "DEBUG" and it will show more detailed information about why CryptoCodec is null.
 
 20191028日,下午7:14V N, Suchithra (Nokia - IN/Bangalore) <[hidden email]> 写道:
 
Hi,
 
I am trying to execute Wordcount.jar in Flink 1.8.1 with Hadoop version 2.6.5. HDFS is enabled with Kerberos+SSL. While writing output to HDFS, facing the below exception and job will be failed. Please let me know if any suggestions to debug this issue.
 
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
        at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
        at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:265)
        ... 21 more
Caused by: java.io.IOException: DataStreamer Exception:
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:695)
Caused by: java.lang.NullPointerException
        at org.apache.hadoop.crypto.CryptoInputStream.<init>(CryptoInputStream.java:132)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil.createStreamPair(DataTransferSaslUtil.java:345)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.doSaslHandshake(SaslDataTransferClient.java:489)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.getEncryptedStreams(SaslDataTransferClient.java:298)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.send(SaslDataTransferClient.java:241)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:210)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.socketSend(SaslDataTransferClient.java:182)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1409)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587)
 
Regards,
Suchithra

Reply | Threaded
Open this post in threaded view
|

RE: Flink 1.8.1 HDFS 2.6.5 issue

V N, Suchithra (Nokia - IN/Bangalore)

Thanks for the information. Without setting such parameter explicitly, is there any possibility that it may work intermittently?

 

From: Dian Fu <[hidden email]>
Sent: Tuesday, October 29, 2019 7:12 AM
To: V N, Suchithra (Nokia - IN/Bangalore) <[hidden email]>
Cc: [hidden email]
Subject: Re: Flink 1.8.1 HDFS 2.6.5 issue

 

I guess this is a bug in Hadoop 2.6.5 and has been fixed in Hadoop 2.8.0 [1]. You can work around it by explicitly setting the configration "hadoop.security.crypto.codec.classes.aes.ctr.nopadding" as "org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec, org.apache.hadoop.crypto.JceAesCtrCryptoCodec".

 

 

20191028日,下午8:59V N, Suchithra (Nokia - IN/Bangalore) <[hidden email]> 写道:

 

Hi,

From debug logs I could see below logs in taskmanager. Please have a look.

 

org.apache.hadoop.ipc.ProtobufRpcEngine Call: addBlock took 372ms"}

org.apache.hadoop.ipc.ProtobufRpcEngine Call: addBlock took 372ms"}

org.apache.hadoop.hdfs.DFSClient pipeline = 10.76.113.216:1044"}

org.apache.hadoop.hdfs.DFSClient pipeline = 10.76.113.216:1044"}

org.apache.hadoop.hdfs.DFSClient Connecting to datanode 10.76.113.216:1044"}

org.apache.hadoop.hdfs.DFSClient Connecting to datanode 10.76.113.216:1044"}

org.apache.hadoop.hdfs.DFSClient Send buf size 131072"}

org.apache.hadoop.hdfs.DFSClient Send buf size 131072"}

o.a.h.h.p.d.sasl.SaslDataTransferClient SASL client doing encrypted handshake for addr = /10.76.113.216, datanodeId = 10.76.113.216:1044"}

o.a.h.h.p.d.sasl.SaslDataTransferClient SASL client doing encrypted handshake for addr = /10.76.113.216, datanodeId = 10.76.113.216:1044"}

o.a.h.h.p.d.sasl.SaslDataTransferClient Client using encryption algorithm 3des"}

o.a.h.h.p.d.sasl.SaslDataTransferClient Client using encryption algorithm 3des"}

o.a.h.h.p.d.sasl.DataTransferSaslUtil Verifying QOP, requested QOP = [auth-conf], negotiated QOP = auth-conf"}

o.a.h.h.p.d.sasl.DataTransferSaslUtil Verifying QOP, requested QOP = [auth-conf], negotiated QOP = auth-conf"}

o.a.h.h.p.d.sasl.DataTransferSaslUtil Creating IOStreamPair of CryptoInputStream and CryptoOutputStream."}

o.a.h.h.p.d.sasl.DataTransferSaslUtil Creating IOStreamPair of CryptoInputStream and CryptoOutputStream."}

o.apache.hadoop.util.PerformanceAdvisory No crypto codec classes with cipher suite configured."}

o.apache.hadoop.util.PerformanceAdvisory No crypto codec classes with cipher suite configured."}

org.apache.hadoop.hdfs.DFSClient DataStreamer Exception"}

java.lang.NullPointerException: null

              at org.apache.hadoop.crypto.CryptoInputStream.<init>(CryptoInputStream.java:132)

              at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil.createStreamPair(DataTransferSaslUtil.java:345)

              at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.doSaslHandshake(SaslDataTransferClient.java:489)

              at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.getEncryptedStreams(SaslDataTransferClient.java:298)

              at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.send(SaslDataTransferClient.java:241)

              at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:210)

              at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.socketSend(SaslDataTransferClient.java:182)

              at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1409)

              at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357)

              at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587)

 

Regards,

Suchithra

 

From: Dian Fu <[hidden email]> 
Sent: Monday, October 28, 2019 5:40 PM
To: V N, Suchithra (Nokia - IN/Bangalore) <[hidden email]>
Cc: [hidden email]
Subject: Re: Flink 1.8.1 HDFS 2.6.5 issue

 

It seems that the CryptoCodec is null from the exception stack trace. This may occur when "hadoop.security.crypto.codec.classes.aes.ctr.nopadding" is misconfigured. You could change the log level to "DEBUG" and it will show more detailed information about why CryptoCodec is null.

 

 20191028日,下午7:14V N, Suchithra (Nokia - IN/Bangalore) <[hidden email]> 写道:

 

Hi,

 

I am trying to execute Wordcount.jar in Flink 1.8.1 with Hadoop version 2.6.5. HDFS is enabled with Kerberos+SSL. While writing output to HDFS, facing the below exception and job will be failed. Please let me know if any suggestions to debug this issue.

 

Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
        at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
        at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:265)
        ... 21 more
Caused by: java.io.IOException: DataStreamer Exception:
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:695)
Caused by: java.lang.NullPointerException
        at org.apache.hadoop.crypto.CryptoInputStream.<init>(CryptoInputStream.java:132)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil.createStreamPair(DataTransferSaslUtil.java:345)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.doSaslHandshake(SaslDataTransferClient.java:489)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.getEncryptedStreams(SaslDataTransferClient.java:298)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.send(SaslDataTransferClient.java:241)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:210)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.socketSend(SaslDataTransferClient.java:182)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1409)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587)

 

Regards,

Suchithra

 

Reply | Threaded
Open this post in threaded view
|

Re: Flink 1.8.1 HDFS 2.6.5 issue

Dian Fu
You could also disable the security feature of the Hadoop cluster or upgrade the hadoop version. I'm not sure if this is acceptable for you as it requires more changes. Setting the configuration is the minimum changes I could think of to solve this issue as it will not affect other users of the hadoop cluster. You could also turn to the hadoop community to see if they could provide some help as this is actually a hadoop problem.

在 2019年10月29日,下午2:25,V N, Suchithra (Nokia - IN/Bangalore) <[hidden email]> 写道:

Thanks for the information. Without setting such parameter explicitly, is there any possibility that it may work intermittently?
 
From: Dian Fu <[hidden email]> 
Sent: Tuesday, October 29, 2019 7:12 AM
To: V N, Suchithra (Nokia - IN/Bangalore) <[hidden email]>
Cc: [hidden email]
Subject: Re: Flink 1.8.1 HDFS 2.6.5 issue
 
I guess this is a bug in Hadoop 2.6.5 and has been fixed in Hadoop 2.8.0 [1]. You can work around it by explicitly setting the configration "hadoop.security.crypto.codec.classes.aes.ctr.nopadding" as "org.apache.hadoop.crypto.OpensslAesCtrCryptoCodec, org.apache.hadoop.crypto.JceAesCtrCryptoCodec".
 
 
 20191028日,下午8:59V N, Suchithra (Nokia - IN/Bangalore) <[hidden email]> 写道:
 
Hi,
From debug logs I could see below logs in taskmanager. Please have a look.
 
org.apache.hadoop.ipc.ProtobufRpcEngine Call: addBlock took 372ms"}
org.apache.hadoop.ipc.ProtobufRpcEngine Call: addBlock took 372ms"}
org.apache.hadoop.hdfs.DFSClient pipeline = 10.76.113.216:1044"}
org.apache.hadoop.hdfs.DFSClient pipeline = 10.76.113.216:1044"}
org.apache.hadoop.hdfs.DFSClient Connecting to datanode 10.76.113.216:1044"}
org.apache.hadoop.hdfs.DFSClient Connecting to datanode 10.76.113.216:1044"}
org.apache.hadoop.hdfs.DFSClient Send buf size 131072"}
org.apache.hadoop.hdfs.DFSClient Send buf size 131072"}
o.a.h.h.p.d.sasl.SaslDataTransferClient SASL client doing encrypted handshake for addr = /10.76.113.216, datanodeId = 10.76.113.216:1044"}
o.a.h.h.p.d.sasl.SaslDataTransferClient SASL client doing encrypted handshake for addr = /10.76.113.216, datanodeId = 10.76.113.216:1044"}
o.a.h.h.p.d.sasl.SaslDataTransferClient Client using encryption algorithm 3des"}
o.a.h.h.p.d.sasl.SaslDataTransferClient Client using encryption algorithm 3des"}
o.a.h.h.p.d.sasl.DataTransferSaslUtil Verifying QOP, requested QOP = [auth-conf], negotiated QOP = auth-conf"}
o.a.h.h.p.d.sasl.DataTransferSaslUtil Verifying QOP, requested QOP = [auth-conf], negotiated QOP = auth-conf"}
o.a.h.h.p.d.sasl.DataTransferSaslUtil Creating IOStreamPair of CryptoInputStream and CryptoOutputStream."}
o.a.h.h.p.d.sasl.DataTransferSaslUtil Creating IOStreamPair of CryptoInputStream and CryptoOutputStream."}
o.apache.hadoop.util.PerformanceAdvisory No crypto codec classes with cipher suite configured."}
o.apache.hadoop.util.PerformanceAdvisory No crypto codec classes with cipher suite configured."}
org.apache.hadoop.hdfs.DFSClient DataStreamer Exception"}
java.lang.NullPointerException: null
              at org.apache.hadoop.crypto.CryptoInputStream.<init>(CryptoInputStream.java:132)
              at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil.createStreamPair(DataTransferSaslUtil.java:345)
              at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.doSaslHandshake(SaslDataTransferClient.java:489)
              at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.getEncryptedStreams(SaslDataTransferClient.java:298)
              at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.send(SaslDataTransferClient.java:241)
              at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:210)
              at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.socketSend(SaslDataTransferClient.java:182)
              at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1409)
              at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357)
              at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587)
 
Regards,
Suchithra
 
From: Dian Fu <[hidden email]> 
Sent: Monday, October 28, 2019 5:40 PM
To: V N, Suchithra (Nokia - IN/Bangalore) <[hidden email]>
Cc: [hidden email]
Subject: Re: Flink 1.8.1 HDFS 2.6.5 issue
 
It seems that the CryptoCodec is null from the exception stack trace. This may occur when "hadoop.security.crypto.codec.classes.aes.ctr.nopadding" is misconfigured. You could change the log level to "DEBUG" and it will show more detailed information about why CryptoCodec is null.
 
 20191028日,下午7:14V N, Suchithra (Nokia - IN/Bangalore) <[hidden email]> 写道:
 
Hi,
 
I am trying to execute Wordcount.jar in Flink 1.8.1 with Hadoop version 2.6.5. HDFS is enabled with Kerberos+SSL. While writing output to HDFS, facing the below exception and job will be failed. Please let me know if any suggestions to debug this issue.
 
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
        at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
        at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:265)
        ... 21 more
Caused by: java.io.IOException: DataStreamer Exception:
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:695)
Caused by: java.lang.NullPointerException
        at org.apache.hadoop.crypto.CryptoInputStream.<init>(CryptoInputStream.java:132)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil.createStreamPair(DataTransferSaslUtil.java:345)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.doSaslHandshake(SaslDataTransferClient.java:489)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.getEncryptedStreams(SaslDataTransferClient.java:298)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.send(SaslDataTransferClient.java:241)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.checkTrustAndSend(SaslDataTransferClient.java:210)
        at org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient.socketSend(SaslDataTransferClient.java:182)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1409)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1357)
        at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:587)
 
Regards,
Suchithra