Hbase Connector failed when deployed to yarn

classic Classic list List threaded Threaded
7 messages Options
hai
Reply | Threaded
Open this post in threaded view
|

Hbase Connector failed when deployed to yarn

hai

Hello:

    I am new to flink, and I copy the official Hbase connector examples from source 

and run in a yarn-cluster with the command: 

bin/flink run -m yarn-cluster -yn 2 -c {class-path-prefix}.HBaseWriteExample {my-application}.jar

What I have get is:

------------------------------------------------------------
 The program finished with the following exception:

org.apache.flink.client.program.ProgramInvocationException: The main method caused an error.
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:545)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:419)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:339)
at org.apache.flink.client.CliFrontend.executeProgram(CliFrontend.java:831)
at org.apache.flink.client.CliFrontend.run(CliFrontend.java:256)
at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1073)
at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1120)
at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1117)
at org.apache.flink.runtime.security.HadoopSecurityContext$1.run(HadoopSecurityContext.java:43)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:40)
at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1117)
Caused by: java.lang.RuntimeException: Could not load the TypeInformation for the class 'org.apache.hadoop.io.Writable'. You may be missing the 'flink-hadoop-compatibility' dependency.
at org.apache.flink.api.java.typeutils.TypeExtractor.createHadoopWritableTypeInfo(TypeExtractor.java:2025)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1649)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1591)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:778)
at org.apache.flink.api.java.typeutils.TypeExtractor.createSubTypesInfo(TypeExtractor.java:998)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:679)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoFromInputs(TypeExtractor.java:791)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateCreateTypeInfo(TypeExtractor.java:621)
at org.apache.flink.api.java.typeutils.TypeExtractor.getUnaryOperatorReturnType(TypeExtractor.java:425)
at org.apache.flink.api.java.typeutils.TypeExtractor.getUnaryOperatorReturnType(TypeExtractor.java:349)
at org.apache.flink.api.java.typeutils.TypeExtractor.getMapReturnTypes(TypeExtractor.java:164)
at org.apache.flink.api.java.DataSet.map(DataSet.java:215)
at com.luckyfish.flink.java.HBaseWriteExample.main(HBaseWriteExample.java:75)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:528)
... 13 more

What should I do to deal with this exception ?

Many Thanks
hai
Reply | Threaded
Open this post in threaded view
|

Re: Hbase Connector failed when deployed to yarn

hai

And my pom.xml dependencies is :


<dependencies>

        <!-- Scala -->

        <dependency>

            <groupId>org.scala-lang</groupId>

            <artifactId>scala-library</artifactId>

            <version>${scala.version}</version>

        </dependency>

        <dependency>

            <groupId>org.scala-lang</groupId>

            <artifactId>scala-compiler</artifactId>

            <version>${scala.version}</version>

        </dependency>


        <!-- SL4J & Log4j & Kafka-Appender & Flume-Appender -->

        <dependency>

            <groupId>org.slf4j</groupId>

            <artifactId>slf4j-api</artifactId>

            <version>1.7.21</version>

        </dependency>


        <!-- 1.1.1 -->

        <dependency>

            <groupId>ch.qos.logback</groupId>

            <artifactId>logback-core</artifactId>

            <version>1.1.1</version>

        </dependency>

        <dependency>

            <groupId>ch.qos.logback</groupId>

            <artifactId>logback-classic</artifactId>

            <version>1.1.1</version>

        </dependency>

        <!-- Flink -->

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-scala_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

            <scope>compile</scope>

        </dependency>

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-streaming-scala_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

            <scope>compile</scope>

        </dependency>

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-runtime-web_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

        </dependency>


        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-hbase_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

        </dependency>

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-hadoop-compatibility_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

        </dependency>

        <dependency>

            <groupId>org.apache.hadoop</groupId>

            <artifactId>hadoop-mapreduce-client-core</artifactId>

            <version>${hadoop.version}</version>

        </dependency>

        <dependency>

            <groupId>cglib</groupId>

            <artifactId>cglib</artifactId>

            <version>2.2.2</version>

        </dependency>

        <!-- Hadoop -->

        <dependency>

            <groupId>org.apache.hadoop</groupId>

            <artifactId>hadoop-common</artifactId>

            <version>${hadoop.version}</version>

        </dependency>

    </dependencies>


 Original Message 
Sender: hai<[hidden email]>
Recipient: [hidden email]
Date: Thursday, Apr 11, 2019 21:04
Subject: Hbase Connector failed when deployed to yarn

Hello:

    I am new to flink, and I copy the official Hbase connector examples from source 

and run in a yarn-cluster with the command: 

bin/flink run -m yarn-cluster -yn 2 -c {class-path-prefix}.HBaseWriteExample {my-application}.jar

What I have get is:

------------------------------------------------------------
 The program finished with the following exception:

org.apache.flink.client.program.ProgramInvocationException: The main method caused an error.
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:545)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:419)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:339)
at org.apache.flink.client.CliFrontend.executeProgram(CliFrontend.java:831)
at org.apache.flink.client.CliFrontend.run(CliFrontend.java:256)
at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1073)
at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1120)
at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1117)
at org.apache.flink.runtime.security.HadoopSecurityContext$1.run(HadoopSecurityContext.java:43)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:40)
at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1117)
Caused by: java.lang.RuntimeException: Could not load the TypeInformation for the class 'org.apache.hadoop.io.Writable'. You may be missing the 'flink-hadoop-compatibility' dependency.
at org.apache.flink.api.java.typeutils.TypeExtractor.createHadoopWritableTypeInfo(TypeExtractor.java:2025)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1649)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1591)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:778)
at org.apache.flink.api.java.typeutils.TypeExtractor.createSubTypesInfo(TypeExtractor.java:998)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:679)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoFromInputs(TypeExtractor.java:791)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateCreateTypeInfo(TypeExtractor.java:621)
at org.apache.flink.api.java.typeutils.TypeExtractor.getUnaryOperatorReturnType(TypeExtractor.java:425)
at org.apache.flink.api.java.typeutils.TypeExtractor.getUnaryOperatorReturnType(TypeExtractor.java:349)
at org.apache.flink.api.java.typeutils.TypeExtractor.getMapReturnTypes(TypeExtractor.java:164)
at org.apache.flink.api.java.DataSet.map(DataSet.java:215)
at com.luckyfish.flink.java.HBaseWriteExample.main(HBaseWriteExample.java:75)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:528)
... 13 more

What should I do to deal with this exception ?

Many Thanks
Reply | Threaded
Open this post in threaded view
|

Re: Hbase Connector failed when deployed to yarn

Yun Tang
Hi

I believe this is the same problem which reported in https://issues.apache.org/jira/browse/FLINK-12163 , current work around solution is to put flink-hadoop-compatibility jar under FLINK_HOME/lib.

Best
Yun Tang

From: hai <[hidden email]>
Sent: Thursday, April 11, 2019 21:06
To: user
Subject: Re: Hbase Connector failed when deployed to yarn
 

And my pom.xml dependencies is :


<dependencies>

        <!-- Scala -->

        <dependency>

            <groupId>org.scala-lang</groupId>

            <artifactId>scala-library</artifactId>

            <version>${scala.version}</version>

        </dependency>

        <dependency>

            <groupId>org.scala-lang</groupId>

            <artifactId>scala-compiler</artifactId>

            <version>${scala.version}</version>

        </dependency>


        <!-- SL4J & Log4j & Kafka-Appender & Flume-Appender -->

        <dependency>

            <groupId>org.slf4j</groupId>

            <artifactId>slf4j-api</artifactId>

            <version>1.7.21</version>

        </dependency>


        <!-- 1.1.1 -->

        <dependency>

            <groupId>ch.qos.logback</groupId>

            <artifactId>logback-core</artifactId>

            <version>1.1.1</version>

        </dependency>

        <dependency>

            <groupId>ch.qos.logback</groupId>

            <artifactId>logback-classic</artifactId>

            <version>1.1.1</version>

        </dependency>

        <!-- Flink -->

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-scala_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

            <scope>compile</scope>

        </dependency>

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-streaming-scala_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

            <scope>compile</scope>

        </dependency>

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-runtime-web_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

        </dependency>


        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-hbase_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

        </dependency>

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-hadoop-compatibility_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

        </dependency>

        <dependency>

            <groupId>org.apache.hadoop</groupId>

            <artifactId>hadoop-mapreduce-client-core</artifactId>

            <version>${hadoop.version}</version>

        </dependency>

        <dependency>

            <groupId>cglib</groupId>

            <artifactId>cglib</artifactId>

            <version>2.2.2</version>

        </dependency>

        <!-- Hadoop -->

        <dependency>

            <groupId>org.apache.hadoop</groupId>

            <artifactId>hadoop-common</artifactId>

            <version>${hadoop.version}</version>

        </dependency>

    </dependencies>


 Original Message 
Sender: hai<[hidden email]>
Recipient: [hidden email]
Date: Thursday, Apr 11, 2019 21:04
Subject: Hbase Connector failed when deployed to yarn

Hello:

    I am new to flink, and I copy the official Hbase connector examples from source 

flink/flink-connectors/flink-hbase/src/test/java/org/apache/flink/addons/hbase/example/HBaseWriteExample.java

and run in a yarn-cluster with the command: 

bin/flink run -m yarn-cluster -yn 2 -c {class-path-prefix}.HBaseWriteExample {my-application}.jar

What I have get is:

------------------------------------------------------------
 The program finished with the following exception:

org.apache.flink.client.program.ProgramInvocationException: The main method caused an error.
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:545)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:419)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:339)
at org.apache.flink.client.CliFrontend.executeProgram(CliFrontend.java:831)
at org.apache.flink.client.CliFrontend.run(CliFrontend.java:256)
at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1073)
at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1120)
at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1117)
at org.apache.flink.runtime.security.HadoopSecurityContext$1.run(HadoopSecurityContext.java:43)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:40)
at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1117)
Caused by: java.lang.RuntimeException: Could not load the TypeInformation for the class 'org.apache.hadoop.io.Writable'. You may be missing the 'flink-hadoop-compatibility' dependency.
at org.apache.flink.api.java.typeutils.TypeExtractor.createHadoopWritableTypeInfo(TypeExtractor.java:2025)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1649)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1591)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:778)
at org.apache.flink.api.java.typeutils.TypeExtractor.createSubTypesInfo(TypeExtractor.java:998)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:679)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoFromInputs(TypeExtractor.java:791)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateCreateTypeInfo(TypeExtractor.java:621)
at org.apache.flink.api.java.typeutils.TypeExtractor.getUnaryOperatorReturnType(TypeExtractor.java:425)
at org.apache.flink.api.java.typeutils.TypeExtractor.getUnaryOperatorReturnType(TypeExtractor.java:349)
at org.apache.flink.api.java.typeutils.TypeExtractor.getMapReturnTypes(TypeExtractor.java:164)
at org.apache.flink.api.java.DataSet.map(DataSet.java:215)
at com.luckyfish.flink.java.HBaseWriteExample.main(HBaseWriteExample.java:75)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:528)
... 13 more

What should I do to deal with this exception ?

Many Thanks
hai
Reply | Threaded
Open this post in threaded view
|

Re: Hbase Connector failed when deployed to yarn

hai
In reply to this post by hai

Hi, Tang:


Thaks for your reply, will this issue fix soon?I don’t think put flink-hadoop-compatibility jar under FLINK_HOME/lib is a elegant solution.


Regards


 Original Message 
Sender: Yun Tang<[hidden email]>
Recipient: hai<[hidden email]>; user<[hidden email]>
Date: Friday, Apr 12, 2019 02:02
Subject: Re: Hbase Connector failed when deployed to yarn

Hi

I believe this is the same problem which reported in https://issues.apache.org/jira/browse/FLINK-12163 , current work around solution is to put flink-hadoop-compatibility jar under FLINK_HOME/lib.

Best
Yun Tang

From: hai <[hidden email]>
Sent: Thursday, April 11, 2019 21:06
To: user
Subject: Re: Hbase Connector failed when deployed to yarn
 

And my pom.xml dependencies is :


<dependencies>

        <!-- Scala -->

        <dependency>

            <groupId>org.scala-lang</groupId>

            <artifactId>scala-library</artifactId>

            <version>${scala.version}</version>

        </dependency>

        <dependency>

            <groupId>org.scala-lang</groupId>

            <artifactId>scala-compiler</artifactId>

            <version>${scala.version}</version>

        </dependency>


        <!-- SL4J & Log4j & Kafka-Appender & Flume-Appender -->

        <dependency>

            <groupId>org.slf4j</groupId>

            <artifactId>slf4j-api</artifactId>

            <version>1.7.21</version>

        </dependency>


        <!-- 1.1.1 -->

        <dependency>

            <groupId>ch.qos.logback</groupId>

            <artifactId>logback-core</artifactId>

            <version>1.1.1</version>

        </dependency>

        <dependency>

            <groupId>ch.qos.logback</groupId>

            <artifactId>logback-classic</artifactId>

            <version>1.1.1</version>

        </dependency>

        <!-- Flink -->

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-scala_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

            <scope>compile</scope>

        </dependency>

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-streaming-scala_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

            <scope>compile</scope>

        </dependency>

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-runtime-web_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

        </dependency>


        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-hbase_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

        </dependency>

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-hadoop-compatibility_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

        </dependency>

        <dependency>

            <groupId>org.apache.hadoop</groupId>

            <artifactId>hadoop-mapreduce-client-core</artifactId>

            <version>${hadoop.version}</version>

        </dependency>

        <dependency>

            <groupId>cglib</groupId>

            <artifactId>cglib</artifactId>

            <version>2.2.2</version>

        </dependency>

        <!-- Hadoop -->

        <dependency>

            <groupId>org.apache.hadoop</groupId>

            <artifactId>hadoop-common</artifactId>

            <version>${hadoop.version}</version>

        </dependency>

    </dependencies>


 Original Message 
Sender: hai<[hidden email]>
Recipient: [hidden email]
Date: Thursday, Apr 11, 2019 21:04
Subject: Hbase Connector failed when deployed to yarn

Hello:

    I am new to flink, and I copy the official Hbase connector examples from source 

flink/flink-connectors/flink-hbase/src/test/java/org/apache/flink/addons/hbase/example/HBaseWriteExample.java

and run in a yarn-cluster with the command: 

bin/flink run -m yarn-cluster -yn 2 -c {class-path-prefix}.HBaseWriteExample {my-application}.jar

What I have get is:

------------------------------------------------------------
 The program finished with the following exception:

org.apache.flink.client.program.ProgramInvocationException: The main method caused an error.
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:545)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:419)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:339)
at org.apache.flink.client.CliFrontend.executeProgram(CliFrontend.java:831)
at org.apache.flink.client.CliFrontend.run(CliFrontend.java:256)
at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1073)
at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1120)
at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1117)
at org.apache.flink.runtime.security.HadoopSecurityContext$1.run(HadoopSecurityContext.java:43)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:40)
at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1117)
Caused by: java.lang.RuntimeException: Could not load the TypeInformation for the class 'org.apache.hadoop.io.Writable'. You may be missing the 'flink-hadoop-compatibility' dependency.
at org.apache.flink.api.java.typeutils.TypeExtractor.createHadoopWritableTypeInfo(TypeExtractor.java:2025)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1649)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1591)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:778)
at org.apache.flink.api.java.typeutils.TypeExtractor.createSubTypesInfo(TypeExtractor.java:998)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:679)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoFromInputs(TypeExtractor.java:791)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateCreateTypeInfo(TypeExtractor.java:621)
at org.apache.flink.api.java.typeutils.TypeExtractor.getUnaryOperatorReturnType(TypeExtractor.java:425)
at org.apache.flink.api.java.typeutils.TypeExtractor.getUnaryOperatorReturnType(TypeExtractor.java:349)
at org.apache.flink.api.java.typeutils.TypeExtractor.getMapReturnTypes(TypeExtractor.java:164)
at org.apache.flink.api.java.DataSet.map(DataSet.java:215)
at com.luckyfish.flink.java.HBaseWriteExample.main(HBaseWriteExample.java:75)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:528)
... 13 more

What should I do to deal with this exception ?

Many Thanks
Reply | Threaded
Open this post in threaded view
|

Re: Hbase Connector failed when deployed to yarn

Fabian Hueske-2
Hi,

The Jira issue is still unassigned.
Would you be up to work on a fix?

Best, Fabian

Am Fr., 12. Apr. 2019 um 05:07 Uhr schrieb hai <[hidden email]>:

Hi, Tang:


Thaks for your reply, will this issue fix soon?I don’t think put flink-hadoop-compatibility jar under FLINK_HOME/lib is a elegant solution.


Regards


 Original Message 
Sender: Yun Tang<[hidden email]>
Recipient: hai<[hidden email]>; user<[hidden email]>
Date: Friday, Apr 12, 2019 02:02
Subject: Re: Hbase Connector failed when deployed to yarn

Hi

I believe this is the same problem which reported in https://issues.apache.org/jira/browse/FLINK-12163 , current work around solution is to put flink-hadoop-compatibility jar under FLINK_HOME/lib.

Best
Yun Tang

From: hai <[hidden email]>
Sent: Thursday, April 11, 2019 21:06
To: user
Subject: Re: Hbase Connector failed when deployed to yarn
 

And my pom.xml dependencies is :


<dependencies>

        <!-- Scala -->

        <dependency>

            <groupId>org.scala-lang</groupId>

            <artifactId>scala-library</artifactId>

            <version>${scala.version}</version>

        </dependency>

        <dependency>

            <groupId>org.scala-lang</groupId>

            <artifactId>scala-compiler</artifactId>

            <version>${scala.version}</version>

        </dependency>


        <!-- SL4J & Log4j & Kafka-Appender & Flume-Appender -->

        <dependency>

            <groupId>org.slf4j</groupId>

            <artifactId>slf4j-api</artifactId>

            <version>1.7.21</version>

        </dependency>


        <!-- 1.1.1 -->

        <dependency>

            <groupId>ch.qos.logback</groupId>

            <artifactId>logback-core</artifactId>

            <version>1.1.1</version>

        </dependency>

        <dependency>

            <groupId>ch.qos.logback</groupId>

            <artifactId>logback-classic</artifactId>

            <version>1.1.1</version>

        </dependency>

        <!-- Flink -->

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-scala_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

            <scope>compile</scope>

        </dependency>

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-streaming-scala_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

            <scope>compile</scope>

        </dependency>

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-runtime-web_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

        </dependency>


        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-hbase_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

        </dependency>

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-hadoop-compatibility_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

        </dependency>

        <dependency>

            <groupId>org.apache.hadoop</groupId>

            <artifactId>hadoop-mapreduce-client-core</artifactId>

            <version>${hadoop.version}</version>

        </dependency>

        <dependency>

            <groupId>cglib</groupId>

            <artifactId>cglib</artifactId>

            <version>2.2.2</version>

        </dependency>

        <!-- Hadoop -->

        <dependency>

            <groupId>org.apache.hadoop</groupId>

            <artifactId>hadoop-common</artifactId>

            <version>${hadoop.version}</version>

        </dependency>

    </dependencies>


 Original Message 
Sender: hai<[hidden email]>
Recipient: [hidden email]
Date: Thursday, Apr 11, 2019 21:04
Subject: Hbase Connector failed when deployed to yarn

Hello:

    I am new to flink, and I copy the official Hbase connector examples from source 

flink/flink-connectors/flink-hbase/src/test/java/org/apache/flink/addons/hbase/example/HBaseWriteExample.java

and run in a yarn-cluster with the command: 

bin/flink run -m yarn-cluster -yn 2 -c {class-path-prefix}.HBaseWriteExample {my-application}.jar

What I have get is:

------------------------------------------------------------
 The program finished with the following exception:

org.apache.flink.client.program.ProgramInvocationException: The main method caused an error.
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:545)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:419)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:339)
at org.apache.flink.client.CliFrontend.executeProgram(CliFrontend.java:831)
at org.apache.flink.client.CliFrontend.run(CliFrontend.java:256)
at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1073)
at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1120)
at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1117)
at org.apache.flink.runtime.security.HadoopSecurityContext$1.run(HadoopSecurityContext.java:43)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:40)
at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1117)
Caused by: java.lang.RuntimeException: Could not load the TypeInformation for the class 'org.apache.hadoop.io.Writable'. You may be missing the 'flink-hadoop-compatibility' dependency.
at org.apache.flink.api.java.typeutils.TypeExtractor.createHadoopWritableTypeInfo(TypeExtractor.java:2025)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1649)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1591)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:778)
at org.apache.flink.api.java.typeutils.TypeExtractor.createSubTypesInfo(TypeExtractor.java:998)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:679)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoFromInputs(TypeExtractor.java:791)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateCreateTypeInfo(TypeExtractor.java:621)
at org.apache.flink.api.java.typeutils.TypeExtractor.getUnaryOperatorReturnType(TypeExtractor.java:425)
at org.apache.flink.api.java.typeutils.TypeExtractor.getUnaryOperatorReturnType(TypeExtractor.java:349)
at org.apache.flink.api.java.typeutils.TypeExtractor.getMapReturnTypes(TypeExtractor.java:164)
at org.apache.flink.api.java.DataSet.map(DataSet.java:215)
at com.luckyfish.flink.java.HBaseWriteExample.main(HBaseWriteExample.java:75)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:528)
... 13 more

What should I do to deal with this exception ?

Many Thanks
hai
Reply | Threaded
Open this post in threaded view
|

Re: Hbase Connector failed when deployed to yarn

hai
In reply to this post by hai

Hi Fabian:


OK ,I am glad to do that.


Regards


 Original Message 
Sender: Fabian Hueske<[hidden email]>
Recipient: hai<[hidden email]>
Cc: user<[hidden email]>; Yun Tang<[hidden email]>
Date: Monday, Apr 15, 2019 17:16
Subject: Re: Hbase Connector failed when deployed to yarn

Hi,

The Jira issue is still unassigned.
Would you be up to work on a fix?

Best, Fabian

Am Fr., 12. Apr. 2019 um 05:07 Uhr schrieb hai <[hidden email]>:

Hi, Tang:


Thaks for your reply, will this issue fix soon?I don’t think put flink-hadoop-compatibility jar under FLINK_HOME/lib is a elegant solution.


Regards


 Original Message 
Sender: Yun Tang<[hidden email]>
Recipient: hai<[hidden email]>; user<[hidden email]>
Date: Friday, Apr 12, 2019 02:02
Subject: Re: Hbase Connector failed when deployed to yarn

Hi

I believe this is the same problem which reported in https://issues.apache.org/jira/browse/FLINK-12163 , current work around solution is to put flink-hadoop-compatibility jar under FLINK_HOME/lib.

Best
Yun Tang

From: hai <[hidden email]>
Sent: Thursday, April 11, 2019 21:06
To: user
Subject: Re: Hbase Connector failed when deployed to yarn
 

And my pom.xml dependencies is :


<dependencies>

        <!-- Scala -->

        <dependency>

            <groupId>org.scala-lang</groupId>

            <artifactId>scala-library</artifactId>

            <version>${scala.version}</version>

        </dependency>

        <dependency>

            <groupId>org.scala-lang</groupId>

            <artifactId>scala-compiler</artifactId>

            <version>${scala.version}</version>

        </dependency>


        <!-- SL4J & Log4j & Kafka-Appender & Flume-Appender -->

        <dependency>

            <groupId>org.slf4j</groupId>

            <artifactId>slf4j-api</artifactId>

            <version>1.7.21</version>

        </dependency>


        <!-- 1.1.1 -->

        <dependency>

            <groupId>ch.qos.logback</groupId>

            <artifactId>logback-core</artifactId>

            <version>1.1.1</version>

        </dependency>

        <dependency>

            <groupId>ch.qos.logback</groupId>

            <artifactId>logback-classic</artifactId>

            <version>1.1.1</version>

        </dependency>

        <!-- Flink -->

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-scala_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

            <scope>compile</scope>

        </dependency>

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-streaming-scala_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

            <scope>compile</scope>

        </dependency>

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-runtime-web_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

        </dependency>


        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-hbase_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

        </dependency>

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-hadoop-compatibility_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

        </dependency>

        <dependency>

            <groupId>org.apache.hadoop</groupId>

            <artifactId>hadoop-mapreduce-client-core</artifactId>

            <version>${hadoop.version}</version>

        </dependency>

        <dependency>

            <groupId>cglib</groupId>

            <artifactId>cglib</artifactId>

            <version>2.2.2</version>

        </dependency>

        <!-- Hadoop -->

        <dependency>

            <groupId>org.apache.hadoop</groupId>

            <artifactId>hadoop-common</artifactId>

            <version>${hadoop.version}</version>

        </dependency>

    </dependencies>


 Original Message 
Sender: hai<[hidden email]>
Recipient: [hidden email]
Date: Thursday, Apr 11, 2019 21:04
Subject: Hbase Connector failed when deployed to yarn

Hello:

    I am new to flink, and I copy the official Hbase connector examples from source 

flink/flink-connectors/flink-hbase/src/test/java/org/apache/flink/addons/hbase/example/HBaseWriteExample.java

and run in a yarn-cluster with the command: 

bin/flink run -m yarn-cluster -yn 2 -c {class-path-prefix}.HBaseWriteExample {my-application}.jar

What I have get is:

------------------------------------------------------------
 The program finished with the following exception:

org.apache.flink.client.program.ProgramInvocationException: The main method caused an error.
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:545)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:419)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:339)
at org.apache.flink.client.CliFrontend.executeProgram(CliFrontend.java:831)
at org.apache.flink.client.CliFrontend.run(CliFrontend.java:256)
at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1073)
at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1120)
at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1117)
at org.apache.flink.runtime.security.HadoopSecurityContext$1.run(HadoopSecurityContext.java:43)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:40)
at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1117)
Caused by: java.lang.RuntimeException: Could not load the TypeInformation for the class 'org.apache.hadoop.io.Writable'. You may be missing the 'flink-hadoop-compatibility' dependency.
at org.apache.flink.api.java.typeutils.TypeExtractor.createHadoopWritableTypeInfo(TypeExtractor.java:2025)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1649)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1591)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:778)
at org.apache.flink.api.java.typeutils.TypeExtractor.createSubTypesInfo(TypeExtractor.java:998)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:679)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoFromInputs(TypeExtractor.java:791)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateCreateTypeInfo(TypeExtractor.java:621)
at org.apache.flink.api.java.typeutils.TypeExtractor.getUnaryOperatorReturnType(TypeExtractor.java:425)
at org.apache.flink.api.java.typeutils.TypeExtractor.getUnaryOperatorReturnType(TypeExtractor.java:349)
at org.apache.flink.api.java.typeutils.TypeExtractor.getMapReturnTypes(TypeExtractor.java:164)
at org.apache.flink.api.java.DataSet.map(DataSet.java:215)
at com.luckyfish.flink.java.HBaseWriteExample.main(HBaseWriteExample.java:75)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:528)
... 13 more

What should I do to deal with this exception ?

Many Thanks
Reply | Threaded
Open this post in threaded view
|

Re: Hbase Connector failed when deployed to yarn

Fabian Hueske-2
That's great!
Thank you.

Let me know if you have any questions.

Fabian

Am Mo., 15. Apr. 2019 um 11:32 Uhr schrieb Hai <[hidden email]>:

Hi Fabian:


OK ,I am glad to do that.


Regards


 Original Message 
Sender: Fabian Hueske<[hidden email]>
Recipient: hai<[hidden email]>
Cc: user<[hidden email]>; Yun Tang<[hidden email]>
Date: Monday, Apr 15, 2019 17:16
Subject: Re: Hbase Connector failed when deployed to yarn

Hi,

The Jira issue is still unassigned.
Would you be up to work on a fix?

Best, Fabian

Am Fr., 12. Apr. 2019 um 05:07 Uhr schrieb hai <[hidden email]>:

Hi, Tang:


Thaks for your reply, will this issue fix soon?I don’t think put flink-hadoop-compatibility jar under FLINK_HOME/lib is a elegant solution.


Regards


 Original Message 
Sender: Yun Tang<[hidden email]>
Recipient: hai<[hidden email]>; user<[hidden email]>
Date: Friday, Apr 12, 2019 02:02
Subject: Re: Hbase Connector failed when deployed to yarn

Hi

I believe this is the same problem which reported in https://issues.apache.org/jira/browse/FLINK-12163 , current work around solution is to put flink-hadoop-compatibility jar under FLINK_HOME/lib.

Best
Yun Tang

From: hai <[hidden email]>
Sent: Thursday, April 11, 2019 21:06
To: user
Subject: Re: Hbase Connector failed when deployed to yarn
 

And my pom.xml dependencies is :


<dependencies>

        <!-- Scala -->

        <dependency>

            <groupId>org.scala-lang</groupId>

            <artifactId>scala-library</artifactId>

            <version>${scala.version}</version>

        </dependency>

        <dependency>

            <groupId>org.scala-lang</groupId>

            <artifactId>scala-compiler</artifactId>

            <version>${scala.version}</version>

        </dependency>


        <!-- SL4J & Log4j & Kafka-Appender & Flume-Appender -->

        <dependency>

            <groupId>org.slf4j</groupId>

            <artifactId>slf4j-api</artifactId>

            <version>1.7.21</version>

        </dependency>


        <!-- 1.1.1 -->

        <dependency>

            <groupId>ch.qos.logback</groupId>

            <artifactId>logback-core</artifactId>

            <version>1.1.1</version>

        </dependency>

        <dependency>

            <groupId>ch.qos.logback</groupId>

            <artifactId>logback-classic</artifactId>

            <version>1.1.1</version>

        </dependency>

        <!-- Flink -->

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-scala_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

            <scope>compile</scope>

        </dependency>

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-streaming-scala_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

            <scope>compile</scope>

        </dependency>

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-runtime-web_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

        </dependency>


        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-hbase_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

        </dependency>

        <dependency>

            <groupId>org.apache.flink</groupId>

            <artifactId>flink-hadoop-compatibility_${scala.binary.version}</artifactId>

            <version>${flink.version}</version>

        </dependency>

        <dependency>

            <groupId>org.apache.hadoop</groupId>

            <artifactId>hadoop-mapreduce-client-core</artifactId>

            <version>${hadoop.version}</version>

        </dependency>

        <dependency>

            <groupId>cglib</groupId>

            <artifactId>cglib</artifactId>

            <version>2.2.2</version>

        </dependency>

        <!-- Hadoop -->

        <dependency>

            <groupId>org.apache.hadoop</groupId>

            <artifactId>hadoop-common</artifactId>

            <version>${hadoop.version}</version>

        </dependency>

    </dependencies>


 Original Message 
Sender: hai<[hidden email]>
Recipient: [hidden email]
Date: Thursday, Apr 11, 2019 21:04
Subject: Hbase Connector failed when deployed to yarn

Hello:

    I am new to flink, and I copy the official Hbase connector examples from source 

flink/flink-connectors/flink-hbase/src/test/java/org/apache/flink/addons/hbase/example/HBaseWriteExample.java

and run in a yarn-cluster with the command: 

bin/flink run -m yarn-cluster -yn 2 -c {class-path-prefix}.HBaseWriteExample {my-application}.jar

What I have get is:

------------------------------------------------------------
 The program finished with the following exception:

org.apache.flink.client.program.ProgramInvocationException: The main method caused an error.
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:545)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:419)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:339)
at org.apache.flink.client.CliFrontend.executeProgram(CliFrontend.java:831)
at org.apache.flink.client.CliFrontend.run(CliFrontend.java:256)
at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1073)
at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1120)
at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1117)
at org.apache.flink.runtime.security.HadoopSecurityContext$1.run(HadoopSecurityContext.java:43)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698)
at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:40)
at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1117)
Caused by: java.lang.RuntimeException: Could not load the TypeInformation for the class 'org.apache.hadoop.io.Writable'. You may be missing the 'flink-hadoop-compatibility' dependency.
at org.apache.flink.api.java.typeutils.TypeExtractor.createHadoopWritableTypeInfo(TypeExtractor.java:2025)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1649)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateGetForClass(TypeExtractor.java:1591)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:778)
at org.apache.flink.api.java.typeutils.TypeExtractor.createSubTypesInfo(TypeExtractor.java:998)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoWithTypeHierarchy(TypeExtractor.java:679)
at org.apache.flink.api.java.typeutils.TypeExtractor.createTypeInfoFromInputs(TypeExtractor.java:791)
at org.apache.flink.api.java.typeutils.TypeExtractor.privateCreateTypeInfo(TypeExtractor.java:621)
at org.apache.flink.api.java.typeutils.TypeExtractor.getUnaryOperatorReturnType(TypeExtractor.java:425)
at org.apache.flink.api.java.typeutils.TypeExtractor.getUnaryOperatorReturnType(TypeExtractor.java:349)
at org.apache.flink.api.java.typeutils.TypeExtractor.getMapReturnTypes(TypeExtractor.java:164)
at org.apache.flink.api.java.DataSet.map(DataSet.java:215)
at com.luckyfish.flink.java.HBaseWriteExample.main(HBaseWriteExample.java:75)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:528)
... 13 more

What should I do to deal with this exception ?

Many Thanks