Error while using catalog in .yaml file

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Error while using catalog in .yaml file

Yebgenya Lazarkhosrouabadi

Hello,

 

I’m trying to use hivecatalog in flink1.9. I modified the yaml file like this:

 

 

catalogs:

  - name: mynewhive

    type: hive

    hive-conf-dir: /home/user/Downloads/apache-hive-1.2.2-bin/conf

    default-database: myhive

 

 

But when I try to run ./sql-client.sh embedded  I get this error:

 

Exception in thread "main" org.apache.flink.table.client.SqlClientException: The configured environment is invalid. Please check your environment files again.

               at org.apache.flink.table.client.SqlClient.validateEnvironment(SqlClient.java:147)

               at org.apache.flink.table.client.SqlClient.start(SqlClient.java:99)

               at org.apache.flink.table.client.SqlClient.main(SqlClient.java:194)

Caused by: org.apache.flink.table.client.gateway.SqlExecutionException: Could not create execution context.

               at org.apache.flink.table.client.gateway.local.LocalExecutor.getOrCreateExecutionContext(LocalExecutor.java:553)

               at org.apache.flink.table.client.gateway.local.LocalExecutor.validateSession(LocalExecutor.java:373)

               at org.apache.flink.table.client.SqlClient.validateEnvironment(SqlClient.java:144)

               ... 2 more

Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException: Could not find a suitable table factory for 'org.apache.flink.table.factories.CatalogFactory' in

the classpath.

 

Reason: No context matches.

 

The following properties are requested:

default-database=myhive

hive-conf-dir=/home/bernadette/Downloads/apache-hive-1.2.2-bin/conf

type=hive

 

The following factories have been considered:

org.apache.flink.table.catalog.GenericInMemoryCatalogFactory

org.apache.flink.table.sources.CsvBatchTableSourceFactory

org.apache.flink.table.sources.CsvAppendTableSourceFactory

org.apache.flink.table.sinks.CsvBatchTableSinkFactory

org.apache.flink.table.sinks.CsvAppendTableSinkFactory

org.apache.flink.table.planner.StreamPlannerFactory

org.apache.flink.table.executor.StreamExecutorFactory

org.apache.flink.table.planner.delegation.BlinkPlannerFactory

org.apache.flink.table.planner.delegation.BlinkExecutorFactory

               at org.apache.flink.table.factories.TableFactoryService.filterByContext(TableFactoryService.java:283)

               at org.apache.flink.table.factories.TableFactoryService.filter(TableFactoryService.java:191)

               at org.apache.flink.table.factories.TableFactoryService.findSingleInternal(TableFactoryService.java:144)

               at org.apache.flink.table.factories.TableFactoryService.find(TableFactoryService.java:114)

               at org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:258)

               at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$new$0(ExecutionContext.java:136)

               at java.util.HashMap.forEach(HashMap.java:1289)

               at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:135)

               at org.apache.flink.table.client.gateway.local.LocalExecutor.getOrCreateExecutionContext(LocalExecutor.java:549)

               ... 4 more

 

 

 

How can I get rid of this error?

 

Best regards

Yebgenya Lazar

HINWEIS: Dies ist eine vertrauliche Nachricht und nur für den Adressaten bestimmt. Es ist nicht erlaubt, diese Nachricht zu kopieren oder Dritten zugänglich zu machen. Sollten Sie diese Nachricht irrtümlich erhalten haben, bitte ich um Ihre Mitteilung per E-Mail oder unter der oben angegebenen Telefonnummer.
Reply | Threaded
Open this post in threaded view
|

Re: Error while using catalog in .yaml file

phoenixjiangnan
Put flink-connector-hive jar in classpath



On Sun, Aug 25, 2019 at 9:14 AM Yebgenya Lazarkhosrouabadi <[hidden email]> wrote:

Hello,

 

I’m trying to use hivecatalog in flink1.9. I modified the yaml file like this:

 

 

catalogs:

  - name: mynewhive

    type: hive

    hive-conf-dir: /home/user/Downloads/apache-hive-1.2.2-bin/conf

    default-database: myhive

 

 

But when I try to run ./sql-client.sh embedded  I get this error:

 

Exception in thread "main" org.apache.flink.table.client.SqlClientException: The configured environment is invalid. Please check your environment files again.

               at org.apache.flink.table.client.SqlClient.validateEnvironment(SqlClient.java:147)

               at org.apache.flink.table.client.SqlClient.start(SqlClient.java:99)

               at org.apache.flink.table.client.SqlClient.main(SqlClient.java:194)

Caused by: org.apache.flink.table.client.gateway.SqlExecutionException: Could not create execution context.

               at org.apache.flink.table.client.gateway.local.LocalExecutor.getOrCreateExecutionContext(LocalExecutor.java:553)

               at org.apache.flink.table.client.gateway.local.LocalExecutor.validateSession(LocalExecutor.java:373)

               at org.apache.flink.table.client.SqlClient.validateEnvironment(SqlClient.java:144)

               ... 2 more

Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException: Could not find a suitable table factory for 'org.apache.flink.table.factories.CatalogFactory' in

the classpath.

 

Reason: No context matches.

 

The following properties are requested:

default-database=myhive

hive-conf-dir=/home/bernadette/Downloads/apache-hive-1.2.2-bin/conf

type=hive

 

The following factories have been considered:

org.apache.flink.table.catalog.GenericInMemoryCatalogFactory

org.apache.flink.table.sources.CsvBatchTableSourceFactory

org.apache.flink.table.sources.CsvAppendTableSourceFactory

org.apache.flink.table.sinks.CsvBatchTableSinkFactory

org.apache.flink.table.sinks.CsvAppendTableSinkFactory

org.apache.flink.table.planner.StreamPlannerFactory

org.apache.flink.table.executor.StreamExecutorFactory

org.apache.flink.table.planner.delegation.BlinkPlannerFactory

org.apache.flink.table.planner.delegation.BlinkExecutorFactory

               at org.apache.flink.table.factories.TableFactoryService.filterByContext(TableFactoryService.java:283)

               at org.apache.flink.table.factories.TableFactoryService.filter(TableFactoryService.java:191)

               at org.apache.flink.table.factories.TableFactoryService.findSingleInternal(TableFactoryService.java:144)

               at org.apache.flink.table.factories.TableFactoryService.find(TableFactoryService.java:114)

               at org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:258)

               at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$new$0(ExecutionContext.java:136)

               at java.util.HashMap.forEach(HashMap.java:1289)

               at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:135)

               at org.apache.flink.table.client.gateway.local.LocalExecutor.getOrCreateExecutionContext(LocalExecutor.java:549)

               ... 4 more

 

 

 

How can I get rid of this error?

 

Best regards

Yebgenya Lazar

HINWEIS: Dies ist eine vertrauliche Nachricht und nur für den Adressaten bestimmt. Es ist nicht erlaubt, diese Nachricht zu kopieren oder Dritten zugänglich zu machen. Sollten Sie diese Nachricht irrtümlich erhalten haben, bitte ich um Ihre Mitteilung per E-Mail oder unter der oben angegebenen Telefonnummer.
Reply | Threaded
Open this post in threaded view
|

Re: Error while using catalog in .yaml file

Yebgenya Lazarkhosrouabadi
In reply to this post by Yebgenya Lazarkhosrouabadi

Hello,

 

I build Flink from source and have the flink-connector-hive jar file now. I copied this file to the lib directory of flink but I still get the same error as I try to run ./sql-client.sh embedded. I get this error:

 

Exception in thread "main" org.apache.flink.table.client.SqlClientException: The configured environment is invalid. Please check your environment files again.

               at org.apache.flink.table.client.SqlClient.validateEnvironment(SqlClient.java:147)

               at org.apache.flink.table.client.SqlClient.start(SqlClient.java:99)

               at org.apache.flink.table.client.SqlClient.main(SqlClient.java:194)

Caused by: org.apache.flink.table.client.gateway.SqlExecutionException: Could not create execution context.

               at org.apache.flink.table.client.gateway.local.LocalExecutor.getOrCreateExecutionContext(LocalExecutor.java:553)

               at org.apache.flink.table.client.gateway.local.LocalExecutor.validateSession(LocalExecutor.java:373)

               at org.apache.flink.table.client.SqlClient.validateEnvironment(SqlClient.java:144)

               ... 2 more

Caused by: java.lang.NoClassDefFoundError: org/apache/hive/common/util/HiveVersionInfo

               at org.apache.flink.table.catalog.hive.client.HiveShimLoader.getHiveVersion(HiveShimLoader.java:58)

               at org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory.createCatalog(HiveCatalogFactory.java:82)

               at org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:259)

               at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$new$0(ExecutionContext.java:136)

               at java.util.HashMap.forEach(HashMap.java:1289)

               at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:135)

               at org.apache.flink.table.client.gateway.local.LocalExecutor.getOrCreateExecutionContext(LocalExecutor.java:549)

               ... 4 more

Caused by: java.lang.ClassNotFoundException: org.apache.hive.common.util.HiveVersionInfo

               at java.net.URLClassLoader.findClass(URLClassLoader.java:382)

               at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

               at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)

               at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

 

 

 

 

I have this information for the Classpath in the log file:

 

2019-08-28 20:06:42,278 INFO  org.apache.flink.runtime.taskexecutor.TaskManagerRunner       -  Classpath: :/home/user/Dokumente/flink-1.9.0/flink-dist/target/flink-1.9.0-bin/flink-1.9.0/lib/flink-connector-hive_2.11-1.9.0.jar:/home/user/Dokumente/flink-1.9.0/flink-dist/target/flink-1.9.0-bin/flink-1.9.0/lib/flink-dist_2.11-1.9.0.jar:/home/user/Dokumente/flink-1.9.0/flink-dist/target/flink-1.9.0-bin/flink-1.9.0/lib/flink-table_2.11-1.9.0.jar:/home/user/Dokumente/flink-1.9.0/flink-dist/target/flink-1.9.0-bin/flink-1.9.0/lib/flink-table-blink_2.11-1.9.0.jar:/home/user/Dokumente/flink-1.9.0/flink-dist/target/flink-1.9.0-bin/flink-1.9.0/lib/log4j-1.2.17.jar:/home/user/Dokumente/flink-1.9.0/flink-dist/target/flink-1.9.0-bin/flink-1.9.0/lib/slf4j-log4j12-1.7.15.jar::/usr/local/hadoop/hadoop-2.7.7/etc/hadoop:

 

 

The configuration of the catalog in the sql-client-defaults.yaml is like this:

 

catalogs:

  - name: mynewhive

    type: hive

    property-version: 1

    hive-conf-dir: /home/bernadette/Downloads/apache-hive-1.2.2-bin/conf

    hive-version: 1.2.1

 

 

I get no error when I remove these from the yaml file.

 

 

I look forward to hearing from you.

 

Regards

Yebgenya Lazar

 

 

Von: Bowen Li <[hidden email]>
Gesendet: Montag, 26. August 2019 22:45
An: Yebgenya Lazarkhosrouabadi <[hidden email]>
Cc: [hidden email]
Betreff: Re: Error while using catalog in .yaml file

 

Put flink-connector-hive jar in classpath

 

 

 

On Sun, Aug 25, 2019 at 9:14 AM Yebgenya Lazarkhosrouabadi <[hidden email]> wrote:

Hello,

 

I’m trying to use hivecatalog in flink1.9. I modified the yaml file like this:

 

 

catalogs:

  - name: mynewhive

    type: hive

    hive-conf-dir: /home/user/Downloads/apache-hive-1.2.2-bin/conf

    default-database: myhive

 

 

But when I try to run ./sql-client.sh embedded  I get this error:

 

Exception in thread "main" org.apache.flink.table.client.SqlClientException: The configured environment is invalid. Please check your environment files again.

               at org.apache.flink.table.client.SqlClient.validateEnvironment(SqlClient.java:147)

               at org.apache.flink.table.client.SqlClient.start(SqlClient.java:99)

               at org.apache.flink.table.client.SqlClient.main(SqlClient.java:194)

Caused by: org.apache.flink.table.client.gateway.SqlExecutionException: Could not create execution context.

               at org.apache.flink.table.client.gateway.local.LocalExecutor.getOrCreateExecutionContext(LocalExecutor.java:553)

               at org.apache.flink.table.client.gateway.local.LocalExecutor.validateSession(LocalExecutor.java:373)

               at org.apache.flink.table.client.SqlClient.validateEnvironment(SqlClient.java:144)

               ... 2 more

Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException: Could not find a suitable table factory for 'org.apache.flink.table.factories.CatalogFactory' in

the classpath.

 

Reason: No context matches.

 

The following properties are requested:

default-database=myhive

hive-conf-dir=/home/bernadette/Downloads/apache-hive-1.2.2-bin/conf

type=hive

 

The following factories have been considered:

org.apache.flink.table.catalog.GenericInMemoryCatalogFactory

org.apache.flink.table.sources.CsvBatchTableSourceFactory

org.apache.flink.table.sources.CsvAppendTableSourceFactory

org.apache.flink.table.sinks.CsvBatchTableSinkFactory

org.apache.flink.table.sinks.CsvAppendTableSinkFactory

org.apache.flink.table.planner.StreamPlannerFactory

org.apache.flink.table.executor.StreamExecutorFactory

org.apache.flink.table.planner.delegation.BlinkPlannerFactory

org.apache.flink.table.planner.delegation.BlinkExecutorFactory

               at org.apache.flink.table.factories.TableFactoryService.filterByContext(TableFactoryService.java:283)

               at org.apache.flink.table.factories.TableFactoryService.filter(TableFactoryService.java:191)

               at org.apache.flink.table.factories.TableFactoryService.findSingleInternal(TableFactoryService.java:144)

               at org.apache.flink.table.factories.TableFactoryService.find(TableFactoryService.java:114)

               at org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:258)

               at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$new$0(ExecutionContext.java:136)

               at java.util.HashMap.forEach(HashMap.java:1289)

               at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:135)

               at org.apache.flink.table.client.gateway.local.LocalExecutor.getOrCreateExecutionContext(LocalExecutor.java:549)

               ... 4 more

 

 

 

How can I get rid of this error?

 

Best regards

Yebgenya Lazar

HINWEIS: Dies ist eine vertrauliche Nachricht und nur für den Adressaten bestimmt. Es ist nicht erlaubt, diese Nachricht zu kopieren oder Dritten zugänglich zu machen. Sollten Sie diese Nachricht irrtümlich erhalten haben, bitte ich um Ihre Mitteilung per E-Mail oder unter der oben angegebenen Telefonnummer.

HINWEIS: Dies ist eine vertrauliche Nachricht und nur für den Adressaten bestimmt. Es ist nicht erlaubt, diese Nachricht zu kopieren oder Dritten zugänglich zu machen. Sollten Sie diese Nachricht irrtümlich erhalten haben, bitte ich um Ihre Mitteilung per E-Mail oder unter der oben angegebenen Telefonnummer.