Compile flink file with external jar

classic Classic list List threaded Threaded
6 messages Options
Reply | Threaded
Open this post in threaded view
|

Compile flink file with external jar

Giacomo Licari
Hi guys,

I'm developing a dataflow client whose backend, exported as jar, allows users to convert dataflows to a Flink file .java with some code inside. The generated file naturally calls some classes I have in jar file, like MapFunction, DataSet.

My question is: 
How to compile the generated file and pack it into a jar file, so I can execute it with Flink?

I have tried several methods like:

  • javac -cp backend.jar:. GeneratedClass.java
  • javac -cp ".:backend.jar" GeneratedClass.java
  • javac -cp ".:./backend.jar" GeneratedClass.java

with no success.

I receive errors like:
GeneratedClass.java:2: error: package org.apache.flink.api.common.functions does not exist GeneratedClass.java:3: error: package org.apache.flink.api.java does not exist


Thanks a lot,

Giacomo

Reply | Threaded
Open this post in threaded view
|

Re: Compile flink file with external jar

rmetzger0
Hi,

you have to make sure that the Flink classes are contained in your class path.
Either add the flink-dist jar from the binary distribution to your class path, or use maven to build the backend.jar as a fat jar.

Why are you generating a java class from your dataflows?
Isn't it easier to just call the Flink APIs when parsing your flow description language?

Robert


On Mon, Sep 21, 2015 at 6:41 PM, Giacomo Licari <[hidden email]> wrote:
Hi guys,

I'm developing a dataflow client whose backend, exported as jar, allows users to convert dataflows to a Flink file .java with some code inside. The generated file naturally calls some classes I have in jar file, like MapFunction, DataSet.

My question is: 
How to compile the generated file and pack it into a jar file, so I can execute it with Flink?

I have tried several methods like:

  • javac -cp backend.jar:. GeneratedClass.java
  • javac -cp ".:backend.jar" GeneratedClass.java
  • javac -cp ".:./backend.jar" GeneratedClass.java

with no success.

I receive errors like:
GeneratedClass.java:2: error: package org.apache.flink.api.common.functions does not exist GeneratedClass.java:3: error: package org.apache.flink.api.java does not exist


Thanks a lot,

Giacomo


Reply | Threaded
Open this post in threaded view
|

Re: Compile flink file with external jar

Giacomo Licari
Hi Robert,
thanks for the reply.
I receive a JSON from my client interface, which contains the dataflow description.
Then I parse that JSON and the parser creates a string which contains the Flink code, as the user can modify the dataflow, the description can change every time it calls "Execute Dataflow".

Now I should be able to execute the generated String code, so I was first thinking about to write it to a .java file.
What could you suggest to me? Is it possible to execute the string code directly from the backend.jar I have?


On Mon, Sep 21, 2015 at 6:54 PM, Robert Metzger <[hidden email]> wrote:
Hi,

you have to make sure that the Flink classes are contained in your class path.
Either add the flink-dist jar from the binary distribution to your class path, or use maven to build the backend.jar as a fat jar.

Why are you generating a java class from your dataflows?
Isn't it easier to just call the Flink APIs when parsing your flow description language?

Robert


On Mon, Sep 21, 2015 at 6:41 PM, Giacomo Licari <[hidden email]> wrote:
Hi guys,

I'm developing a dataflow client whose backend, exported as jar, allows users to convert dataflows to a Flink file .java with some code inside. The generated file naturally calls some classes I have in jar file, like MapFunction, DataSet.

My question is: 
How to compile the generated file and pack it into a jar file, so I can execute it with Flink?

I have tried several methods like:

  • javac -cp backend.jar:. GeneratedClass.java
  • javac -cp ".:backend.jar" GeneratedClass.java
  • javac -cp ".:./backend.jar" GeneratedClass.java

with no success.

I receive errors like:
GeneratedClass.java:2: error: package org.apache.flink.api.common.functions does not exist GeneratedClass.java:3: error: package org.apache.flink.api.java does not exist


Thanks a lot,

Giacomo



Reply | Threaded
Open this post in threaded view
|

Re: Compile flink file with external jar

Fabian Hueske-2

Hi Giacomo,

you could use Janino [1] to directly compile the code string into a class and execute it. The program does not need to be shipped to the cluster if all user functions are contained in the jar.

Cheers, Fabian

[1] http://unkrig.de/w/Janino

On Sep 21, 2015 7:08 PM, "Giacomo Licari" <[hidden email]> wrote:
Hi Robert,
thanks for the reply.
I receive a JSON from my client interface, which contains the dataflow description.
Then I parse that JSON and the parser creates a string which contains the Flink code, as the user can modify the dataflow, the description can change every time it calls "Execute Dataflow".

Now I should be able to execute the generated String code, so I was first thinking about to write it to a .java file.
What could you suggest to me? Is it possible to execute the string code directly from the backend.jar I have?


On Mon, Sep 21, 2015 at 6:54 PM, Robert Metzger <[hidden email]> wrote:
Hi,

you have to make sure that the Flink classes are contained in your class path.
Either add the flink-dist jar from the binary distribution to your class path, or use maven to build the backend.jar as a fat jar.

Why are you generating a java class from your dataflows?
Isn't it easier to just call the Flink APIs when parsing your flow description language?

Robert


On Mon, Sep 21, 2015 at 6:41 PM, Giacomo Licari <[hidden email]> wrote:
Hi guys,

I'm developing a dataflow client whose backend, exported as jar, allows users to convert dataflows to a Flink file .java with some code inside. The generated file naturally calls some classes I have in jar file, like MapFunction, DataSet.

My question is: 
How to compile the generated file and pack it into a jar file, so I can execute it with Flink?

I have tried several methods like:

  • javac -cp backend.jar:. GeneratedClass.java
  • javac -cp ".:backend.jar" GeneratedClass.java
  • javac -cp ".:./backend.jar" GeneratedClass.java

with no success.

I receive errors like:
GeneratedClass.java:2: error: package org.apache.flink.api.common.functions does not exist GeneratedClass.java:3: error: package org.apache.flink.api.java does not exist


Thanks a lot,

Giacomo



Reply | Threaded
Open this post in threaded view
|

Re: Compile flink file with external jar

Giacomo Licari
Thanks a lot Fabian,
I will try it.

Cheers,
Giacomo

On Mon, Sep 21, 2015 at 7:35 PM, Fabian Hueske <[hidden email]> wrote:

Hi Giacomo,

you could use Janino [1] to directly compile the code string into a class and execute it. The program does not need to be shipped to the cluster if all user functions are contained in the jar.

Cheers, Fabian

[1] http://unkrig.de/w/Janino

On Sep 21, 2015 7:08 PM, "Giacomo Licari" <[hidden email]> wrote:
Hi Robert,
thanks for the reply.
I receive a JSON from my client interface, which contains the dataflow description.
Then I parse that JSON and the parser creates a string which contains the Flink code, as the user can modify the dataflow, the description can change every time it calls "Execute Dataflow".

Now I should be able to execute the generated String code, so I was first thinking about to write it to a .java file.
What could you suggest to me? Is it possible to execute the string code directly from the backend.jar I have?


On Mon, Sep 21, 2015 at 6:54 PM, Robert Metzger <[hidden email]> wrote:
Hi,

you have to make sure that the Flink classes are contained in your class path.
Either add the flink-dist jar from the binary distribution to your class path, or use maven to build the backend.jar as a fat jar.

Why are you generating a java class from your dataflows?
Isn't it easier to just call the Flink APIs when parsing your flow description language?

Robert


On Mon, Sep 21, 2015 at 6:41 PM, Giacomo Licari <[hidden email]> wrote:
Hi guys,

I'm developing a dataflow client whose backend, exported as jar, allows users to convert dataflows to a Flink file .java with some code inside. The generated file naturally calls some classes I have in jar file, like MapFunction, DataSet.

My question is: 
How to compile the generated file and pack it into a jar file, so I can execute it with Flink?

I have tried several methods like:

  • javac -cp backend.jar:. GeneratedClass.java
  • javac -cp ".:backend.jar" GeneratedClass.java
  • javac -cp ".:./backend.jar" GeneratedClass.java

with no success.

I receive errors like:
GeneratedClass.java:2: error: package org.apache.flink.api.common.functions does not exist GeneratedClass.java:3: error: package org.apache.flink.api.java does not exist


Thanks a lot,

Giacomo




Reply | Threaded
Open this post in threaded view
|

Re: Compile flink file with external jar

Giacomo Licari
Hi guys,
Janino works perfectly!

It allows to create entire classes on-the-fly and to use them.

Thank you a lot.

Cheers,
Giacomo

On Mon, Sep 21, 2015 at 7:49 PM, Giacomo Licari <[hidden email]> wrote:
Thanks a lot Fabian,
I will try it.

Cheers,
Giacomo

On Mon, Sep 21, 2015 at 7:35 PM, Fabian Hueske <[hidden email]> wrote:

Hi Giacomo,

you could use Janino [1] to directly compile the code string into a class and execute it. The program does not need to be shipped to the cluster if all user functions are contained in the jar.

Cheers, Fabian

[1] http://unkrig.de/w/Janino

On Sep 21, 2015 7:08 PM, "Giacomo Licari" <[hidden email]> wrote:
Hi Robert,
thanks for the reply.
I receive a JSON from my client interface, which contains the dataflow description.
Then I parse that JSON and the parser creates a string which contains the Flink code, as the user can modify the dataflow, the description can change every time it calls "Execute Dataflow".

Now I should be able to execute the generated String code, so I was first thinking about to write it to a .java file.
What could you suggest to me? Is it possible to execute the string code directly from the backend.jar I have?


On Mon, Sep 21, 2015 at 6:54 PM, Robert Metzger <[hidden email]> wrote:
Hi,

you have to make sure that the Flink classes are contained in your class path.
Either add the flink-dist jar from the binary distribution to your class path, or use maven to build the backend.jar as a fat jar.

Why are you generating a java class from your dataflows?
Isn't it easier to just call the Flink APIs when parsing your flow description language?

Robert


On Mon, Sep 21, 2015 at 6:41 PM, Giacomo Licari <[hidden email]> wrote:
Hi guys,

I'm developing a dataflow client whose backend, exported as jar, allows users to convert dataflows to a Flink file .java with some code inside. The generated file naturally calls some classes I have in jar file, like MapFunction, DataSet.

My question is: 
How to compile the generated file and pack it into a jar file, so I can execute it with Flink?

I have tried several methods like:

  • javac -cp backend.jar:. GeneratedClass.java
  • javac -cp ".:backend.jar" GeneratedClass.java
  • javac -cp ".:./backend.jar" GeneratedClass.java

with no success.

I receive errors like:
GeneratedClass.java:2: error: package org.apache.flink.api.common.functions does not exist GeneratedClass.java:3: error: package org.apache.flink.api.java does not exist


Thanks a lot,

Giacomo