Re: problem with build from source flink 1.11

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Re: problem with build from source flink 1.11

Felipe Lolas
Seems fixed!

I was replacing only flink-dist.jar. When replaced all the compiled jar's from flink-1.1.0-bin fixed the issue.

Thanks!

El 27 de julio de 2020 4:28, Felipe Lolas <[hidden email]> escribió:

Hi!! Timo and Chesnay:

Thanks for helping!!!

Here is the full stack trace:
2020-07-27 05:27:38,661 INFO  org.apache.flink.runtime.executiongraph.ExecutionGraph       [] - Job insert-into_default_catalog.default_database.print_table (ca40bd10a729f5cad56a7db6bef17a6f) switched from state FAILING to FAILED.
org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:116) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:78) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:192) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:185) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:179) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:503) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:386) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:284) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:199) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26) [flink-dist_2.11-1.11.0.jar:1.11.0]
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21) [flink-dist_2.11-1.11.0.jar:1.11.0]
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123) [flink-dist_2.11-1.11.0.jar:1.11.0]
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21) [flink-dist_2.11-1.11.0.jar:1.11.0]
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170) [flink-dist_2.11-1.11.0.jar:1.11.0]
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171) [flink-dist_2.11-1.11.0.jar:1.11.0]
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171) [flink-dist_2.11-1.11.0.jar:1.11.0]
at akka.actor.Actor$class.aroundReceive(Actor.scala:517) [flink-dist_2.11-1.11.0.jar:1.11.0]
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225) [flink-dist_2.11-1.11.0.jar:1.11.0]
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592) [flink-dist_2.11-1.11.0.jar:1.11.0]
at akka.actor.ActorCell.invoke(ActorCell.scala:561) [flink-dist_2.11-1.11.0.jar:1.11.0]
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258) [flink-dist_2.11-1.11.0.jar:1.11.0]
at akka.dispatch.Mailbox.run(Mailbox.scala:225) [flink-dist_2.11-1.11.0.jar:1.11.0]
at akka.dispatch.Mailbox.exec(Mailbox.scala:235) [flink-dist_2.11-1.11.0.jar:1.11.0]
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) [flink-dist_2.11-1.11.0.jar:1.11.0]
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) [flink-dist_2.11-1.11.0.jar:1.11.0]
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) [flink-dist_2.11-1.11.0.jar:1.11.0]
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) [flink-dist_2.11-1.11.0.jar:1.11.0]
Caused by: java.lang.NoSuchMethodError: java.nio.ByteBuffer.position(I)Ljava/nio/ByteBuffer;
at org.apache.flink.core.memory.DataOutputSerializer.wrapAsByteBuffer(DataOutputSerializer.java:65) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
at org.apache.flink.runtime.io.network.api.serialization.SpanningRecordSerializer.<init>(SpanningRecordSerializer.java:50) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
at org.apache.flink.runtime.io.network.api.writer.RecordWriter.<init>(RecordWriter.java:98) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
at org.apache.flink.runtime.io.network.api.writer.ChannelSelectorRecordWriter.<init>(ChannelSelectorRecordWriter.java:50) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
at org.apache.flink.runtime.io.network.api.writer.RecordWriterBuilder.build(RecordWriterBuilder.java:53) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
at org.apache.flink.streaming.runtime.tasks.StreamTask.createRecordWriter(StreamTask.java:1159) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
at org.apache.flink.streaming.runtime.tasks.StreamTask.createRecordWriters(StreamTask.java:1124) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
at org.apache.flink.streaming.runtime.tasks.StreamTask.createRecordWriterDelegate(StreamTask.java:1102) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
at org.apache.flink.streaming.runtime.tasks.StreamTask.<init>(StreamTask.java:278) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
at org.apache.flink.streaming.runtime.tasks.StreamTask.<init>(StreamTask.java:265) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
at org.apache.flink.streaming.runtime.tasks.SourceStreamTask.<init>(SourceStreamTask.java:72) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
at org.apache.flink.streaming.runtime.tasks.SourceStreamTask.<init>(SourceStreamTask.java:68) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112]
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112]
at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112]
at org.apache.flink.runtime.taskmanager.Task.loadAndInstantiateInvokable(Task.java:1372) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:699) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:546) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
at java.lang.Thread.run(Thread.java:745) ~[?:1.8.0_112]

Maven
Apache Maven 3.2.5 (12a6b3acb947671f09b81f49094c53f426d8cea1; 2014-12-14T14:29:23-03:00)
Maven home: /Users/flolas/Downloads/apache-maven-3.2.5
Java version: 1.8.0_202, vendor: Oracle Corporation
Java home: /Library/Java/JavaVirtualMachines/jdk1.8.0_202.jdk/Contents/Home/jre
Default locale: es_CL, platform encoding: US-ASCII
OS name: "mac os x", version: "10.14.5", arch: "x86_64", family: "mac"

This happens only with my builded dist. Using the version of the 1.11 release in apache mirror works fine.



Thanks!!

Cheers,
Felipe L.


El 27 de julio de 2020 2:34, Chesnay Schepler <[hidden email]> escribió:

@Timo maven 3.2.5 is the recommended Maven version for building Flink.

@Felipe Can you provide us the full stacktrace? This could be a library
issue in regards to JDK compatibility.

On 27/07/2020 15:23, Timo Walther wrote:
Hi Felipe,

are you sure that Maven and the TaskManagers are using the JDK version
that you mentioned?

Usually, a `mvn clean install` in the `.../flink/` directory should
succeed without any problems. Also your Maven version seems pretty
old. I'm using Apache Maven 3.6.3 for example.

The NoSuchMethodError indicates that there is some version mismatch.
It seems that this version mismatch is related to your JDK version.
Maybe your task managers run a different version?

Let me know if this helped.

Regards,
Timo


On 27.07.20 12:09, Felipe Lolas wrote:
Hi,

Im Felipe, just started learning flink a few weeks ago(moving spark
streaming workloads).

Now, I currently testing some changes into flink-yarn, but when using
my builded flink-dist.jar, the Job in TaskManager fails because of:
java.lang.NoSuchMethodError:
java.nio.ByteBuffer.position(I)Ljava/nio/ByteBuffer;

*env(build)*
flink-1.11.0
maven 3.2.5
jdk 1.8
macox

*command(into flink-parent)*

mvn clean install -DskipTests -Dfast

*env*
yarn application mode
cdh 6.2.1
*
*
can anyone help me?

Thank you!
Cheers,
Felipe L



Reply | Threaded
Open this post in threaded view
|

Re:  problem with build from source flink 1.11

Timo Walther
Great to hear. Thanks for letting us know.

Regards,
Timo

On 27.07.20 17:58, Felipe Lolas wrote:

> Seems fixed!
>
> I was replacing only flink-dist.jar. When replaced all the compiled
> jar's from flink-1.1.0-bin fixed the issue.
>
> Thanks!
>
> El 27 de julio de 2020 4:28, Felipe Lolas <[hidden email]> escribió:
>
>> Hi!! Timo and Chesnay:
>>
>> Thanks for helping!!!
>>
>> Here is the full stack trace:
>> 2020-07-27 05:27:38,661 INFO  org.apache.flink.runtime.executiongraph.ExecutionGraph       [] - Job insert-into_default_catalog.default_database.print_table (ca40bd10a729f5cad56a7db6bef17a6f) switched from state FAILING to FAILED.
>> org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
>> at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:116) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>> at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:78) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>> at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:192) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>> at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:185) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>> at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:179) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>> at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:503) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>> at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:386) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_112]
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_112]
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
>> at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
>> at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:284) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>> at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:199) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>> at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>> at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>> at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26) [flink-dist_2.11-1.11.0.jar:1.11.0]
>> at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21) [flink-dist_2.11-1.11.0.jar:1.11.0]
>> at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123) [flink-dist_2.11-1.11.0.jar:1.11.0]
>> at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21) [flink-dist_2.11-1.11.0.jar:1.11.0]
>> at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170) [flink-dist_2.11-1.11.0.jar:1.11.0]
>> at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171) [flink-dist_2.11-1.11.0.jar:1.11.0]
>> at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171) [flink-dist_2.11-1.11.0.jar:1.11.0]
>> at akka.actor.Actor$class.aroundReceive(Actor.scala:517) [flink-dist_2.11-1.11.0.jar:1.11.0]
>> at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225) [flink-dist_2.11-1.11.0.jar:1.11.0]
>> at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592) [flink-dist_2.11-1.11.0.jar:1.11.0]
>> at akka.actor.ActorCell.invoke(ActorCell.scala:561) [flink-dist_2.11-1.11.0.jar:1.11.0]
>> at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258) [flink-dist_2.11-1.11.0.jar:1.11.0]
>> at akka.dispatch.Mailbox.run(Mailbox.scala:225) [flink-dist_2.11-1.11.0.jar:1.11.0]
>> at akka.dispatch.Mailbox.exec(Mailbox.scala:235) [flink-dist_2.11-1.11.0.jar:1.11.0]
>> at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) [flink-dist_2.11-1.11.0.jar:1.11.0]
>> at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) [flink-dist_2.11-1.11.0.jar:1.11.0]
>> at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) [flink-dist_2.11-1.11.0.jar:1.11.0]
>> at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) [flink-dist_2.11-1.11.0.jar:1.11.0]
>> Caused by: java.lang.NoSuchMethodError: java.nio.ByteBuffer.position(I)Ljava/nio/ByteBuffer;
>> at org.apache.flink.core.memory.DataOutputSerializer.wrapAsByteBuffer(DataOutputSerializer.java:65) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>> at org.apache.flink.runtime.io.network.api.serialization.SpanningRecordSerializer.<init>(SpanningRecordSerializer.java:50) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>> at org.apache.flink.runtime.io.network.api.writer.RecordWriter.<init>(RecordWriter.java:98) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>> at org.apache.flink.runtime.io.network.api.writer.ChannelSelectorRecordWriter.<init>(ChannelSelectorRecordWriter.java:50) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>> at org.apache.flink.runtime.io.network.api.writer.RecordWriterBuilder.build(RecordWriterBuilder.java:53) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>> at org.apache.flink.streaming.runtime.tasks.StreamTask.createRecordWriter(StreamTask.java:1159) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>> at org.apache.flink.streaming.runtime.tasks.StreamTask.createRecordWriters(StreamTask.java:1124) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>> at org.apache.flink.streaming.runtime.tasks.StreamTask.createRecordWriterDelegate(StreamTask.java:1102) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>> at org.apache.flink.streaming.runtime.tasks.StreamTask.<init>(StreamTask.java:278) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>> at org.apache.flink.streaming.runtime.tasks.StreamTask.<init>(StreamTask.java:265) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>> at org.apache.flink.streaming.runtime.tasks.SourceStreamTask.<init>(SourceStreamTask.java:72) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>> at org.apache.flink.streaming.runtime.tasks.SourceStreamTask.<init>(SourceStreamTask.java:68) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_112]
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_112]
>> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_112]
>> at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_112]
>> at org.apache.flink.runtime.taskmanager.Task.loadAndInstantiateInvokable(Task.java:1372) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>> at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:699) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>> at org.apache.flink.runtime.taskmanager.Task.run(Task.java:546) ~[flink-dist_2.11-1.11.0.jar:1.11.0]
>> at java.lang.Thread.run(Thread.java:745) ~[?:1.8.0_112]
>>
>> Maven
>> Apache Maven 3.2.5 (12a6b3acb947671f09b81f49094c53f426d8cea1;
>> 2014-12-14T14:29:23-03:00)
>> Maven home: /Users/flolas/Downloads/apache-maven-3.2.5
>> Java version: 1.8.0_202, vendor: Oracle Corporation
>> Java home:
>> /Library/Java/JavaVirtualMachines/jdk1.8.0_202.jdk/Contents/Home/jre
>> Default locale: es_CL, platform encoding: US-ASCII
>> OS name: "mac os x", version: "10.14.5", arch: "x86_64", family: "mac"
>>
>> This happens only with my builded dist. Using the version of the 1.11
>> release in apache mirror works fine.
>>
>>
>>
>> Thanks!!
>>
>> Cheers,
>> Felipe L.
>>
>>
>> El 27 de julio de 2020 2:34, Chesnay Schepler <[hidden email]>
>> escribió:
>>
>>> @Timo maven 3.2.5 is the recommended Maven version for building Flink.
>>>
>>> @Felipe Can you provide us the full stacktrace? This could be a library
>>> issue in regards to JDK compatibility.
>>>
>>> On 27/07/2020 15:23, Timo Walther wrote:
>>>> Hi Felipe,
>>>>
>>>> are you sure that Maven and the TaskManagers are using the JDK version
>>>> that you mentioned?
>>>>
>>>> Usually, a `mvn clean install` in the `.../flink/` directory should
>>>> succeed without any problems. Also your Maven version seems pretty
>>>> old. I'm using Apache Maven 3.6.3 for example.
>>>>
>>>> The NoSuchMethodError indicates that there is some version mismatch.
>>>> It seems that this version mismatch is related to your JDK version.
>>>> Maybe your task managers run a different version?
>>>>
>>>> Let me know if this helped.
>>>>
>>>> Regards,
>>>> Timo
>>>>
>>>>
>>>> On 27.07.20 12:09, Felipe Lolas wrote:
>>>>> Hi,
>>>>>
>>>>> Im Felipe, just started learning flink a few weeks ago(moving spark
>>>>> streaming workloads).
>>>>>
>>>>> Now, I currently testing some changes into flink-yarn, but when using
>>>>> my builded flink-dist.jar, the Job in TaskManager fails because of:
>>>>> java.lang.NoSuchMethodError:
>>>>> java.nio.ByteBuffer.position(I)Ljava/nio/ByteBuffer;
>>>>>
>>>>> *env(build)*
>>>>> flink-1.11.0
>>>>> maven 3.2.5
>>>>> jdk 1.8
>>>>> macox
>>>>>
>>>>> *command(into flink-parent)*
>>>>>
>>>>> mvn clean install -DskipTests -Dfast
>>>>>
>>>>> *env*
>>>>> yarn application mode
>>>>> cdh 6.2.1
>>>>> *
>>>>> *
>>>>> can anyone help me?
>>>>>
>>>>> Thank you!
>>>>> Cheers,
>>>>> Felipe L
>>>>
>>>>
>>>