Flink ML 1.0.0 - Saving and Loading Models to Score a Single Feature Vector

classic Classic list List threaded Threaded
14 messages Options
Reply | Threaded
Open this post in threaded view
|

Flink ML 1.0.0 - Saving and Loading Models to Score a Single Feature Vector

Sourigna Phetsarath
Flinksters,

Is there an example of saving a Trained Model, loading a Trained Model and then scoring one or more feature vectors using Flink ML?

All of the examples I've seen have shown only sequential fit and predict.

Thank you.

-Gna
--

Gna Phetsarath
System Architect // AOL Platforms // Data Services // Applied Research Chapter
770 Broadway, 5th Floor, New York, NY 10003
o: 212.402.4871 // m: 917.373.7363
vvmr: 8890237 
aim: sphetsarath20 t: @sourigna


Reply | Threaded
Open this post in threaded view
|

Re: Flink ML 1.0.0 - Saving and Loading Models to Score a Single Feature Vector

Simone Robutti
To my knowledge there is nothing like that. PMML is not supported in any form and there's no custom saving format yet. If you really need a quick and dirty solution, it's not that hard to serialize the model into a file. 

2016-03-28 17:59 GMT+02:00 Sourigna Phetsarath <[hidden email]>:
Flinksters,

Is there an example of saving a Trained Model, loading a Trained Model and then scoring one or more feature vectors using Flink ML?

All of the examples I've seen have shown only sequential fit and predict.

Thank you.

-Gna
--

Gna Phetsarath
System Architect // AOL Platforms // Data Services // Applied Research Chapter
770 Broadway, 5th Floor, New York, NY 10003
o: <a href="tel:212.402.4871" value="+12124024871" target="_blank">212.402.4871 // m: <a href="tel:917.373.7363" value="+19173737363" target="_blank">917.373.7363
vvmr: 8890237 
aim: sphetsarath20 t: @sourigna



Reply | Threaded
Open this post in threaded view
|

Re: Flink ML 1.0.0 - Saving and Loading Models to Score a Single Feature Vector

Till Rohrmann

Hi Gna,

there are no utilities yet to do that but you can do it manually. In the end, a model is simply a Flink DataSet which you can serialize to some file. Upon reading this DataSet you simply have to give it to your algorithm to be used as the model. The following code snippet illustrates this approach:

mlr.fit(inputDS, parameters)

// write model to disk using the SerializedOutputFormat
mlr.weightsOption.get.write(new SerializedOutputFormat[WeightVector], "path")

// read the serialized model from disk
val model = env.readFile(new SerializedInputFormat[WeightVector], "path")

// set the read model for the MLR algorithm
mlr.weightsOption = model

Cheers,
Till


On Tue, Mar 29, 2016 at 10:46 AM, Simone Robutti <[hidden email]> wrote:
To my knowledge there is nothing like that. PMML is not supported in any form and there's no custom saving format yet. If you really need a quick and dirty solution, it's not that hard to serialize the model into a file. 

2016-03-28 17:59 GMT+02:00 Sourigna Phetsarath <[hidden email]>:
Flinksters,

Is there an example of saving a Trained Model, loading a Trained Model and then scoring one or more feature vectors using Flink ML?

All of the examples I've seen have shown only sequential fit and predict.

Thank you.

-Gna
--

Gna Phetsarath
System Architect // AOL Platforms // Data Services // Applied Research Chapter
770 Broadway, 5th Floor, New York, NY 10003
o: <a href="tel:212.402.4871" value="+12124024871" target="_blank">212.402.4871 // m: <a href="tel:917.373.7363" value="+19173737363" target="_blank">917.373.7363
vvmr: 8890237 
aim: sphetsarath20 t: @sourigna




Reply | Threaded
Open this post in threaded view
|

Re: Flink ML 1.0.0 - Saving and Loading Models to Score a Single Feature Vector

Sourigna Phetsarath
Till,

Thank you for your reply.  

Having this issue though, WeightVector does not extend IOReadWriteable:

public class SerializedOutputFormat<T extends IOReadableWritable> 

case class WeightVector(weights: Vector, intercept: Double) extends Serializable {}


However, I will use the approach to write out the weights as text.


On Tue, Mar 29, 2016 at 5:01 AM, Till Rohrmann <[hidden email]> wrote:

Hi Gna,

there are no utilities yet to do that but you can do it manually. In the end, a model is simply a Flink DataSet which you can serialize to some file. Upon reading this DataSet you simply have to give it to your algorithm to be used as the model. The following code snippet illustrates this approach:

mlr.fit(inputDS, parameters)

// write model to disk using the SerializedOutputFormat
mlr.weightsOption.get.write(new SerializedOutputFormat[WeightVector], "path")

// read the serialized model from disk
val model = env.readFile(new SerializedInputFormat[WeightVector], "path")

// set the read model for the MLR algorithm
mlr.weightsOption = model

Cheers,
Till


On Tue, Mar 29, 2016 at 10:46 AM, Simone Robutti <[hidden email]> wrote:
To my knowledge there is nothing like that. PMML is not supported in any form and there's no custom saving format yet. If you really need a quick and dirty solution, it's not that hard to serialize the model into a file. 

2016-03-28 17:59 GMT+02:00 Sourigna Phetsarath <[hidden email]>:
Flinksters,

Is there an example of saving a Trained Model, loading a Trained Model and then scoring one or more feature vectors using Flink ML?

All of the examples I've seen have shown only sequential fit and predict.

Thank you.

-Gna
--

Gna Phetsarath
System Architect // AOL Platforms // Data Services // Applied Research Chapter
770 Broadway, 5th Floor, New York, NY 10003
o: <a href="tel:212.402.4871" value="+12124024871" target="_blank">212.402.4871 // m: <a href="tel:917.373.7363" value="+19173737363" target="_blank">917.373.7363
vvmr: 8890237 
aim: sphetsarath20 t: @sourigna







--

Gna Phetsarath
System Architect // AOL Platforms // Data Services // Applied Research Chapter
770 Broadway, 5th Floor, New York, NY 10003
o: 212.402.4871 // m: 917.373.7363
vvmr: 8890237 
aim: sphetsarath20 t: @sourigna


Reply | Threaded
Open this post in threaded view
|

Re: Flink ML 1.0.0 - Saving and Loading Models to Score a Single Feature Vector

Suneel Marthi
U may want to use FlinkMLTools.persist() methods which use TypeSerializerFormat and don't enforce IOReadableWritable.



On Tue, Mar 29, 2016 at 2:12 PM, Sourigna Phetsarath <[hidden email]> wrote:
Till,

Thank you for your reply.  

Having this issue though, WeightVector does not extend IOReadWriteable:

public class SerializedOutputFormat<T extends IOReadableWritable> 

case class WeightVector(weights: Vector, intercept: Double) extends Serializable {}


However, I will use the approach to write out the weights as text.


On Tue, Mar 29, 2016 at 5:01 AM, Till Rohrmann <[hidden email]> wrote:

Hi Gna,

there are no utilities yet to do that but you can do it manually. In the end, a model is simply a Flink DataSet which you can serialize to some file. Upon reading this DataSet you simply have to give it to your algorithm to be used as the model. The following code snippet illustrates this approach:

mlr.fit(inputDS, parameters)

// write model to disk using the SerializedOutputFormat
mlr.weightsOption.get.write(new SerializedOutputFormat[WeightVector], "path")

// read the serialized model from disk
val model = env.readFile(new SerializedInputFormat[WeightVector], "path")

// set the read model for the MLR algorithm
mlr.weightsOption = model

Cheers,
Till


On Tue, Mar 29, 2016 at 10:46 AM, Simone Robutti <[hidden email]> wrote:
To my knowledge there is nothing like that. PMML is not supported in any form and there's no custom saving format yet. If you really need a quick and dirty solution, it's not that hard to serialize the model into a file. 

2016-03-28 17:59 GMT+02:00 Sourigna Phetsarath <[hidden email]>:
Flinksters,

Is there an example of saving a Trained Model, loading a Trained Model and then scoring one or more feature vectors using Flink ML?

All of the examples I've seen have shown only sequential fit and predict.

Thank you.

-Gna
--

Gna Phetsarath
System Architect // AOL Platforms // Data Services // Applied Research Chapter
770 Broadway, 5th Floor, New York, NY 10003
o: <a href="tel:212.402.4871" value="+12124024871" target="_blank">212.402.4871 // m: <a href="tel:917.373.7363" value="+19173737363" target="_blank">917.373.7363
vvmr: 8890237 
aim: sphetsarath20 t: @sourigna







--

Gna Phetsarath
System Architect // AOL Platforms // Data Services // Applied Research Chapter
770 Broadway, 5th Floor, New York, NY 10003
o: <a href="tel:212.402.4871" value="+12124024871" target="_blank">212.402.4871 // m: <a href="tel:917.373.7363" value="+19173737363" target="_blank">917.373.7363
vvmr: 8890237 
aim: sphetsarath20 t: @sourigna



Reply | Threaded
Open this post in threaded view
|

Re: Flink ML 1.0.0 - Saving and Loading Models to Score a Single Feature Vector

Till Rohrmann

Yes Suneel is completely wright. If the data does not implement IOReadableWritable it is probably easier to use the TypeSerializerOutputFormat. What you need here to seralize the data is a TypeSerializer. You can obtain it the following way:

val model = mlr.weightsOption.get

val weightVectorTypeInfo = TypeInformation.of(classOf[WeightVector])
val weightVectorSerializer = weightVectorTypeInfo.createSerializer(new ExecutionConfig())
val outputFormat = new TypeSerializerOutputFormat[WeightVector]
outputFormat.setSerializer(weightVectorSerializer)

model.write(outputFormat, "path")

Cheers,
Till


On Tue, Mar 29, 2016 at 8:22 PM, Suneel Marthi <[hidden email]> wrote:
U may want to use FlinkMLTools.persist() methods which use TypeSerializerFormat and don't enforce IOReadableWritable.



On Tue, Mar 29, 2016 at 2:12 PM, Sourigna Phetsarath <[hidden email]> wrote:
Till,

Thank you for your reply.  

Having this issue though, WeightVector does not extend IOReadWriteable:

public class SerializedOutputFormat<T extends IOReadableWritable> 

case class WeightVector(weights: Vector, intercept: Double) extends Serializable {}


However, I will use the approach to write out the weights as text.


On Tue, Mar 29, 2016 at 5:01 AM, Till Rohrmann <[hidden email]> wrote:

Hi Gna,

there are no utilities yet to do that but you can do it manually. In the end, a model is simply a Flink DataSet which you can serialize to some file. Upon reading this DataSet you simply have to give it to your algorithm to be used as the model. The following code snippet illustrates this approach:

mlr.fit(inputDS, parameters)

// write model to disk using the SerializedOutputFormat
mlr.weightsOption.get.write(new SerializedOutputFormat[WeightVector], "path")

// read the serialized model from disk
val model = env.readFile(new SerializedInputFormat[WeightVector], "path")

// set the read model for the MLR algorithm
mlr.weightsOption = model

Cheers,
Till


On Tue, Mar 29, 2016 at 10:46 AM, Simone Robutti <[hidden email]> wrote:
To my knowledge there is nothing like that. PMML is not supported in any form and there's no custom saving format yet. If you really need a quick and dirty solution, it's not that hard to serialize the model into a file. 

2016-03-28 17:59 GMT+02:00 Sourigna Phetsarath <[hidden email]>:
Flinksters,

Is there an example of saving a Trained Model, loading a Trained Model and then scoring one or more feature vectors using Flink ML?

All of the examples I've seen have shown only sequential fit and predict.

Thank you.

-Gna
--

Gna Phetsarath
System Architect // AOL Platforms // Data Services // Applied Research Chapter
770 Broadway, 5th Floor, New York, NY 10003
o: <a href="tel:212.402.4871" value="+12124024871" target="_blank">212.402.4871 // m: <a href="tel:917.373.7363" value="+19173737363" target="_blank">917.373.7363
vvmr: 8890237 
aim: sphetsarath20 t: @sourigna







--

Gna Phetsarath
System Architect // AOL Platforms // Data Services // Applied Research Chapter
770 Broadway, 5th Floor, New York, NY 10003
o: <a href="tel:212.402.4871" value="+12124024871" target="_blank">212.402.4871 // m: <a href="tel:917.373.7363" value="+19173737363" target="_blank">917.373.7363
vvmr: 8890237 
aim: sphetsarath20 t: @sourigna




Reply | Threaded
Open this post in threaded view
|

Re: Flink ML 1.0.0 - Saving and Loading Models to Score a Single Feature Vector

Behrouz Derakhshan
Is there a reasons the Predictor or Estimator class don't have read and write methods for saving and retrieving the model? I couldn't find Jira issues for it. Does it make sense to create one ?

BR,
Behrouz

On Wed, Mar 30, 2016 at 4:40 PM, Till Rohrmann <[hidden email]> wrote:

Yes Suneel is completely wright. If the data does not implement IOReadableWritable it is probably easier to use the TypeSerializerOutputFormat. What you need here to seralize the data is a TypeSerializer. You can obtain it the following way:

val model = mlr.weightsOption.get

val weightVectorTypeInfo = TypeInformation.of(classOf[WeightVector])
val weightVectorSerializer = weightVectorTypeInfo.createSerializer(new ExecutionConfig())
val outputFormat = new TypeSerializerOutputFormat[WeightVector]
outputFormat.setSerializer(weightVectorSerializer)

model.write(outputFormat, "path")

Cheers,
Till


On Tue, Mar 29, 2016 at 8:22 PM, Suneel Marthi <[hidden email]> wrote:
U may want to use FlinkMLTools.persist() methods which use TypeSerializerFormat and don't enforce IOReadableWritable.



On Tue, Mar 29, 2016 at 2:12 PM, Sourigna Phetsarath <[hidden email]> wrote:
Till,

Thank you for your reply.  

Having this issue though, WeightVector does not extend IOReadWriteable:

public class SerializedOutputFormat<T extends IOReadableWritable> 

case class WeightVector(weights: Vector, intercept: Double) extends Serializable {}


However, I will use the approach to write out the weights as text.


On Tue, Mar 29, 2016 at 5:01 AM, Till Rohrmann <[hidden email]> wrote:

Hi Gna,

there are no utilities yet to do that but you can do it manually. In the end, a model is simply a Flink DataSet which you can serialize to some file. Upon reading this DataSet you simply have to give it to your algorithm to be used as the model. The following code snippet illustrates this approach:

mlr.fit(inputDS, parameters)

// write model to disk using the SerializedOutputFormat
mlr.weightsOption.get.write(new SerializedOutputFormat[WeightVector], "path")

// read the serialized model from disk
val model = env.readFile(new SerializedInputFormat[WeightVector], "path")

// set the read model for the MLR algorithm
mlr.weightsOption = model

Cheers,
Till


On Tue, Mar 29, 2016 at 10:46 AM, Simone Robutti <[hidden email]> wrote:
To my knowledge there is nothing like that. PMML is not supported in any form and there's no custom saving format yet. If you really need a quick and dirty solution, it's not that hard to serialize the model into a file. 

2016-03-28 17:59 GMT+02:00 Sourigna Phetsarath <[hidden email]>:
Flinksters,

Is there an example of saving a Trained Model, loading a Trained Model and then scoring one or more feature vectors using Flink ML?

All of the examples I've seen have shown only sequential fit and predict.

Thank you.

-Gna
--

Gna Phetsarath
System Architect // AOL Platforms // Data Services // Applied Research Chapter
770 Broadway, 5th Floor, New York, NY 10003
o: <a href="tel:212.402.4871" value="+12124024871" target="_blank">212.402.4871 // m: <a href="tel:917.373.7363" value="+19173737363" target="_blank">917.373.7363
vvmr: 8890237 
aim: sphetsarath20 t: @sourigna







--

Gna Phetsarath
System Architect // AOL Platforms // Data Services // Applied Research Chapter
770 Broadway, 5th Floor, New York, NY 10003
o: <a href="tel:212.402.4871" value="+12124024871" target="_blank">212.402.4871 // m: <a href="tel:917.373.7363" value="+19173737363" target="_blank">917.373.7363
vvmr: 8890237 
aim: sphetsarath20 t: @sourigna





Reply | Threaded
Open this post in threaded view
|

Re: Flink ML 1.0.0 - Saving and Loading Models to Score a Single Feature Vector

Trevor Grant
I'm just about to open an issue / PR solution for 'warm-starts'

Once this is in, we could just add a setter for the weight vector (and what ever iteration you're on if you're going to do more partial fits).

Then all you need to save if your weight vector (and iter number).



Trevor Grant
Data Scientist

"Fortunate is he, who is able to know the causes of things."  -Virgil


On Fri, Apr 8, 2016 at 9:04 AM, Behrouz Derakhshan <[hidden email]> wrote:
Is there a reasons the Predictor or Estimator class don't have read and write methods for saving and retrieving the model? I couldn't find Jira issues for it. Does it make sense to create one ?

BR,
Behrouz

On Wed, Mar 30, 2016 at 4:40 PM, Till Rohrmann <[hidden email]> wrote:

Yes Suneel is completely wright. If the data does not implement IOReadableWritable it is probably easier to use the TypeSerializerOutputFormat. What you need here to seralize the data is a TypeSerializer. You can obtain it the following way:

val model = mlr.weightsOption.get

val weightVectorTypeInfo = TypeInformation.of(classOf[WeightVector])
val weightVectorSerializer = weightVectorTypeInfo.createSerializer(new ExecutionConfig())
val outputFormat = new TypeSerializerOutputFormat[WeightVector]
outputFormat.setSerializer(weightVectorSerializer)

model.write(outputFormat, "path")

Cheers,
Till


On Tue, Mar 29, 2016 at 8:22 PM, Suneel Marthi <[hidden email]> wrote:
U may want to use FlinkMLTools.persist() methods which use TypeSerializerFormat and don't enforce IOReadableWritable.



On Tue, Mar 29, 2016 at 2:12 PM, Sourigna Phetsarath <[hidden email]> wrote:
Till,

Thank you for your reply.  

Having this issue though, WeightVector does not extend IOReadWriteable:

public class SerializedOutputFormat<T extends IOReadableWritable> 

case class WeightVector(weights: Vector, intercept: Double) extends Serializable {}


However, I will use the approach to write out the weights as text.


On Tue, Mar 29, 2016 at 5:01 AM, Till Rohrmann <[hidden email]> wrote:

Hi Gna,

there are no utilities yet to do that but you can do it manually. In the end, a model is simply a Flink DataSet which you can serialize to some file. Upon reading this DataSet you simply have to give it to your algorithm to be used as the model. The following code snippet illustrates this approach:

mlr.fit(inputDS, parameters)

// write model to disk using the SerializedOutputFormat
mlr.weightsOption.get.write(new SerializedOutputFormat[WeightVector], "path")

// read the serialized model from disk
val model = env.readFile(new SerializedInputFormat[WeightVector], "path")

// set the read model for the MLR algorithm
mlr.weightsOption = model

Cheers,
Till


On Tue, Mar 29, 2016 at 10:46 AM, Simone Robutti <[hidden email]> wrote:
To my knowledge there is nothing like that. PMML is not supported in any form and there's no custom saving format yet. If you really need a quick and dirty solution, it's not that hard to serialize the model into a file. 

2016-03-28 17:59 GMT+02:00 Sourigna Phetsarath <[hidden email]>:
Flinksters,

Is there an example of saving a Trained Model, loading a Trained Model and then scoring one or more feature vectors using Flink ML?

All of the examples I've seen have shown only sequential fit and predict.

Thank you.

-Gna
--

Gna Phetsarath
System Architect // AOL Platforms // Data Services // Applied Research Chapter
770 Broadway, 5th Floor, New York, NY 10003
o: <a href="tel:212.402.4871" value="+12124024871" target="_blank">212.402.4871 // m: <a href="tel:917.373.7363" value="+19173737363" target="_blank">917.373.7363
vvmr: 8890237 
aim: sphetsarath20 t: @sourigna







--

Gna Phetsarath
System Architect // AOL Platforms // Data Services // Applied Research Chapter
770 Broadway, 5th Floor, New York, NY 10003
o: <a href="tel:212.402.4871" value="+12124024871" target="_blank">212.402.4871 // m: <a href="tel:917.373.7363" value="+19173737363" target="_blank">917.373.7363
vvmr: 8890237 
aim: sphetsarath20 t: @sourigna






Reply | Threaded
Open this post in threaded view
|

Re: Flink ML 1.0.0 - Saving and Loading Models to Score a Single Feature Vector

KirstiLaurila
How should this be done for the recommendation engine (that is ALS, example here https://ci.apache.org/projects/flink/flink-docs-release-1.0/apis/batch/libs/ml/als.html).

 I am able to run the example with my example data but cannot get anything written to any file (user or item matrices).

Basically, I have tried something like this

...
 als.fit(inputTraining, parameters)

 val userMatrix = als.factorsOption.get._1
 
//this work s perfectly   
 userMatrix.first(5).print

//this doesn't do anything, no message in log, no error nor any file to be written
userMatrix.writeAsText("path")
   

Tried also to apply similar approach than this

val model = mlr.weightsOption.get

val weightVectorTypeInfo = TypeInformation.of(classOf[WeightVector])
val weightVectorSerializer = weightVectorTypeInfo.createSerializer(new ExecutionConfig())
val outputFormat = new TypeSerializerOutputFormat[WeightVector]
outputFormat.setSerializer(weightVectorSerializer)

model.write(outputFormat, "path")

but with no success. Could someone help me with this to get my model saved?


Best,
Kirsti


Trevor Grant wrote
I'm just about to open an issue / PR solution for 'warm-starts'

Once this is in, we could just add a setter for the weight vector (and what
ever iteration you're on if you're going to do more partial fits).

Then all you need to save if your weight vector (and iter number).



Trevor Grant
Data Scientist
https://github.com/rawkintrevo
http://stackexchange.com/users/3002022/rawkintrevo
http://trevorgrant.org

*"Fortunate is he, who is able to know the causes of things."  -Virgil*


On Fri, Apr 8, 2016 at 9:04 AM, Behrouz Derakhshan <
[hidden email]> wrote:

> Is there a reasons the Predictor or Estimator class don't have read and
> write methods for saving and retrieving the model? I couldn't find Jira
> issues for it. Does it make sense to create one ?
>
> BR,
> Behrouz
>
> On Wed, Mar 30, 2016 at 4:40 PM, Till Rohrmann <[hidden email]>
> wrote:
>
>> Yes Suneel is completely wright. If the data does not implement
>> IOReadableWritable it is probably easier to use the
>> TypeSerializerOutputFormat. What you need here to seralize the data is a
>> TypeSerializer. You can obtain it the following way:
>>
>> val model = mlr.weightsOption.get
>>
>> val weightVectorTypeInfo = TypeInformation.of(classOf[WeightVector])
>> val weightVectorSerializer = weightVectorTypeInfo.createSerializer(new ExecutionConfig())
>> val outputFormat = new TypeSerializerOutputFormat[WeightVector]
>> outputFormat.setSerializer(weightVectorSerializer)
>>
>> model.write(outputFormat, "path")
>>
>> Cheers,
>> Till
>> ​
>>
>> On Tue, Mar 29, 2016 at 8:22 PM, Suneel Marthi <[hidden email]>
>> wrote:
>>
>>> U may want to use FlinkMLTools.persist() methods which use
>>> TypeSerializerFormat and don't enforce IOReadableWritable.
>>>
>>>
>>>
>>> On Tue, Mar 29, 2016 at 2:12 PM, Sourigna Phetsarath <
>>> [hidden email]> wrote:
>>>
>>>> Till,
>>>>
>>>> Thank you for your reply.
>>>>
>>>> Having this issue though, WeightVector does not extend IOReadWriteable:
>>>>
>>>> *public* *class* SerializedOutputFormat<*T* *extends*
>>>> IOReadableWritable>
>>>>
>>>> *case* *class* WeightVector(weights: Vector, intercept: Double)
>>>> *extends* Serializable {}
>>>>
>>>>
>>>> However, I will use the approach to write out the weights as text.
>>>>
>>>>
>>>> On Tue, Mar 29, 2016 at 5:01 AM, Till Rohrmann <[hidden email]>
>>>> wrote:
>>>>
>>>>> Hi Gna,
>>>>>
>>>>> there are no utilities yet to do that but you can do it manually. In
>>>>> the end, a model is simply a Flink DataSet which you can serialize to
>>>>> some file. Upon reading this DataSet you simply have to give it to
>>>>> your algorithm to be used as the model. The following code snippet
>>>>> illustrates this approach:
>>>>>
>>>>> mlr.fit(inputDS, parameters)
>>>>>
>>>>> // write model to disk using the SerializedOutputFormat
>>>>> mlr.weightsOption.get.write(new SerializedOutputFormat[WeightVector], "path")
>>>>>
>>>>> // read the serialized model from disk
>>>>> val model = env.readFile(new SerializedInputFormat[WeightVector], "path")
>>>>>
>>>>> // set the read model for the MLR algorithm
>>>>> mlr.weightsOption = model
>>>>>
>>>>> Cheers,
>>>>> Till
>>>>> ​
>>>>>
>>>>> On Tue, Mar 29, 2016 at 10:46 AM, Simone Robutti <
>>>>> [hidden email]> wrote:
>>>>>
>>>>>> To my knowledge there is nothing like that. PMML is not supported in
>>>>>> any form and there's no custom saving format yet. If you really need a
>>>>>> quick and dirty solution, it's not that hard to serialize the model into a
>>>>>> file.
>>>>>>
>>>>>> 2016-03-28 17:59 GMT+02:00 Sourigna Phetsarath <
>>>>>> [hidden email]>:
>>>>>>
>>>>>>> Flinksters,
>>>>>>>
>>>>>>> Is there an example of saving a Trained Model, loading a Trained
>>>>>>> Model and then scoring one or more feature vectors using Flink ML?
>>>>>>>
>>>>>>> All of the examples I've seen have shown only sequential fit and
>>>>>>> predict.
>>>>>>>
>>>>>>> Thank you.
>>>>>>>
>>>>>>> -Gna
>>>>>>> --
>>>>>>>
>>>>>>>
>>>>>>> *Gna Phetsarath*System Architect // AOL Platforms // Data Services
>>>>>>> // Applied Research Chapter
>>>>>>> 770 Broadway, 5th Floor, New York, NY 10003
>>>>>>> o: 212.402.4871 // m: 917.373.7363
>>>>>>> vvmr: 8890237 aim: sphetsarath20 t: @sourigna
>>>>>>>
>>>>>>> * <http://www.aolplatforms.com>*
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>>
>>>> --
>>>>
>>>>
>>>> *Gna Phetsarath*System Architect // AOL Platforms // Data Services //
>>>> Applied Research Chapter
>>>> 770 Broadway, 5th Floor, New York, NY 10003
>>>> o: 212.402.4871 // m: 917.373.7363
>>>> vvmr: 8890237 aim: sphetsarath20 t: @sourigna
>>>>
>>>> * <http://www.aolplatforms.com>*
>>>>
>>>
>>>
>>
>
Reply | Threaded
Open this post in threaded view
|

Re: Flink ML 1.0.0 - Saving and Loading Models to Score a Single Feature Vector

Till Rohrmann

Hi Kirsti,

I think you attached some images to your file which show the code. Unfortunately this is not supported by the mailing list. So maybe you could resend what you’ve already tried.

In order to access the ALS model, you can do the following:

val als = ALS()

als.fit(input)

val (userFactorsOpt, itemFactorsOpt) = als.factorsOption

val factorsTypeInfo = TypeInformation.of(classOf[Factors])
val factorsSerializer = factorsTypeInfo.createSerializer(new ExecutionConfig())
val outputFormat = new TypeSerializerOutputFormat[Factors]

userFactorsOpt match {
    case Some(userFactors) => userFactors.write(outputFormat, "user_path")
    case None =>
}

itemFactorsOpt match {
    case Some(itemFactors) => itemFactors.write(outputFormat, "item_path")
    case None =>
}

Cheers,
Till


On Tue, Apr 12, 2016 at 10:29 AM, KirstiLaurila <[hidden email]> wrote:
How should this be done for the recommendation engine (that is ALS, example
here
https://ci.apache.org/projects/flink/flink-docs-release-1.0/apis/batch/libs/ml/als.html
<https://ci.apache.org/projects/flink/flink-docs-release-1.0/apis/batch/libs/ml/als.html>
).

 I am able to run the example with my example data but cannot get anything
written to any file (user or item matrices).

Basically, I have tried something like this




Tried also to apply similar approach than this



but with no success. Could someone help me with this to get my model saved?


Best,
Kirsti



Trevor Grant wrote
> I'm just about to open an issue / PR solution for 'warm-starts'
>
> Once this is in, we could just add a setter for the weight vector (and
> what
> ever iteration you're on if you're going to do more partial fits).
>
> Then all you need to save if your weight vector (and iter number).
>
>
>
> Trevor Grant
> Data Scientist
> https://github.com/rawkintrevo
> http://stackexchange.com/users/3002022/rawkintrevo
> http://trevorgrant.org
>
> *"Fortunate is he, who is able to know the causes of things."  -Virgil*
>
>
> On Fri, Apr 8, 2016 at 9:04 AM, Behrouz Derakhshan <

> behrouz.derakhshan@

>> wrote:
>
>> Is there a reasons the Predictor or Estimator class don't have read and
>> write methods for saving and retrieving the model? I couldn't find Jira
>> issues for it. Does it make sense to create one ?
>>
>> BR,
>> Behrouz
>>
>> On Wed, Mar 30, 2016 at 4:40 PM, Till Rohrmann &lt;

> trohrmann@

> &gt;
>> wrote:
>>
>>> Yes Suneel is completely wright. If the data does not implement
>>> IOReadableWritable it is probably easier to use the
>>> TypeSerializerOutputFormat. What you need here to seralize the data is a
>>> TypeSerializer. You can obtain it the following way:
>>>
>>> val model = mlr.weightsOption.get
>>>
>>> val weightVectorTypeInfo = TypeInformation.of(classOf[WeightVector])
>>> val weightVectorSerializer = weightVectorTypeInfo.createSerializer(new
>>> ExecutionConfig())
>>> val outputFormat = new TypeSerializerOutputFormat[WeightVector]
>>> outputFormat.setSerializer(weightVectorSerializer)
>>>
>>> model.write(outputFormat, "path")
>>>
>>> Cheers,
>>> Till
>>> ​
>>>
>>> On Tue, Mar 29, 2016 at 8:22 PM, Suneel Marthi &lt;

> smarthi@

> &gt;
>>> wrote:
>>>
>>>> U may want to use FlinkMLTools.persist() methods which use
>>>> TypeSerializerFormat and don't enforce IOReadableWritable.
>>>>
>>>>
>>>>
>>>> On Tue, Mar 29, 2016 at 2:12 PM, Sourigna Phetsarath <
>>>>

> gna.phetsarath@

>> wrote:
>>>>
>>>>> Till,
>>>>>
>>>>> Thank you for your reply.
>>>>>
>>>>> Having this issue though, WeightVector does not extend
>>>>> IOReadWriteable:
>>>>>
>>>>> *public* *class* SerializedOutputFormat<*T* *extends*
>>>>> IOReadableWritable>
>>>>>
>>>>> *case* *class* WeightVector(weights: Vector, intercept: Double)
>>>>> *extends* Serializable {}
>>>>>
>>>>>
>>>>> However, I will use the approach to write out the weights as text.
>>>>>
>>>>>
>>>>> On Tue, Mar 29, 2016 at 5:01 AM, Till Rohrmann &lt;

> trohrmann@

> &gt;
>>>>> wrote:
>>>>>
>>>>>> Hi Gna,
>>>>>>
>>>>>> there are no utilities yet to do that but you can do it manually. In
>>>>>> the end, a model is simply a Flink DataSet which you can serialize to
>>>>>> some file. Upon reading this DataSet you simply have to give it to
>>>>>> your algorithm to be used as the model. The following code snippet
>>>>>> illustrates this approach:
>>>>>>
>>>>>> mlr.fit(inputDS, parameters)
>>>>>>
>>>>>> // write model to disk using the SerializedOutputFormat
>>>>>> mlr.weightsOption.get.write(new SerializedOutputFormat[WeightVector],
>>>>>> "path")
>>>>>>
>>>>>> // read the serialized model from disk
>>>>>> val model = env.readFile(new SerializedInputFormat[WeightVector],
>>>>>> "path")
>>>>>>
>>>>>> // set the read model for the MLR algorithm
>>>>>> mlr.weightsOption = model
>>>>>>
>>>>>> Cheers,
>>>>>> Till
>>>>>> ​
>>>>>>
>>>>>> On Tue, Mar 29, 2016 at 10:46 AM, Simone Robutti <
>>>>>>

> simone.robutti@

>> wrote:
>>>>>>
>>>>>>> To my knowledge there is nothing like that. PMML is not supported in
>>>>>>> any form and there's no custom saving format yet. If you really need
>>>>>>> a
>>>>>>> quick and dirty solution, it's not that hard to serialize the model
>>>>>>> into a
>>>>>>> file.
>>>>>>>
>>>>>>> 2016-03-28 17:59 GMT+02:00 Sourigna Phetsarath <
>>>>>>>

> gna.phetsarath@

>>:
>>>>>>>
>>>>>>>> Flinksters,
>>>>>>>>
>>>>>>>> Is there an example of saving a Trained Model, loading a Trained
>>>>>>>> Model and then scoring one or more feature vectors using Flink ML?
>>>>>>>>
>>>>>>>> All of the examples I've seen have shown only sequential fit and
>>>>>>>> predict.
>>>>>>>>
>>>>>>>> Thank you.
>>>>>>>>
>>>>>>>> -Gna
>>>>>>>> --
>>>>>>>>
>>>>>>>>
>>>>>>>> *Gna Phetsarath*System Architect // AOL Platforms // Data Services
>>>>>>>> // Applied Research Chapter
>>>>>>>> 770 Broadway, 5th Floor, New York, NY 10003
>>>>>>>> o: <a href="tel:212.402.4871" value="+12124024871">212.402.4871 // m: <a href="tel:917.373.7363" value="+19173737363">917.373.7363
>>>>>>>> vvmr: 8890237 aim: sphetsarath20 t: @sourigna
>>>>>>>>
>>>>>>>> * &lt;http://www.aolplatforms.com&gt;*
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>> --
>>>>>
>>>>>
>>>>> *Gna Phetsarath*System Architect // AOL Platforms // Data Services //
>>>>> Applied Research Chapter
>>>>> 770 Broadway, 5th Floor, New York, NY 10003
>>>>> o: <a href="tel:212.402.4871" value="+12124024871">212.402.4871 // m: <a href="tel:917.373.7363" value="+19173737363">917.373.7363
>>>>> vvmr: 8890237 aim: sphetsarath20 t: @sourigna
>>>>>
>>>>> * &lt;http://www.aolplatforms.com&gt;*
>>>>>
>>>>
>>>>
>>>
>>





--
View this message in context: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Flink-ML-1-0-0-Saving-and-Loading-Models-to-Score-a-Single-Feature-Vector-tp5766p6056.html
Sent from the Apache Flink User Mailing List archive. mailing list archive at Nabble.com.

Reply | Threaded
Open this post in threaded view
|

Re: Flink ML 1.0.0 - Saving and Loading Models to Score a Single Feature Vector

KirstiLaurila
Hi,

those parts were examples how I had tried. I tried with your suggestions, but still no success. Additionally,
there were some problems:


val (userFactorsOpt, itemFactorsOpt) = als.factorsOption

If I had just this, userFactorsOpt And itemFactorsOpt did not have write method. So I added get there i.e.

val (userFactorsOpt, itemFactorsOpt) = als.factorsOption.get


val factorsTypeInfo = TypeInformation.of(classOf[Factors])
val factorsSerializer = factorsTypeInfo.createSerializer(new ExecutionConfig())
val outputFormat = new TypeSerializerOutputFormat[Factors]


Here, the factorsSerializer was not used at all, so I guess this was missing line

    outputFormat.setSerializer(factorsSerializer)


userFactorsOpt match {
    case Some(userFactors) => userFactors.write(outputFormat, "user_path")
    case None =>
}


This doesn't run because of error message

Error:(71, 12) constructor cannot be instantiated to expected type;
 found   : Some[A]
 required: org.apache.flink.api.scala.DataSet[org.apache.flink.ml.recommendation.ALS.Factors]
      case Some(userFactors) => userFactorsOpt.write(outputFormat, "path_to_my_file")

However, I still tried not to have match case i.e.

    userFactorsOpt.write(outputFormat, "path")
   
but nothing was written anywhere.

Reply | Threaded
Open this post in threaded view
|

Re: Flink ML 1.0.0 - Saving and Loading Models to Score a Single Feature Vector

Till Rohrmann

Sorry, I had a mistake in my example code. I thought the model would be stored as a (Option[DataSet[Factors]], Option[DataSet[Factors]]) but instead it’s stored as Option[(DataSet[Factors], DataSet[Factors])].

So the code should be

val als = ALS()

als.fit(input)

val alsModelOpt = als.factorsOption

val factorsTypeInfo = TypeInformation.of(classOf[Factors])
val factorsSerializer = factorsTypeInfo.createSerializer(new ExecutionConfig())
val outputFormat = new TypeSerializerOutputFormat[Factors]
outputFormat.setSerializer(factorsSerializer)

alsModelOpt match {
    case Some((userFactors, itemFactors)) => 
        userFactors.write(outputFormat, "user_path")
        itemFactors.write(outputFormat, "item_path")
    case None =>
}

if I’m not mistaken.

If you don’t see any output, then it might be the case that your model is empty. Could you check that? You could for example simply call print on the model DataSet.

Do you call env.execute at the end of your program? If you don’t do that, then the job is not executed.

Cheers,
Till


On Tue, Apr 12, 2016 at 1:25 PM, KirstiLaurila <[hidden email]> wrote:
Hi,

those parts were examples how I had tried. I tried with your suggestions,
but still no success. Additionally,
there were some problems:


val (userFactorsOpt, itemFactorsOpt) = als.factorsOption

If I had just this, userFactorsOpt And itemFactorsOpt did not have write
method. So I added get there i.e.

val (userFactorsOpt, itemFactorsOpt) = als.factorsOption.get


val factorsTypeInfo = TypeInformation.of(classOf[Factors])
val factorsSerializer = factorsTypeInfo.createSerializer(new
ExecutionConfig())
val outputFormat = new TypeSerializerOutputFormat[Factors]


Here, the factorsSerializer was not used at all, so I guess this was missing
line

    outputFormat.setSerializer(factorsSerializer)


userFactorsOpt match {
    case Some(userFactors) => userFactors.write(outputFormat, "user_path")
    case None =>
}


This doesn't run because of error message

Error:(71, 12) constructor cannot be instantiated to expected type;
 found   : Some[A]
 required:
org.apache.flink.api.scala.DataSet[org.apache.flink.ml.recommendation.ALS.Factors]
      case Some(userFactors) => userFactorsOpt.write(outputFormat,
"path_to_my_file")

However, I still tried not to have match case i.e.

    userFactorsOpt.write(outputFormat, "path")

but nothing was written anywhere.





--
View this message in context: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/Flink-ML-1-0-0-Saving-and-Loading-Models-to-Score-a-Single-Feature-Vector-tp5766p6059.html
Sent from the Apache Flink User Mailing List archive. mailing list archive at Nabble.com.

Reply | Threaded
Open this post in threaded view
|

Re: Flink ML 1.0.0 - Saving and Loading Models to Score a Single Feature Vector

KirstiLaurila
Now I got this working in cloud (not locally, but it's ok) so thanks a lot. Next problem is how to read then these written files and add them to the als.

I guess it is something like

   val als = ALS()
   als.factorsOption = Option(users,items)

but I don't get how I could read in the data I have written with the previous example. I tried with :

    val users  = env.readFile(new SerializedInputFormat[Factors], "path")

but I guess I need to use somehow TypeSerializedInputFormat[Factors] but I couldn't get this working.

Best,
Kirsti
Reply | Threaded
Open this post in threaded view
|

Re: Flink ML 1.0.0 - Saving and Loading Models to Score a Single Feature Vector

KirstiLaurila
Answering to myself if someone is having similar problems. So already saved matrices can be read and used in als like this:

   
    // Setup the ALS learnerd
    val als = ALS()

    val users  = env.readFile(new TypeSerializerInputFormat[Factors](createTypeInformation[Factors]),"path")
    val items = env.readFile(new TypeSerializerInputFormat[Factors](createTypeInformation[Factors]),"path")
   

    als.factorsOption = Option(users,items)

After this, one can use als for prediction.