First of all to give a back story for the deprecation we do not want to
depend on the TypeInformation anymore for the types in Table API as it
binds both the on-wire representation with the logical types of the SQL
API. The goal is to use the DataType exclusively in the Table API
(including for the Schema). Moreover we rather aim to make the SQL types
the source of truth. That's why its usually the TableSchema that's being
translated into the Avro schema rather than in the other direction.
Nevertheless I do see a use for deriving a DDL statement from avro
schema. I created a ticket for
it:
https://issues.apache.org/jira/browse/FLINK-18158What you could do now is to leverage the
org.apache.flink.formats.avro.typeutils.AvroSchemaConverter#convertToSchema(org.apache.flink.table.types.logical.LogicalType)
from the master to write the reverse operation from Schema to DataType.
Just as an additional comment the class AvroSchemaConverter is an
internal class without any guarantees for the stability of the methods.
Best
Dawid
On 04/06/2020 14:53, Ramana Uppala wrote:
> Hi,
>
> In Flink 1.9, we have option to create the TableSchema form TypeInformation. We have used below.
>
> TypeInformation typeInfo = AvroSchemaConverter.convertToTypeInfo(schema);
> TableSchema tableSchema = TableSchema.fromTypeInfo(typeInfo);
>
> However TableSchema's fromTypeInfo method is deprecated in Flink 1.10. Is there any other utility to create TableSchema from Avro schema in Flink 1.0 ?