Flink consume Kafka with schema registry

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

Flink consume Kafka with schema registry

Lijun Ye
Hi,

I have occur the problem that the data in Kakfa is formatted as avro with
schema register server.
I found that is not easy to consume this topic easy, the provided kafka
does not support this, and I do not want to write a new kafka source, is
there any way to using provided kafka source to consume kafka, which is
format as avro with schema register.

Thanks
Reply | Threaded
Open this post in threaded view
|

Re: Flink consume Kafka with schema registry

朱广彬
I have the same problem these days.

I finally customize avro related serde schema for supporting schema
registry.

The root cause is that, when serialization , the avro record with schema
registry restriction is different with “original” avro record without
schema registry restriction . The former writes 5 bytes header ahead of
real record bytes. 1 byte magic and 4 bytes schema Id which is the unique
id registered in Kafka schema registry.

I think apache flink should consider this case,  supporting both original
avro and schema registry formatted avro .

Any plan for this?

On Wed, Nov 27, 2019 at 10:43 Lijun Ye <[hidden email]> wrote:

> Hi,
>
> I have occur the problem that the data in Kakfa is formatted as avro with
> schema register server.
> I found that is not easy to consume this topic easy, the provided kafka
> does not support this, and I do not want to write a new kafka source, is
> there any way to using provided kafka source to consume kafka, which is
> format as avro with schema register.
>
> Thanks
>
Reply | Threaded
Open this post in threaded view
|

Re: Flink consume Kafka with schema registry

Lijun Ye
Hi,

Can not agree more, if it is supported. Because we need, ahhhh

On Wed, Nov 27, 2019 at 11:00 AM 朱广彬 <[hidden email]> wrote:

> I have the same problem these days.
>
> I finally customize avro related serde schema for supporting schema
> registry.
>
> The root cause is that, when serialization , the avro record with schema
> registry restriction is different with “original” avro record without
> schema registry restriction . The former writes 5 bytes header ahead of
> real record bytes. 1 byte magic and 4 bytes schema Id which is the unique
> id registered in Kafka schema registry.
>
> I think apache flink should consider this case,  supporting both original
> avro and schema registry formatted avro .
>
> Any plan for this?
>
> On Wed, Nov 27, 2019 at 10:43 Lijun Ye <[hidden email]> wrote:
>
> > Hi,
> >
> > I have occur the problem that the data in Kakfa is formatted as avro with
> > schema register server.
> > I found that is not easy to consume this topic easy, the provided kafka
> > does not support this, and I do not want to write a new kafka source, is
> > there any way to using provided kafka source to consume kafka, which is
> > format as avro with schema register.
> >
> > Thanks
> >
>
Reply | Threaded
Open this post in threaded view
|

Re: Flink consume Kafka with schema registry

Lijun Ye
Hi,

Try this
https://ci.apache.org/projects/flink/flink-docs-release-1.8/dev/connectors/kafka.html
I have found this contain schema registry part.

On Wed, Nov 27, 2019 at 1:23 PM Lijun Ye <[hidden email]> wrote:

> Hi,
>
> Can not agree more, if it is supported. Because we need, ahhhh
>
> On Wed, Nov 27, 2019 at 11:00 AM 朱广彬 <[hidden email]> wrote:
>
>> I have the same problem these days.
>>
>> I finally customize avro related serde schema for supporting schema
>> registry.
>>
>> The root cause is that, when serialization , the avro record with schema
>> registry restriction is different with “original” avro record without
>> schema registry restriction . The former writes 5 bytes header ahead of
>> real record bytes. 1 byte magic and 4 bytes schema Id which is the unique
>> id registered in Kafka schema registry.
>>
>> I think apache flink should consider this case,  supporting both original
>> avro and schema registry formatted avro .
>>
>> Any plan for this?
>>
>> On Wed, Nov 27, 2019 at 10:43 Lijun Ye <[hidden email]> wrote:
>>
>> > Hi,
>> >
>> > I have occur the problem that the data in Kakfa is formatted as avro
>> with
>> > schema register server.
>> > I found that is not easy to consume this topic easy, the provided kafka
>> > does not support this, and I do not want to write a new kafka source, is
>> > there any way to using provided kafka source to consume kafka, which is
>> > format as avro with schema register.
>> >
>> > Thanks
>> >
>>
>