question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

DotNet - Confluent.SchemaRegistry.Serdes.AvroDeserializer throws "Confluent.Kafka.ConsumeException"exception.

See original GitHub issue

Description

When creating a schema (which is registered into SchemaRegistry) with an array of sub-schema, and using a C# key word in the schema namespace, we are unable to deserialize the Kafka message using AvroDeserializer<T>(…)

The producer uses AvroSerializer<T>(…) successfully to publish message.

When consuming, using Confluent.SchemaRegistry.Serdes.AvroDeserializer<T> we receive exception: Confluent.Kafka.ConsumeException: Local: Value deserialization error —> Avro.AvroException: Unable to find type com.company.example.@event.SubObject in all loaded assemblies in field SubObjects.

How to reproduce

Create a schema, ensuring the namespace contains the word “event” (a C# key word). Schema example: { “type”: “record”, “name”: “NewConstructionAddressEvent”, “namespace”: “com.company.sub.event”, “doc”: “@author: Smith, @description: Avro Schema for an address”, “fields”: [ { “name”: “eventId”, “type”: { “type”: “string”, “avro.java.string”: “String” }, “doc”: “@required: true, @description: unique id (UUID version 4 and variant 2) for an event, @examples: d15f36fe-ab1e-4d5c-9a04-a1827ac0c330” }, { “name”: “eventType”, “type”: { “type”: “string”, “avro.java.string”: “String” }, “doc”: “@required: true, @description: operation type for event, @examples: created|updated|deleted” }, { “name”: “constructionAddressId”, “type”: { “type”: “string”, “avro.java.string”: “String” }, “doc”: “@required: true, @description: unique nds id for a construction address object, @examples: 35051923” }, { “name”: “units”, “type”: [ “null”, { “type”: “array”, “items”: { “type”: “record”, “name”: “Unit”, “fields”: [ { “name”: “unitNumber”, “type”: [ “null”, { “type”: “string”, “avro.java.string”: “String” } ], “doc”: “@required: false, @description: a specific unit number for an individual unit within a multi-dwelling unit, @examples: 1|101”, “default”: null }, { “name”: “type”, “type”: [ “null”, { “type”: “string”, “avro.java.string”: “String” } ], “doc”: “@required: false, @description: the type of the unit, @examples: Apartment|Building”, “default”: null }, { “name”: “story”, “type”: [ “null”, { “type”: “string”, “avro.java.string”: “String” } ], “doc”: “@required: false, @description: the story or floor number for the unit, @examples: 1|2|3”, “default”: null }, { “name”: “fiberCount”, “type”: [ “null”, { “type”: “string”, “avro.java.string”: “String” } ], “doc”: “@required: false, @description: the number of fibers available at the unit, @examples: 1|4”, “default”: null } ] } } ], “doc”: “@required: false, @description: unit numbers will be available for multi-dwelling unit - demand points, @examples: unit number details”, “default”: null }, { “name”: “constructionIndicator”, “type”: { “type”: “string”, “avro.java.string”: “String” }, “doc”: “@required: true, @description: construction stages (yes means in construction stage and no means in completed stage), @examples: yes|no” } ] }

Generate the associated C# code files using avrogen.

Now, execute consumer code(after publishing):

    public void Consume()
    {
        var producerConfig = new ProducerConfig
        {
            BootstrapServers = bootstrapServers
        };

        var schemaRegistryConfig = new SchemaRegistryConfig
        {
            SchemaRegistryUrl = schemaRegistryUrl,
            // optional schema registry client properties:
            SchemaRegistryRequestTimeoutMs = schemaRegistryRequestTimeoutMs,
            SchemaRegistryMaxCachedSchemas = schemaRegistryMaxCachedSchemas
        };

        var consumerConfig = new ConsumerConfig
        {
            BootstrapServers = bootstrapServers,
            AutoOffsetReset = AutoOffsetReset.Latest,
            GroupId = groupID // "Test" //Guid.NewGuid().ToString()
        };

        var avroSerializerConfig = new AvroSerializerConfig
        {
            // optional Avro serializer properties:
            BufferBytes = bufferBytes,
            AutoRegisterSchemas = autoRegisterSchema
        };

        NewConstructionAddressEvent addr = new NewConstructionAddressEvent();

        using (var schemaRegistry = new CachedSchemaRegistryClient(schemaRegistryConfig))
        using (var consumer =
            new ConsumerBuilder<string, NewConstructionAddressEvent>(consumerConfig)
                .SetKeyDeserializer(new AvroDeserializer<string>(schemaRegistry).AsSyncOverAsync())
                .SetValueDeserializer(new AvroDeserializer<NewConstructionAddressEvent>(schemaRegistry).AsSyncOverAsync())
                .SetErrorHandler((_, e) => logger.Error($"Error: {e.Reason}"))
                .Build())
        {
            try
            {
                logger.Info($"Starting consumer.subscribe.");

                consumer.Subscribe(topicName);

                while (true)
                {
                    try
                    {
                        logger.Info($"Starting: consumer.Consume");
                        var consumeResult = consumer.Consume(Executor.ApplicationCancelToken.Token);

                        string k = consumeResult.Key;
                        logger.Info($"BusMessage: {consumeResult.Message}, constructionAddressId: {consumeResult.Value.constructionAddressId}");
                    }
                    catch (OperationCanceledException)
                    {
                        logger.Info($"OperationCancelled for consumer.Consume");
                        break;
                    }
                    catch (ConsumeException e)
                    {
                        logger.Error(e, $"Consume error: {e.Error.Reason}");
                        break;
                    }
                }
            }
            catch (Exception ex)
            {
                logger.Error(ex, $"Consume error: {ex.Message}");
            }
            finally
            {
                consumer.Close();
            }
        }
    }

Consumer code can read Kafka message as “GenericRecord” successfully, but using SpecificRecord (as specified in code snippit above) :

            new ConsumerBuilder<string, NewConstructionAddressEvent>(consumerConfig)
                .SetKeyDeserializer(new AvroDeserializer<string>(schemaRegistry).AsSyncOverAsync())
                .SetValueDeserializer(new AvroDeserializer<NewConstructionAddressEvent>(schemaRegistry).AsSyncOverAsync())
                .SetErrorHandler((_, e) => logger.Error($"Error: {e.Reason}"))
                .Build()

Will result in Exception: Exception detail: Confluent.Kafka.ConsumeException: Local: Value deserialization error —> Avro.AvroException: Unable to find type com.company.example.@event.unitsin all loaded assemblies in field emails\r\n at Avro.Specific.SpecificDefaultReader.ReadRecord(Object reuse, RecordSchema writerSchema, Schema readerSchema, Decoder dec)\r\n at Avro.Generic.DefaultReader.Read[T](T reuse, Decoder decoder)\r\n at Confluent.SchemaRegistry.Serdes.SpecificDeserializerImpl1.Deserialize(String topic, Byte[] array)\r\n at Confluent.SchemaRegistry.Serdes.AvroDeserializer1.DeserializeAsync(ReadOnlyMemory1 data, Boolean isNull, SerializationContext context)\r\n at Confluent.Kafka.SyncOverAsync.SyncOverAsyncDeserializer1.Deserialize(ReadOnlySpan1 data, Boolean isNull, SerializationContext context)\r\n at Confluent.Kafka.Consumer2.ConsumeImpl[K,V](Int32 millisecondsTimeout, IDeserializer1 keyDeserializer, IDeserializer1 valueDeserializer)\r\n — End of inner exception stack trace —\r\n at Confluent.Kafka.Con sumer2.ConsumeImpl[K,V](Int32 millisecondsTimeout, IDeserializer1 keyDeserializer, IDeserializer1 valueDeserializer)\r\n at Confluent.Kafka.Consumer2.Consume(CancellationToken cancellationToken)

I believe that for DotNet C#, if a schema namespace contains a C# key word (EG: event), then the AvroDeserializer<T> fails.

NuGet Versions: Confluent.Kafka = 1.1.0 Confluent.Kafka.Avro = 0.11.6 Confluent.SchemaRegistry = 1.1.0 Confluent.SchemaRegistry.Serdes = 1.1.0

Operating system/Client configuration: Windows - DotNetCore 2.2, C#

I think the key issue for replication of this issue, is to create an Avro schema that includes the word “event” in the namespace, and that the schema includes a “sub-schema” of an array of objects - Please see the “Schema example” I provided above. Then create a Producer(using AvroSerializer) and a Consumer(using AvroDeserializer).

Issue Analytics

  • State:open
  • Created 4 years ago
  • Comments:21 (7 by maintainers)

github_iconTop GitHub Comments

4reactions
Rolicecommented, Nov 26, 2019

Thanks for the information!

What I do not understand is the versioning between the two available sets. Does ConsumerBuilder come from 0.11.x? Is there a replacement?

I am experimenting now with .NET core and Kafka.

What I am trying to do now is something like:

.SetValueDeserializer(new Confluent.SchemaRegistry.Serdes.AvroDeserializer<OrderEvent<Order>>(schemaConfig))

The method SetValueDeserializer expects Confluent.Kafka.IDeserializer. Maybe I am doing something wrong, but I am unable to find up-to-date documentation and examples (especially about the transition from 0.11.x to 1.0.0).

1reaction
camtechnetcommented, Aug 3, 2021

Hi @vkhose, can you please confirm the following is the issue raised in Apache.Avro - https://issues.apache.org/jira/browse/AVRO-2888 Just to be able to keep an eye on it. Thanks

Read more comments on GitHub >

github_iconTop Results From Across the Web

Class AvroDeserializer<T> | Confluent.Kafka
An implementation of ISchemaRegistryClient used for communication with Confluent Schema Registry. System.Collections.Generic.IEnumerable<System.Collections.
Read more >
C# confluent kafka problem with avro serialization
It seems schema registry needs a static field in your class ... will work without your workaround, which just ignores the schema registry....
Read more >
Confluent.SchemaRegistry.Serdes.Avro 2.2.0
Provides an Avro Serializer and Deserializer for use with Confluent.Kafka with Confluent Schema Registry integration.
Read more >
confluent-kafka-dotnet
The three "Serdes" packages provide serializers and deserializers for Avro, Protobuf and JSON with Confluent Schema Registry integration. The Confluent.
Read more >
Apache Kafka for .NET Developers - YouTube
This video will introduce the Schema Registry Client and show how it can be attached to a ... # dotnet #csharp #apachekafka #...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found