StackOverflow Exception - Specific Avro Messages - Apache Avro 1.10.0
See original GitHub issueDescription
When upgrading to Confluent Kafka 1.5.0 from 1.3.0, for our separate Avro message projects, we upgraded the Avro messages projects to use Apache.Avro
version 1.10.0 from Confluent.Kafka.Avro
-> 1.7.7.7
. At the time I noted that the minimum version of the dependency should be 1.9.2
, but went with latest here. This has been fine so far but for this specific message I noticed StackOverflow exceptions in the consumer.
For a specific avro type we have:
{
"namespace": "[namespace]",
"type": "record",
"doc": "[doc]",
"name": "[message]",
"fields": [
{
"name": "Field1",
"type": "string"
},
{
"name": "Field2",
"type": "string"
},
{
"name": "Field3",
"type": "string"
}
]
}
One thing to note is the strings in the fields will be quite large. I can try and get exact sizes if this helps.
When specifically targeting Apache.Avro 1.10.0 the following happens:
> Avro.dll!Avro.IO.BinaryDecoder.ReadString() Unknown
Avro.dll!Avro.Generic.DefaultReader.Read<System.__Canon>(Avro.Schema.Type tag, Avro.Schema readerSchema, Avro.Generic.Reader<System.__Canon> reader) Unknown
Avro.dll!Avro.Generic.DefaultReader.Read(object reuse, Avro.Schema writerSchema, Avro.Schema readerSchema, Avro.IO.Decoder d) Unknown
Avro.dll!Avro.Specific.SpecificDefaultReader.ReadRecord(object reuse, Avro.RecordSchema writerSchema, Avro.Schema readerSchema, Avro.IO.Decoder dec) Unknown
Avro.dll!Avro.Generic.DefaultReader.Read(object reuse, Avro.Schema writerSchema, Avro.Schema readerSchema, Avro.IO.Decoder d) Unknown
Avro.dll!Avro.Generic.DefaultReader.Read<>( reuse, Avro.IO.Decoder decoder) Unknown
Avro.dll!Avro.Specific.SpecificReader<>.Read( reuse, Avro.IO.Decoder dec) Unknown
Confluent.SchemaRegistry.Serdes.Avro.dll!Confluent.SchemaRegistry.Serdes.SpecificDeserializerImpl<>.Deserialize(string topic, byte[] array) Unknown
How to reproduce
- Create a consumer with default settings. I used the Apache Avro Example from the repo as my basis
- Create a AVRO message with large strings
- Reference Apache Avro 1.10.0 rather than 1.9.2
Should we use the latest Apache.Avro version? I have a lot more complex Avro messages targeting this version and working fine thus far. The only thing I can think of is the large strings?
Checklist
Please provide the following information:
- A complete (i.e. we can run it), minimal program demonstrating the problem. No need to supply a project file.
- [1.5.0] Confluent.Kafka nuget version.
- Apache Kafka version.
- Client configuration.
- [Windows+Linux] Operating system.
- Provide logs (with “debug” : “…” as necessary in configuration).
- Provide broker log excerpts.
- [] Critical issue.
Issue Analytics
- State:
- Created 3 years ago
- Reactions:4
- Comments:7 (2 by maintainers)
Top GitHub Comments
Currently there are two open bugs related to our investigation:
let’s hope for a soonish fix for the first one at least 🤞
FYI: version 1.10.2 is out 🎉 large strings issue is fixed. https://github.com/apache/avro/releases/tag/release-1.10.2