question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Global-id value is 65034875748

See original GitHub issue

I need help to solve this problem, I am creating a sink connector with a Kafka topic so that the data is thrown into Big Query, my producer already writes the messages in the topic in AVRO format and performs the schema registration in the apicurio registry which is deployed on the same cluster as my kafka (I am using strimzi to manage the Kafka Cluster), but when the messages have to go through the convert, the schema is not found, and the following error message is always issued

RESTEASY003870: Unable to extract parameter from http request: javax.ws.rs.PathParam(“globalId”) value is ‘65034875748’

I would like the schema to be found automatically using the strategy TopicIdStrategy where the search is done by adding the suffix, value or key to the name of the consumption topic, as it is already being recorded in the apicurio registry

image

My Kafka Connector configuration yaml file ⬇️⬇️⬇️⬇️

apiVersion: kafka.strimzi.io/v1beta2
kind: KafkaConnector
metadata:
  name: schema-registry-sink-connector
  labels:
    strimzi.io/cluster: kafka-connect-schema-registry-component
spec:
  class: 'com.wepay.kafka.connect.bigquery.BigQuerySinkConnector'
  tasksMax: 1
  config:
    tasks.max: 1
    
    #consumer configs
    topics: 'tb.avro.schema'
    consumer.override.auto.offset.reset: "latest"
    

    #kafka converter configs
    value.converter.schemas.enable: true
    key.converter.schemas.enable: false
    value.converter: "io.apicurio.registry.utils.converter.AvroConverter" 
    value.converter.apicurio.registry.url: "http://my-apicurio-registry-apicurio-registry.kafka.svc.cluster.local:8080"
    key.converter: org.apache.kafka.connect.storage.StringConverter


    #bquery configs
    project: caramel-box
    defaultDataset: 'db_product'
    keyfile: 'svc-account-bquery.json'
    autoCreateTables: false  
    deleteEnabled: false
    upsertEnabled: false
    schemaRetriever: "com.wepay.kafka.connect.bigquery.retrieve.IdentitySchemaRetriever"
    autoUpdateSchemas: false
    sanitizeTopics: false
    bigQueryPartitionDecorator: false
   

The error message I’m getting ⬇️⬇️⬇️⬇️

org.apache.kafka.connect.errors.ConnectException: Tolerance exceeded in error handler
              at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:206)
              at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execute(RetryWithToleranceOperator.java:132)
              at org.apache.kafka.connect.runtime.WorkerSinkTask.convertAndTransformRecord(WorkerSinkTask.java:516)
              at org.apache.kafka.connect.runtime.WorkerSinkTask.convertMessages(WorkerSinkTask.java:493)
              at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:332)
              at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:234)
              at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:203)
              at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:188)
              at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:243)
              at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
              at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
              at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
              at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
              at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: io.apicurio.registry.rest.client.exception.RestClientException: RESTEASY003870: Unable to extract parameter from http request: javax.ws.rs.PathParam("globalId") value is '65034875748'
  at io.apicurio.registry.rest.client.impl.ErrorHandler.handleErrorResponse(ErrorHandler.java:64)
  at io.apicurio.rest.client.handler.BodyHandler.lambda$toSupplierOfType$1(BodyHandler.java:46)
  at io.apicurio.rest.client.JdkHttpClient.sendRequest(JdkHttpClient.java:202)
  at io.apicurio.registry.rest.client.impl.RegistryClientImpl.getContentByGlobalId(RegistryClientImpl.java:293)
  at io.apicurio.registry.resolver.AbstractSchemaResolver.lambda$resolveSchemaByGlobalId$1(AbstractSchemaResolver.java:183)
  at io.apicurio.registry.resolver.ERCache.lambda$getValue$0(ERCache.java:132)
  at io.apicurio.registry.resolver.ERCache.retry(ERCache.java:171)
  at io.apicurio.registry.resolver.ERCache.getValue(ERCache.java:131)
  at io.apicurio.registry.resolver.ERCache.getByGlobalId(ERCache.java:111)
  at io.apicurio.registry.resolver.AbstractSchemaResolver.resolveSchemaByGlobalId(AbstractSchemaResolver.java:178)
  at io.apicurio.registry.resolver.DefaultSchemaResolver.resolveSchemaByArtifactReference(DefaultSchemaResolver.java:148)
  at io.apicurio.registry.serde.AbstractKafkaDeserializer.resolve(AbstractKafkaDeserializer.java:147)
  at io.apicurio.registry.serde.AbstractKafkaDeserializer.deserialize(AbstractKafkaDeserializer.java:104)
  at io.apicurio.registry.utils.converter.SerdeBasedConverter.toConnectData(SerdeBasedConverter.java:129)
  at org.apache.kafka.connect.storage.Converter.toConnectData(Converter.java:87)
  at org.apache.kafka.connect.runtime.WorkerSinkTask.lambda$convertAndTransformRecord$4(WorkerSinkTask.java:516)
  at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndRetry(RetryWithToleranceOperator.java:156)
  at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:190)

Issue Analytics

  • State:closed
  • Created a year ago
  • Comments:8 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
carlesarnalcommented, Jun 6, 2022

Which version of the confluent libraries are you using? As you can see here the references field is present in the class and is being returned by the server, so I think you might be using an old version of the confluent library.

0reactions
carlesarnalcommented, Oct 7, 2022

As Eric pointed out this was caused by #2636 (an incompatibility in the ccompat api) which is now solved, so I’m closing this one as solved.

Read more comments on GitHub >

github_iconTop Results From Across the Web

When adding a GlobalID attribute should the values be NULL ...
Technical Article Details : FAQ: When adding a GlobalID attribute should the values be NULL in the archive class?
Read more >
001692: ObjectID and GlobalID validation failed for <value ...
When migrating an ObjectID-based relationship class with versioned data, a validation check is performed to ensure that there are no invalid ObjectIDs or ......
Read more >
GlobaliD – Bringing web3 digital identity to businesses and ...
Change the way ID works, forever. We're building a future where people control every aspect of their digital identities and businesses offload ...
Read more >
Global ID Values - API Reference - eBay Developers Program
Global ID Language Territory Site Name eBay Site ID EBAY‑AT de_AT AT eBay Austria 16 EBAY‑AU en_AU AU eBay Australia 15 EBAY‑CH de_CH CH eBay Switzerland...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found