question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Connector throws exception when using AppendFilter

See original GitHub issue

Hi,

The connector fails when using the append filter to add a new column name. I am using the latest version 1.6.1 as a docker image from dockerhub. Removing the curly brackets from the value, or just setting a hardcoded string results in the same exception.

Config:

    "filters"= 'AppendFileName, Row',
    
    "filters.AppendFileName.type"='io.streamthoughts.kafka.connect.filepulse.filter.AppendFilter',
    "filters.AppendFileName.field" = 'FILE_NAME',
    "filters.AppendFileName.value" = '{{ $metadata.name }}',

The exception is:

java.lang.ClassCastException: class io.streamthoughts.kafka.connect.filepulse.expression.SubstitutionExpression cannot be cast to class io.streamthoughts.kafka.connect.filepulse.expression.PropertyExpression (io.streamthoughts.kafka.connect.filepulse.expression.SubstitutionExpression and io.streamthoughts.kafka.connect.filepulse.expression.PropertyExpression are in unnamed module of loader org.apache.kafka.connect.runtime.isolation.PluginClassLoader @5bbf8daa)
        at io.streamthoughts.kafka.connect.filepulse.filter.AppendFilter.mayEvaluateWriteExpression(AppendFilter.java:113)
        at io.streamthoughts.kafka.connect.filepulse.filter.AppendFilter.apply(AppendFilter.java:81)
        at io.streamthoughts.kafka.connect.filepulse.filter.AbstractMergeRecordFilter.apply(AbstractMergeRecordFilter.java:51)
        at io.streamthoughts.kafka.connect.filepulse.filter.DefaultRecordFilterPipeline$FilterNode.apply(DefaultRecordFilterPipeline.java:159)
        at io.streamthoughts.kafka.connect.filepulse.filter.DefaultRecordFilterPipeline.apply(DefaultRecordFilterPipeline.java:131)
        at io.streamthoughts.kafka.connect.filepulse.filter.DefaultRecordFilterPipeline.apply(DefaultRecordFilterPipeline.java:99)
        at io.streamthoughts.kafka.connect.filepulse.source.DefaultFileRecordsPollingConsumer.next(DefaultFileRecordsPollingConsumer.java:169)
        at io.streamthoughts.kafka.connect.filepulse.source.FilePulseSourceTask.poll(FilePulseSourceTask.java:127)
        at org.apache.kafka.connect.runtime.WorkerSourceTask.poll(WorkerSourceTask.java:289)
        at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:256)
        at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:185)
        at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:235)
        at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
        at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
        at java.base/java.lang.Thread.run(Thread.java:834)

The full config below:

CREATE SOURCE CONNECTOR "oss.file" WITH (
    "connector.class"= 'io.streamthoughts.kafka.connect.filepulse.source.FilePulseSourceConnector',
    "tasks.max"= 1,
    "fs.scan.directory.path"= '/data/adr/input',
    "fs.scan.interval.ms"= 10000,
    "fs.scan.filters"= 'io.streamthoughts.kafka.connect.filepulse.scanner.local.filter.RegexFileListFilter',
    "file.filter.regex.pattern"= '^(.*?)',
    "topic"= 'oss.raw',
    "skip.headers"= 1,
    
    "internal.kafka.reporter.bootstrap.servers"= 'broker:29092',
    "internal.kafka.reporter.topic"= 'connect-file-pulse-status',
    "internal.kafka.reporter.id"= 'connect-file-pulse',
    "fs.cleanup.policy.class"= 'io.streamthoughts.kafka.connect.filepulse.clean.MoveCleanupPolicy',
    "cleaner.output.failed.path"= '/data/adr/error',
    "cleaner.output.succeed.path"= '/data/adr/succeed',
    
    "filters"= 'AppendFileName, Row',
    
    "filters.AppendFileName.type"='io.streamthoughts.kafka.connect.filepulse.filter.AppendFilter',
    "filters.AppendFileName.field" = 'FILE_NAME',
    "filters.AppendFileName.value" = '{{ $metadata.name }}',

    "filters.Row.extractColumnName"= 'headers',
    "filters.Row.separator"= ';',
    "filters.Row.trimColumn"= true,
    "filters.Row.type"= 'io.streamthoughts.kafka.connect.filepulse.filter.DelimitedRowFilter');

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:11 (4 by maintainers)

github_iconTop GitHub Comments

1reaction
fhussonnoiscommented, May 5, 2022

Ok, that’s an issue. I will provide a fix for that. In meanwhile, I think you can try to access to $metadata.stringURI

1reaction
fhussonnoiscommented, May 5, 2022

@bradydean, try to change the filters.path.field property value from path to $value.path.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Processing Filters | Kafka Connect File Pulse - GitHub Pages
FailFilter, Throws an exception when a message satisfy a specific condition ... The following examples shows how to use the AppendFilter to ...
Read more >
Exception in getting authentication header When Trying to ...
Office365 Connector Throwing Error : ConnectorException: Exception in getting authentication header When Trying to Authenticate to Proxy (Doc ID ...
Read more >
SmackException$NoResponseException - Java - Tabnine
Exception thrown always when there was no response to an request within the stanza reply timeout of the used connection instance. You can...
Read more >
How to manage exceptions thrown in filters in Spring?
Reusing the controller exception handling by using the HandlerExceptionResolver . Using Java config over XML config. First, you need to make sure, that...
Read more >
Developing connectors - Connector exception handling - IBM
The IBridge interface defines exceptions that each method in a connector might return. The exception thrown should reflect the condition that occurred.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found