question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Failed to Find Class 500 Error

See original GitHub issue

I am trying to add this plugin connector to my current instance of kafka connect. I’m using Openshift and am running Kafka Connect using the docker image below (it’s basically an older version of confluentinc/cp-docker-images except it has the jar files from the previously mentioned GitHub and the hdfs connector jar is upgraded) :

https://hub.docker.com/r/chenchik/custom-connect-hdfs/

I run kafka connect with a huge command that sets a ton of environment variables. From what I understand the plugins should go in /etc/kafka-connect/jars, once they’re in there, they should work.

In order to install this plugin into my Kafka Connect instance I cloned this plugin from github and ran:

mvn clean package

Then I took all the files in /target and copied them into my container running Kafka Connect into the /etc/kafka-connect/jars directory. I didn’t change any environment variables after that.

When I try to activate the connector using the REST API, I issue a POST request with this payload:

{
    "name": "csv-json-1",
    "config": {
        "connector.class": "com.github.jcustenborder.kafka.connect.spooldir.SpoolDirCsvSourceConnector",
        "tasks.max": "1",
        "finished.path":"/csv-json/results/",
	"input.file.pattern":".*csv",
	"error.path":"/csv-json/errors/",
	"topic":"danila-csv-json",
	"input.path":"/csv-json/input/",
	"key.schema":"com.github.jcustenborder.kafka.connect.spooldir.CsvSchemaGenerator",
	"value.schema":"com.github.jcustenborder.kafka.connect.spooldir.CsvSchemaGenerator"
    }
}

The response I get every time is:

{ “error_code”: 500, “message”: “Failed to find any class that implements Connector and which name matches com.github.jcustenborder.kafka.connect.spooldir.SpoolDirCsvSourceConnector, available connectors are: org.apache.kafka.connect.source.SourceConnector, org.apache.kafka.connect.tools.MockSourceConnector, org.apache.kafka.connect.file.FileStreamSinkConnector, io.confluent.connect.hdfs.tools.SchemaSourceConnector, org.apache.kafka.connect.tools.VerifiableSourceConnector, io.confluent.connect.s3.S3SinkConnector, org.apache.kafka.connect.file.FileStreamSourceConnector, org.apache.kafka.connect.tools.VerifiableSinkConnector, io.confluent.connect.jdbc.JdbcSinkConnector, io.confluent.connect.jdbc.JdbcSourceConnector, io.confluent.connect.elasticsearch.ElasticsearchSinkConnector, org.apache.kafka.connect.sink.SinkConnector, io.confluent.connect.storage.tools.SchemaSourceConnector, org.apache.kafka.connect.tools.MockConnector, org.apache.kafka.connect.tools.MockSinkConnector, org.apache.kafka.connect.tools.SchemaSourceConnector, io.confluent.connect.hdfs.HdfsSinkConnector” }

If I issue a GET request to /connector-plugins, it is not listed.

I also cannot seem to find any logs inside of the container that can explain what’s going on. The only kind log message I get is from the log the container is providing openshift. This is the only entry that pops up:

[2017-08-01 22:22:35,635] INFO 172.17.0.1 - - [01/Aug/2017:22:22:15 +0000] “POST /connectors HTTP/1.1” 500 1081 20544 (org.apache.kafka.connect.runtime.rest.RestServer)

Any idea on what I can do to resolve this issue?

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Comments:11 (6 by maintainers)

github_iconTop GitHub Comments

2reactions
jcustenbordercommented, Jun 14, 2020

@MohammadChalaki All good buddy. When you run mvn clean package one of the outputs is a folder called target/kafka-connect-target under this it has all of the dependencies needed to run the connector. You need to take the content of this folder and copy it to /opt/kafka-connect/kafka-connect-spooldir. If your worker properties file has plugin.path=/opt/kafka-connect the worker will look at each subfolder in this folder and load them as a plugin. This requires a restart. Another way to do this is to install via the Confluent Hub this will do it for you. I do this a lot when I build docker containers for Kafka Connect. Check this Dockerfile out. Docker isn’t required for connect FYI. This is just an example of how to use the Confluent hub.

1reaction
jcustenbordercommented, Jan 27, 2021

@SachinKSunny You need to make sure that all of the workers in your connect cluster get the plugin installed. You need to make sure that you restart all of the workers after installing a plugin. Verify that the plugin.path of your config matches where the connectors are installed. For example when I do this in docker I set my plugin.path to include where the plugins are installed.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Classroom 500 error - Google Classroom Community
It shows error 500. This error is happening after I delete the workspace account. All other Google apps works fine.
Read more >
How to Fix a 500 Internal Server Error on Your WordPress Site
The 500 Internal Server Error status code occurs when the server encounters an error that prevents it from fulfilling the request.
Read more >
HTTP 500 Internal Server Error: What It Means & How to Fix It
This code tells users that the server is temporarily unable to load the page they're looking for. Check out this post for a...
Read more >
How to Fix a 500 Internal Server Error - Lifewire
The 500 Internal Server Error is a very general HTTP status code that means something has gone wrong on the website's server, but...
Read more >
500 error on server could not find declared class
When I run puppet agent --test I get an error saying could not find declared class viminstall at /etc/puppetlabs/code/environments/production/ ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found