Failed to Find Class 500 Error
See original GitHub issueI am trying to add this plugin connector to my current instance of kafka connect. I’m using Openshift and am running Kafka Connect using the docker image below (it’s basically an older version of confluentinc/cp-docker-images except it has the jar files from the previously mentioned GitHub and the hdfs connector jar is upgraded) :
https://hub.docker.com/r/chenchik/custom-connect-hdfs/
I run kafka connect with a huge command that sets a ton of environment variables. From what I understand the plugins should go in /etc/kafka-connect/jars, once they’re in there, they should work.
In order to install this plugin into my Kafka Connect instance I cloned this plugin from github and ran:
mvn clean package
Then I took all the files in /target and copied them into my container running Kafka Connect into the /etc/kafka-connect/jars
directory. I didn’t change any environment variables after that.
When I try to activate the connector using the REST API, I issue a POST request with this payload:
{
"name": "csv-json-1",
"config": {
"connector.class": "com.github.jcustenborder.kafka.connect.spooldir.SpoolDirCsvSourceConnector",
"tasks.max": "1",
"finished.path":"/csv-json/results/",
"input.file.pattern":".*csv",
"error.path":"/csv-json/errors/",
"topic":"danila-csv-json",
"input.path":"/csv-json/input/",
"key.schema":"com.github.jcustenborder.kafka.connect.spooldir.CsvSchemaGenerator",
"value.schema":"com.github.jcustenborder.kafka.connect.spooldir.CsvSchemaGenerator"
}
}
The response I get every time is:
{ “error_code”: 500, “message”: “Failed to find any class that implements Connector and which name matches com.github.jcustenborder.kafka.connect.spooldir.SpoolDirCsvSourceConnector, available connectors are: org.apache.kafka.connect.source.SourceConnector, org.apache.kafka.connect.tools.MockSourceConnector, org.apache.kafka.connect.file.FileStreamSinkConnector, io.confluent.connect.hdfs.tools.SchemaSourceConnector, org.apache.kafka.connect.tools.VerifiableSourceConnector, io.confluent.connect.s3.S3SinkConnector, org.apache.kafka.connect.file.FileStreamSourceConnector, org.apache.kafka.connect.tools.VerifiableSinkConnector, io.confluent.connect.jdbc.JdbcSinkConnector, io.confluent.connect.jdbc.JdbcSourceConnector, io.confluent.connect.elasticsearch.ElasticsearchSinkConnector, org.apache.kafka.connect.sink.SinkConnector, io.confluent.connect.storage.tools.SchemaSourceConnector, org.apache.kafka.connect.tools.MockConnector, org.apache.kafka.connect.tools.MockSinkConnector, org.apache.kafka.connect.tools.SchemaSourceConnector, io.confluent.connect.hdfs.HdfsSinkConnector” }
If I issue a GET request to /connector-plugins, it is not listed.
I also cannot seem to find any logs inside of the container that can explain what’s going on. The only kind log message I get is from the log the container is providing openshift. This is the only entry that pops up:
[2017-08-01 22:22:35,635] INFO 172.17.0.1 - - [01/Aug/2017:22:22:15 +0000] “POST /connectors HTTP/1.1” 500 1081 20544 (org.apache.kafka.connect.runtime.rest.RestServer)
Any idea on what I can do to resolve this issue?
Issue Analytics
- State:
- Created 6 years ago
- Comments:11 (6 by maintainers)
Top GitHub Comments
@MohammadChalaki All good buddy. When you run
mvn clean package
one of the outputs is a folder calledtarget/kafka-connect-target
under this it has all of the dependencies needed to run the connector. You need to take the content of this folder and copy it to/opt/kafka-connect/kafka-connect-spooldir
. If your worker properties file hasplugin.path=/opt/kafka-connect
the worker will look at each subfolder in this folder and load them as a plugin. This requires a restart. Another way to do this is to install via the Confluent Hub this will do it for you. I do this a lot when I build docker containers for Kafka Connect. Check this Dockerfile out. Docker isn’t required for connect FYI. This is just an example of how to use the Confluent hub.@SachinKSunny You need to make sure that all of the workers in your connect cluster get the plugin installed. You need to make sure that you restart all of the workers after installing a plugin. Verify that the plugin.path of your config matches where the connectors are installed. For example when I do this in docker I set my plugin.path to include where the plugins are installed.