Adding Plugin Gives HTTP 500 Error
See original GitHub issueI am trying to add a plugin connector from github:
https://github.com/jcustenborder/kafka-connect-spooldir
I’m using Openshift and am running Kafka Connect using the docker image below (it’s basically confluentinc/cp-docker-images except it has the jar files from the previously mentioned GitHub and the hdfs connector jar is upgraded) :
https://hub.docker.com/r/chenchik/custom-connect-hdfs/
I run kafka connect with a huge command that sets a ton of environment variables. From what I understand the plugins should be in /etc/kafka-connect/jars
. So the most important part of this command is probably: -e CONNECT_PLUGIN_PATH="/etc/kafka-connect/jars"
which is where I put my jars from the github after running mvn clean package
huge command:
oc new-app chenchik/custom-connect-hdfs:latest -e CONNECT_BOOTSTRAP_SERVERS=--------------:9092 -e CONNECT_GROUP_ID="connect_---" -e CONNECT_CONFIG_STORAGE_TOPIC="----" -e CONNECT_OFFSET_STORAGE_TOPIC="----" -e CONNECT_STATUS_STORAGE_TOPIC="----" -e CONNECT_KEY_CONVERTER="org.apache.kafka.connect.storage.StringConverter" -e CONNECT_VALUE_CONVERTER="org.apache.kafka.connect.storage.StringConverter" -e CONNECT_INTERNAL_KEY_CONVERTER="org.apache.kafka.connect.json.JsonConverter" -e CONNECT_INTERNAL_VALUE_CONVERTER="org.apache.kafka.connect.json.JsonConverter" -e CONNECT_REST_ADVERTISED_HOST_NAME="connect" -e HADOOP_USER_NAME="hdfs" -e CONNECT_KEY_CONVERTER_SCHEMAS_ENABLE="false" -e CONNECT_VALUES_CONVERTER_SCHEMAS_ENABLE="false" -e CONNECT_INTERNAL_KEY_CONVERTER_SCHEMAS_ENABLE="false" -e CONNECT_INTERNAL_VALUE_CONVERTER_SCHEMAS_ENABLE="false" -e CONNECT_AUTO_OFFSET_RESET="latest" -e AUTO_OFFSET_RESET="latest" -e TERM=xterm -e KAFKA_HEAP_OPTS="-Xms512m -Xmx1g" -e CONNECT_KAFKA_HEAP_OPTS="-Xms512m -Xmx1g" -e CONNECT_PLUGIN_PATH="/etc/kafka-connect/jars" -e CLASSPATH="/etc/kafka-connect/jars" --name=c
I’ve tried making the CONNECT_PLUGIN_PATH
simply:
/plugins
and
/usr/local/share/kafka/plugins/
as well.
In order to create the connector, I use the REST API and issue a POST request with this payload:
{
"name": "csv-json-1",
"config": {
"connector.class": "com.github.jcustenborder.kafka.connect.spooldir.SpoolDirCsvSourceConnector",
"tasks.max": "1",
"finished.path":"/csv-json/results/",
"input.file.pattern":".*csv",
"error.path":"/csv-json/errors/",
"topic":"csv-json",
"input.path":"/csv-json/input/",
"key.schema":"com.github.jcustenborder.kafka.connect.spooldir.CsvSchemaGenerator",
"value.schema":"com.github.jcustenborder.kafka.connect.spooldir.CsvSchemaGenerator"
}
}
The response I get every time is:
{
"error_code": 500,
"message": "Failed to find any class that implements Connector and which name matches com.github.jcustenborder.kafka.connect.spooldir.SpoolDirCsvSourceConnector, available connectors are: org.apache.kafka.connect.source.SourceConnector, org.apache.kafka.connect.tools.MockSourceConnector, org.apache.kafka.connect.file.FileStreamSinkConnector, io.confluent.connect.hdfs.tools.SchemaSourceConnector, org.apache.kafka.connect.tools.VerifiableSourceConnector, io.confluent.connect.s3.S3SinkConnector, org.apache.kafka.connect.file.FileStreamSourceConnector, org.apache.kafka.connect.tools.VerifiableSinkConnector, io.confluent.connect.jdbc.JdbcSinkConnector, io.confluent.connect.jdbc.JdbcSourceConnector, io.confluent.connect.elasticsearch.ElasticsearchSinkConnector, org.apache.kafka.connect.sink.SinkConnector, io.confluent.connect.storage.tools.SchemaSourceConnector, org.apache.kafka.connect.tools.MockConnector, org.apache.kafka.connect.tools.MockSinkConnector, org.apache.kafka.connect.tools.SchemaSourceConnector, io.confluent.connect.hdfs.HdfsSinkConnector"
}
If I issue a GET request to /connector-plugins, it is not listed.
I also cannot seem to find any logs inside of the container that can explain what’s going on. The only kind log message I get is from the log the container is providing openshift. This is the entry that pops up:
[2017-08-01 22:22:35,635] INFO 172.17.0.1 - - [01/Aug/2017:22:22:15 +0000] "POST /connectors HTTP/1.1" 500 1081 20544 (org.apache.kafka.connect.runtime.rest.RestServer)
What can I do to resolve this issue?
Issue Analytics
- State:
- Created 6 years ago
- Comments:9 (4 by maintainers)
Top GitHub Comments
First, there are 2 ways to load plugins as of the newest release. Previously the only way was to put the jars on the classpath. That’s what putting them under
/etc/kafka-connect/jars
does. You should not need to set any additional environment variables to make that work.As of Kafka 0.11 and CP 3.3, you can also load via the new
plugin.path
setting, which is what the environment variableCONNECT_PLUGIN_PATH
would be doing. In that case, the structure is a bit different – inside that directory, we’d expect to find a directory for each plugin, and within that directory we’d expect to find the jars.Is there anything else in that log before that? The relevant parts will probably be earlier during the time when plugins are being loaded/the classpath is being scanned.
Also, you might want to increase the log4j log level to get more debug output.
Hi All, I have a similar problem to create a new connector, and would need some help… The steps I followed were:
where I added to the kafka-connect image, (i) a CONNECT_PLUGIN_PATH environment variable pointing to “/etc/kakfa-connect/jars”, and (ii) a volume from my host directory holding the jar file to the “/etc/kakfa-connect/jars” 5. I then tried to create a new connector of this class using the kafka connect rest api as follows
curl -X POST -H "Content-Type: application/json" --data '{"name": "quickstart-json-dir", "config": {"connector.class":"com.github.jcustenborder.kafka.connect.spooldir.SpoolDirJsonSourceConnector", "tasks.max":"1", "topic":"quickstart-data-6", "finished.path": "/data/results", "input.file.pattern": ".*json", "input.path": "/data/input", "error.path": "/data/error"}}' http://localhost:8083/connectors
The response is Error 500 Request Failed, see attached png for more details
I’m pretty sure there is neither a connection issue, nor a server problem since I can create a connector of another class without any problem, e.g. the following works like a charm (connector created, can check its status is running)
curl -X POST -H "Content-Type: application/json" --data '{"name": "quickstart-file-source-json-5", "config": {"connector.class":"org.apache.kafka.connect.file.FileStreamSourceConnector", "tasks.max":"1", "topic":"quickstart-data-5", "file": "/data/test-4.json"}}' http://localhost:8083/connectors
I tend to believe that although I set up the CONNECT_PLUGIN_PATH in docker compose file, the connector class cannot be loaded and found.
Should I set up plugin.path another way? There is a properties file in kafka-connect container called /etc/kafka/connect-standalone.properties, but how can I edit it before starting the container?
Could be another issue? Thanks in advance!