question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Adding Plugin Gives HTTP 500 Error

See original GitHub issue

I am trying to add a plugin connector from github:

https://github.com/jcustenborder/kafka-connect-spooldir

I’m using Openshift and am running Kafka Connect using the docker image below (it’s basically confluentinc/cp-docker-images except it has the jar files from the previously mentioned GitHub and the hdfs connector jar is upgraded) :

https://hub.docker.com/r/chenchik/custom-connect-hdfs/

I run kafka connect with a huge command that sets a ton of environment variables. From what I understand the plugins should be in /etc/kafka-connect/jars. So the most important part of this command is probably: -e CONNECT_PLUGIN_PATH="/etc/kafka-connect/jars"

which is where I put my jars from the github after running mvn clean package

huge command:

oc new-app chenchik/custom-connect-hdfs:latest -e CONNECT_BOOTSTRAP_SERVERS=--------------:9092 -e CONNECT_GROUP_ID="connect_---" -e CONNECT_CONFIG_STORAGE_TOPIC="----" -e CONNECT_OFFSET_STORAGE_TOPIC="----" -e CONNECT_STATUS_STORAGE_TOPIC="----" -e CONNECT_KEY_CONVERTER="org.apache.kafka.connect.storage.StringConverter" -e CONNECT_VALUE_CONVERTER="org.apache.kafka.connect.storage.StringConverter" -e CONNECT_INTERNAL_KEY_CONVERTER="org.apache.kafka.connect.json.JsonConverter" -e CONNECT_INTERNAL_VALUE_CONVERTER="org.apache.kafka.connect.json.JsonConverter" -e CONNECT_REST_ADVERTISED_HOST_NAME="connect" -e HADOOP_USER_NAME="hdfs" -e CONNECT_KEY_CONVERTER_SCHEMAS_ENABLE="false" -e CONNECT_VALUES_CONVERTER_SCHEMAS_ENABLE="false" -e CONNECT_INTERNAL_KEY_CONVERTER_SCHEMAS_ENABLE="false" -e CONNECT_INTERNAL_VALUE_CONVERTER_SCHEMAS_ENABLE="false" -e CONNECT_AUTO_OFFSET_RESET="latest" -e AUTO_OFFSET_RESET="latest" -e TERM=xterm -e KAFKA_HEAP_OPTS="-Xms512m -Xmx1g" -e CONNECT_KAFKA_HEAP_OPTS="-Xms512m -Xmx1g" -e CONNECT_PLUGIN_PATH="/etc/kafka-connect/jars" -e CLASSPATH="/etc/kafka-connect/jars" --name=c

I’ve tried making the CONNECT_PLUGIN_PATH simply:

/plugins and /usr/local/share/kafka/plugins/

as well.

In order to create the connector, I use the REST API and issue a POST request with this payload:

{
    "name": "csv-json-1",
    "config": {
        "connector.class": "com.github.jcustenborder.kafka.connect.spooldir.SpoolDirCsvSourceConnector",
        "tasks.max": "1",
        "finished.path":"/csv-json/results/",
		"input.file.pattern":".*csv",
		"error.path":"/csv-json/errors/",
		"topic":"csv-json",
		"input.path":"/csv-json/input/",
		"key.schema":"com.github.jcustenborder.kafka.connect.spooldir.CsvSchemaGenerator",
		"value.schema":"com.github.jcustenborder.kafka.connect.spooldir.CsvSchemaGenerator"
    }
}

The response I get every time is:

{
    "error_code": 500,
    "message": "Failed to find any class that implements Connector and which name matches com.github.jcustenborder.kafka.connect.spooldir.SpoolDirCsvSourceConnector, available connectors are: org.apache.kafka.connect.source.SourceConnector, org.apache.kafka.connect.tools.MockSourceConnector, org.apache.kafka.connect.file.FileStreamSinkConnector, io.confluent.connect.hdfs.tools.SchemaSourceConnector, org.apache.kafka.connect.tools.VerifiableSourceConnector, io.confluent.connect.s3.S3SinkConnector, org.apache.kafka.connect.file.FileStreamSourceConnector, org.apache.kafka.connect.tools.VerifiableSinkConnector, io.confluent.connect.jdbc.JdbcSinkConnector, io.confluent.connect.jdbc.JdbcSourceConnector, io.confluent.connect.elasticsearch.ElasticsearchSinkConnector, org.apache.kafka.connect.sink.SinkConnector, io.confluent.connect.storage.tools.SchemaSourceConnector, org.apache.kafka.connect.tools.MockConnector, org.apache.kafka.connect.tools.MockSinkConnector, org.apache.kafka.connect.tools.SchemaSourceConnector, io.confluent.connect.hdfs.HdfsSinkConnector"
}

If I issue a GET request to /connector-plugins, it is not listed.

I also cannot seem to find any logs inside of the container that can explain what’s going on. The only kind log message I get is from the log the container is providing openshift. This is the entry that pops up:

[2017-08-01 22:22:35,635] INFO 172.17.0.1 - - [01/Aug/2017:22:22:15 +0000] "POST /connectors HTTP/1.1" 500 1081 20544 (org.apache.kafka.connect.runtime.rest.RestServer)

What can I do to resolve this issue?

Issue Analytics

  • State:open
  • Created 6 years ago
  • Comments:9 (4 by maintainers)

github_iconTop GitHub Comments

1reaction
ewencpcommented, Aug 1, 2017

First, there are 2 ways to load plugins as of the newest release. Previously the only way was to put the jars on the classpath. That’s what putting them under /etc/kafka-connect/jars does. You should not need to set any additional environment variables to make that work.

As of Kafka 0.11 and CP 3.3, you can also load via the new plugin.path setting, which is what the environment variable CONNECT_PLUGIN_PATH would be doing. In that case, the structure is a bit different – inside that directory, we’d expect to find a directory for each plugin, and within that directory we’d expect to find the jars.

Is there anything else in that log before that? The relevant parts will probably be earlier during the time when plugins are being loaded/the classpath is being scanned.

Also, you might want to increase the log4j log level to get more debug output.

0reactions
anastathcommented, Feb 19, 2019

Hi All, I have a similar problem to create a new connector, and would need some help… The steps I followed were:

  1. I cloned kafka locally from this repository (https://github.com/wurstmeister/kafka-docker)
  2. I downloaded from docker hub the docker images for zookeeper (https://hub.docker.com/_/zookeeper) and kafka-connect (https://hub.docker.com/r/confluentinc/cp-kafka-connect).
  3. I cloned this repo locally (https://github.com/jcustenborder/kafka-connect-spooldir) and packaged it a jar so as to use this as a kafka plugin
  4. I set up docker-compose file as per the attached docker-compose-1-broker-Connect-yml.txt

where I added to the kafka-connect image, (i) a CONNECT_PLUGIN_PATH environment variable pointing to “/etc/kakfa-connect/jars”, and (ii) a volume from my host directory holding the jar file to the “/etc/kakfa-connect/jars” 5. I then tried to create a new connector of this class using the kafka connect rest api as follows

curl -X POST -H "Content-Type: application/json" --data '{"name": "quickstart-json-dir", "config": {"connector.class":"com.github.jcustenborder.kafka.connect.spooldir.SpoolDirJsonSourceConnector", "tasks.max":"1", "topic":"quickstart-data-6", "finished.path": "/data/results", "input.file.pattern": ".*json", "input.path": "/data/input", "error.path": "/data/error"}}' http://localhost:8083/connectors

The response is Error 500 Request Failed, see attached png for more details image

I’m pretty sure there is neither a connection issue, nor a server problem since I can create a connector of another class without any problem, e.g. the following works like a charm (connector created, can check its status is running)

curl -X POST -H "Content-Type: application/json" --data '{"name": "quickstart-file-source-json-5", "config": {"connector.class":"org.apache.kafka.connect.file.FileStreamSourceConnector", "tasks.max":"1", "topic":"quickstart-data-5", "file": "/data/test-4.json"}}' http://localhost:8083/connectors

I tend to believe that although I set up the CONNECT_PLUGIN_PATH in docker compose file, the connector class cannot be loaded and found.

Should I set up plugin.path another way? There is a properties file in kafka-connect container called /etc/kafka/connect-standalone.properties, but how can I edit it before starting the container?

Could be another issue? Thanks in advance!

Read more comments on GitHub >

github_iconTop Results From Across the Web

How to Fix HTTP Error 500 in WordPress (Internal Server Error)
1. WordPress 500 Internal Server Error Due to Bad Plugins. In most cases, WordPress HTTP error 500 occurs because of a new plugin....
Read more >
How to Fix the 500 Internal Server Error on Your WordPress ...
Go to Plugins in the left sidebar of your dashboard, then click Deactivate under the plugin name.
Read more >
How To Fix HTTP 500 Internal Server Error In WordPress
Plugins are one of the most common causes of the HTTP 500 Internal Server Error. A recently installed plugin may be causing the...
Read more >
How to Fix the 500 Internal Server Error on Your WordPress Site
The WordPress 500 internal error is an application-side issue and mostly occurs on the server level. Mostly it is caused due to plugin/theme...
Read more >
How to Fix the 500 Internal Server Error in ... - ThemeIsle
1. Turn on debugging - You can turn debugging on by editing your site’s wp-config.php file: once you’ve accessed this file, search for...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found