Unable to create Integration if Kafka is not currently running/active
See original GitHub issueThis is a…
[ ] Feature request
[ ] Regression (a behavior that used to work and stopped working in a new release)
[x] Bug report
[ ] Documentation issue or request
The problem
Unable to create an integration with Kafka if the broker/cluster is not currently active/running. To produce this issue try the following:
- Create a connection of any type (this will be the start connection in the in integration)
- Create a Kafka connection to a broker URL that doesn’t exist/is not running
- Create an Integration from the connection in step 1 to Kafka. When you select “Publish” messages to the topic “Something wrong” is shown (see attachment)
Expected behavior
Should be able to create the Integration even if the Kafka broker/cluster is down.
Screenshot
Request and Response Data
400 response from https://HOSTNAME/api/v1/connections/i-M1le7YOaH8qq746bswFz/actions/io.syndesis:kafka-publish-action
{
"inputDataShape": {
"kind": "any"
},
"outputDataShape": {
"kind": "none"
},
"propertyDefinitionSteps": [
{
"name": "Select the Kafka topic",
"properties": {
"topic": {
"order": 1,
"componentProperty": false,
"deprecated": false,
"displayName": "Topic Name",
"group": "common",
"javaType": "java.lang.String",
"kind": "path",
"labelHint": "Select the Kafka topic to send data to.",
"required": true,
"secret": false,
"type": "string"
}
},
"description": "Specify Kafka topic name"
}
],
"_meta": {
"message": "Connection to broker transactions-cluster-kafka-brokers.my-cluster.svc.cluster.local:9092 has failed.. Unable to fetch and process metadata",
"type": "DANGER"
}
}
Tasks involved / Steps to Reproduce
See above
Issue Analytics
- State:
- Created 4 years ago
- Comments:9 (9 by maintainers)
Top Results From Across the Web
Why Can't I Connect to Kafka? | Troubleshoot Connectivity
This returns metadata to the client, including a list of all the brokers in the cluster and their connection endpoints. The client then...
Read more >ubuntu - kafka broker not available at starting - Stack Overflow
I was getting this error because I was having a different version of kafka running on cluster vs local. Make sure the version...
Read more >Kafka integrations not starting - New Relic Explorers Hub
Hi @j.needham,. The “no instances loaded for plugin com.newrelic.kafka because it isn't registered” message can arise if your kafka-config has a ...
Read more >Amazon Managed Streaming for Apache Kafka - Noise
This post discusses how Vortexa harnesses the power of Apache Kafka to improve ... Vortexa can now build new data pipelines that integrate...
Read more >CREATE INTEGRATION - Snowflake Documentation
An integration is a Snowflake object that provides an interface between Snowflake ... CREATE [ OR REPLACE ] <integration_type> INTEGRATION [ IF NOT...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
I think we can remove this from
1.9.0
, not to block the release and address this in a patch release. Thoughts @syndesisio/backend?@squakez it’s definitely a bug, we should allow the user to specify parameters even though we don’t consider them valid. It’s up to the user to deal with manually entered values, at the cost of failing exchanges in the integration.