question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

[kafka_consumer] Integration not respecting documentation

See original GitHub issue

The documentation for the Kafka consumer connection string seems to be wrong. I use the following config:

- kafka_connect_str:
   - localhost:9092
   - another_kafka_broker:9092

The following error is logged in the /var/log/datadog/collector.log

2017-11-23 08:44:34 UTC | ERROR | dd.collector | checks.kafka_consumer(__init__.py:812) | Check 'kafka_consumer' instance #0 failed
Traceback (most recent call last):
  File "/opt/datadog-agent/agent/checks/__init__.py", line 795, in run
    self.check(copy.deepcopy(instance))
  File "/opt/datadog-agent/agent/checks.d/kafka_consumer.py", line 99, in check
    cli = self._get_kafka_client(instance)
  File "/opt/datadog-agent/agent/checks.d/kafka_consumer.py", line 166, in _get_kafka_client
    if kafka_conn_str not in self.kafka_clients:

If I understand correctly, this line expects kafka_connect_str to be a string, not a list:

https://github.com/DataDog/integrations-core/blob/8ec79412e9043d694db771dba29292709c0b741f/kafka_consumer/check.py#L166

Issue Analytics

  • State:closed
  • Created 6 years ago
  • Comments:7 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
truthbkcommented, Nov 23, 2017

PR #904 should address the issue and will be released with an agent bugfix release soon.

0reactions
truthbkcommented, Nov 30, 2017

5.20.1 was released - the issue should be addressed there.

Read more comments on GitHub >

github_iconTop Results From Across the Web

spring-kafka (not integration) consumer not consuming message
The referenced above documentation says: Although Serializer/Deserializer API is pretty simple and flexible from the low-level Kafka ...
Read more >
Kafka Consumer | Confluent Documentation
The main reason is that the consumer does not retry the request if the commit fails. This is something that committing synchronously gives...
Read more >
Structured Streaming + Kafka Integration Guide (Kafka broker ...
If a topic column exists then its value is used as the topic when writing the given row to Kafka, unless the “topic”...
Read more >
Apache Kafka Reference Guide - Quarkus
This is a good example of how to integrate a Kafka consumer with another downstream, in this example exposing it as a Server-Sent...
Read more >
KafkaConsumer node - IBM
The KafkaConsumer node handles messages in the following message domains ... If this option is not selected, the KafkaConsumer node does not ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found