• 23-Jan-2023
Lightrun Team
Author Lightrun Team
Share
This article is about fixing Keep getting ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1076) in dpkp kafka-python

Keep getting ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1076) in dpkp kafka-python

Lightrun Team
Lightrun Team
23-Jan-2023

Explanation of the problem

We are utilizing Kafka-python version 2.0.1 in conjunction with Python 3.7.6 on MacOsx. While the package generally performs as expected, an issue has been encountered when attempting to access a broker utilizing SSL authentication. The error occurs specifically when SSL is enabled with the provided certificates, one of which is self-signed within the company.

The following code is being executed:

return KafkaProducer(
            bootstrap_servers=self.bootstrap_servers,
            acks='all',
            compression_type=None,
            retries=5,
            batch_size=16384 * 5,
            max_block_ms=5000,
            retry_backoff_ms=100 * 10,
            linger_ms=5,
            client_id='data-importer',
            security_protocol='SSL',
            ssl_check_hostname=True,
            api_version=(0, 20),
            ssl_cafile=rel_to(__file__, '../kafkakeys/KafkaClientCAChain.pem'),
            ssl_certfile=rel_to(__file__, '../kafkakeys/certificate.pem'),
            ssl_keyfile=rel_to(__file__, '../kafkakeys/key.pem'),

As a result of this code execution, when sending a message, the process becomes stuck in a loop and the following traceback is returned:

Traceback (most recent call last):
  File "/Users/-----/dev/prj/data-importer-python/.venv/lib/python3.7/site-packages/kafka/producer/sender.py", line 60, in run
    self.run_once()
  File "/Users/-----/dev/prj/data-importer-python/.venv/lib/python3.7/site-packages/kafka/producer/sender.py", line 160, in run_once
    self._client.poll(timeout_ms=poll_timeout_ms)
  File "/Users/-----/dev/prj/data-importer-python/.venv/lib/python3.7/site-packages/kafka/client_async.py", line 600, in poll
    self._poll(timeout / 1000)
  File "/Users/d-----i/dev/prj/data-importer-python/.venv/lib/python3.7/site-packages/kafka/client_async.py", line 646, in _poll
    conn.connect()
  File "/Users/d-----i/dev/prj/data-importer-python/.venv/lib/python3.7/site-packages/kafka/conn.py", line 426, in connect
    if self._try_handshake():
  File "/Users/-----i/dev/prj/data-importer-python/.venv/lib/python3.7/site-packages/kafka/conn.py", line 505, in _try_handshake
    self._sock.do_handshake()
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/ssl.py", line 1139, in do_handshake

Troubleshooting with the Lightrun Developer Observability Platform

Getting a sense of what’s actually happening inside a live application is a frustrating experience, one that relies mostly on querying and observing whatever logs were written during development.
Lightrun is a Developer Observability Platform, allowing developers to add telemetry to live applications in real-time, on-demand, and right from the IDE.

  • Instantly add logs to, set metrics in, and take snapshots of live applications
  • Insights delivered straight to your IDE or CLI
  • Works where you do: dev, QA, staging, CI/CD, and production

Start for free today

Problem solution for keep getting ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1076)

When dealing with SSL certificate issues, it is important to pay attention to the specific certificate and key files being used. The certificate file, also known as the public key, is used to verify the identity of the server to the client. The key file, also known as the private key, is used to encrypt and decrypt data between the server and the client. It is important that these files match and are valid, otherwise the client will not be able to establish a secure connection with the server. For example, when loading the certificate and key files in python, the following code can be used:

cert = "user.crt"
key = "user.key"
context = ssl.create_default_context()
context.load_cert_chain(certfile=cert, keyfile=key)

Another important aspect to consider when dealing with SSL certificate issues is the source of the SSL CA file. The CA file, also known as the certificate authority file, contains the root certificate of the certificate chain. It is used to verify the authenticity of the certificate presented by the server. It is important to ensure that the CA file used is trusted and valid, otherwise the client will not be able to establish a secure connection with the server. For example, when loading the CA file in python, the following code can be used:

ssl_cafile = "ca.crt"
context.load_verify_locations(ssl_cafile)

It is also important to understand the implications of turning off hostname verification and setting the verification mode to none. Hostname verification is used to ensure that the certificate presented by the server matches the hostname of the server. By turning off hostname verification, the client will not be able to verify the identity of the server. Setting the verification mode to none means that the client will not verify the authenticity of the certificate presented by the server. This may be used as a workaround, but it could also have security implications, as the client may be vulnerable to man-in-the-middle attacks.

context.check_hostname = False
context.verify_mode = ssl.CERT_NONE
It is crucial to carefully weigh the trade-offs and decide whether it is necessary to use these options for the specific use case.

Other popular problems with Facebook create-react-app

Problem: “NoBrokersAvailable” Error

One of the most common problems with kafka-python is encountering the “NoBrokersAvailable” error when attempting to produce or consume messages. This error occurs when the kafka-python client is unable to connect to any of the specified Kafka brokers. This can happen if the brokers are not running or if the incorrect host and port information is provided in the client configuration.

Solution:

To solve this problem, first ensure that the Kafka brokers are running and reachable by using the command line tool, kafka-console-producer or kafka-console-consumer. If the brokers are running, check the client configuration and ensure that the host and port information is correct. The following code block is an example of how to configure a KafkaProducer client in kafka-python:

from kafka import KafkaProducer
producer = KafkaProducer(bootstrap_servers='host1:port1,host2:port2')

Problem: “KafkaError” when producing messages with key

Another common problem with kafka-python is encountering a “KafkaError” when producing messages with a key. This error occurs when the key is not a bytes type. In kafka-python, the key must be of type bytes when using the KafkaProducer.send method.

Solution:

To solve this problem, ensure that the key is of type bytes before producing the message. The following code block is an example of how to correctly produce a message with a key:

key = "key_string".encode()
value = "value_string".encode()
producer.send('topic', key=key, value=value)

Problem: “KafkaError” when consuming messages from specific partition

Another problem that may occur when using kafka-python is encountering a “KafkaError” when consuming messages from a specific partition. This error occurs when the partition does not exist or the consumer is not authorized to access it.

Solution:

To solve this problem, ensure that the partition exists and that the consumer has the necessary permissions to access it. Additionally, when consuming messages from a specific partition, it’s important to specify the correct topic and partition number. The following code block is an example of how to correctly consume messages from a specific partition:

from kafka import KafkaConsumer
consumer = KafkaConsumer('topic', bootstrap_servers='host1:port1')
consumer.assign([TopicPartition('topic', partition)])

A brief introduction to dpkp kafka-python

Kafka-python is a python library for working with Apache Kafka, a distributed streaming platform. It provides a high-level API for producing and consuming messages, as well as more advanced functionality such as message key-value pairs, partitioning, and message compression. The library is built on top of the Kafka protocol and is fully compatible with it, making it easy to integrate with existing Kafka systems.

Kafka-python offers both a synchronous and an asynchronous API for producing and consuming messages. The synchronous API, which is based on the KafkaProducer and KafkaConsumer classes, allows for blocking operations that wait for acknowledgement from the Kafka broker before returning. The asynchronous API, which is based on the AIOKafkaProducer and AIOKafkaConsumer classes, uses async/await and allows for non-blocking operations that return immediately. This allows for better performance and scalability when working with large numbers of messages. Additionally, kafka-python also supports advanced features such as transactions, message timestamps, and message headers.

Most popular use cases for Facebook create-react-app

  1. Kafka-python can be used for building real-time data pipelines and streaming applications. By using the library’s API for producing and consuming messages, developers can easily connect to a Kafka cluster and move data between systems in near real-time. This can be used to build data pipelines for tasks such as data processing, analytics, and machine learning.
  2. kafka-python can be used for building scalable and fault-tolerant message-based systems. Apache Kafka, the technology that kafka-python is built on, is designed to handle large amounts of data with high throughput and low latency. By using kafka-python, developers can leverage the power of Kafka to build highly available and fault-tolerant systems that can handle millions of messages per second.
  3. kafka-python can be used for building event-driven architectures. By using the library’s API for producing and consuming messages, developers can easily create systems that respond to specific events in near real-time. This can be used to build applications such as real-time analytics, fraud detection, and IoT systems that need to react to events as they happen. Additionally, kafka-python also supports advanced features such as transactions, message timestamps, and message headers that allow to build more complex event-driven architectures.
Share

It’s Really not that Complicated.

You can actually understand what’s going on inside your live applications. It’s a registration form away.

Get Lightrun

Lets Talk!

Looking for more information about Lightrun and debugging?
We’d love to hear from you!
Drop us a line and we’ll get back to you shortly.

By submitting this form, I agree to Lightrun’s Privacy Policy and Terms of Use.