question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Add ability to pass secrets into Kafka Connect build container

See original GitHub issue

Is your feature request related to a problem? Please describe. One of our Kafka Connect plugins is hosted in a private maven repository (Artifactory). Currently there’s no way to pass in a secret as an environment variable, or any other way from what I can tell, to the build container that we can use when making a request to our private Maven repo.

Describe the solution you’d like Either:

  1. Add ability to set environment variables from secrets in the build container (maybe add an additional buildExternalConfiguration section to the KafkaConnectSpec?). Users could then specify plugin urls in the format
https://username:$PASSWORD@MY_PRIVATE_REPO/some_plugin.tar.gz

This approach relies on curl converting the credentials in the URL to an Authorization header, meaning it won’t work for custom headers.

  1. Add an authHeader field to the DownloadableArtifact spec, for example:
plugins:
      - name: debezium-postgres-connector
        artifacts:
          - type: tgz
            url: https://MY_PRIVATE_REPO/some_plugin.tar.gz
            authHeader:
                secretName: my-private-repo-auth-secret
                header: my-private-repo-header

Users could then set the auth header secret to things like Authorization: Basic abc12345=, Authorization: token 5199831f4dd3b79e7c5b7e0ebe75d67aa66e79d4" or X-JFrog-Art-Api:ABcdEF. The authHeaders would be injected as environment variables to the build container, and the KafkaConnectDockerfile class would then add the -H $ENVIRONMENT_VARIABLE to the generated curl request where appropriate.

Describe alternatives you’ve considered I tried finding a way to do this with the templates section of the KafkaConnect spec, but couldn’t find a way to set an environment variable based on a secret (doesn’t seem like it’s supported in code).

Additional context If you think the auth header feature sounds like a good approach I’d be happy to contribute, let me know and I’ll send a PR.

Issue Analytics

  • State:open
  • Created 3 years ago
  • Reactions:3
  • Comments:8 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
adrianiskcommented, Apr 23, 2021

Awesome! Yeah that’s in line with what I was thinking as well. Also I agree that someone might need multiple headers, so an array makes sense to me. I’m happy to implement this, how about I take a look at the code sometime today or Monday and write up a quick summary of how I think I think it could be implemented so you can double check that it makes sense before I start?

1reaction
scholzjcommented, Apr 23, 2021

I had some thoughts on how this could be done. I think the API might look something like this:

plugins:
  - name: debezium-postgres-connector
    artifacts:
      - type: tgz
        url: https://MY_PRIVATE_REPO/some_plugin.tar.gz
        authentication:
          type: httpHeader
          header:
            valueFrom:
              secretKeyRef:
                name: authSecret
                key: authHeader

Where the value in the secret would be something like Authorization: Basic AXVubzpwQDU1dzByYM==.

Alternatively, I guess we could also split the header into two parts:

plugins:
  - name: debezium-postgres-connector
    artifacts:
      - type: tgz
        url: https://MY_PRIVATE_REPO/some_plugin.tar.gz
        authentication:
          type: httpHeader
          header: Auhtorization
          token:
            valueFrom:
              secretKeyRef:
                name: authSecret
                key: authHeader

Where the value in the secret would be basically just something like Basic AXVubzpwQDU1dzByYM== and the full header will be stitched together in the code.

This should follow the usual Kube and Strimzi API designs and be easily extensible for additional types of authentication if needed. Some things I’m not sure about is whether we might in some cases need multiple headers at the same time in which case we might need an array, e.g.:

plugins:
  - name: debezium-postgres-connector
    artifacts:
      - type: tgz
        url: https://MY_PRIVATE_REPO/some_plugin.tar.gz
        authentication:
          type: httpHeader
          headers:
            - header:
                valueFrom:
                  secretKeyRef:
                    name: authSecret
                    key: authHeader

I did not yet figured out how to implement it in the build. I guess we can mount the secrets as env vars and use them in the generated Dockerfile as build arguments. But we will need to double check that it works with both the plain Kubernetes as well as OCP build implementations.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Using secrets in Kafka Connect configuration
Learn how to store credentials using Kubernetes secrets for connector configuration with Kafka Connect deployed using Strimzi or Red Hat AMQ ...
Read more >
Kafka Connect Security Basics | Confluent Documentation
When Connect starts the connector, it resolves the variable by looking up the secret with a matching secret key in the secrets file,...
Read more >
Putting Kafka Connect passwords in a separate file ...
Kafka Connect connector secrets management · Set up your credentials file, e.g. data/foo_credentials.properties. FOO_USERNAME="rick" FOO_PASSWORD ...
Read more >
Don't trust Kafka Connect with your secrets | Lenses.io Blog
Connect can handle this by making sure connectors have sensible configurations and use the Password type provided by the ConfigDef class Kafka ......
Read more >
Using Kubernetes Configuration Provider to load data ... - Strimzi
When running Apache Kafka on Kubernetes, you will sooner or later probably need to use Config Maps or Secrets. Either to store something...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found