question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

bug: A single Environment definition can not be split into multiple YAML files

See original GitHub issue

Meltano Version

2.5.0

Python Version

NA

Bug scope

API

Operating System

Any

Description

When a Environment (e.g. prod) is defined in multiple project files, Meltano fails with the following error message:

Plugin with path "environments:prod" already added in file /run/media/atif.imam/data/pycharm_projects/data-integration/source_config/file_1.yml.

This is because the multi-file implementation indexes environment by name alone. This means an entire environment definition MUST be contained in a single YAML file and it makes use cases where it’s desirable to split plugin configuration in one file per plugin with a common set of environments impossible to accomplish.

From a Slack thread: https://meltano.slack.com/archives/C01TCRBBJD7/p1662443196912219

Code

YAML files
# meltano.yml
environments:
- name: dev
- name: uat
- name: prod
send_anonymous_usage_stats: true
project_id: 488164fd-cb5b-4fe1
include_paths:
  - ./source_config/file_1.yml
  - ./source_config/file_2.yml

plugins:
  extractors:
    - name: tap-mysql
      variant: transferwise
      pip_url: pipelinewise-tap-mysql
      config:
        ..
        ..
  loaders:
    - name: target-snowflake
      variant: transferwise
      pip_url: pipelinewise-target-snowflake
      config:
        add_metadata_columns: true
        hard_delete: true
        ..
        ..
# file_1.yml
environments:
  - name: prod
    config:
      plugins:
        extractors:
          - name: tap-mysql-db1
              
        loaders:
          - name: target-snowflake-db1
            config:
              s3_key_prefix: snowflake/db1/
  - name: uat
    config:
      plugins:
        extractors:
          - name: tap-mysql-db1
              
        loaders:
          - name: target-snowflake-db1
            config:

plugins:
  extractors:
    - name: tap-mysql-db1
      inherit_from: tap-mysql
  loaders:
    - name: target-snowflake-db1
      inherit_from: target-snowflake
      config:
        default_target_schema: db1
# file_2.yml
environments:
    - name: prod
      config:
        plugins:
          extractors:
            - name: tap-mysql-db2
                
          loaders:
            - name: target-snowflake-db2
              config:
                s3_key_prefix: snowflake/db2/
    - name: uat
      config:
        plugins:
          extractors:
            - name: tap-mysql-db2
                
          loaders:
            - name: target-snowflake-db2
              config:
  
  plugins:
    extractors:
      - name: tap-mysql-db2
        inherit_from: tap-mysql
    loaders:
      - name: target-snowflake-db2
        inherit_from: target-snowflake
        config:
          default_target_schema: db2

Issue Analytics

  • State:open
  • Created a year ago
  • Comments:5

github_iconTop GitHub Comments

2reactions
msardana94commented, Sep 20, 2022

If we can add support for dynamic select expressions via interpolation of env variables, would this solve the most important parts of your use case?

If we can solve for the select issue, then (hopefully) this enables you to keep all of the distinct mysql data sources in separate files (one plugin declaration per file, presumably), with the expectation that only environment-specific overrides need to be in the environment declaration. Do you think this could work?

Yes 💯 it will definitely solve the problem and make testing easier.

~I’ll open~ I have opened an issue:

Thanks for opening the issue.

1reaction
edgarrmondragoncommented, Sep 7, 2022

@aaronsteers I too I’m wary of allowing environments to be spread across multiple files. Still logged in case anybody searches the issues for Plugin with path ... already added. I recommended your one-file-per-environment approach to the user 👍.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Splitting pipeline.yml into several files via simple templating
Every resource and every job must go in exactly one file, it is not allowed to have 2 resources in one *resource.yml file,...
Read more >
Infrastructure as Code only works as Code… - Medium
JSON and YAML file formats cannot be split into multiple files. In order to avoid large service definition files when using those formats, ......
Read more >
Splitting up the configuration - Home Assistant
Splitting the configuration.yaml into several files. ... The integration headers ( mqtt: ) should be fully left aligned (aka no indent), and the...
Read more >
Troubleshoot pipeline runs - Azure DevOps - Microsoft Learn
Learn how to troubleshoot pipeline runs in Azure Pipelines and Team Foundation ... When you define a YAML PR or CI trigger, you...
Read more >
How to split a command over multiple lines in appveyor.yml
This means the first option looks like the only viable solution, where the YAML is constructed such that the FOR line and everything...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found