question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

WorkflowsCreateExecutionOperator execution argument only receive bytes

See original GitHub issue

Apache Airflow Provider(s)

google

Versions of Apache Airflow Providers

apache-airflow-providers-google==7.0.0

Apache Airflow version

2.3.2

Operating System

Ubuntu 20.04.5 LTS (Focal Fossa)

Deployment

Docker-Compose

Deployment details

No response

What happened

WorkflowsCreateExecutionOperator triggers google cloud workflows and execution param receives argument as {“argument”: {“key”: “val”, “key”, “val”…}

But, When I passed argument as dict using render_template_as_native_obj=True, protobuf error occured TypeError: {‘projectId’: ‘project-id’, ‘location’: ‘us-east1’} has type dict, but expected one of: bytes, unicode.

When I passed argument as bytes {“argument”: b’{\n “projectId”: “project-id”,\n “location”: “us-east1”\n}’ It working.

What you think should happen instead

execution argument should be Dict instead of bytes.

How to reproduce

not working

from airflow import DAG
from airflow.models.param import Param
from airflow.operators.dummy_operator import DummyOperator
from airflow.providers.google.cloud.operators.workflows import WorkflowsCreateExecutionOperator

with DAG(
    dag_id="continual_learning_deid_norm_h2h_test",
    params={
        "location": Param(type="string", default="us-east1"),
        "project_id": Param(type="string", default="project-id"),
        "workflow_id": Param(type="string", default="orkflow"),
        "workflow_execution_info": {
            "argument": {
                "projectId": "project-id",
                "location": "us-east1"
            }
        }
    },
    render_template_as_native_obj=True
) as dag:
    execution = "{{ params.workflow_execution_info }}"
    create_execution = WorkflowsCreateExecutionOperator(
        task_id="create_execution",
        location="{{ params.location }}",
        project_id="{{ params.project_id }}",
        workflow_id="{{ params.workflow_id }}",
        execution="{{ params.workflow_execution_info }}"
    )

    start_operator = DummyOperator(task_id='test_task')

    start_operator >> create_execution

working

from airflow import DAG
from airflow.models.param import Param
from airflow.operators.dummy_operator import DummyOperator
from airflow.providers.google.cloud.operators.workflows import WorkflowsCreateExecutionOperator

with DAG(
    dag_id="continual_learning_deid_norm_h2h_test",
    params={
        "location": Param(type="string", default="us-east1"),
        "project_id": Param(type="string", default="project-id"),
        "workflow_id": Param(type="string", default="orkflow"),
        "workflow_execution_info": {
            "argument": b'{\n  "projectId": "project-id",\n  "location": "us-east1"\n}'
        }
    },
    render_template_as_native_obj=True
) as dag:
    execution = "{{ params.workflow_execution_info }}"
    create_execution = WorkflowsCreateExecutionOperator(
        task_id="create_execution",
        location="{{ params.location }}",
        project_id="{{ params.project_id }}",
        workflow_id="{{ params.workflow_id }}",
        execution="{{ params.workflow_execution_info }}"
    )

    start_operator = DummyOperator(task_id='test_task')

    start_operator >> create_execution

Anything else

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

Issue Analytics

  • State:closed
  • Created a year ago
  • Comments:5 (5 by maintainers)

github_iconTop GitHub Comments

1reaction
potiukcommented, Oct 29, 2022

commented there. You need to add test. And making project_id templated for those might be a good idea for another PR

0reactions
rkarishcommented, Oct 29, 2022

Also want to mention that project_id parameter is not a templated field.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Package apache-airflow-providers-google
Allow values in WorkflowsCreateExecutionOperator execution argument to be ... Migration of System Tests: Cloud BigQuery Data Transfer (AIP-47) (#27312).
Read more >
Google - Astronomer Registry
An Apache Airflow provider for all Google services. Contains namespaces and modules for a variety of services, including Google Cloud, ...
Read more >
apache-airflow-backport-providers-google 2021.3.3 - PyPI
This is a backport providers package for google provider. All classes for this provider package are in airflow.providers.google python package. Only Python 3.6+ ......
Read more >
The Apache Airflow PythonOperator, all you need in 20 mins!
Pass positional arguments to your Python function ⏺ Pass keyword ... Make your DAG cleaner ⏺ Get the current execution date in Airflow...
Read more >
Writing DAGs (workflows) | Cloud Composer - Google Cloud
Note: The way you implement your DAGs influences performance of DAG scheduling and execution. Please get familiarized with the following troubleshooting ...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found