WorkflowsCreateExecutionOperator execution argument only receive bytes
See original GitHub issueApache Airflow Provider(s)
Versions of Apache Airflow Providers
apache-airflow-providers-google==7.0.0
Apache Airflow version
2.3.2
Operating System
Ubuntu 20.04.5 LTS (Focal Fossa)
Deployment
Docker-Compose
Deployment details
No response
What happened
WorkflowsCreateExecutionOperator triggers google cloud workflows and execution param receives argument as {“argument”: {“key”: “val”, “key”, “val”…}
But, When I passed argument as dict using render_template_as_native_obj=True, protobuf error occured TypeError: {‘projectId’: ‘project-id’, ‘location’: ‘us-east1’} has type dict, but expected one of: bytes, unicode.
When I passed argument as bytes {“argument”: b’{\n “projectId”: “project-id”,\n “location”: “us-east1”\n}’ It working.
What you think should happen instead
execution argument should be Dict instead of bytes.
How to reproduce
not working
from airflow import DAG
from airflow.models.param import Param
from airflow.operators.dummy_operator import DummyOperator
from airflow.providers.google.cloud.operators.workflows import WorkflowsCreateExecutionOperator
with DAG(
dag_id="continual_learning_deid_norm_h2h_test",
params={
"location": Param(type="string", default="us-east1"),
"project_id": Param(type="string", default="project-id"),
"workflow_id": Param(type="string", default="orkflow"),
"workflow_execution_info": {
"argument": {
"projectId": "project-id",
"location": "us-east1"
}
}
},
render_template_as_native_obj=True
) as dag:
execution = "{{ params.workflow_execution_info }}"
create_execution = WorkflowsCreateExecutionOperator(
task_id="create_execution",
location="{{ params.location }}",
project_id="{{ params.project_id }}",
workflow_id="{{ params.workflow_id }}",
execution="{{ params.workflow_execution_info }}"
)
start_operator = DummyOperator(task_id='test_task')
start_operator >> create_execution
working
from airflow import DAG
from airflow.models.param import Param
from airflow.operators.dummy_operator import DummyOperator
from airflow.providers.google.cloud.operators.workflows import WorkflowsCreateExecutionOperator
with DAG(
dag_id="continual_learning_deid_norm_h2h_test",
params={
"location": Param(type="string", default="us-east1"),
"project_id": Param(type="string", default="project-id"),
"workflow_id": Param(type="string", default="orkflow"),
"workflow_execution_info": {
"argument": b'{\n "projectId": "project-id",\n "location": "us-east1"\n}'
}
},
render_template_as_native_obj=True
) as dag:
execution = "{{ params.workflow_execution_info }}"
create_execution = WorkflowsCreateExecutionOperator(
task_id="create_execution",
location="{{ params.location }}",
project_id="{{ params.project_id }}",
workflow_id="{{ params.workflow_id }}",
execution="{{ params.workflow_execution_info }}"
)
start_operator = DummyOperator(task_id='test_task')
start_operator >> create_execution
Anything else
No response
Are you willing to submit PR?
- Yes I am willing to submit a PR!
Code of Conduct
- I agree to follow this project’s Code of Conduct
Issue Analytics
- State:
- Created a year ago
- Comments:5 (5 by maintainers)
Top Results From Across the Web
Package apache-airflow-providers-google
Allow values in WorkflowsCreateExecutionOperator execution argument to be ... Migration of System Tests: Cloud BigQuery Data Transfer (AIP-47) (#27312).
Read more >Google - Astronomer Registry
An Apache Airflow provider for all Google services. Contains namespaces and modules for a variety of services, including Google Cloud, ...
Read more >apache-airflow-backport-providers-google 2021.3.3 - PyPI
This is a backport providers package for google provider. All classes for this provider package are in airflow.providers.google python package. Only Python 3.6+ ......
Read more >The Apache Airflow PythonOperator, all you need in 20 mins!
Pass positional arguments to your Python function ⏺ Pass keyword ... Make your DAG cleaner ⏺ Get the current execution date in Airflow...
Read more >Writing DAGs (workflows) | Cloud Composer - Google Cloud
Note: The way you implement your DAGs influences performance of DAG scheduling and execution. Please get familiarized with the following troubleshooting ...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
commented there. You need to add test. And making project_id templated for those might be a good idea for another PR
Also want to mention that
project_id
parameter is not a templated field.