Allow partition-wise copy from bq to bq tables.
See original GitHub issueDescription
As of now, there is no support to copy a partition from a source table to destination table’s partition in the existing code
bq cp
cli gives an option to do this with a decorator ( $ ).
Use case/motivation
we have many use cases everyday where we keep different tables across projects in sync by materialising new partitions with a frequency. This feature could be great. other options for now would be using DBT, but to use it just for this will be an added task.
Related issues
No response
Are you willing to submit a PR?
- Yes I am willing to submit a PR!
Code of Conduct
- I agree to follow this project’s Code of Conduct
Issue Analytics
- State:
- Created a year ago
- Comments:8 (6 by maintainers)
Top Results From Across the Web
Managing partitioned tables | BigQuery | Google Cloud
Managing partitioned tables · Get partition metadata · Set the partition expiration · Set partition filter requirements · Copy a partitioned table ·...
Read more >Copy partitioned bigquery table that only overwrites the ...
You can do something like this: Use this query to build script for bq command #legacySql select concat ('bq cp -f ', s.project_id, ......
Read more >BigQuery Copy — How to copy data efficiently between ...
BigQuery provides bq cp command to copy tables with in the same project or to a different project with 0 cost. (Provided, both...
Read more >How to Duplicate a Table in BigQuery - PopSQL
In the BigQuery UI, select the table you wish to copy, then push the Copy Table button. Enter the desired new table name....
Read more >Guide to BigQuery Partition - Coupler.io Blog
How to copy a BigQuery partitioned table ... Unfortunately, BigQuery does not allow partitioning a table using multiple columns yet.
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
@eladkal, @kaxil Can you add the
area:providers
/provider:Google
label to this one?Also, it seems like a somewhat straightforward update to the operator (though GCP is not my area of expertise) so perhaps
Good First Issue
as well?Hey @bhankit1410, I think this feature has already been supported.
I have tested on
BigQueryToBigQueryOperator
. I created a simple DAG. You can see the DAG as follows. I generated sample 1000 records with multiple types of column types such as integer, UUID and timestamp. I have inserted data into a table. I have tested with every three Write Dispositions. Every case I have tested has been working for me so far.https://gist.github.com/bugraoz93/d3ee6d2d03d1881de4614d1e7c3b8234
I have also checked the code. I have tested the individual methods to ensure that they can process the
$
sign within the table name without any exceptions. There is no part within the code that prevents this feature to work.Could you please expand your case a little bit? Which Apache Airflow version are you using to achieve it? Which provider version are you using for google (apache-airflow-providers-google)?