BigQuery Service Account JSON Authentication not working with env_var
See original GitHub issueDescribe the bug
Same that https://github.com/fishtown-analytics/dbt/issues/2553 but
If I just put the value:
private_key: XXX
it works but using env_var it does not.
Steps To Reproduce
Same that https://github.com/fishtown-analytics/dbt/issues/2553
private_key: “{{ env_var(‘BQ_PRIVATE_KEY’) }}” or private_key: “{{env_var(‘BQ_PRIVATE_KEY’)}}”
Expected behavior
dbt should run normally
Screenshots and log output
Could not deserialize key data.
System information
Which database are you using dbt with?
- postgres
- redshift
- bigquery
- snowflake
- other (specify: ____________)
The output of dbt --version
:
installed version: 0.18.1
The operating system you’re using: Debian GNU/Linux 10 (buster) --> docker image fishtownanalytics/dbt:0.18.1
The output of python --version
:
Python 3.8.3
Issue Analytics
- State:
- Created 3 years ago
- Comments:5 (2 by maintainers)
Top Results From Across the Web
BigQuery Service Account JSON Authentication not working
Trying to authenticate to BigQuery using Service Account JSON Authentication because I'm running dbt from a docker container.
Read more >Authenticating with a service account key file | BigQuery
Create credentials in your application from the service account file. ... Load the credentials from the JSON file using GoogleCredential.FromStream(Stream).
Read more >Setting GOOGLE_APPLICATION_CREDENTIALS for ...
I'm not sure about BigQuery , but i'm using Google Data Store for saving. ... gcloud auth activate-service-account --key-file=<path to your generated json...
Read more >Google Auth Library for Python. - Read the Docs
If the environment variable GOOGLE_APPLICATION_CREDENTIALS is set to the path of a valid service account JSON private key file, then it is loaded...
Read more >Use Google Cloud user credentials when testing containers ...
But the suggested JSON file to use is a service account key file. It must be generated and stored locally! This solution isn't...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
As in #2553 and #2989, this is quite difficult to debug at a distance, but here’s my best guess: The
Could not deserialize key data
error indicates that some character—often a newline—is added, missing, or in the wrong place.Private keys contain many newlines, and they’re quite sensitive about them. In JSON, new lines in strings are represented via
\n
. When you download a JSON-formatted private keyfile from BigQuery, theprivate_key
value will therefore contain\n
.However, when you’re passing the
private_key
value as a standalone string (not within a larger JSON blob), you need to replace all the instances of\n
with a true newline. When setting the environment variable, you want to end up with something like:I’m not positive why
var
is working, whileenv_var
is not working—I wasn’t able to replicate this!—but it’s possible that printing the env var out to CLI, then back in as an argument to--vars
, replaces\n
with newlines because it’s a bash string. I don’t think this reflects a functional difference between how dbt handles{{ var() }}
vs.{{ env_var() }}
.If you’re still having trouble, or if you don’t care to manually edit the private key, you could also try passing the entire keyfile contents (as a JSON blob) into a single environment variable:
Thanks, there must be some difference between env_var and var, as var works like a charm with the same complex key.