Some dbt_utils macros do not work with Spark SQL
See original GitHub issueDescribe the bug
It appears that many dbt_utils macros are not supported with Spark SQL due to how a lot of the macros are casting with ::
instead of cast
Steps to reproduce
- Add dbt_utils to your packages.yml file:
packages:
- package: fishtown-analytics/dbt_utils
version: 0.6.2
- Create a simple model with the following SQL:
select {{ dbt_utils.current_timestamp() }}
Expected results
A model should be created just fine.
Actual results
Runtime Error in model dbt_test (models/staging/dbt_test.sql)
Database Error
Error running query: org.apache.spark.sql.catalyst.parser.ParseException:
mismatched input ':' expecting <EOF>(line 6, pos 21)
== SQL ==
/* {"app": "dbt", "dbt_version": "0.18.1", "profile_name": "databricks", "target_name": "dev", "node_id": "model.dbt_databricks.dbt_test"} */
create or replace view brian_dev_stg.dbt_test
as
select
current_timestamp::
---------------------^^^
timestamp
Done. PASS=0 WARN=0 ERROR=1 SKIP=0 TOTAL=1
Screenshots and log output
System information
The contents of your packages.yml
file:
packages:
- package: fishtown-analytics/dbt_utils version: 0.6.2
Which database are you using dbt with?
- postgres
- redshift
- bigquery
- snowflake
- other (specify: Spark SQL)
The output of dbt --version
:
installed version: 0.18.1
latest version: 0.18.1
Up to date!
Plugins:
- bigquery: 0.18.1
- snowflake: 0.18.1
- redshift: 0.18.1
- postgres: 0.18.1
- spark: 0.18.0
The operating system you’re using:
macOS Catalina Version 10.15.7
The output of python --version
:
Python 3.7.6
Additional context
Here is one place where casting is done by ::
instead of cast()
https://github.com/fishtown-analytics/dbt-utils/blob/9feaccc327a7409298a2bc362db53c2e597024fa/macros/cross_db_utils/current_timestamp.sql#L6
Are you interested in contributing the fix?
Yes! I am happy to change all the ::
to cast() or provide another solution. I would love to contribute to dbt-utils!
Issue Analytics
- State:
- Created 3 years ago
- Comments:5 (4 by maintainers)
Top Results From Across the Web
Some dbt_utils macros do not work with Spark SQL · Issue #291
Describe the bug It appears that many dbt_utils macros are not supported with Spark SQL due to how a lot of the macros...
Read more >dispatch | dbt Developer Hub
Depending on the adapter I'm running against, one of these macros will be selected, it will be passed the specified arguments as inputs,...
Read more >Upgrading to dbt utils v1.0 - dbt Developer Hub
Cause: No macro called MACRO_NAME exists. This is most likely because the macro has moved to the dbt namespace (see above). It could...
Read more >dbt_utils - dbt - Package hub
This dbt package contains macros that can be (re)used across dbt projects. ... Asserts that a column does not have any values equal...
Read more >Announcing dbt v1.3 and (soon) dbt-utils v1.0
Code in dbt utils should be useful even if there was only one database engine in the world. This means that the cross-database...
Read more >
Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free
Top Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
just want to update the note above to v0.20.0 syntax for anyone stumbling across this issue like I did
This is a super cool feature btw!
dispatch docs
And it works even more “out of the box” in v1.2: https://github.com/dbt-labs/dbt-spark/pull/359
Closing this one, which has been open for a while 😃