question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Fail to fetch data w/ spark adapter

See original GitHub issue

I tried to use spark adapter with following docker image.

FROM lightdash/lightdash:0.2.7

RUN apt-get update ; apt-get install -y --no-install-recommends libsasl2-dev
RUN pip install -U dbt==0.19.2
RUN pip install 'dbt-spark[PyHive]'

When I run query, results field is empty. And Export CSV output is also empty 500 lines.

image

I doubt SQL query include extra-two-‘d’ as follows (metric: Num_users, Dimension: Org_1st).

SELECT
  testmart.org_1st AS dtestmart_org_1std,
  COUNT(testmart.count_user) AS dtestmart_num_usersd
FROM dbt.testmart AS testmart


GROUP BY 1

I think it will right.

SELECT
  testmart.org_1st AS "testmart_org_1st",
  COUNT(testmart.count_user) AS "testmart_num_users"
FROM dbt.testmart AS testmart


GROUP BY 1

/usr/app/dbt/logs/dbt.log include right numbers.

2021-07-05 12:59:19.606216 (Thread-1): TFetchResultsResp(status=TStatus(statusCode=0, infoMessages=None, sqlState=None, errorCode=None, errorMessage=None), hasMoreRows=False, results=TRowSet(startRowOffset=0, rows=[], columns=[TColumn(boolVal=None, byteVal=None, i16Val=None, i32Val=None, i64Val=None, doubleVal=None, stringVal=TStringColumn(values=[(   snip   ), i64Val=TI64Column(values=[9492, 1704, 13997, 755, 1650, 4686, 912, 2132, 912, 4260, 912, 3830, 4664, 7477, 10262, 912, 4686, 852, 2280, 1638, 4200, 4039, 6384, 2966, 5964, 5599, 3192, 3437, 12218, 5837, 4681, 4260, 10741, 1212, 3072, 3499, 1764, 570], nulls=b'\x00'), doubleVal=None, stringVal=None, binaryVal=None)], binaryColumns=None, columnCount=None))

I also tried dbt-presto and it went fine.

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:19 (1 by maintainers)

github_iconTop GitHub Comments

1reaction
owlascommented, Jul 7, 2021

That’s awesome! I’m really pleased we’ve got this working. Please let us know if you hit any other problems with pyhive / pyspark.

I’m going to close this issue for now.

Would love to hear more about what you’re working on. You can always chat with the devs at:

1reaction
owlascommented, Jul 7, 2021

Thanks @skame - I think I’ve found the problem. See #201

I’ll let you know when I’ve released the update

Read more comments on GitHub >

github_iconTop Results From Across the Web

Jan 5, 2021•Knowledge 000138070 - Search
ERROR : "Failed to fetch spark://10.xx.xx.92:42111/jars/com.informatica.adapter.jdbc_v2....." while mapping with JDBC_V2 connector in Databricks ...
Read more >
Unable to fetch data from Cassandra using spark (java)
1 Answer 1 ... Your current code does not perform any Spark action. Therefore no data is loaded. ... Furthermore adding CassandraRows to...
Read more >
I am getting error while fetching full data from spark thrift.
I am using HDP 2.6 and spark 2.1. While running select * from table trough beeline. Connection sting !connect - 231722.
Read more >
Not able to fetch data from Simba Spark Jdbc Driver
We are getting below error when we tried to set the date in preparedstatement using Simba Spark Jdbc Driver. · Exception: · Query...
Read more >
JDBC To Other Databases - Spark 3.3.1 Documentation
Property Name Default Scope url (none) read/write dbtable (none) read/write query (none) read/write
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found