Connecting PySpark with Hive tables #7240
Replies: 5 comments
-
|
One obvious issue - you are using Scala 2.12 |
Beta Was this translation helpful? Give feedback.
-
|
I made a stupid mistake! I was calling functions on BTW I made it work with pyspark=3.5 |
Beta Was this translation helpful? Give feedback.
-
|
I just realized I didn't quite solve this yet. When I execute more complex SELECT query, I get error: Fetching simple data types is working, but fetching ARRAY or STRUCT is failing. It seems like Hive dialect translation is not working. I'm using |
Beta Was this translation helpful? Give feedback.
-
|
Here are few more exmples: |
Beta Was this translation helpful? Give feedback.
-
|
After digging deeper here is what I found:
|
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello, I am trying to use pyspark to access kyuubi and Spark. The issue I have is with Hive dialect. The queries that end up on cluster have some weird syntax and they fail because of that.
STEPS TO REPRODUCE:
My error is
pyspark 4.0.1 (aslo tried 3.5 but no luck)
kyuubi 1.10.2
Beta Was this translation helpful? Give feedback.
All reactions