Skip to content

feat(pyspark): Add UUID support to pyspark when spark 4.0 is supported #8582

Open
@jitingxu1

Description

@jitingxu1

What happened?

Background: While implementing UUID support across various backends, I encountered an issue with PySpark. PySpark does have UUID functions, but they would consistently throw errors. This problem was identified as a known bug, and it has been addressed in the Spark 4.0 release (see PR: link).

Resolution: We have decided to leave this issue unresolved for now. Once we officially support Spark 4.0, we will revisit adding UUID support for PySpark, taking advantage of the fixes and improvements in the new release.

What version of ibis are you using?

8.0.0

What backend(s) are you using, if any?

pyspark

Relevant log output

No response

Code of Conduct

  • I agree to follow this project's Code of Conduct

Metadata

Metadata

Assignees

No one assigned

    Labels

    featureFeatures or general enhancementspysparkThe Apache PySpark backendrequires upstream supportFeature or bug requires support from the upstream project

    Type

    No type

    Projects

    Status

    backlog

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions