Skip to content

[DISCUSSION] Add separate crate to cover spark builtin functions #5600

Open
@comphead

Description

@comphead

Is your feature request related to a problem or challenge? Please describe what you are trying to do.
The discussion is to collect community thoughts on implementing spark builtin functions listed https://spark.apache.org/docs/3.2.0/api/sql/

We more often face the requests to implement Spark functions and the use case highly depends on person/company stack, one treats the spark more important, for others its opposite and PG compatibility is a priority

Builtin function list between Postgres and Spark are expectedly not the same. I believe it can be rare and worse cases when the function name the same but signature and/or return type is different.

The discussion goal is to find out how to organize DF and keep compatibility for majors like Spark, Postgres, and perhaps other systems

Describe the solution you'd like
@alamb in #5568 (comment) made a proposition to create an extensible crate for spark functions, or even it can be a separate subproject so the PG users have the possibility to exclude Spark functions.

Related Issues

Describe alternatives you've considered

Not doing this
Additional context

Created after #5568 (comment)

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions