Description
Is your feature request related to a problem or challenge? Please describe what you are trying to do.
The discussion is to collect community thoughts on implementing spark builtin functions listed https://spark.apache.org/docs/3.2.0/api/sql/
We more often face the requests to implement Spark functions and the use case highly depends on person/company stack, one treats the spark more important, for others its opposite and PG compatibility is a priority
Builtin function list between Postgres and Spark are expectedly not the same. I believe it can be rare and worse cases when the function name the same but signature and/or return type is different.
The discussion goal is to find out how to organize DF and keep compatibility for majors like Spark, Postgres, and perhaps other systems
Describe the solution you'd like
@alamb in #5568 (comment) made a proposition to create an extensible crate for spark functions, or even it can be a separate subproject so the PG users have the possibility to exclude Spark functions.
Related Issues
- Spark-compatible CAST operation #11201
- Make modulos with negative float zero compat with other engines #11051
- Support "standard" / alternate format arguments for
to_timestamp
#8915 - Decimal division compatibility mode with spark #7301
Describe alternatives you've considered
Not doing this
Additional context
Created after #5568 (comment)