Skip to content

[Bug report] Gravitino REST Server Defaults to 'jdbc' Instead of Using Catalog Name from Spark Configuration #6310

Open
@dataageek

Description

@dataageek

Version

main branch

Describe what's wrong

I have multiple catalogs created using the Iceberg JDBC catalog. When I attempt to consume these catalogs via the Gravitino REST server (using the REST catalog through Spark), the implementation always defaults to using jdbc as the catalog name instead of the catalog name specified in the Spark configuration.

For example, when running the following command:

./spark-sql \
    --conf spark.sql.catalog.iceberg_jdbc_catalog=org.apache.iceberg.spark.SparkCatalog \
    --conf spark.sql.catalog.iceberg_jdbc_catalog.warehouse="/mnt/c/iceberg-warehouse" \
    --conf spark.sql.catalog.iceberg_jdbc_catalog.type=rest \
    --conf spark.sql.catalog.iceberg_jdbc_catalog.uri=http://127.0.0.1:9991/iceberg \
    --conf spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions

I expect Spark to use the iceberg_jdbc_catalog catalog, as specified in the configuration. However, the Gravitino REST server implementation instead looks for a catalog named jdbc.

Although I can override this behavior using the catalog-backend-name property in gravitino.conf, I would prefer a dynamic mechanism that fetches the catalog name from the Spark configuration directly.

Error message and/or stacktrace

No error message. just not working as expected

How to reproduce

0.7.0

Additional context

No response

Activity

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions