Skip to content

Spark 3.5.6 and Iceberg 1.9.1 #1960

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 4 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/spark_client_regtests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ jobs:
:polaris-server:quarkusAppPartsBuild --rerun \
-Dquarkus.container-image.build=true

# NOTE: the regression test runs with spark 3.5.5 and scala 2.12 in Java 17. We also have integration
# NOTE: the regression test runs with spark 3.5.6 and scala 2.12 in Java 17. We also have integration
# tests runs with the existing gradle.yml, which only runs on Java 21. Since spark Java compatibility
# for 3.5 is 8, 11, and 17, we should run spark client with those compatible java versions.
# TODO: add separate spark client CI and run with Java 8, 11 and 17.
Expand Down
2 changes: 1 addition & 1 deletion getting-started/eclipselink/docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ services:
retries: 15
command: [
/opt/spark/bin/spark-sql,
--packages, "org.apache.iceberg:iceberg-spark-runtime-3.5_2.12:1.9.0,org.apache.iceberg:iceberg-aws-bundle:1.9.0,org.apache.iceberg:iceberg-gcp-bundle:1.9.0,org.apache.iceberg:iceberg-azure-bundle:1.9.0",
--packages, "org.apache.iceberg:iceberg-spark-runtime-3.5_2.12:1.9.1,org.apache.iceberg:iceberg-aws-bundle:1.9.1,org.apache.iceberg:iceberg-gcp-bundle:1.9.1,org.apache.iceberg:iceberg-azure-bundle:1.9.1",
--conf, "spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions",
--conf, "spark.sql.catalog.quickstart_catalog=org.apache.iceberg.spark.SparkCatalog",
--conf, "spark.sql.catalog.quickstart_catalog.type=rest",
Expand Down
2 changes: 1 addition & 1 deletion getting-started/jdbc/docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ services:
retries: 15
command: [
/opt/spark/bin/spark-sql,
--packages, "org.apache.iceberg:iceberg-spark-runtime-3.5_2.12:1.9.0,org.apache.iceberg:iceberg-aws-bundle:1.9.0,org.apache.iceberg:iceberg-gcp-bundle:1.9.0,org.apache.iceberg:iceberg-azure-bundle:1.9.0",
--packages, "org.apache.iceberg:iceberg-spark-runtime-3.5_2.12:1.9.1,org.apache.iceberg:iceberg-aws-bundle:1.9.1,org.apache.iceberg:iceberg-gcp-bundle:1.9.1,org.apache.iceberg:iceberg-azure-bundle:1.9.1",
--conf, "spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions",
--conf, "spark.sql.catalog.polaris=org.apache.iceberg.spark.SparkCatalog",
--conf, "spark.sql.catalog.polaris.type=rest",
Expand Down
2 changes: 1 addition & 1 deletion getting-started/spark/notebooks/SparkPolaris.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -256,7 +256,7 @@
"\n",
"spark = (SparkSession.builder\n",
" .config(\"spark.sql.catalog.spark_catalog\", \"org.apache.iceberg.spark.SparkSessionCatalog\")\n",
" .config(\"spark.jars.packages\", \"org.apache.iceberg:iceberg-spark-runtime-3.5_2.12:1.9.0,org.apache.iceberg:iceberg-aws-bundle:1.9.0\")\n",
" .config(\"spark.jars.packages\", \"org.apache.iceberg:iceberg-spark-runtime-3.5_2.12:1.9.1,org.apache.iceberg:iceberg-aws-bundle:1.9.1\")\n",
" .config('spark.sql.iceberg.vectorization.enabled', 'false')\n",
" \n",
" # Configure the 'polaris' catalog as an Iceberg rest catalog\n",
Expand Down
2 changes: 1 addition & 1 deletion gradle/libs.versions.toml
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
[versions]
checkstyle = "10.25.0"
hadoop = "3.4.1"
iceberg = "1.9.0" # Ensure to update the iceberg version in regtests to keep regtests up-to-date
iceberg = "1.9.1" # Ensure to update the iceberg version in regtests to keep regtests up-to-date
quarkus = "3.23.3"
immutables = "2.10.1"
picocli = "4.7.7"
Expand Down
4 changes: 2 additions & 2 deletions plugins/pluginlibs.versions.toml
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@
#

[versions]
iceberg = "1.9.0"
spark35 = "3.5.5"
iceberg = "1.9.1"
spark35 = "3.5.6"
scala212 = "2.12.19"
scala213 = "2.13.15"
14 changes: 7 additions & 7 deletions plugins/spark/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,12 +21,12 @@

The Polaris Spark plugin provides a SparkCatalog class, which communicates with the Polaris
REST endpoints, and provides implementations for Apache Spark's
[TableCatalog](https://github.com/apache/spark/blob/v3.5.5/sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/TableCatalog.java),
[SupportsNamespaces](https://github.com/apache/spark/blob/v3.5.5/sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/SupportsNamespaces.java),
[ViewCatalog](https://github.com/apache/spark/blob/v3.5.5/sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/ViewCatalog.java) classes.
[TableCatalog](https://github.com/apache/spark/blob/v3.5.6/sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/TableCatalog.java),
[ViewCatalog](https://github.com/apache/spark/blob/v3.5.6/sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/ViewCatalog.java) classes.
[SupportsNamespaces](https://github.com/apache/spark/blob/v3.5.6/sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/SupportsNamespaces.java),

Right now, the plugin only provides support for Spark 3.5, Scala version 2.12 and 2.13,
and depends on iceberg-spark-runtime 1.9.0.
and depends on iceberg-spark-runtime 1.9.1.

# Start Spark with local Polaris service using the Polaris Spark plugin
The following command starts a Polaris server for local testing, it runs on localhost:8181 with default
Expand All @@ -50,7 +50,7 @@ Run the following command to build the Polaris Spark project and publish the sou

```shell
bin/spark-shell \
--packages org.apache.polaris:polaris-spark-<spark_version>_<scala_version>:<polaris_version>,org.apache.iceberg:iceberg-aws-bundle:1.9.0,io.delta:delta-spark_2.12:3.3.1 \
--packages org.apache.polaris:polaris-spark-<spark_version>_<scala_version>:<polaris_version>,org.apache.iceberg:iceberg-aws-bundle:1.9.1,io.delta:delta-spark_2.12:3.3.1 \
--conf spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions,io.delta.sql.DeltaSparkSessionExtension \
--conf spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog \
--conf spark.sql.catalog.<catalog-name>.warehouse=<catalog-name> \
Expand All @@ -73,7 +73,7 @@ The Spark command would look like following:

```shell
bin/spark-shell \
--packages org.apache.polaris:polaris-spark-3.5_2.12:1.1.0-incubating-SNAPSHOT,org.apache.iceberg:iceberg-aws-bundle:1.9.0,io.delta:delta-spark_2.12:3.3.1 \
--packages org.apache.polaris:polaris-spark-3.5_2.12:1.1.0-incubating-SNAPSHOT,org.apache.iceberg:iceberg-aws-bundle:1.9.1,io.delta:delta-spark_2.12:3.3.1 \
--conf spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions,io.delta.sql.DeltaSparkSessionExtension \
--conf spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog \
--conf spark.sql.catalog.polaris.warehouse=polaris \
Expand All @@ -99,7 +99,7 @@ To start Spark using the bundle JAR, specify it with the `--jars` option as show
```shell
bin/spark-shell \
--jars <path-to-spark-client-jar> \
--packages org.apache.iceberg:iceberg-aws-bundle:1.9.0,io.delta:delta-spark_2.12:3.3.1 \
--packages org.apache.iceberg:iceberg-aws-bundle:1.9.1,io.delta:delta-spark_2.12:3.3.1 \
--conf spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions,io.delta.sql.DeltaSparkSessionExtension \
--conf spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog \
--conf spark.sql.catalog.<catalog-name>.warehouse=<catalog-name> \
Expand Down
2 changes: 1 addition & 1 deletion plugins/spark/v3.5/getting-started/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ This will spin up 2 container services
* The `polaris` service for running Apache Polaris using an in-memory metastore
* The `jupyter` service for running Jupyter notebook with PySpark

NOTE: Starting the container first time may take a couple of minutes, because it will need to download the Spark 3.5.5.
NOTE: Starting the container first time may take a couple of minutes, because it will need to download the Spark 3.5.6.
When working with Delta, the Polaris Spark Client requires delta-io >= 3.2.1, and it requires at least Spark 3.5.3,
but the current jupyter Spark image only support Spark 3.5.0.

Expand Down
10 changes: 5 additions & 5 deletions plugins/spark/v3.5/getting-started/notebooks/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -24,11 +24,11 @@ ENV LANGUAGE='en_US:en'
USER root

# Generic table support requires delta 3.2.1
# Install Spark 3.5.5
RUN wget -q https://archive.apache.org/dist/spark/spark-3.5.5/spark-3.5.5-bin-hadoop3.tgz \
&& tar -xzf spark-3.5.5-bin-hadoop3.tgz \
&& mv spark-3.5.5-bin-hadoop3 /opt/spark \
&& rm spark-3.5.5-bin-hadoop3.tgz
# Install Spark 3.5.6
RUN wget -q https://archive.apache.org/dist/spark/spark-3.5.6/spark-3.5.6-bin-hadoop3.tgz \
&& tar -xzf spark-3.5.6-bin-hadoop3.tgz \
&& mv spark-3.5.6-bin-hadoop3 /opt/spark \
&& rm spark-3.5.6-bin-hadoop3.tgz

# Set environment variables
ENV SPARK_HOME=/opt/spark
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -266,7 +266,7 @@
"\n",
"spark = (SparkSession.builder\n",
" .config(\"spark.jars\", \"../polaris_libs/polaris-spark-3.5_2.12-1.1.0-incubating-SNAPSHOT-bundle.jar\") # TODO: add a way to automatically discover the Jar\n",
" .config(\"spark.jars.packages\", \"org.apache.iceberg:iceberg-aws-bundle:1.9.0,io.delta:delta-spark_2.12:3.2.1\")\n",
" .config(\"spark.jars.packages\", \"org.apache.iceberg:iceberg-aws-bundle:1.9.1,io.delta:delta-spark_2.12:3.2.1\")\n",
" .config(\"spark.sql.catalog.spark_catalog\", \"org.apache.spark.sql.delta.catalog.DeltaCatalog\")\n",
" .config('spark.sql.iceberg.vectorization.enabled', 'false')\n",
"\n",
Expand Down
2 changes: 1 addition & 1 deletion plugins/spark/v3.5/regtests/run.sh
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ if [[ -n "$CURRENT_SCALA_VERSION" ]]; then
SCALA_VERSIONS=("${CURRENT_SCALA_VERSION}")
fi
SPARK_MAJOR_VERSION="3.5"
SPARK_VERSION="3.5.5"
SPARK_VERSION="3.5.6"

SPARK_SHELL_OPTIONS=("PACKAGE" "JAR")

Expand Down
2 changes: 1 addition & 1 deletion plugins/spark/v3.5/regtests/setup.sh
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ set -x

SCRIPT_DIR=$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )

SPARK_VERSION=3.5.5
SPARK_VERSION=3.5.6
SCALA_VERSION=2.12
POLARIS_CLIENT_JAR=""
POLARIS_VERSION=""
Expand Down
2 changes: 1 addition & 1 deletion regtests/run.sh
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
# Run without args to run all tests, or single arg for single test.
SCRIPT_DIR=$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )

export SPARK_VERSION=spark-3.5.5
export SPARK_VERSION=spark-3.5.6
export SPARK_DISTRIBUTION=${SPARK_VERSION}-bin-hadoop3

if [ -z "${SPARK_HOME}" ]; then
Expand Down
2 changes: 1 addition & 1 deletion regtests/run_spark_sql.sh
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ fi
REGTEST_HOME=$(dirname $(realpath $0))
cd ${REGTEST_HOME}

export SPARK_VERSION=spark-3.5.5
export SPARK_VERSION=spark-3.5.6
export SPARK_DISTRIBUTION=${SPARK_VERSION}-bin-hadoop3
export SPARK_LOCAL_HOSTNAME=localhost # avoid VPN messing up driver local IP address binding

Expand Down
2 changes: 1 addition & 1 deletion regtests/setup.sh
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ if [ -z "${SPARK_HOME}" ]; then
fi
SPARK_CONF="${SPARK_HOME}/conf/spark-defaults.conf"
DERBY_HOME="/tmp/derby"
ICEBERG_VERSION="1.9.0"
ICEBERG_VERSION="1.9.1"
export PYTHONPATH="${SPARK_HOME}/python/:${SPARK_HOME}/python/lib/py4j-0.10.9.7-src.zip:$PYTHONPATH"

# Ensure binaries are downloaded locally
Expand Down
4 changes: 2 additions & 2 deletions regtests/t_pyspark/src/iceberg_spark.py
Original file line number Diff line number Diff line change
Expand Up @@ -73,8 +73,8 @@ def __enter__(self):
"""Initial method for Iceberg Spark session. Creates a Spark session with specified configs.
"""
packages = [
"org.apache.iceberg:iceberg-spark-runtime-3.5_2.12:1.9.0",
"org.apache.iceberg:iceberg-aws-bundle:1.9.0",
"org.apache.iceberg:iceberg-spark-runtime-3.5_2.12:1.9.1",
"org.apache.iceberg:iceberg-aws-bundle:1.9.1",
]
excludes = ["org.checkerframework:checker-qual", "com.google.errorprone:error_prone_annotations"]

Expand Down
14 changes: 7 additions & 7 deletions runtime/admin/distribution/LICENSE
Original file line number Diff line number Diff line change
Expand Up @@ -1003,13 +1003,13 @@ License: Apache License 2.0 - https://www.apache.org/licenses/LICENSE-2.0.txt

--------------------------------------------------------------------------------

Group: org.apache.iceberg Name: iceberg-api Version: 1.9.0
Group: org.apache.iceberg Name: iceberg-aws Version: 1.9.0
Group: org.apache.iceberg Name: iceberg-azure Version: 1.9.0
Group: org.apache.iceberg Name: iceberg-bundled-guava Version: 1.9.0
Group: org.apache.iceberg Name: iceberg-common Version: 1.9.0
Group: org.apache.iceberg Name: iceberg-core Version: 1.9.0
Group: org.apache.iceberg Name: iceberg-gcp Version: 1.9.0
Group: org.apache.iceberg Name: iceberg-api Version: 1.9.1
Group: org.apache.iceberg Name: iceberg-aws Version: 1.9.1
Group: org.apache.iceberg Name: iceberg-azure Version: 1.9.1
Group: org.apache.iceberg Name: iceberg-bundled-guava Version: 1.9.1
Group: org.apache.iceberg Name: iceberg-common Version: 1.9.1
Group: org.apache.iceberg Name: iceberg-core Version: 1.9.1
Group: org.apache.iceberg Name: iceberg-gcp Version: 1.9.1
Project URL: https://iceberg.apache.org/
License: Apache License 2.0 - https://www.apache.org/licenses/LICENSE-2.0.txt

Expand Down
14 changes: 7 additions & 7 deletions runtime/distribution/LICENSE
Original file line number Diff line number Diff line change
Expand Up @@ -1300,13 +1300,13 @@ License: Apache License 2.0 - https://www.apache.org/licenses/LICENSE-2.0.txt

--------------------------------------------------------------------------------

Group: org.apache.iceberg Name: iceberg-api Version: 1.9.0
Group: org.apache.iceberg Name: iceberg-aws Version: 1.9.0
Group: org.apache.iceberg Name: iceberg-azure Version: 1.9.0
Group: org.apache.iceberg Name: iceberg-bundled-guava Version: 1.9.0
Group: org.apache.iceberg Name: iceberg-common Version: 1.9.0
Group: org.apache.iceberg Name: iceberg-core Version: 1.9.0
Group: org.apache.iceberg Name: iceberg-gcp Version: 1.9.0
Group: org.apache.iceberg Name: iceberg-api Version: 1.9.1
Group: org.apache.iceberg Name: iceberg-aws Version: 1.9.1
Group: org.apache.iceberg Name: iceberg-azure Version: 1.9.1
Group: org.apache.iceberg Name: iceberg-bundled-guava Version: 1.9.1
Group: org.apache.iceberg Name: iceberg-common Version: 1.9.1
Group: org.apache.iceberg Name: iceberg-core Version: 1.9.1
Group: org.apache.iceberg Name: iceberg-gcp Version: 1.9.1
Project URL: https://iceberg.apache.org/
License: Apache License 2.0 - https://www.apache.org/licenses/LICENSE-2.0.txt

Expand Down
14 changes: 7 additions & 7 deletions runtime/server/distribution/LICENSE
Original file line number Diff line number Diff line change
Expand Up @@ -1294,13 +1294,13 @@ License: Apache License 2.0 - https://www.apache.org/licenses/LICENSE-2.0.txt

--------------------------------------------------------------------------------

Group: org.apache.iceberg Name: iceberg-api Version: 1.9.0
Group: org.apache.iceberg Name: iceberg-aws Version: 1.9.0
Group: org.apache.iceberg Name: iceberg-azure Version: 1.9.0
Group: org.apache.iceberg Name: iceberg-bundled-guava Version: 1.9.0
Group: org.apache.iceberg Name: iceberg-common Version: 1.9.0
Group: org.apache.iceberg Name: iceberg-core Version: 1.9.0
Group: org.apache.iceberg Name: iceberg-gcp Version: 1.9.0
Group: org.apache.iceberg Name: iceberg-api Version: 1.9.1
Group: org.apache.iceberg Name: iceberg-aws Version: 1.9.1
Group: org.apache.iceberg Name: iceberg-azure Version: 1.9.1
Group: org.apache.iceberg Name: iceberg-bundled-guava Version: 1.9.1
Group: org.apache.iceberg Name: iceberg-common Version: 1.9.1
Group: org.apache.iceberg Name: iceberg-core Version: 1.9.1
Group: org.apache.iceberg Name: iceberg-gcp Version: 1.9.1
Project URL: https://iceberg.apache.org/
License: Apache License 2.0 - https://www.apache.org/licenses/LICENSE-2.0.txt

Expand Down
2 changes: 1 addition & 1 deletion site/content/in-dev/getting-started/using-polaris.md
Original file line number Diff line number Diff line change
Expand Up @@ -158,7 +158,7 @@ _Note: the credentials provided here are those for our principal, not the root c

```shell
bin/spark-sql \
--packages org.apache.iceberg:iceberg-spark-runtime-3.5_2.12:1.9.0,org.apache.iceberg:iceberg-aws-bundle:1.9.0 \
--packages org.apache.iceberg:iceberg-spark-runtime-3.5_2.12:1.9.1,org.apache.iceberg:iceberg-aws-bundle:1.9.1 \
--conf spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions \
--conf spark.sql.catalog.quickstart_catalog.warehouse=quickstart_catalog \
--conf spark.sql.catalog.quickstart_catalog.header.X-Iceberg-Access-Delegation=vended-credentials \
Expand Down
10 changes: 5 additions & 5 deletions site/content/in-dev/polaris-spark-client.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,12 +44,12 @@ git clone https://github.com/apache/polaris.git ~/polaris

## Start Spark against a deployed Polaris service
Before starting, ensure that the deployed Polaris service supports Generic Tables, and that Spark 3.5(version 3.5.3 or later is installed).
Spark 3.5.5 is recommended, and you can follow the instructions below to get a Spark 3.5.5 distribution.
Spark 3.5.6 is recommended, and you can follow the instructions below to get a Spark 3.5.6 distribution.
```shell
cd ~
wget https://archive.apache.org/dist/spark/spark-3.5.5/spark-3.5.5-bin-hadoop3.tgz
wget https://archive.apache.org/dist/spark/spark-3.5.6/spark-3.5.6-bin-hadoop3.tgz
mkdir spark-3.5
tar xzvf spark-3.5.5-bin-hadoop3.tgz -C spark-3.5 --strip-components=1
tar xzvf spark-3.5.6-bin-hadoop3.tgz -C spark-3.5 --strip-components=1
cd spark-3.5
```

Expand All @@ -59,7 +59,7 @@ a released Polaris Spark client.

```shell
bin/spark-shell \
--packages <polaris-spark-client-package>,org.apache.iceberg:iceberg-aws-bundle:1.9.0,io.delta:delta-spark_2.12:3.3.1 \
--packages <polaris-spark-client-package>,org.apache.iceberg:iceberg-aws-bundle:1.9.1,io.delta:delta-spark_2.12:3.3.1 \
--conf spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions,io.delta.sql.DeltaSparkSessionExtension \
--conf spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog \
--conf spark.sql.catalog.<spark-catalog-name>.warehouse=<polaris-catalog-name> \
Expand Down Expand Up @@ -87,7 +87,7 @@ You can also start the connection by programmatically initialize a SparkSession,
from pyspark.sql import SparkSession

spark = SparkSession.builder
.config("spark.jars.packages", "<polaris-spark-client-package>,org.apache.iceberg:iceberg-aws-bundle:1.9.0,io.delta:delta-spark_2.12:3.3.1")
.config("spark.jars.packages", "<polaris-spark-client-package>,org.apache.iceberg:iceberg-aws-bundle:1.9.1,io.delta:delta-spark_2.12:3.3.1")
.config("spark.sql.catalog.spark_catalog", "org.apache.spark.sql.delta.catalog.DeltaCatalog")
.config("spark.sql.extensions", "org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions,io.delta.sql.DeltaSparkSessionExtension")
.config("spark.sql.catalog.<spark-catalog-name>", "org.apache.polaris.spark.SparkCatalog")
Expand Down
Loading