Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion fluss-lake/fluss-lake-paimon/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@
<packaging>jar</packaging>

<properties>
<paimon.version>1.2.0</paimon.version>
<paimon.version>1.3.1</paimon.version>
</properties>

<dependencies>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,7 @@
import org.apache.paimon.manifest.IndexManifestEntry;
import org.apache.paimon.manifest.ManifestCommittable;
import org.apache.paimon.manifest.ManifestEntry;
import org.apache.paimon.manifest.SimpleFileEntry;
import org.apache.paimon.operation.FileStoreCommit;
import org.apache.paimon.table.FileStoreTable;
import org.apache.paimon.table.sink.CommitCallback;
Expand Down Expand Up @@ -224,7 +225,10 @@ public static class PaimonCommitCallback implements CommitCallback {

@Override
public void call(
List<ManifestEntry> list, List<IndexManifestEntry> indexFiles, Snapshot snapshot) {
List<SimpleFileEntry> baseFiles,
List<ManifestEntry> deltaFiles,
List<IndexManifestEntry> indexFiles,
Snapshot snapshot) {
currentCommitSnapshotId.set(snapshot.id());
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,4 +6,4 @@ The Apache Software Foundation (http://www.apache.org/).

This project bundles the following dependencies under the Apache Software License 2.0 (http://www.apache.org/licenses/LICENSE-2.0.txt)

- org.apache.paimon:paimon-bundle:1.2.0
- org.apache.paimon:paimon-bundle:1.3.1
2 changes: 1 addition & 1 deletion pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@
<curator.version>5.4.0</curator.version>
<netty.version>4.1.104.Final</netty.version>
<arrow.version>15.0.0</arrow.version>
<paimon.version>1.2.0</paimon.version>
<paimon.version>1.3.1</paimon.version>
<iceberg.version>1.9.1</iceberg.version>

<fluss.hadoop.version>2.10.2</fluss.hadoop.version>
Expand Down
4 changes: 2 additions & 2 deletions website/docs/maintenance/tiered-storage/lakehouse-storage.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ datalake.paimon.metastore: filesystem
datalake.paimon.warehouse: /tmp/paimon
```

Fluss processes Paimon configurations by removing the `datalake.paimon.` prefix and then use the remaining configuration (without the prefix `datalake.paimon.`) to create the Paimon catalog. Checkout the [Paimon documentation](https://paimon.apache.org/docs/1.1/maintenance/configurations/) for more details on the available configurations.
Fluss processes Paimon configurations by removing the `datalake.paimon.` prefix and then use the remaining configuration (without the prefix `datalake.paimon.`) to create the Paimon catalog. Checkout the [Paimon documentation](https://paimon.apache.org/docs/1.3/maintenance/configurations/) for more details on the available configurations.

For example, if you want to configure to use Hive catalog, you can configure like following:
```yaml
Expand Down Expand Up @@ -65,7 +65,7 @@ Then, you must start the datalake tiering service to tier Fluss's data to the la
you should download the corresponding [Fluss filesystem jar](/downloads#filesystem-jars) and also put it into `${FLINK_HOME}/lib`
- Put [fluss-lake-paimon jar](https://repo1.maven.org/maven2/org/apache/fluss/fluss-lake-paimon/$FLUSS_VERSION$/fluss-lake-paimon-$FLUSS_VERSION$.jar) into `${FLINK_HOME}/lib`
- [Download](https://flink.apache.org/downloads/) pre-bundled Hadoop jar `flink-shaded-hadoop-2-uber-*.jar` and put into `${FLINK_HOME}/lib`
- Put Paimon's [filesystem jar](https://paimon.apache.org/docs/1.1/project/download/) into `${FLINK_HOME}/lib`, if you use s3 to store paimon data, please put `paimon-s3` jar into `${FLINK_HOME}/lib`
- Put Paimon's [filesystem jar](https://paimon.apache.org/docs/1.3/project/download/) into `${FLINK_HOME}/lib`, if you use s3 to store paimon data, please put `paimon-s3` jar into `${FLINK_HOME}/lib`
- The other jars that Paimon may require, for example, if you use HiveCatalog, you will need to put hive related jars


Expand Down
4 changes: 2 additions & 2 deletions website/docs/quickstart/lakehouse.md
Original file line number Diff line number Diff line change
Expand Up @@ -117,7 +117,7 @@ The Docker Compose environment consists of the following containers:
- **Flink Cluster**: a Flink `JobManager` and a Flink `TaskManager` container to execute queries.

**Note:** The `apache/fluss-quickstart-flink` image is based on [flink:1.20.3-java17](https://hub.docker.com/layers/library/flink/1.20-java17/images/sha256:296c7c23fa40a9a3547771b08fc65e25f06bc4cfd3549eee243c99890778cafc) and
includes the [fluss-flink](engine-flink/getting-started.md), [paimon-flink](https://paimon.apache.org/docs/1.0/flink/quick-start/) and
includes the [fluss-flink](engine-flink/getting-started.md), [paimon-flink](https://paimon.apache.org/docs/1.3/flink/quick-start/) and
[flink-connector-faker](https://flink-packages.org/packages/flink-faker) to simplify this guide.

3. To start all containers, run:
Expand All @@ -136,7 +136,7 @@ You can also visit http://localhost:8083/ to see if Flink is running normally.

:::note
- If you want to additionally use an observability stack, follow one of the provided quickstart guides [here](maintenance/observability/quickstart.md) and then continue with this guide.
- If you want to run with your own Flink environment, remember to download the [fluss-flink connector jar](/downloads), [flink-connector-faker](https://github.com/knaufk/flink-faker/releases), [paimon-flink connector jar](https://paimon.apache.org/docs/1.0/flink/quick-start/) and then put them to `FLINK_HOME/lib/`.
- If you want to run with your own Flink environment, remember to download the [fluss-flink connector jar](/downloads), [flink-connector-faker](https://github.com/knaufk/flink-faker/releases), [paimon-flink connector jar](https://paimon.apache.org/docs/1.3/flink/quick-start/) and then put them to `FLINK_HOME/lib/`.
- All the following commands involving `docker compose` should be executed in the created working directory that contains the `docker-compose.yml` file.
:::

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,7 @@ You can choose between two views of the table:
#### Read Data Only in Paimon

##### Prerequisites
Download the [paimon-flink.jar](https://paimon.apache.org/docs/1.2/) that matches your Flink version, and place it in the `FLINK_HOME/lib` directory
Download the [paimon-flink.jar](https://paimon.apache.org/docs/1.3/) that matches your Flink version, and place it in the `FLINK_HOME/lib` directory

##### Read Paimon Data
To read only data stored in Paimon, use the `$lake` suffix in the table name. The following example demonstrates this:
Expand All @@ -92,7 +92,7 @@ SELECT * FROM orders$lake$snapshots;

When you specify the `$lake` suffix in a query, the table behaves like a standard Paimon table and inherits all its capabilities.
This allows you to take full advantage of Flink's query support and optimizations on Paimon, such as querying system tables, time travel, and more.
For further information, refer to Paimon’s [SQL Query documentation](https://paimon.apache.org/docs/0.9/flink/sql-query/#sql-query).
For further information, refer to Paimon’s [SQL Query documentation](https://paimon.apache.org/docs/1.3/flink/sql-query/#sql-query).

#### Union Read of Data in Fluss and Paimon

Expand Down Expand Up @@ -125,7 +125,7 @@ Key behavior for data retention:

### Reading with other Engines

Since the data tiered to Paimon from Fluss is stored as a standard Paimon table, you can use any engine that supports Paimon to read it. Below is an example using [StarRocks](https://paimon.apache.org/docs/1.2/ecosystem/starrocks/):
Since the data tiered to Paimon from Fluss is stored as a standard Paimon table, you can use any engine that supports Paimon to read it. Below is an example using [StarRocks](https://paimon.apache.org/docs/1.3/ecosystem/starrocks/):

First, create a Paimon catalog in StarRocks:

Expand Down