@@ -68,6 +68,8 @@ services:
6868 datalake.format: paimon
6969 datalake.paimon.metastore: filesystem
7070 datalake.paimon.warehouse: /tmp/paimon
71+ volumes :
72+ - shared-tmpfs:/tmp/paimon
7173 tablet-server :
7274 image : fluss/fluss:$FLUSS_VERSION$
7375 command : tabletServer
@@ -84,6 +86,8 @@ services:
8486 datalake.format: paimon
8587 datalake.paimon.metastore: filesystem
8688 datalake.paimon.warehouse: /tmp/paimon
89+ volumes :
90+ - shared-tmpfs:/tmp/paimon
8791 zookeeper :
8892 restart : always
8993 image : zookeeper:3.9.2
@@ -363,9 +367,15 @@ SELECT * FROM fluss_customer WHERE `cust_key` = 1;
363367To integrate with [ Apache Paimon] ( https://paimon.apache.org/ ) , you need to start the ` Lakehouse Tiering Service ` .
364368Open a new terminal, navigate to the ` fluss-quickstart-flink ` directory, and execute the following command within this directory to start the service:
365369``` shell
366- docker compose exec coordinator-server ./bin/lakehouse.sh -Dflink.rest.address=jobmanager -Dflink.rest.port=8081 -Dflink.execution.checkpointing.interval=30s -Dbootstrap.servers=coordinator-server:9123
370+ docker compose exec jobmanager \
371+ /opt/flink/bin/flink run \
372+ /opt/flink/opt/fluss-flink-tiering-$FLUSS_VERSION_SHORT $.jar \
373+ --fluss.bootstrap.servers coordinator-server:9123 \
374+ --datalake.format paimon \
375+ --datalake.paimon.metastore filesystem \
376+ --datalake.paimon.warehouse /tmp/paimon
367377```
368- You should see a Flink Job named ` fluss-paimon-tiering-service ` running in the [ Flink Web UI] ( http://localhost:8083/ ) .
378+ You should see a Flink Job to tier data from Fluss to Paimon running in the [ Flink Web UI] ( http://localhost:8083/ ) .
369379
370380### Streaming into Fluss datalake-enabled tables
371381
@@ -387,7 +397,10 @@ CREATE TABLE datalake_enriched_orders (
387397 ` cust_mktsegment` STRING,
388398 ` nation_name` STRING,
389399 PRIMARY KEY (` order_key` ) NOT ENFORCED
390- ) WITH (' table.datalake.enabled' = ' true' );
400+ ) WITH (
401+ ' table.datalake.enabled' = ' true' ,
402+ ' table.datalake.freshness' = ' 30s'
403+ );
391404```
392405
393406Next, perform streaming data writing into the ** datalake-enabled** table, ` datalake_enriched_orders ` :
0 commit comments