@@ -37,6 +37,46 @@ Flink CDC 深度集成并由 Apache Flink 驱动,提供以下核心功能:
3737* ✅ 整库同步
3838* ✅具备表结构变更自动同步的能力(Schema Evolution),
3939
40+ ## 环境要求
41+
42+ Flink CDC 有以下环境要求:
43+
44+ * ** JDK** :JDK 11 或更高版本(Flink CDC 从 3.6.0 版本开始基于 JDK 11 构建)
45+ * ** Apache Flink** :Flink 1.20.x 或 Flink 2.2.x
46+
47+ {{< hint info >}}
48+ 在运行 Flink CDC 之前,请确保已安装正确的 JDK 版本。您可以使用 ` java -version ` 命令验证 Java 版本。
49+ {{< /hint >}}
50+
51+ ## 支持的连接器
52+
53+ Flink CDC 提供了丰富的连接器生态系统,用于与各种外部系统进行交互:
54+
55+ | 连接器 | 类型 |
56+ | -----------| ------|
57+ | MySQL | [ Source Connector] ({{< ref "docs/connectors/flink-sources/mysql-cdc" >}}) / [ Pipeline Source Connector] ({{< ref "docs/connectors/pipeline-connectors/mysql" >}}) |
58+ | Oracle | [ Source Connector] ({{< ref "docs/connectors/flink-sources/oracle-cdc" >}}) / [ Pipeline Source Connector] ({{< ref "docs/connectors/pipeline-connectors/oracle" >}}) |
59+ | PostgreSQL | [ Source Connector] ({{< ref "docs/connectors/flink-sources/postgres-cdc" >}}) / [ Pipeline Source Connector] ({{< ref "docs/connectors/pipeline-connectors/postgres" >}}) |
60+ | Db2 | [ Source Connector] ({{< ref "docs/connectors/flink-sources/db2-cdc" >}}) |
61+ | MongoDB | [ Source Connector] ({{< ref "docs/connectors/flink-sources/mongodb-cdc" >}}) |
62+ | SQL Server | [ Source Connector] ({{< ref "docs/connectors/flink-sources/sqlserver-cdc" >}}) |
63+ | TiDB | [ Source Connector] ({{< ref "docs/connectors/flink-sources/tidb-cdc" >}}) |
64+ | Vitess | [ Source Connector] ({{< ref "docs/connectors/flink-sources/vitess-cdc" >}}) |
65+ | Apache Doris | [ Pipeline Sink Connector] ({{< ref "docs/connectors/pipeline-connectors/doris" >}}) |
66+ | Elasticsearch | [ Pipeline Sink Connector] ({{< ref "docs/connectors/pipeline-connectors/elasticsearch" >}}) |
67+ | Fluss | [ Pipeline Sink Connector] ({{< ref "docs/connectors/pipeline-connectors/fluss" >}}) |
68+ | Hudi | [ Pipeline Sink Connector] ({{< ref "docs/connectors/pipeline-connectors/hudi" >}}) |
69+ | Iceberg | [ Pipeline Sink Connector] ({{< ref "docs/connectors/pipeline-connectors/iceberg" >}}) |
70+ | Kafka | [ Pipeline Sink Connector] ({{< ref "docs/connectors/pipeline-connectors/kafka" >}}) |
71+ | MaxCompute | [ Pipeline Sink Connector] ({{< ref "docs/connectors/pipeline-connectors/maxcompute" >}}) |
72+ | OceanBase | [ Pipeline Sink Connector] ({{< ref "docs/connectors/pipeline-connectors/oceanbase" >}}) |
73+ | Paimon | [ Pipeline Sink Connector] ({{< ref "docs/connectors/pipeline-connectors/paimon" >}}) |
74+ | StarRocks | [ Pipeline Sink Connector] ({{< ref "docs/connectors/pipeline-connectors/starrocks" >}}) |
75+
76+ {{< hint info >}}
77+ 有关每个连接器的详细信息,包括支持的版本、功能和配置选项,请参考[ 连接器] ({{< ref "docs/connectors" >}})部分。
78+ {{< /hint >}}
79+
4080## 如何使用 Flink CDC
4181
4282Flink CDC 提供了基于 ` YAML ` 格式的用户 API,更适合于数据集成场景。以下是一个 ` YAML ` 文件的示例,它定义了一个数据管道(Pipeline),该Pipeline从 MySQL 捕获实时变更,并将它们同步到 Apache Doris:
@@ -77,8 +117,10 @@ pipeline:
77117
78118查看快速入门指南,了解如何建立一个 Flink CDC Pipeline:
79119
80- - [MySQL to Apache Doris]({{< ref "docs/get-started/quickstart/mysql-to-doris" >}})
81- - [MySQL to StarRocks]({{< ref "docs/get-started/quickstart/mysql-to-starrocks" >}})
120+ | 示例 | 版本 |
121+ |---------|---------|
122+ | MySQL to Apache Doris | [1.20.x]({{< ref "docs/get-started/quickstart-for-1.20/mysql-to-doris" >}}) / [2.2.x]({{< ref "docs/get-started/quickstart-for-2.2/mysql-to-doris" >}}) |
123+ | MySQL to StarRocks | [1.20.x]({{< ref "docs/get-started/quickstart-for-1.20/mysql-to-starrocks" >}}) / [2.2.x]({{< ref "docs/get-started/quickstart-for-2.2/mysql-to-starrocks" >}}) |
82124
83125# ## 理解核心概念
84126
0 commit comments