Skip to content

Commit 4e10631

Browse files
authored
docs: update kafka source doc (#3663)
Signed-off-by: Song Gao <disxiaofei@163.com>
1 parent d95877e commit 4e10631

File tree

4 files changed

+40
-42
lines changed

4 files changed

+40
-42
lines changed

docs/en_US/guide/sinks/plugin/kafka.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -60,7 +60,7 @@ Restart the eKuiper server to activate the plugin.
6060
| topic | false | The topic of the Kafka |
6161
| saslAuthType | false | The Kafka sasl authType, support none,plain,scram |
6262
| saslUserName | true | The sasl user name |
63-
| saslPassword | true | The sasl password |
63+
| password | true | The sasl password |
6464
| insecureSkipVerify | true | whether to ignore SSL verification |
6565
| certificationPath | true | Kafka client ssl verification Cert file path |
6666
| privateKeyPath | true | Key file path for Kafka client SSL verification |
@@ -73,7 +73,7 @@ Restart the eKuiper server to activate the plugin.
7373
| key | true | Key information carried by the Kafka client in messages sent to the server |
7474
| headers | true | The header information carried by the Kafka client in the message sent to the server |
7575
| compression | true | Whether to enable compression when the Kafka client sends messages to the server, only supports `gzip`, `snappy`, `lz4`, `zstd` |
76-
| batchBytes | true | Set the maximum number of bytes for Kafka client to send batch messages to the server, default is 1048576 |
76+
| batchBytes | true | Set the maximum number of bytes for Kafka client to send batch messages to the server, default is 1048576 |
7777

7878
You can check the connectivity of the corresponding sink endpoint in advance through the API: [Connectivity Check](../../../api/restapi/connection.md#connectivity-check)
7979

docs/en_US/guide/sources/plugin/kafka.md

Lines changed: 18 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -28,22 +28,21 @@ default:
2828
2929
You can check the connectivity of the corresponding sink endpoint in advance through the API: [Connectivity Check](../../../api/restapi/connection.md#connectivity-check)
3030
31-
### Global configurations
32-
33-
User can specify the global Kafka source settings here. The configuration items specified in `default` section will be taken as default settings for the source when running this source.
34-
35-
### brokers
36-
37-
Kafka message source address, the address is separated by `,`.
38-
39-
### groupID
40-
41-
The group ID used by eKuiper when consuming kafka messages.
42-
43-
### partition
44-
45-
The partition specified when eKuiper consumes kafka messages
46-
47-
### maxBytes
48-
49-
The maximum number of bytes that a single Kafka message batch can carry, the default is 1MB
31+
### Properties
32+
33+
| Property name | Optional | Description |
34+
|--------------------|----------|-----------------------------------------------------------------------------------------------------------|
35+
| brokers | false | The broker address list ,split with "," |
36+
| saslAuthType | true | The Kafka sasl authType, support none,plain,scram, default none |
37+
| saslUserName | true | The sasl user name |
38+
| password | true | The sasl password |
39+
| insecureSkipVerify | true | whether to ignore SSL verification |
40+
| certificationPath | true | Kafka client ssl verification Cert file path |
41+
| privateKeyPath | true | Key file path for Kafka client SSL verification |
42+
| rootCaPath | true | Kafka client ssl verified CA certificate file path |
43+
| certficationRaw | true | Kafka client ssl verified Cert base64 encoded original text, use `certificationPath` first if both defined |
44+
| privateKeyRaw | true | Kafka client ssl verified Key base64 encoded original text, use `privateKeyPath` first if both defined |
45+
| rootCARaw | true | Kafka client ssl verified CA base64 encoded original text, use `rootCaPath` first if both defined |
46+
| maxBytes | true | The maximum number of bytes that a single Kafka message batch can carry, the default is 1MB |
47+
| groupID | true | The group ID used by eKuiper when consuming kafka messages. |
48+
| partition | true | The partition specified when eKuiper consumes kafka messages |

docs/zh_CN/guide/sinks/plugin/kafka.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -60,7 +60,7 @@ $(PLUGINS_CUSTOM):
6060
| topic || kafka 主题 |
6161
| saslAuthType || sasl 认证类型 , 支持none,plain,scram |
6262
| saslUserName || sasl 用户名 |
63-
| saslPassword || sasl 密码 |
63+
| password || sasl 密码 |
6464
| insecureSkipVerify || 是否忽略 SSL 验证 |
6565
| certificationPath || Kafka 客户端 ssl 验证的 crt 文件路径 |
6666
| privateKeyPath || Kafka 客户端 ssl 验证的 key 文件路径 |
@@ -73,7 +73,7 @@ $(PLUGINS_CUSTOM):
7373
| key || Kafka 客户端向 server 发送消息所携带的 Key 信息 |
7474
| headers || Kafka 客户端向 server 发送消息所携带的 headers 信息 |
7575
| compression || Kafka 客户端向 server 发送消息时是否开启压缩,仅支持 `gzip`,`snappy`,`lz4`,`zstd` |
76-
| batchBytes || 设置 Kafka 客户端向 server 发送 batch 消息的最大 byte, 默认为 1048576 |
76+
| batchBytes || 设置 Kafka 客户端向 server 发送 batch 消息的最大 byte, 默认为 1048576 |
7777

7878
其他通用的 sink 属性也支持,请参阅[公共属性](../overview.md#公共属性)
7979

docs/zh_CN/guide/sources/plugin/kafka.md

Lines changed: 18 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -28,22 +28,21 @@ default:
2828
2929
你可以通过 api 的方式提前检查对应 sink 端点的连通性: [连通性检查](../../../api/restapi/connection.md#连通性检查)
3030
31-
### 全局配置
32-
33-
用户可以在此处指定全局 kafka 源设置。`default` 部分中指定的配置项将在运行此源时作为源的默认设置。
34-
35-
### brokers
36-
37-
kafka 消息源地址,多个地址以 `,` 分割。
38-
39-
### groupID
40-
41-
eKuiper 消费 kafka 消息时所使用的 group ID。
42-
43-
### partition
44-
45-
eKuiper 消费 kafka 消息时所指定的 partition
46-
47-
### maxBytes
48-
49-
单个 kafka 消息批次最大所能携带的 bytes 数,默认为 1MB
31+
### 属性
32+
33+
| 属性名称 | 是否可选 | 说明 |
34+
|--------------------|------|--------------------------------------------------------------------------------|
35+
| brokers | 否 | broker地址列表 ,用 "," 分割 |
36+
| saslAuthType | 是 | sasl 认证类型 , 支持none,plain,scram, 默认为 none |
37+
| saslUserName | 是 | sasl 用户名 |
38+
| password | 是 | sasl 密码 |
39+
| insecureSkipVerify | 是 | 是否忽略 SSL 验证 |
40+
| certificationPath | 是 | Kafka 客户端 ssl 验证的 crt 文件路径 |
41+
| privateKeyPath | 是 | Kafka 客户端 ssl 验证的 key 文件路径 |
42+
| rootCaPath | 是 | Kafka 客户端 ssl 验证的 ca 证书文件路径 |
43+
| certficationRaw | 是 | Kafka 客户端 ssl 验证,经过 base64 编码过的的 crt 原文, 如果同时定义了 `certificationPath` 将会先用该参数。 |
44+
| privateKeyRaw | 是 | Kafka 客户端 ssl 验证,经过 base64 编码过的的 key 原文, 如果同时定义了 `privateKeyPath` 将会先用该参数。 |
45+
| rootCARaw | 是 | Kafka 客户端 ssl 验证,经过 base64 编码过的的 ca 原文, 如果同时定义了 `rootCAPath` 将会先用该参数。 |
46+
| maxBytes | 是 | 单个 kafka 消息批次最大所能携带的 bytes 数,默认为 1MB |
47+
| groupID | 是 | eKuiper 消费 kafka 消息时所使用的 group ID。 |
48+
| partition | 是 | eKuiper 消费 kafka 消息时所指定的 partition |

0 commit comments

Comments
 (0)