Skip to content

Commit 13ef946

Browse files
(fix): wrong countings && indentation
1 parent 5c53fcd commit 13ef946

File tree

3 files changed

+9
-9
lines changed

3 files changed

+9
-9
lines changed

content/cn/docs/quickstart/client/hugegraph-client.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -227,11 +227,11 @@ public class BatchExample {
227227
public static void main(String[] args) {
228228
// If connect failed will throw a exception.
229229
HugeClient hugeClient = HugeClient.builder("http://localhost:8080",
230-
"DEFAULT",
231-
"hugegraph")
232-
.configUser("username", "password")
233-
// 这是示例,生产环境需要使用安全的凭证
234-
.build();
230+
"DEFAULT",
231+
"hugegraph")
232+
.configUser("username", "password")
233+
// 这是示例,生产环境需要使用安全的凭证
234+
.build();
235235

236236
SchemaManager schema = hugeClient.schema();
237237

content/cn/docs/quickstart/toolchain/hugegraph-loader.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -601,7 +601,7 @@ bin/mapping-convert.sh struct.json
601601

602602
##### 3.3.2 输入源
603603

604-
输入源目前分为五类:FILE、HDFS、JDBC、KAFKAGRAPH,由`type`节点区分,我们称为本地文件输入源、HDFS 输入源、JDBC 输入源和 KAFKA 输入源,图数据源,下面分别介绍。
604+
输入源目前分为五类:FILE、HDFS、JDBC、KAFKAGRAPH,由`type`节点区分,我们称为本地文件输入源、HDFS 输入源、JDBC 输入源和 KAFKA 输入源,图数据源,下面分别介绍。
605605

606606
###### 3.3.2.1 本地文件输入源
607607

@@ -842,7 +842,7 @@ schema: 必填
842842
| `--check-vertex` | false | | 插入边时是否检查边所连接的顶点是否存在 |
843843
| `--print-progress` | true | | 是否在控制台实时打印导入条数 |
844844
| `--dry-run` | false | | 打开该模式,只解析不导入,通常用于测试 |
845-
| `--help` | false | | 打印帮助信息 | |
845+
| `--help` | false | | 打印帮助信息 |
846846

847847
##### 3.4.2 断点续导模式
848848

content/en/docs/quickstart/toolchain/hugegraph-loader.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -590,7 +590,7 @@ A struct-v2.json will be generated in the same directory as struct.json.
590590

591591
##### 3.3.2 Input Source
592592

593-
Input sources are currently divided into four categories: FILE, HDFS, JDBC, KAFKA and GRAPH, which are distinguished by the `type` node. We call them local file input sources, HDFS input sources, JDBC input sources, KAFKA input sources and GRAPH input source, which are described below.
593+
Input sources are currently divided into five categories: FILE, HDFS, JDBC, KAFKA and GRAPH, which are distinguished by the `type` node. We call them local file input sources, HDFS input sources, JDBC input sources, KAFKA input sources and GRAPH input source, which are described below.
594594

595595
###### 3.3.2.1 Local file input source
596596

@@ -831,7 +831,7 @@ The import process is controlled by commands submitted by the user, and the user
831831
| `--check-vertex` | false | | Whether to check if the vertices connected by the edge exist when inserting the edge |
832832
| `--print-progress` | true | | Whether to print the number of imported items in real time on the console |
833833
| `--dry-run` | false | | Enable this mode to only parse data without importing; usually used for testing |
834-
| `--help` | false | | Print help information |
834+
| `--help` | false | | Print help information |
835835

836836
##### 3.4.2 Breakpoint Continuation Mode
837837

0 commit comments

Comments
 (0)