Skip to content

Commit 3685c7a

Browse files
Docs: Extend the topic description in the Kafka output doc (#46909)
* Extend the topic description in the Kafka output doc for Filebeat * Update the kafka-output.md for the rest of the beats
1 parent e8b965f commit 3685c7a

File tree

6 files changed

+168
-6
lines changed

6 files changed

+168
-6
lines changed

docs/reference/auditbeat/kafka-output.md

Lines changed: 28 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -99,7 +99,21 @@ To use `GSSAPI` mechanism to authenticate with Kerberos, you must leave this fie
9999

100100
The Kafka topic used for produced events.
101101

102-
You can set the topic dynamically by using a format string to access any event field. For example, this configuration uses a custom field, `fields.log_topic`, to set the topic for each event:
102+
You can set a static topic, for example `auditbeat`, or you can use a format string to set a topic dynamically based on one or more [Elastic Common Schema (ECS)](ecs://reference/index.md) fields. Available fields include:
103+
104+
* `data_stream.type`
105+
* `data_stream.dataset`
106+
* `data_stream.namespace`
107+
* `@timestamp`
108+
* `event.dataset`
109+
110+
For example:
111+
112+
```yaml
113+
topic: '%{[data_stream.type]}-%{[data_stream.dataset]}-%{[data_stream.namespace]}'
114+
```
115+
116+
You can also set a custom field. This is useful if you need to construct a more complex or structured topic name. For example, this configuration uses the `fields.log_topic` custom field to set the topic for each event:
103117

104118
```yaml
105119
topic: '%{[fields.log_topic]}'
@@ -109,6 +123,19 @@ topic: '%{[fields.log_topic]}'
109123
To learn how to add custom fields to events, see the [`fields`](/reference/auditbeat/configuration-general-options.md#libbeat-configuration-fields) option.
110124
::::
111125

126+
To set a dynamic topic value for outputting {{auditbeat}} data to Kafka, you can add the [`add_fields` processor](/reference/auditbeat/add-fields.md) to {{auditbeat}}'s input configuration settings.
127+
128+
For example, the following `add_fields` processor creates a dynamic topic value for the `fields.log_topic` field by combining multiple [ECS data stream fields](ecs://reference/ecs-data_stream.md):
129+
130+
```yaml
131+
- add_fields:
132+
target: ''
133+
fields:
134+
log_topic: '%{[data_stream.type]}-%{[data_stream.dataset]}-%{[data_stream.namespace]}' <1>
135+
```
136+
1. Depending on the values of the data stream fields, this generates topic names such as `logs-nginx.access-production` or `metrics-system.cpu-staging` as the value of the custom `log_topic` field.
137+
138+
For more information, refer to [Filter and enhance data with processors](/reference/auditbeat/filtering-enhancing-data.md).
112139

113140
See the [`topics`](#topics-option-kafka) setting for other ways to set the topic dynamically.
114141

docs/reference/filebeat/kafka-output.md

Lines changed: 28 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -99,7 +99,21 @@ To use `GSSAPI` mechanism to authenticate with Kerberos, you must leave this fie
9999

100100
The Kafka topic used for produced events.
101101

102-
You can set the topic dynamically by using a format string to access any event field. For example, this configuration uses a custom field, `fields.log_topic`, to set the topic for each event:
102+
You can set a static topic, for example `filebeat`, or you can use a format string to set a topic dynamically based on one or more [Elastic Common Schema (ECS)](ecs://reference/index.md) fields. Available fields include:
103+
104+
* `data_stream.type`
105+
* `data_stream.dataset`
106+
* `data_stream.namespace`
107+
* `@timestamp`
108+
* `event.dataset`
109+
110+
For example:
111+
112+
```yaml
113+
topic: '%{[data_stream.type]}-%{[data_stream.dataset]}-%{[data_stream.namespace]}'
114+
```
115+
116+
You can also set a custom field. This is useful if you need to construct a more complex or structured topic name. For example, this configuration uses the `fields.log_topic` custom field to set the topic for each event:
103117

104118
```yaml
105119
topic: '%{[fields.log_topic]}'
@@ -109,6 +123,19 @@ topic: '%{[fields.log_topic]}'
109123
To learn how to add custom fields to events, see the [`fields`](/reference/filebeat/configuration-general-options.md#libbeat-configuration-fields) option.
110124
::::
111125

126+
To set a dynamic topic value for outputting {{filebeat}} data to Kafka, you can add the [`add_fields` processor](/reference/filebeat/add-fields.md) to {{filebeat}}'s input configuration settings.
127+
128+
For example, the following `add_fields` processor creates a dynamic topic value for the `fields.log_topic` field by combining multiple [ECS data stream fields](ecs://reference/ecs-data_stream.md):
129+
130+
```yaml
131+
- add_fields:
132+
target: ''
133+
fields:
134+
log_topic: '%{[data_stream.type]}-%{[data_stream.dataset]}-%{[data_stream.namespace]}' <1>
135+
```
136+
1. Depending on the values of the data stream fields, this generates topic names such as `logs-nginx.access-production` or `metrics-system.cpu-staging` as the value of the custom `log_topic` field.
137+
138+
For more information, refer to [Filter and enhance data with processors](/reference/filebeat/filtering-enhancing-data.md).
112139

113140
See the [`topics`](#topics-option-kafka) setting for other ways to set the topic dynamically.
114141

docs/reference/heartbeat/kafka-output.md

Lines changed: 28 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -99,7 +99,21 @@ To use `GSSAPI` mechanism to authenticate with Kerberos, you must leave this fie
9999

100100
The Kafka topic used for produced events.
101101

102-
You can set the topic dynamically by using a format string to access any event field. For example, this configuration uses a custom field, `fields.log_topic`, to set the topic for each event:
102+
You can set a static topic, for example `heartbeat`, or you can use a format string to set a topic dynamically based on one or more [Elastic Common Schema (ECS)](ecs://reference/index.md) fields. Available fields include:
103+
104+
* `data_stream.type`
105+
* `data_stream.dataset`
106+
* `data_stream.namespace`
107+
* `@timestamp`
108+
* `event.dataset`
109+
110+
For example:
111+
112+
```yaml
113+
topic: '%{[data_stream.type]}-%{[data_stream.dataset]}-%{[data_stream.namespace]}'
114+
```
115+
116+
You can also set a custom field. This is useful if you need to construct a more complex or structured topic name. For example, this configuration uses the `fields.log_topic` custom field to set the topic for each event:
103117

104118
```yaml
105119
topic: '%{[fields.log_topic]}'
@@ -109,6 +123,19 @@ topic: '%{[fields.log_topic]}'
109123
To learn how to add custom fields to events, see the [`fields`](/reference/heartbeat/configuration-general-options.md#libbeat-configuration-fields) option.
110124
::::
111125

126+
To set a dynamic topic value for outputting {{heartbeat}} data to Kafka, you can add the [`add_fields` processor](/reference/heartbeat/add-fields.md) to {{heartbeat}}'s input configuration settings.
127+
128+
For example, the following `add_fields` processor creates a dynamic topic value for the `fields.log_topic` field by combining multiple [ECS data stream fields](ecs://reference/ecs-data_stream.md):
129+
130+
```yaml
131+
- add_fields:
132+
target: ''
133+
fields:
134+
log_topic: '%{[data_stream.type]}-%{[data_stream.dataset]}-%{[data_stream.namespace]}' <1>
135+
```
136+
1. Depending on the values of the data stream fields, this generates topic names such as `logs-nginx.access-production` or `metrics-system.cpu-staging` as the value of the custom `log_topic` field.
137+
138+
For more information, refer to [Filter and enhance data with processors](/reference/heartbeat/filtering-enhancing-data.md).
112139

113140
See the [`topics`](#topics-option-kafka) setting for other ways to set the topic dynamically.
114141

docs/reference/metricbeat/kafka-output.md

Lines changed: 28 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -99,7 +99,21 @@ To use `GSSAPI` mechanism to authenticate with Kerberos, you must leave this fie
9999

100100
The Kafka topic used for produced events.
101101

102-
You can set the topic dynamically by using a format string to access any event field. For example, this configuration uses a custom field, `fields.log_topic`, to set the topic for each event:
102+
You can set a static topic, for example `metricbeat`, or you can use a format string to set a topic dynamically based on one or more [Elastic Common Schema (ECS)](ecs://reference/index.md) fields. Available fields include:
103+
104+
* `data_stream.type`
105+
* `data_stream.dataset`
106+
* `data_stream.namespace`
107+
* `@timestamp`
108+
* `event.dataset`
109+
110+
For example:
111+
112+
```yaml
113+
topic: '%{[data_stream.type]}-%{[data_stream.dataset]}-%{[data_stream.namespace]}'
114+
```
115+
116+
You can also set a custom field. This is useful if you need to construct a more complex or structured topic name. For example, this configuration uses the `fields.log_topic` custom field to set the topic for each event:
103117

104118
```yaml
105119
topic: '%{[fields.log_topic]}'
@@ -109,6 +123,19 @@ topic: '%{[fields.log_topic]}'
109123
To learn how to add custom fields to events, see the [`fields`](/reference/metricbeat/configuration-general-options.md#libbeat-configuration-fields) option.
110124
::::
111125

126+
To set a dynamic topic value for outputting {{metricbeat}} data to Kafka, you can add the [`add_fields` processor](/reference/metricbeat/add-fields.md) to {{metricbeat}}'s input configuration settings.
127+
128+
For example, the following `add_fields` processor creates a dynamic topic value for the `fields.log_topic` field by combining multiple [ECS data stream fields](ecs://reference/ecs-data_stream.md):
129+
130+
```yaml
131+
- add_fields:
132+
target: ''
133+
fields:
134+
log_topic: '%{[data_stream.type]}-%{[data_stream.dataset]}-%{[data_stream.namespace]}' <1>
135+
```
136+
1. Depending on the values of the data stream fields, this generates topic names such as `logs-nginx.access-production` or `metrics-system.cpu-staging` as the value of the custom `log_topic` field.
137+
138+
For more information, refer to [Filter and enhance data with processors](/reference/metricbeat/filtering-enhancing-data.md).
112139

113140
See the [`topics`](#topics-option-kafka) setting for other ways to set the topic dynamically.
114141

docs/reference/packetbeat/kafka-output.md

Lines changed: 28 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -99,7 +99,21 @@ To use `GSSAPI` mechanism to authenticate with Kerberos, you must leave this fie
9999

100100
The Kafka topic used for produced events.
101101

102-
You can set the topic dynamically by using a format string to access any event field. For example, this configuration uses a custom field, `fields.log_topic`, to set the topic for each event:
102+
You can set a static topic, for example `packetbeat`, or you can use a format string to set a topic dynamically based on one or more [Elastic Common Schema (ECS)](ecs://reference/index.md) fields. Available fields include:
103+
104+
* `data_stream.type`
105+
* `data_stream.dataset`
106+
* `data_stream.namespace`
107+
* `@timestamp`
108+
* `event.dataset`
109+
110+
For example:
111+
112+
```yaml
113+
topic: '%{[data_stream.type]}-%{[data_stream.dataset]}-%{[data_stream.namespace]}'
114+
```
115+
116+
You can also set a custom field. This is useful if you need to construct a more complex or structured topic name. For example, this configuration uses the `fields.log_topic` custom field to set the topic for each event:
103117

104118
```yaml
105119
topic: '%{[fields.log_topic]}'
@@ -109,6 +123,19 @@ topic: '%{[fields.log_topic]}'
109123
To learn how to add custom fields to events, see the [`fields`](/reference/packetbeat/configuration-general-options.md#libbeat-configuration-fields) option.
110124
::::
111125

126+
To set a dynamic topic value for outputting {{packetbeat}} data to Kafka, you can add the [`add_fields` processor](/reference/packetbeat/add-fields.md) to {{packetbeat}}'s input configuration settings.
127+
128+
For example, the following `add_fields` processor creates a dynamic topic value for the `fields.log_topic` field by combining multiple [ECS data stream fields](ecs://reference/ecs-data_stream.md):
129+
130+
```yaml
131+
- add_fields:
132+
target: ''
133+
fields:
134+
log_topic: '%{[data_stream.type]}-%{[data_stream.dataset]}-%{[data_stream.namespace]}' <1>
135+
```
136+
1. Depending on the values of the data stream fields, this generates topic names such as `logs-nginx.access-production` or `metrics-system.cpu-staging` as the value of the custom `log_topic` field.
137+
138+
For more information, refer to [Filter and enhance data with processors](/reference/packetbeat/filtering-enhancing-data.md).
112139

113140
See the [`topics`](#topics-option-kafka) setting for other ways to set the topic dynamically.
114141

docs/reference/winlogbeat/kafka-output.md

Lines changed: 28 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -99,7 +99,21 @@ To use `GSSAPI` mechanism to authenticate with Kerberos, you must leave this fie
9999

100100
The Kafka topic used for produced events.
101101

102-
You can set the topic dynamically by using a format string to access any event field. For example, this configuration uses a custom field, `fields.log_topic`, to set the topic for each event:
102+
You can set a static topic, for example `winlogbeat`, or you can use a format string to set a topic dynamically based on one or more [Elastic Common Schema (ECS)](ecs://reference/index.md) fields. Available fields include:
103+
104+
* `data_stream.type`
105+
* `data_stream.dataset`
106+
* `data_stream.namespace`
107+
* `@timestamp`
108+
* `event.dataset`
109+
110+
For example:
111+
112+
```yaml
113+
topic: '%{[data_stream.type]}-%{[data_stream.dataset]}-%{[data_stream.namespace]}'
114+
```
115+
116+
You can also set a custom field. This is useful if you need to construct a more complex or structured topic name. For example, this configuration uses the `fields.log_topic` custom field to set the topic for each event:
103117

104118
```yaml
105119
topic: '%{[fields.log_topic]}'
@@ -109,6 +123,19 @@ topic: '%{[fields.log_topic]}'
109123
To learn how to add custom fields to events, see the [`fields`](/reference/winlogbeat/configuration-general-options.md#libbeat-configuration-fields) option.
110124
::::
111125

126+
To set a dynamic topic value for outputting {{winlogbeat}} data to Kafka, you can add the [`add_fields` processor](/reference/winlogbeat/add-fields.md) to {{winlogbeat}}'s input configuration settings.
127+
128+
For example, the following `add_fields` processor creates a dynamic topic value for the `fields.log_topic` field by combining multiple [ECS data stream fields](ecs://reference/ecs-data_stream.md):
129+
130+
```yaml
131+
- add_fields:
132+
target: ''
133+
fields:
134+
log_topic: '%{[data_stream.type]}-%{[data_stream.dataset]}-%{[data_stream.namespace]}' <1>
135+
```
136+
1. Depending on the values of the data stream fields, this generates topic names such as `logs-nginx.access-production` or `metrics-system.cpu-staging` as the value of the custom `log_topic` field.
137+
138+
For more information, refer to [Filter and enhance data with processors](/reference/winlogbeat/filtering-enhancing-data.md).
112139

113140
See the [`topics`](#topics-option-kafka) setting for other ways to set the topic dynamically.
114141

0 commit comments

Comments
 (0)