You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/overview.md
+63
Original file line number
Diff line number
Diff line change
@@ -24,6 +24,9 @@ Sink in the output component of pipeline, it defines the one or more destination
24
24
### Processor
25
25
Processor component of the pipeline, these are intermediary processing units using which users can filter, transform and enrich the records into desired format before publishing to the sink. The processor is an optional component of the pipeline, if not defined the records will be published in the format as defined in the source. You can have more than one processor, and they are executed in the order they are defined in the pipeline spec.
26
26
27
+
### Route
28
+
Data Prepper supports routes on events. These allow pipeline authors to send only events matching certain conditions to different sinks.
29
+
27
30
### Prepper
28
31
Preppers were renamed to Processors and are no longer supported starting in DataPrepper 2.0. Please see [Deprecated Pipeline Configuration Support](configuration.md#Deprecated Pipeline Configuration Support) section in the Configuration document for more details.
29
32
@@ -107,3 +110,63 @@ output-pipeline-2:
107
110
```
108
111
109
112
The above configuration uses the Pipeline Connectors. `input-pipeline` is configured with `output-pipeline-1` and `output-pipeline-2` as sink. With the help of pipeline connectors we can read once from the input file and write upper case values to `output-1-file` and lower case values to `output-2-file`.
113
+
114
+
## Conditional Routing
115
+
116
+
In many situations, pipeline authors want to route some events to certain sinks. They can configure their pipeline to do this by using Data Prepper's route feature. The pipeline author
117
+
first configures routes in `route:` part of the pipeline configuration.
118
+
119
+
The following shows how this is configured. In the example below, `info_level` and `warn_and_above` are the names of two different routes. As a pipeline author, you can define these
120
+
names to match your use-case. After the names of the routes, you can define the conditions that must apply to any route to make it applicable for any given event. For more information
121
+
on how to define conditions see the [Data Prepper Expression Syntax](expression_syntax.md) guide.
122
+
123
+
```
124
+
route:
125
+
- info_level: '/loglevel == "INFO"'
126
+
- warn_and_above: '/loglevel == "WARN" or /loglevel == "ERROR"'
127
+
```
128
+
129
+
Now that you have defined some routes, you can define these routes in your sinks. You do this by adding the `routes:` property onto any sink that you want to have routing applied to.
130
+
You can specify a list of routes. Any route that applies for an event will cause it to be routed to that sink. If you do not apply a route to a sink, then all events will go into that
131
+
sink.
132
+
133
+
The following shows a snippet of a sink that outputs events to a `file` sink. It will only output events which match the `info_level` or `warn_and_above` routes.
The following shows a full example pipeline. In this example, Data Prepper accepts log events from the `http` source, runs the `grok` processor to set the
145
+
`loglevel` value, and then outputs to the sinks. The first sink (writing to `out-info-above.txt`) receives events matching either `info_level` or `warn_and_above`.
146
+
The second sink (writing to `out-warn-above.txt`) only receives events matching `warn_and_above`. The last sink (writing to `out-all.txt`) receives all events.
0 commit comments