Skip to content

Commit 0805750

Browse files
committed
feat(kafka): add support for schema registries
1 parent 538d257 commit 0805750

File tree

7 files changed

+263
-1
lines changed

7 files changed

+263
-1
lines changed

app/_hub/kong-inc/confluent-consume/overview/_index.md

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -98,6 +98,21 @@ The following steps assume that {{site.base_gateway}} is installed.
9898
--header 'Accept: text/event-stream'
9999
```
100100

101+
## Schema Registry Support
102+
103+
The Consfluent Consume plugin supports integration with Schema Registry for AVRO and JSON schemas. Currently, only Confluent Schema Registry is supported.
104+
105+
For more information about Schema Registry integration, see the [Schema Registry documentation](/hub/kong-inc/schema-registry/).
106+
107+
## Configuration
108+
109+
To configure Schema Registry with the Kafka Consume plugin, use the `schema_registry` parameter in your plugin configuration. See the [Schema Registry Configuration](/hub/kong-inc/schema-registry/configuration/) for specific Schema Registry options.
110+
111+
## Related Resources
112+
113+
- [Schema Registry](/hub/kong-inc/schema-registry/)
114+
- [Confluent Schema Registry Documentation](https://docs.confluent.io/platform/current/schema-registry/index.html)
115+
101116
## Learn more about the Confluent Consume plugin
102117

103118
* [Configuration reference](/hub/kong-inc/confluent-consume/configuration/)

app/_hub/kong-inc/confluent/overview/_index.md

Lines changed: 17 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,12 +10,13 @@ Kong also provides Kafka Log and Kafka Upstream plugins for publishing logs and
1010
* See [Kafka Log](/hub/kong-inc/kafka-log/)
1111
* See [Kafka Upstream](/hub/kong-inc/kafka-upstream/)
1212

13-
{:.note}
13+
{:.note}
1414
> **Note**: This plugin has the following known limitations:
1515
> * Message compression is not supported.
1616
> * The message format is not customizable.
1717
> * {{site.base_gateway}} does not support Kafka 4.0.
1818
19+
1920
## Quickstart
2021

2122
### Prerequisites
@@ -62,3 +63,18 @@ To check that the message has been added to the topic in the Confluent Cloud con
6263
1. From the navigation menu, select **Topics** to show the list of topics in your cluster.
6364
2. Select the topic you sent messages to.
6465
3. In the topic detail page, select the **Messages** tab to view the messages being produced to the topic.
66+
67+
## Schema Registry Support
68+
69+
The Confluent plugin supports integration with Schema Registry for AVRO and JSON schemas. Currently, only Confluent Schema Registry is supported.
70+
71+
For more information about Schema Registry integration, see the [Schema Registry documentation](/hub/kong-inc/schema-registry/).
72+
73+
## Configuration
74+
75+
To configure Schema Registry with the Kafka Upstream plugin, use the `schema_registry` parameter in your plugin configuration. See the [Schema Registry Configuration](/hub/kong-inc/schema-registry/configuration/) for specific Schema Registry options.
76+
77+
## Related Resources
78+
79+
- [Schema Registry](/hub/kong-inc/schema-registry/)
80+
- [Confluent Schema Registry Documentation](https://docs.confluent.io/platform/current/schema-registry/index.html)

app/_hub/kong-inc/kafka-consume/overview/_index.md

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,7 @@ Kong also provides Kafka plugins for publishing messages:
1515
* See [Kafka Log](/hub/kong-inc/kafka-log/)
1616
* See [Kafka Upstream](/hub/kong-inc/kafka-upstream/)
1717

18+
1819
## Implementation details
1920

2021
The plugin supports two modes of operation:
@@ -97,6 +98,21 @@ The following steps assume that {{site.base_gateway}} is installed.
9798
--header 'Accept: text/event-stream'
9899
```
99100

101+
## Schema Registry Support
102+
103+
The Kafka Consume plugin supports integration with Schema Registry for AVRO and JSON schemas. Currently, only Confluent Schema Registry is supported.
104+
105+
For more information about Schema Registry integration, see the [Schema Registry documentation](/hub/kong-inc/schema-registry/).
106+
107+
## Configuration
108+
109+
To configure Schema Registry with the Kafka Consume plugin, use the `schema_registry` parameter in your plugin configuration. See the [Schema Registry Configuration](/hub/kong-inc/schema-registry/configuration/) for specific Schema Registry options.
110+
111+
## Related Resources
112+
113+
- [Schema Registry](/hub/kong-inc/schema-registry/)
114+
- [Confluent Schema Registry Documentation](https://docs.confluent.io/platform/current/schema-registry/index.html)
115+
100116
## Learn more about the Kafka Consume plugin
101117

102118
* [Configuration reference](/hub/kong-inc/kafka-consume/configuration/)

app/_hub/kong-inc/kafka-log/overview/_index.md

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -144,3 +144,17 @@ This plugin supports the following authentication mechanisms:
144144
145145
{% endif_version %}
146146
147+
## Schema Registry Support
148+
149+
The Kafka Log plugin supports integration with Schema Registry for AVRO and JSON schemas. Currently, only Confluent Schema Registry is supported.
150+
151+
For more information about Schema Registry integration, see the [Schema Registry documentation](/hub/kong-inc/schema-registry/).
152+
153+
## Configuration
154+
155+
To configure Schema Registry with the Kafka Upstream plugin, use the `schema_registry` parameter in your plugin configuration. See the [Schema Registry Configuration](/hub/kong-inc/schema-registry/configuration/) for specific Schema Registry options.
156+
157+
## Related Resources
158+
159+
- [Schema Registry](/hub/kong-inc/schema-registry/)
160+
- [Confluent Schema Registry Documentation](https://docs.confluent.io/platform/current/schema-registry/index.html)

app/_hub/kong-inc/kafka-upstream/overview/_index.md

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -162,3 +162,18 @@ The following steps assume that {{site.base_gateway}} is installed and the Kafka
162162
163163
You should receive a `200 { message: "message sent" }` response, and should see the request bodies appear on
164164
the Kafka consumer console you started in the previous step.
165+
166+
## Schema Registry Support
167+
168+
The Kafka Log plugin supports integration with Schema Registry for AVRO and JSON schemas. Currently, only Confluent Schema Registry is supported.
169+
170+
For more information about Schema Registry integration, see the [Schema Registry documentation](/hub/kong-inc/schema-registry/).
171+
172+
## Configuration
173+
174+
To configure Schema Registry with the Kafka Upstream plugin, use the `schema_registry` parameter in your plugin configuration. See the [Schema Registry Configuration](/hub/kong-inc/schema-registry/configuration/) for specific Schema Registry options.
175+
176+
## Related Resources
177+
178+
- [Schema Registry](/hub/kong-inc/schema-registry/)
179+
- [Confluent Schema Registry Documentation](https://docs.confluent.io/platform/current/schema-registry/index.html)
Lines changed: 100 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,100 @@
1+
---
2+
nav_title: Configuration
3+
---
4+
5+
## Configuration Parameters
6+
7+
The Schema Registry integration is configured within the supported plugins. The configuration differs slightly between producer plugins (like Kafka Log) and consumer plugins (like Kafka Consume).
8+
9+
### Confluent Schema Registry
10+
11+
Currently, Kong supports integration with Confluent Schema Registry for both AVRO and JSON schemas. Only one schema registry provider can be configured at a time.
12+
13+
#### Common Parameters (Both Producers and Consumers)
14+
15+
| Parameter | Description |
16+
|-----------|-------------|
17+
| `schema_registry.confluent.url` | The URL of the Confluent Schema Registry service |
18+
| `schema_registry.confluent.authentication.mode` | Authentication mode for the Schema Registry. Options: `none`, `basic` |
19+
| `schema_registry.confluent.authentication.username` | (Optional) Username for basic authentication |
20+
| `schema_registry.confluent.authentication.password` | (Optional) Password for basic authentication |
21+
22+
#### Producer-Specific Parameters
23+
For plugins that produce messages to Kafka (Kafka Log, Kafka Upstream, Confluent):
24+
25+
| Parameter | Description |
26+
|-----------|-------------|
27+
| `schema_registry.confluent.value_schema.subject_name` | The subject name for the value schema in the Schema Registry |
28+
| `schema_registry.confluent.value_schema.schema_version` | (Optional) Specific schema version to use. Can be a version number or `latest` to always use the most recent version |
29+
30+
#### Consumer-Specific Parameters
31+
For plugins that consume messages from Kafka (Kafka Consume, Confluent Consume):
32+
33+
| Parameter | Description |
34+
|-----------|-------------|
35+
| No additional parameters required | Consumers automatically detect and use the schema ID embedded in the messages |
36+
37+
## Example Configurations
38+
39+
### Producer Plugin Example (Kafka Log)
40+
41+
```json
42+
{
43+
"name": "kafka-log",
44+
"config": {
45+
"bootstrap_servers": [
46+
{
47+
"host": "kafka-server-1",
48+
"port": 9092
49+
}
50+
],
51+
"topic": "kong-logs",
52+
"schema_registry": {
53+
"confluent": {
54+
"url": "http://schema-registry:8081",
55+
"authentication": {
56+
"mode": "basic",
57+
"username": "user",
58+
"password": "password"
59+
},
60+
"value_schema": {
61+
"subject_name": "kong-logs-value",
62+
"schema_version": "latest"
63+
}
64+
}
65+
}
66+
}
67+
}
68+
```
69+
70+
### Consumer Plugin Example (Kafka Consume)
71+
72+
```json
73+
{
74+
"name": "kafka-consume",
75+
"config": {
76+
"bootstrap_servers": [
77+
{
78+
"host": "kafka-server-1",
79+
"port": 9092
80+
}
81+
],
82+
"topics": [
83+
{
84+
"name": "kong-logs"
85+
}
86+
],
87+
"schema_registry": {
88+
"confluent": {
89+
"url": "http://schema-registry:8081",
90+
"authentication": {
91+
"mode": "basic",
92+
"username": "user",
93+
"password": "password"
94+
}
95+
}
96+
},
97+
"mode": "http-get"
98+
}
99+
}
100+
```
Lines changed: 86 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,86 @@
1+
---
2+
nav_title: Overview
3+
---
4+
5+
## Introduction
6+
7+
Schema Registry is a feature that allows Kong to integrate with schema registry services to validate and serialize/deserialize messages in a standardized format. Schema registries provide a centralized repository for managing and validating schemas for data formats like AVRO and JSON.
8+
9+
Currently, Kong supports integration with:
10+
- **Confluent Schema Registry** for AVRO and JSON schemas
11+
12+
Support for additional schema registry providers may be added in future releases.
13+
14+
## How Schema Registry Works
15+
16+
### Producer Workflow
17+
18+
When a producer plugin (like Kafka Log) is configured with Schema Registry, the following workflow occurs:
19+
20+
```
21+
┌─────────┐ ┌─────────────┐ ┌───────────────┐ ┌──────────────────┐
22+
│ Request │────>│ Kong Plugin │────>│ Fetch Schema │────>│ Validate Message │
23+
└─────────┘ └─────────────┘ │ from Registry │ │ Against Schema │
24+
└───────────────┘ └──────────┬───────┘
25+
26+
27+
┌─────────────┐ ┌──────────────────┐
28+
│ Forward to │<────│ Serialize Using │
29+
│ Kafka │ │ Schema │
30+
└─────────────┘ └──────────────────┘
31+
```
32+
33+
If validation fails, the request is rejected with an appropriate error message.
34+
35+
### Consumer Workflow
36+
37+
When a consumer plugin (like Kafka Consume) is configured with Schema Registry, the following workflow occurs:
38+
39+
```
40+
┌─────────┐ ┌─────────────┐ ┌───────────────┐ ┌──────────────────┐
41+
│ Kafka │────>│ Kong Plugin │────>│ Extract │────>│ Fetch Schema │
42+
│ Message │ └─────────────┘ │ Schema ID │ │ from Registry │
43+
└─────────┘ └───────────────┘ └──────────┬───────┘
44+
45+
46+
┌─────────────┐ ┌──────────────────┐
47+
│ Return to │<────│ Deserialize │
48+
│ Client │ │ Using Schema │
49+
└─────────────┘ └──────────────────┘
50+
```
51+
52+
## Benefits
53+
54+
Using a schema registry with Kong provides several benefits:
55+
56+
- **Data Validation**: Ensures messages conform to a predefined schema before being processed
57+
- **Schema Evolution**: Manages schema changes and versioning
58+
- **Interoperability**: Enables seamless communication between different services using standardized data formats
59+
- **Reduced Overhead**: Minimizes the need for custom validation logic in your applications
60+
61+
62+
## Supported Plugins
63+
64+
The following Kong plugins currently support schema registry integration:
65+
66+
### Producer Plugins
67+
These plugins produce messages to Kafka and can use Schema Registry for serialization:
68+
- [Kafka Log](/hub/kong-inc/kafka-log/)
69+
- [Kafka Upstream](/hub/kong-inc/kafka-upstream/)
70+
- [Confluent](/hub/kong-inc/confluent/)
71+
72+
### Consumer Plugins
73+
These plugins consume messages from Kafka and can use Schema Registry for deserialization:
74+
- [Kafka Consume](/hub/kong-inc/kafka-consume/)
75+
- [Confluent Consume](/hub/kong-inc/confluent-consume/)
76+
77+
## Configuration
78+
79+
Schema registry configuration is specified within the plugin configuration. See the [Configuration](/hub/kong-inc/schema-registry/configuration/) section for details.
80+
81+
82+
83+
## Related Resources
84+
85+
- [Confluent Schema Registry Documentation](https://docs.confluent.io/platform/current/schema-registry/index.html)
86+
- [AVRO Specification](https://avro.apache.org/docs/current/spec.html)

0 commit comments

Comments
 (0)