Skip to content

feat(kafka): add support for schema registries #8800

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
15 changes: 15 additions & 0 deletions app/_hub/kong-inc/confluent-consume/overview/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -98,6 +98,21 @@ The following steps assume that {{site.base_gateway}} is installed.
--header 'Accept: text/event-stream'
```

## Schema Registry Support

The Confluent Consume plugin supports integration with Schema Registry for AVRO and JSON schemas. Currently, only Confluent Schema Registry is supported.

For more information about Schema Registry integration, see the [Schema Registry documentation](/hub/kong-inc/schema-registry/).

## Configuration

To configure Schema Registry with the Kafka Consume plugin, use the `schema_registry` parameter in your plugin configuration. See the [Schema Registry Configuration](/hub/kong-inc/schema-registry/configuration/) for specific Schema Registry options.

## Related Resources

- [Schema Registry](/hub/kong-inc/schema-registry/)
- [Confluent Schema Registry Documentation](https://docs.confluent.io/platform/current/schema-registry/index.html)

## Learn more about the Confluent Consume plugin

* [Configuration reference](/hub/kong-inc/confluent-consume/configuration/)
Expand Down
18 changes: 17 additions & 1 deletion app/_hub/kong-inc/confluent/overview/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,12 +10,13 @@ Kong also provides Kafka Log and Kafka Upstream plugins for publishing logs and
* See [Kafka Log](/hub/kong-inc/kafka-log/)
* See [Kafka Upstream](/hub/kong-inc/kafka-upstream/)

{:.note}
{:.note}
> **Note**: This plugin has the following known limitations:
> * Message compression is not supported.
> * The message format is not customizable.
> * {{site.base_gateway}} does not support Kafka 4.0.


## Quickstart

### Prerequisites
Expand Down Expand Up @@ -62,3 +63,18 @@ To check that the message has been added to the topic in the Confluent Cloud con
1. From the navigation menu, select **Topics** to show the list of topics in your cluster.
2. Select the topic you sent messages to.
3. In the topic detail page, select the **Messages** tab to view the messages being produced to the topic.

## Schema Registry Support

The Confluent plugin supports integration with Schema Registry for AVRO and JSON schemas. Currently, only Confluent Schema Registry is supported.

For more information about Schema Registry integration, see the [Schema Registry documentation](/hub/kong-inc/schema-registry/).

## Configuration

To configure Schema Registry with the Kafka Upstream plugin, use the `schema_registry` parameter in your plugin configuration. See the [Schema Registry Configuration](/hub/kong-inc/schema-registry/configuration/) for specific Schema Registry options.

## Related Resources

- [Schema Registry](/hub/kong-inc/schema-registry/)
- [Confluent Schema Registry Documentation](https://docs.confluent.io/platform/current/schema-registry/index.html)
16 changes: 16 additions & 0 deletions app/_hub/kong-inc/kafka-consume/overview/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ Kong also provides Kafka plugins for publishing messages:
* See [Kafka Log](/hub/kong-inc/kafka-log/)
* See [Kafka Upstream](/hub/kong-inc/kafka-upstream/)


## Implementation details

The plugin supports two modes of operation:
Expand Down Expand Up @@ -97,6 +98,21 @@ The following steps assume that {{site.base_gateway}} is installed.
--header 'Accept: text/event-stream'
```

## Schema Registry Support

The Kafka Consume plugin supports integration with Schema Registry for AVRO and JSON schemas. Currently, only Confluent Schema Registry is supported.

For more information about Schema Registry integration, see the [Schema Registry documentation](/hub/kong-inc/schema-registry/).

## Configuration

To configure Schema Registry with the Kafka Consume plugin, use the `schema_registry` parameter in your plugin configuration. See the [Schema Registry Configuration](/hub/kong-inc/schema-registry/configuration/) for specific Schema Registry options.

## Related Resources

- [Schema Registry](/hub/kong-inc/schema-registry/)
- [Confluent Schema Registry Documentation](https://docs.confluent.io/platform/current/schema-registry/index.html)

## Learn more about the Kafka Consume plugin

* [Configuration reference](/hub/kong-inc/kafka-consume/configuration/)
Expand Down
14 changes: 14 additions & 0 deletions app/_hub/kong-inc/kafka-log/overview/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -144,3 +144,17 @@ This plugin supports the following authentication mechanisms:

{% endif_version %}

## Schema Registry Support

The Kafka Log plugin supports integration with Schema Registry for AVRO and JSON schemas. Currently, only Confluent Schema Registry is supported.

For more information about Schema Registry integration, see the [Schema Registry documentation](/hub/kong-inc/schema-registry/).

## Configuration

To configure Schema Registry with the Kafka Upstream plugin, use the `schema_registry` parameter in your plugin configuration. See the [Schema Registry Configuration](/hub/kong-inc/schema-registry/configuration/) for specific Schema Registry options.

## Related Resources

- [Schema Registry](/hub/kong-inc/schema-registry/)
- [Confluent Schema Registry Documentation](https://docs.confluent.io/platform/current/schema-registry/index.html)
15 changes: 15 additions & 0 deletions app/_hub/kong-inc/kafka-upstream/overview/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -162,3 +162,18 @@ The following steps assume that {{site.base_gateway}} is installed and the Kafka

You should receive a `200 { message: "message sent" }` response, and should see the request bodies appear on
the Kafka consumer console you started in the previous step.

## Schema Registry Support

The Kafka Log plugin supports integration with Schema Registry for AVRO and JSON schemas. Currently, only Confluent Schema Registry is supported.

For more information about Schema Registry integration, see the [Schema Registry documentation](/hub/kong-inc/schema-registry/).

## Configuration

To configure Schema Registry with the Kafka Upstream plugin, use the `schema_registry` parameter in your plugin configuration. See the [Schema Registry Configuration](/hub/kong-inc/schema-registry/configuration/) for specific Schema Registry options.

## Related Resources

- [Schema Registry](/hub/kong-inc/schema-registry/)
- [Confluent Schema Registry Documentation](https://docs.confluent.io/platform/current/schema-registry/index.html)
100 changes: 100 additions & 0 deletions app/_hub/kong-inc/schema-registry/configuration/_index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,100 @@
---
nav_title: Configuration
---

## Configuration Parameters

The Schema Registry integration is configured within the supported plugins. The configuration differs slightly between producer plugins (like Kafka Log) and consumer plugins (like Kafka Consume).

### Confluent Schema Registry

Currently, Kong supports integration with Confluent Schema Registry for both AVRO and JSON schemas. Only one schema registry provider can be configured at a time.

#### Common Parameters (Both Producers and Consumers)

| Parameter | Description |
|-----------|-------------|
| `schema_registry.confluent.url` | The URL of the Confluent Schema Registry service |
| `schema_registry.confluent.authentication.mode` | Authentication mode for the Schema Registry. Options: `none`, `basic` |
| `schema_registry.confluent.authentication.username` | (Optional) Username for basic authentication |
| `schema_registry.confluent.authentication.password` | (Optional) Password for basic authentication |

#### Producer-Specific Parameters
For plugins that produce messages to Kafka (Kafka Log, Kafka Upstream, Confluent):

| Parameter | Description |
|-----------|-------------|
| `schema_registry.confluent.value_schema.subject_name` | The subject name for the value schema in the Schema Registry |
| `schema_registry.confluent.value_schema.schema_version` | (Optional) Specific schema version to use. Can be a version number or `latest` to always use the most recent version |

#### Consumer-Specific Parameters
For plugins that consume messages from Kafka (Kafka Consume, Confluent Consume):

| Parameter | Description |
|-----------|-------------|
| No additional parameters required | Consumers automatically detect and use the schema ID embedded in the messages |

## Example Configurations

### Producer Plugin Example (Kafka Log)

```json
{
"name": "kafka-log",
"config": {
"bootstrap_servers": [
{
"host": "kafka-server-1",
"port": 9092
}
],
"topic": "kong-logs",
"schema_registry": {
"confluent": {
"url": "http://schema-registry:8081",
"authentication": {
"mode": "basic",
"username": "user",
"password": "password"
},
"value_schema": {
"subject_name": "kong-logs-value",
"schema_version": "latest"
}
}
}
}
}
```

### Consumer Plugin Example (Kafka Consume)

```json
{
"name": "kafka-consume",
"config": {
"bootstrap_servers": [
{
"host": "kafka-server-1",
"port": 9092
}
],
"topics": [
{
"name": "kong-logs"
}
],
"schema_registry": {
"confluent": {
"url": "http://schema-registry:8081",
"authentication": {
"mode": "basic",
"username": "user",
"password": "password"
}
}
},
"mode": "http-get"
}
}
```
86 changes: 86 additions & 0 deletions app/_hub/kong-inc/schema-registry/overview/_index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,86 @@
---
nav_title: Overview
---

## Introduction

Schema Registry is a feature that allows Kong to integrate with schema registry services to validate and serialize/deserialize messages in a standardized format. Schema registries provide a centralized repository for managing and validating schemas for data formats like AVRO and JSON.

Currently, Kong supports integration with:
- **Confluent Schema Registry** for AVRO and JSON schemas

Support for additional schema registry providers may be added in future releases.

## How Schema Registry Works

### Producer Workflow

When a producer plugin (like Kafka Log) is configured with Schema Registry, the following workflow occurs:

```
┌─────────┐ ┌─────────────┐ ┌───────────────┐ ┌──────────────────┐
│ Request │────>│ Kong Plugin │────>│ Fetch Schema │────>│ Validate Message │
└─────────┘ └─────────────┘ │ from Registry │ │ Against Schema │
└───────────────┘ └──────────┬───────┘
┌─────────────┐ ┌──────────────────┐
│ Forward to │<────│ Serialize Using │
│ Kafka │ │ Schema │
└─────────────┘ └──────────────────┘
```

If validation fails, the request is rejected with an appropriate error message.

### Consumer Workflow

When a consumer plugin (like Kafka Consume) is configured with Schema Registry, the following workflow occurs:

```
┌─────────┐ ┌─────────────┐ ┌───────────────┐ ┌──────────────────┐
│ Kafka │────>│ Kong Plugin │────>│ Extract │────>│ Fetch Schema │
│ Message │ └─────────────┘ │ Schema ID │ │ from Registry │
└─────────┘ └───────────────┘ └──────────┬───────┘
┌─────────────┐ ┌──────────────────┐
│ Return to │<────│ Deserialize │
│ Client │ │ Using Schema │
└─────────────┘ └──────────────────┘
```

## Benefits

Using a schema registry with Kong provides several benefits:

- **Data Validation**: Ensures messages conform to a predefined schema before being processed
- **Schema Evolution**: Manages schema changes and versioning
- **Interoperability**: Enables seamless communication between different services using standardized data formats
- **Reduced Overhead**: Minimizes the need for custom validation logic in your applications


## Supported Plugins

The following Kong plugins currently support schema registry integration:

### Producer Plugins
These plugins produce messages to Kafka and can use Schema Registry for serialization:
- [Kafka Log](/hub/kong-inc/kafka-log/)
- [Kafka Upstream](/hub/kong-inc/kafka-upstream/)
- [Confluent](/hub/kong-inc/confluent/)

### Consumer Plugins
These plugins consume messages from Kafka and can use Schema Registry for deserialization:
- [Kafka Consume](/hub/kong-inc/kafka-consume/)
- [Confluent Consume](/hub/kong-inc/confluent-consume/)

## Configuration

Schema registry configuration is specified within the plugin configuration. See the [Configuration](/hub/kong-inc/schema-registry/configuration/) section for details.



## Related Resources

- [Confluent Schema Registry Documentation](https://docs.confluent.io/platform/current/schema-registry/index.html)
- [AVRO Specification](https://avro.apache.org/docs/current/spec.html)