Skip to content

A microservices demo uses Kafka as message broker and Kafka Schema Registry for shared event structure.

Notifications You must be signed in to change notification settings

burakmert236/kafka-schema-registry

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

2 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

πŸ›°οΈ Kafka Schema Registry Microservices Demo

This project demonstrates an event-driven microservices architecture using Spring Boot, Apache Kafka, and Confluent Schema Registry.
It includes:

  • Producer Service β€” generates sensor data and publishes events to Kafka (Avro format)
  • Consumer Service β€” listens to Kafka and processes incoming sensor events
  • Schema Registry β€” stores and manages Avro schemas
  • Docker Compose β€” spins up Kafka, Schema Registry, and both services

🧱 Project Structure

microservices-demo/
β”‚
β”œβ”€β”€ docker-compose.yml
β”œβ”€β”€ producer-service/
β”‚   β”œβ”€β”€ src/
β”‚   β”œβ”€β”€ pom.xml
β”‚   └── Dockerfile
β”‚
β”œβ”€β”€ consumer-service/
β”‚   β”œβ”€β”€ src/
β”‚   β”œβ”€β”€ pom.xml
β”‚   └── Dockerfile
└── README.md

βš™οΈ Requirements

Tool Version Description
Java 17+ Development runtime
Docker & Docker Compose latest Container management
Maven (auto-managed via wrapper) Dependency & build tool

You don’t need Maven installed globally β€” this project includes the Maven Wrapper (mvnw, mvnw.cmd).


🧬 1. Generate Avro classes

Both services use Avro schemas stored under:

schema-registry/src/main/avro/

To generate Java classes from Avro:

./mvnw clean compile

Generated classes will appear under:

schema-registry/target/generated-sources/avro/

This step runs automatically during the normal build, but you can trigger it manually if you edit .avsc files.


πŸ“¦ 2. Build & Package services

Build both services using:

./mvnw clean package -DskipTests

This will:

  • Compile code and generated Avro classes
  • Package each microservice as a runnable JAR inside its respective target/ folder

🐳 3. Run everything with Docker Compose

From the project root:

docker-compose up --build

This will start:

  • Zookeeper
  • Kafka broker
  • Schema Registry
  • Producer Service
  • Consumer Service

All services run on a shared Docker network.


🌐 4. Useful URLs

Service URL Notes
Kafka Broker localhost:9092 External access
Schema Registry http://localhost:8081 REST API for schemas
Producer Service http://localhost:8082 REST endpoint (if defined)
Consumer Service http://localhost:8083 Listens for Kafka events

🧠 Notes

  • Both services share the same Avro schema to ensure compatibility.
  • The producer uses KafkaAvroSerializer, and the consumer uses KafkaAvroDeserializer with specific.avro.reader=true.
  • Each service uses its own Dockerfile to create independent containers.

About

A microservices demo uses Kafka as message broker and Kafka Schema Registry for shared event structure.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published