This project demonstrates an event-driven microservices architecture using Spring Boot, Apache Kafka, and Confluent Schema Registry.
It includes:
- Producer Service β generates sensor data and publishes events to Kafka (Avro format)
- Consumer Service β listens to Kafka and processes incoming sensor events
- Schema Registry β stores and manages Avro schemas
- Docker Compose β spins up Kafka, Schema Registry, and both services
microservices-demo/
β
βββ docker-compose.yml
βββ producer-service/
β βββ src/
β βββ pom.xml
β βββ Dockerfile
β
βββ consumer-service/
β βββ src/
β βββ pom.xml
β βββ Dockerfile
βββ README.md
| Tool | Version | Description |
|---|---|---|
| Java | 17+ | Development runtime |
| Docker & Docker Compose | latest | Container management |
| Maven | (auto-managed via wrapper) | Dependency & build tool |
You donβt need Maven installed globally β this project includes the Maven Wrapper (
mvnw,mvnw.cmd).
Both services use Avro schemas stored under:
schema-registry/src/main/avro/
To generate Java classes from Avro:
./mvnw clean compileGenerated classes will appear under:
schema-registry/target/generated-sources/avro/
This step runs automatically during the normal build, but you can trigger it manually if you edit
.avscfiles.
Build both services using:
./mvnw clean package -DskipTestsThis will:
- Compile code and generated Avro classes
- Package each microservice as a runnable JAR inside its respective
target/folder
From the project root:
docker-compose up --buildThis will start:
- Zookeeper
- Kafka broker
- Schema Registry
- Producer Service
- Consumer Service
All services run on a shared Docker network.
| Service | URL | Notes |
|---|---|---|
| Kafka Broker | localhost:9092 |
External access |
| Schema Registry | http://localhost:8081 |
REST API for schemas |
| Producer Service | http://localhost:8082 |
REST endpoint (if defined) |
| Consumer Service | http://localhost:8083 |
Listens for Kafka events |
- Both services share the same Avro schema to ensure compatibility.
- The producer uses
KafkaAvroSerializer, and the consumer usesKafkaAvroDeserializerwithspecific.avro.reader=true. - Each service uses its own Dockerfile to create independent containers.