DEV Community

Cover image for Building a Kafka Event-Driven Spring Boot Application with Avro, Schema Registry and PostgreSQL
Matthaios Stavrou
Matthaios Stavrou

Posted on • Originally published at Medium

Building a Kafka Event-Driven Spring Boot Application with Avro, Schema Registry and PostgreSQL

If youโ€™re building event-driven systems with Apache Kafka, you must think about data contracts early.

This post shows a practical, end-to-end Spring Boot example using:

  • Apache Kafka
  • Confluent Schema Registry
  • Avro serialization
  • PostgreSQL
  • Docker Compose

๐Ÿ‘‰ Full source code:
๐Ÿ”— https://github.com/mathias82/kafka-schema-registry-spring-demo

๐Ÿง  Why Schema Registry + Avro?

JSON worksโ€ฆ until it doesnโ€™t.

Common problems in Kafka-based systems:

  • breaking consumers when producers change payloads
  • no schema versioning
  • unclear data contracts between teams

Avro + Schema Registry solves this by:

  • enforcing schema compatibility
  • allowing safe schema evolution
  • decoupling producers from consumers

This demo shows how to do it the right way with Spring Boot.

๐Ÿ—๏ธ Architecture Overview
Client (Postman)
|
v
Spring Boot Producer (REST)
|
v
Kafka Topic (users.v1)
|
v
Spring Boot Consumer
|
v
PostgreSQL

  • Producer exposes POST /users
  • Payload is converted to an Avro record
  • Message is published to Kafka
  • Consumer deserializes Avro and persists data to PostgreSQL

โœจ What This Demo Includes

  • Spring Boot Kafka Producer (Avro)
  • Spring Boot Kafka Consumer (Avro)
  • Confluent Schema Registry
  • PostgreSQL persistence using Spring Data JPA
  • Schema evolution with backward compatibility
  • Docker Compose for local development

๐Ÿณ Local Setup (Kafka + Schema Registry + PostgreSQL)
Prerequisites

  • Java 21
  • Maven
  • Docker & Docker Compose

Start infrastructure

docker compose up -d

Services started:

  • Kafka โ†’ localhost:29092
  • Schema Registry โ†’ http://localhost:8081
  • PostgreSQL โ†’ localhost:5432

โ–ถ๏ธ Run the Applications
Consumer
cd consumer-app
mvn spring-boot:run

Listens to users.v1 and persists messages to PostgreSQL.

Producer
cd producer-app
mvn spring-boot:run

Exposes REST endpoint.

๐Ÿ“ฌ Produce an Event
curl -X POST http://localhost:8080/users \
-H "Content-Type: application/json" \
-d '{
"id": "u-1",
"email": "user@test.com",
"firstName": "John",
"lastName": "Doe",
"isActive": true,
"age": 30
}'

Youโ€™ll see:

  • Avro schema registered (or validated)
  • Message published to Kafka
  • Consumer saving the record to PostgreSQL

๐Ÿ”„ Schema Evolution (The Important Part)

Avro allows safe evolution when rules are respected.

Example:

  • Add a new optional field
  • Provide a default value
  • Keep compatibility set to BACKWARD

Schema Registry ensures:

  • old consumers keep working
  • new producers donโ€™t break the system

This demo is designed to show real-world schema evolution, not toy examples.

โ˜๏ธ Confluent Cloud Ready

The project also supports Confluent Cloud via Spring profiles:

  • SASL/SSL
  • Schema Registry API keys
  • use.latest.version=true
  • auto.register.schemas=false
  • Perfect for CI/CD pipelines.

๐Ÿ”— Source Code

๐Ÿ‘‰ GitHub repository:
https://github.com/mathias82/kafka-schema-registry-spring-demo

Includes:

  • Docker Compose
  • Avro schemas
  • Producer & Consumer apps
  • PostgreSQL setup
  • Postman collection

๐Ÿงฉ Who Is This For?

  • Java & Spring Boot developers
  • Kafka users moving beyond JSON
  • Teams building event-driven microservices
  • Anyone learning Schema Registry + Avro

โญ Final Thoughts

This is a production-style Kafka example, not a hello-world.

If youโ€™re serious about:

  • schema contracts
  • backward compatibility
  • safe evolution
  • real persistence

then this demo will save you a lot of trial and error.

๐Ÿ‘‰ Star the repo if it helped you
๐Ÿ‘‰ Fork it and adapt it to your own system

Top comments (0)