Git Product home page Git Product logo

spring-kafka-event-sourcing-sampler's Introduction

Event Sourcing using Spring Kafka

Build Status

This repository contains a sample application that demonstrates how to implement an Event-sourced systems using the CQRS architectural style. The solution uses Apache Kafka, which we easily integrate into a Spring Boot based application using Spring for Apache Kafka (2.6.5), Apache Avro for event serialization and deserialization and uses an in-memory H2 database that contributes to the query side of our CQRS-based system. The application itself is minimal and implements a subset of David Allen's Getting Things Done time management method.

The code presented in this repository is the joint work of Boris Fresow and Markus Günther as part of an article series on Building Event-based applications with Spring Kafka for the German JavaMagazin.

Prerequisites

Running the showcase requires a working installation of Apache ZooKeeper and Apache Kafka. We provide Dockerfiles for both of them to get you started easily. Please make sure that Docker as well as Docker Compose are installed on your system.

Versions

Application Version Docker Image
Apache Kafka 2.6.0 wurstmeister/kafka:2.13-2.6.0
Apache ZooKeeper 3.4.13 wurstmeister/zookeeper

Building and Running the Containers

Before you execute the code samples, make sure that you have a working environment running. If you have not done it already, use the script docker/build-images to create Docker images for all required applications. After a couple of minutes, you should be ready to go.

Once the images have been successfully built, you can start the resp. containers using the provided docker-compose script. Simply issue

$ docker-compose up

for starting Apache Kafka, Apache Zookeeper and Yahoo Kafka Manager. Stopping the containers is best done using a separate terminal and issueing the following commands.

$ docker-compose stop
$ docker-compose rm

The final rm operation deletes the containers and thus clears all state so you can start over with a clean installation.

For simplicity, we restrict the Kafka cluster to a single Kafka broker. However, scaling to more Kafka brokers is easily done via docker-compose. You will have to provide a sensible value for KAFKA_ADVERTISED_HOST_NAME (other than localhost) for this to work, though.

$ docker-compose scale kafka=3   # scales up to 3 Kafka brokers
$ docker-compose scale kafka=1   # scales down to 1 Kafka broker after the previous upscale

After changing the number of Kafka brokers, give the cluster some time so that all brokers can finish their cluster-join procedure. This should complete in a couple of seconds and you can inspect the output of the resp. Docker containers just to be sure that everything is fine. Kafka Manager should also reflect the change in the number of Kafka brokers after they successfully joined the cluster.

Using the API

Running the provided docker-compose will fire up a couple of services. First of all, Apache Kafka as well as Apache ZooKeeper, then both the command and query side of the GTD application as well as to small services for the sake of service discovery and to unify the API. The API gateway is listening at localhost:8765. You will have to interact with the API gateway, which takes care of the proper routing to one instance of the command or the query side of the application.

Overview

API Endpoint Method Example
/items POST Creates a new item.
/items GET Lists all items that are currently managed.
/items/{itemId} GET Lists the details of a specific item.
/items/{itemId} PUT Modifies an existing item.
/items/{itemId} DELETE Closes an existing item.

The following sections will walk you through a simple example on how to use the API via cURL.

Creating a new item

To create new item, we simply have to provide a short description of it in JSON along with the HTTP payload.

{
  "description": "Go shopping"
}

Using cURL we can create the item:

$ curl http://localhost:8765/api/items -X POST -H "Content-Type: application/json" -d '{"description":"Go shopping"}'

This request will be routed to an instance of the command-side of the GTD application, where the command will be validated before the proper event will be persisted to the event log.

Retrieving a list of all items

After creating an item, we'd like to inspect what items our GTD application currently manages. There is an HTTP endpoint for that as well. If you issue the following cURL request

$ curl http://localhost:8765/api/items

you see something along the lines of the following output (pretty-printed).

[
  {
    "id": "07bad2d",
    "description": "Go shopping",
    "requiredTime": 0,
    "dueDate": null,
    "tags": [      
    ],
    "associatedList": null,
    "done": false
  }
]

This shows the item we just created in full detail.

Retrieving a single item

We can retrieve details of a dedicated item as well. With the next cURL command, we request the item details for the item we just created (id: 07bad2d).

$ curl http://localhost:8765/api/items/07bad2d

This yields the following output (pretty-printed):

{
  "id": "07bad2d",
  "description": "Go shopping",
  "requiredTime": 0,
  "dueDate": null,
  "tags": [
      ],
  "associatedList": null,
  "done": false
}

Modifying an existing item

Let's try to update the item and associate it with a list of tags, a required time and put it into a dedicated list. The payload for this update looks like this:

{
  "tags": ["weekly"],
  "associatedList": "project",
  "requiredTime": 5
}

To issue the update, we simply execute the following cURL command.

$ curl http://localhost:8765/api/items/07bad2d -X PUT -H "Content-Type:application/json" -d '{"tags": ["weekly"], "associatedList":"project", "requiredTime":5}'

This will validate the individiual update commands extracted from the payload against the current state of the item. If the validation holds, the respective events will be emitted and the state of the item will be updated. If we look at the details of the item again using

$ curl http://localhost:8765/api/items/07bad2d

we see that the update has been successfully applied.

{
  "id": "07bad2d",
  "description": "Go shopping",
  "requiredTime": 5,
  "dueDate": null,
  "tags": [
    "weekly"
  ],
  "associatedList": "project",
  "done": false
}

Closing an item

To close an item, we issue a DELETE request via cURL.

$ curl http://localhost:8765/api/items/07bad2d -X DELETE

Looking again at the details of the item, we see that its done attribute is now true.

{
  "id": "07bad2d",
  "description": "Go shopping",
  "requiredTime": 5,
  "dueDate": null,
  "tags": [
    "weekly"
  ],
  "associatedList": "project",
  "done": true
}

License

This work is released under the terms of the MIT license.

spring-kafka-event-sourcing-sampler's People

Contributors

mguenther avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

spring-kafka-event-sourcing-sampler's Issues

Increase Kafka version to 1.0.0 and Spring Kafka to 2.1.0

A couple of days ago, Kafka 1.0.0 has been released. This is a minor issue as there is no need for the sake of the example application to upgrade to the latest version. However, since 1.0.0 marks a milestone release of Kafka, we should upgrade it together with Spring Kafka 2.1.0 - as soon as 2.1.0 is stable.

Document the API usage

The API endpoints exposed via the API gateway should be described in brief manner, at least in the README.md in order to get new users started very quicky.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.