Test Automation with Kafka - Docker and Kafka
Introduction
This guide is an introduction to test automation through the use of Apache Kafka®. Apache Kafka® is a publish-subscribe messaging system that’s used widely across many industries.
I’ve previously worked on projects that used Kafka for low latency, high volume, scalable data streaming in real-time applications.
In this series of posts, I will show you how to set up a full end-to-end test automation framework, which will include:
- Sending (producing) a message onto a Kafka topic
- Filtering (consuming) for a specific message
- Asserting against a value in the filtered message
This, in a nutshell, is the same automation functionality that you would use in industry.
Prerequisites
- Knowledge of a programming language - we will be using Javascript
- Knowledge of setting up a test automation framework - we will be using AVA
- NPM
- Docker
For the brevity of this tutorial, I will not be going into detail on every section. If you’d like something elaborating, please get in touch.
Setup
To develop the automation framework, we need a local instance of Kafka and its dependencies. We will use Docker and docker-compose. Compose is a tool for defining and running multi-container Docker applications.
docker-compose.yml
- Within your directory, create a
docker-compose.yml
file. - We will use a public container for this, alongside Zookeeper.
- Kafka uses Zookeeper to store a variety of configuration settings:
- Topics under a broker
- Next available offset for a Consumer/topic/partition.
- We can now configure our Kafka instance:
version: '2'
services:
zookeeper:
image: wurstmeister/zookeeper
ports:
- "2181:2181"
kafka:
image: wurstmeister/kafka
ports:
- "9092:9092"
environment:
KAFKA_ADVERTISED_HOST_NAME: 127.0.0.1
KAFKA_CREATE_TOPICS: "testAutomationTopic:1:1"
KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
depends_on:
- zookeeper
volumes:
- /var/run/docker.sock:/var/run/docker.sock
This is the minimum configuration required, but also includes KAFKA_CREATE_TOPICS
, where we pre-determine the topic name, its partitions (1) and replicas (1).
Execute docker-compose.yml file
Now we have the necessary configuration needed to run Kafka locally, we can now build the Docker container.
- Within the directory of your
docker-compose.yml
, rundocker-compose up
. - Once your container images have been downloaded, you’ll now have a locally running Kafka instance.
Coming up next
In the next blog post, we will create our test framework.