How to run Kafka with Schema registry locally with Docker

kafka with docker

This short guide shows you how to get Kafka with schema registry locally up and running with its main components using Docker containers. This is very useful when you are testing Kafka locally with schema registry and you do not want to manually set up Kafka with all of its components. This will get Kafka, zookeeper, broker, schema-registry, rest-proxy, connect, ksql-datagen, ksql-server,control-center and ksql-cli running on your machine without any need to do any configuration. This guide will also show you how to remove the docker instances once you are done with testing Kafka.

My use case for this was when using Amazon MSK and needing to test locally before connecting to the AWS MSK cluster. You can set up publisher with the control panel if needed but its a little complicated so I would recommend reading the main confluent guide which can be found here: Confluent quick start guide

Prerequisites

  • Docker:
    • Docker version 1.11 or later is installed and running.
    • Docker Compose is installed. Docker Compose is installed by default with Docker for Mac.
    • Docker memory is allocated minimally at 8 GB. When using Docker Desktop for Mac, the default Docker memory allocation is 2 GB. You can change the default allocation to 8 GB in Docker > Preferences > Advanced.
  • Git.
  • Internet connectivity.
  • Ensure you are on an Operating System currently supported by Confluent Platform.
  • Networking and Kafka on Docker: Configure your hosts and ports to allow both internal and external components to the Docker network to communicate. For more details, see this article.

Download and run Kafka using Docker

  1. Clone the Confluent Platform Docker Images GitHub Repository and check out the 5.4.0-post branch.
git clone https://github.com/confluentinc/examples
cd examples
git checkout 5.4.0-post

2. Navigate to cp-all-in-one examples directory.

cd cp-all-in-one/

3. Start Kafka with schema registry by specifying two options: (-d) to run in detached mode and (--build) to build the Kafka Connect image with the source connector kafka-connect-datagen from Confluent Hub.

You must allocate a minimum of 8 GB of Docker memory resource. The default memory allocation on Docker Desktop for Mac is 2 GB and must be changed.

docker-compose up -d --build

This starts Kafka with schema registry with separate containers for all Confluent Platform components. Your output should resemble the following:

Creating network "cp-all-in-one_default" with the default driver
 Creating zookeeper       … done
 Creating broker          … done
 Creating schema-registry … done
 Creating rest-proxy      … done
 Creating connect         … done
 Creating ksql-datagen    … done
 Creating ksql-server     … done
 Creating control-center  … done
 Creating ksql-cli        … done

4. Optional: Run this command to verify that the services are up and running.

docker-compose ps

You should see the following:

     Name                    Command               State                Ports
------------------------------------------------------------------------------------------
broker            /etc/confluent/docker/run        Up      0.0.0.0:29092->29092/tcp,
                                                           0.0.0.0:9092->9092/tcp
connect           /etc/confluent/docker/run        Up      0.0.0.0:8083->8083/tcp,
                                                           9092/tcp
control-center    /etc/confluent/docker/run        Up      0.0.0.0:9021->9021/tcp
ksql-cli          ksql http://localhost:8088       Up
ksql-datagen      bash -c echo Waiting for K ...   Up
ksql-server       /etc/confluent/docker/run        Up      0.0.0.0:8088->8088/tcp
rest-proxy        /etc/confluent/docker/run        Up      0.0.0.0:8082->8082/tcp
schema-registry   /etc/confluent/docker/run        Up      0.0.0.0:8081->8081/tcp
zookeeper         /etc/confluent/docker/run        Up      0.0.0.0:2181->2181/tcp,
                                                           2888/tcp, 3888/tcp

If the state is not Up, rerun the docker-compose up -d command.

You can navigate to the Control Center web interface at http://localhost:9021/ and control Kafka (creating topics etc).

Stop Kafka Docker containers

When you are done working with Docker, you can stop and remove Docker containers and images.

  1. View a list of all Docker container IDs.
docker container ls -aq

2. Run the following command to stop the Docker containers for Confluent:

docker container stop $(docker container ls -a -q -f "label=io.confluent.docker")

3. Run the following commands to stop the containers and prune the Docker system. Running these commands deletes containers, networks, volumes, and images; freeing up disk space:

docker container stop $(docker container ls -a -q -f "label=io.confluent.docker") && docker system prune -a -f --volumes

You can rebuild and restart the containers at any time using the docker-compose up -d --build command.


1 thought on “How to run Kafka with Schema registry locally with Docker”

Leave a Reply

Your email address will not be published. Required fields are marked *