A docker-compose stack for experimenting with Kafka in python.
- clone repo
- cd into folder
- run
docker compose build
- run
docker compose up
- access Jupyter Lab on http://localhost:8888/
- Jupyter notebooks are stored in the notebooks/ directory and mounted as a local volume so that your notebooks will
persist between
docker compose up
anddocker compose down
cycles. Any changes you make to the notebooks will not be deleted between docker compose up and down cycles
- The Kafka / Zookeeper data is stored persistently in a docker volume created by compose. If you want to scrap the
existing message store run
docker compose down -v
which will bring down the containers and delete the volume so that you can start fresh. Note: this will not delete your notebooks folder.
Start with 00-kafka-producer.ipynb