Step 0 (Create Docker Compose file)
See my Docker Compose file
Step 1 (Create Zookeeper Container)
1 |
docker run -d --name zookeeper -p 2181:2181 -p 2888:2888 -p 3888:3888 -e ZOOKEEPER_CLIENT_PORT=2181 -e ZOOKEEPER_TICK_TIME=36000 confluentinc/cp-zookeeper:4.1.0 |
Step 2 (Create Kafka Container)
1 |
docker run -d --name kafka -p 9092:9092 -e ADVERTISED_HOST=192.168.99.100 -e KAFKA_ADVERTISED_PORT=9092 -e KAFKA_ADVERTISED_HOST_NAME=$(echo $DOCKER_HOST | cut -f3 -d’/’ | cut -f1 -d’:’) -e LOG_LEVEL=DEBUG -e KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181 -e CONSUMER_THREADS=1 -e BROKER_ID=1 --link zookeeper:zookeeper confluentinc/cp-kafka:4.1.0 |
Step 3 (Create Kafka Topic)
1 |
docker exec kafka kafka-topics --create --zookeeper zookeeper:2181 --replication-factor 1 --partitions 1 --topic testing_odenktools |
Step 4 (Verify Kafka Topic)
1 |
docker exec kafka kafka-topics --describe --topic testing_rahyan --zookeeper zookeeper:2181 |
Step 6 (Get All Kafka Topics)
1 |
docker-compose exec kafka kafka-topics --zookeeper zookeeper:2181 --list |
Step 6 (Listen to Kafka Topic)
1 |
docker exec -it kafka bash -c "kafka-console-consumer --bootstrap-server kafka:9092 --topic testing_odenktools --from-beginning --max-messages 10" |
Step 7 (Produce data to Kafka Topic)
1 |
docker exec -it kafka bash -c "kafka-console-producer --broker-list kafka:9092 --topic testing_odenktools" |
1 |
{"TEST":"HEYYYY ODENKTOOLS"} |
References :
Confluent kafka stream samples
Installing Confluent on production
Manual Deserialize Debezium Connector Event
Debezium Postgres Avro Converter
Streaming databases in realtime with MySQL, Debezium, and Kafka (Articles)
Confluent JDBC Source Configuration Options
KSQL: Streaming SQL for Apache Kafka (Articles)