For Turkish documentation: README.tr.md
This module is a Kafka management and messaging REST API service developed with FastAPI. It provides Kafka broker, topic, producer, consumer, and consumer group operations, as well as authentication (JWT). All operations are protected by JWT. The application can be easily deployed with Docker and automatically set up with GitHub Actions.
- FastAPI-based REST API: Kafka broker, topic, producer, consumer, and consumer group management
- JWT Authentication: Obtain an access token via the /token endpoint; all APIs are protected
- User Interface: Bootstrap-based web UI (index.html)
- Full Kafka Integration: Topic creation, message send/receive, group management
- Dynamic Broker Definition: Kafka broker addresses can be set via environment variables or automatically fetched from Terraform state/S3
- Long-lived Token: Script to generate a token valid for 10 years for systems like Kafka Connect
- Prometheus Healthcheck: /health endpoint
- Easy Deployment: Quick setup with Dockerfile and docker-compose
- CI/CD: Automatic setup and test with GitHub Actions
- main.py: Main FastAPI application file
- auth.py: JWT-based authentication and token generation
- config.py: Configuration and environment variable management
- models.py: API data models
- routers/:
- brokers.py: Broker management
- topics.py: Topic management
- producer.py: Message sending
- consumer.py: Message reading
- consumer_groups.py: Consumer group operations
- services/: Helper services for Kafka
- templates/index.html: Web UI
- static/app.js: UI scripts
- generate_permanent_token.py: Script to generate a JWT valid for 10 years
- requirements.txt: Python dependencies
- Dockerfile: Docker image
- docker-compose.yml: Service definition
- All API endpoints are protected by JWT.
- Send a POST request to the /token endpoint with username and password to get an access_token.
- The Authorization: Bearer header is required for API requests.
- For long-lived tokens: use
python generate_permanent_token.py.
- POST /topics: Create a new topic
- GET /topics: List all topics
- GET /topics/{topic}: Get topic details
- PUT /topics/{topic}/config: Update topic configuration
- DELETE /topics/{topic}: Delete topic

- GET /consumer-groups: List all consumer groups
- GET /consumer-groups/{group_id}: Get consumer group details

docker-compose up -dcurl -X POST -d 'username=admin&password=admin123' http://localhost:2020/tokencurl -H "Authorization: Bearer <TOKEN>" http://localhost:2020/brokerscurl -X POST -H "Authorization: Bearer <TOKEN>" -H "Content-Type: application/json" \
-d '{"name": "topic-1", "num_partitions": 3, "replication_factor": 2}' \
http://localhost:2020/topicscurl -X POST -H "Authorization: Bearer <TOKEN>" -H "Content-Type: application/json" \
-d '{"topic": "topic-1", "value": "Hello Kafka!"}' \
http://localhost:2020/producecurl -X POST -H "Authorization: Bearer <TOKEN>" -H "Content-Type: application/json" \
-d '{"topic": "topic-1", "group_id": "rest-api-consumer", "max_messages": 5}' \
http://localhost:2020/consume- KAFKA_BOOTSTRAP_SERVERS: Kafka broker addresses (comma-separated)
- FETCH_BROKERS_FROM_TERRAFORM: true/false (fetch brokers from Terraform state)
- AWS_ACCESS_KEY_ID / AWS_SECRET_ACCESS_KEY: For S3 access
- KAFKA_SECURITY_PROTOCOL, KAFKA_SASL_MECHANISM, KAFKA_SASL_USERNAME, KAFKA_SASL_PASSWORD: Security settings
- API_USERNAME, API_PASSWORD, SECRET_KEY: For API authentication and JWT
- Ansible playbooks automatically fetch Kafka Connect and broker info from the Terraform state file in S3.
- Inventory and environment files are generated dynamically.
- Docker and FastAPI app are automatically deployed with this information.
- The entire process is fully automated with GitHub Actions.
- Automatic setup and test with .github/workflows/ansible_kafka_api.yaml
- On every push: Ansible runs in Docker, generates inventory, fetches connect/broker info from S3, runs healthcheck, and executes playbooks
- Accessible at http://localhost:2020
- All operations can be securely performed via the JWT-protected web UI
- Change default passwords and use a strong SECRET_KEY
- Do not share tokens; store long-lived tokens securely
- All traffic should be protected with SSL (reverse proxy recommended in production)
- Keep Docker and environment variables out of version control
This service provides an AdminClient-based REST API for Kafka cluster management. All operations are performed on the Kafka cluster set up in Section 2 (bootstrap servers, security config) using the same connection details.
- The REST API service is packaged as a Docker container.
- The container is configured to connect to the Kafka cluster as described in Section 2 (bootstrap servers, security).
- API port 2020 is exposed and must be accessible.
- Image build and deployment steps are included in the documentation.







