Skip to content

Latest commit

 

History

History

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 
 
 
 
 

README.md

Kafka REST API (FastAPI)

For Turkish documentation: README.tr.md

Overview

This module is a Kafka management and messaging REST API service developed with FastAPI. It provides Kafka broker, topic, producer, consumer, and consumer group operations, as well as authentication (JWT). All operations are protected by JWT. The application can be easily deployed with Docker and automatically set up with GitHub Actions.


Main Features

  • FastAPI-based REST API: Kafka broker, topic, producer, consumer, and consumer group management
  • JWT Authentication: Obtain an access token via the /token endpoint; all APIs are protected
  • User Interface: Bootstrap-based web UI (index.html)
  • Full Kafka Integration: Topic creation, message send/receive, group management
  • Dynamic Broker Definition: Kafka broker addresses can be set via environment variables or automatically fetched from Terraform state/S3
  • Long-lived Token: Script to generate a token valid for 10 years for systems like Kafka Connect
  • Prometheus Healthcheck: /health endpoint
  • Easy Deployment: Quick setup with Dockerfile and docker-compose
  • CI/CD: Automatic setup and test with GitHub Actions

Directory and File Structure


Authentication

  • All API endpoints are protected by JWT.
  • Send a POST request to the /token endpoint with username and password to get an access_token.
  • The Authorization: Bearer header is required for API requests.
  • For long-lived tokens: use python generate_permanent_token.py.

auth


Core API Endpoints

Authentication

  • POST /token: Get JWT token token

Broker Management

  • GET /brokers: List all brokers brokers

Topic Management

  • POST /topics: Create a new topic
  • GET /topics: List all topics
  • GET /topics/{topic}: Get topic details
  • PUT /topics/{topic}/config: Update topic configuration
  • DELETE /topics/{topic}: Delete topic topics

Producer

  • POST /produce: Send message producer

Consumer

  • POST /consume: Read message consumer

Consumer Group

  • GET /consumer-groups: List all consumer groups
  • GET /consumer-groups/{group_id}: Get consumer group details consumer_group

Healthcheck

  • GET /health: Service health check health

Usage Example

1. Run with Docker

docker-compose up -d

2. Get Token

curl -X POST -d 'username=admin&password=admin123' http://localhost:2020/token

3. List Brokers

curl -H "Authorization: Bearer <TOKEN>" http://localhost:2020/brokers

4. Create Topic

curl -X POST -H "Authorization: Bearer <TOKEN>" -H "Content-Type: application/json" \
  -d '{"name": "topic-1", "num_partitions": 3, "replication_factor": 2}' \
  http://localhost:2020/topics

5. Send Message

curl -X POST -H "Authorization: Bearer <TOKEN>" -H "Content-Type: application/json" \
  -d '{"topic": "topic-1", "value": "Hello Kafka!"}' \
  http://localhost:2020/produce

6. Read Message

curl -X POST -H "Authorization: Bearer <TOKEN>" -H "Content-Type: application/json" \
  -d '{"topic": "topic-1", "group_id": "rest-api-consumer", "max_messages": 5}' \
  http://localhost:2020/consume

Environment Variables and Configuration

  • KAFKA_BOOTSTRAP_SERVERS: Kafka broker addresses (comma-separated)
  • FETCH_BROKERS_FROM_TERRAFORM: true/false (fetch brokers from Terraform state)
  • AWS_ACCESS_KEY_ID / AWS_SECRET_ACCESS_KEY: For S3 access
  • KAFKA_SECURITY_PROTOCOL, KAFKA_SASL_MECHANISM, KAFKA_SASL_USERNAME, KAFKA_SASL_PASSWORD: Security settings
  • API_USERNAME, API_PASSWORD, SECRET_KEY: For API authentication and JWT

Automated Deployment with Ansible and S3 Integration

  • Ansible playbooks automatically fetch Kafka Connect and broker info from the Terraform state file in S3.
  • Inventory and environment files are generated dynamically.
  • Docker and FastAPI app are automatically deployed with this information.
  • The entire process is fully automated with GitHub Actions.

Automation with GitHub Actions

  • Automatic setup and test with .github/workflows/ansible_kafka_api.yaml
  • On every push: Ansible runs in Docker, generates inventory, fetches connect/broker info from S3, runs healthcheck, and executes playbooks

actions


Web Interface

  • Accessible at http://localhost:2020
  • All operations can be securely performed via the JWT-protected web UI

web


Security and Best Practices

  • Change default passwords and use a strong SECRET_KEY
  • Do not share tokens; store long-lived tokens securely
  • All traffic should be protected with SSL (reverse proxy recommended in production)
  • Keep Docker and environment variables out of version control

Operational Excellence Goal

This service provides an AdminClient-based REST API for Kafka cluster management. All operations are performed on the Kafka cluster set up in Section 2 (bootstrap servers, security config) using the same connection details.


Docker Deployment

  • The REST API service is packaged as a Docker container.
  • The container is configured to connect to the Kafka cluster as described in Section 2 (bootstrap servers, security).
  • API port 2020 is exposed and must be accessible.
  • Image build and deployment steps are included in the documentation.