Skip to content

hguerrero/kong-api-gw-examples

Repository files navigation

Kong API Gateway Examples

This repository contains practical examples demonstrating various features and use cases of Kong API Gateway.

Table of Contents

🤖 AI & Machine Learning

📨 Event & Message Processing

🔧 API Enhancement


Overview

Each example is self-contained and includes:

  • Complete configuration files
  • Docker Compose setup
  • Detailed documentation
  • Step-by-step instructions
  • Testing procedures

Prerequisites

General requirements for all examples:

Examples

Demonstrates how to use Kong's Request Callout plugin to enrich API requests with data from multiple backend services.

Key Features:

  • Sequential API calls
  • Request/response transformation
  • Mock backend services
  • Data enrichment patterns

Technologies:

  • Kong Request Callout plugin
  • Kong Mocking plugin
  • OpenAPI specifications

Shows how to use Kong API Gateway to mediate between HTTP and Kafka protocols.

Key Features:

  • HTTP to Kafka message production
  • Kafka to HTTP consumption
  • Server-Sent Events (SSE) streaming
  • Synchronous and asynchronous patterns

Technologies:

  • Kong Kafka Upstream plugin
  • Kong Kafka Consumer plugin
  • Apache Kafka
  • Server-Sent Events

Demonstrates how to consume Kafka messages via Server-Sent Events (SSE) with OpenID Connect (OIDC) authentication.

Key Features:

  • Kafka message consumption via SSE
  • OIDC authentication integration
  • Keycloak for identity management

Technologies:

  • Kong Kafka Consumer plugin
  • Kong OIDC plugin
  • Apache Kafka
  • Server-Sent Events

Demonstrates how to use Kong Gateway as an event gateway with Kafka integration, including schema validation using a Schema Registry.

Key Features:

  • Kafka message production with schema validation
  • Multiple consumption patterns (REST and SSE)
  • Schema Registry integration
  • Avro and JSON schema support

Technologies:

  • Kong Kafka plugins
  • Apache Kafka
  • Apicurio Schema Registry (Confluent-compatible)
  • Avro schemas

Demonstrates how to use Kong's AI Proxy Advanced plugin to create an AI gateway for image generation and audio transcription services.

Key Features:

  • OpenAI DALL-E 3 image generation
  • OpenAI Whisper audio transcription
  • AI request routing and load balancing
  • Token counting and latency optimization
  • Request/response logging

Technologies:

  • Kong AI Proxy Advanced plugin
  • OpenAI API (DALL-E 3, Whisper)
  • AI request balancing and retries
  • CORS support

Shows how to use Kong's AI Prompt Compressor plugin to optimize LLM requests by compressing prompts while maintaining semantic meaning.

Key Features:

  • Intelligent prompt compression
  • Token-based compression strategies
  • LLM request optimization
  • Configurable compression ranges
  • Integration with local LLM models

Technologies:

  • Kong AI Prompt Compressor plugin
  • Kong AI Proxy Advanced plugin
  • LLMlingua compression service
  • Local LLM models (Qwen2.5)

Demonstrates how to use Kong API Gateway to mediate between HTTP and Solace PubSub+ messaging.

Key Features:

  • HTTP to Solace message production
  • Queue-based messaging patterns
  • Message transformation and routing
  • Persistent message delivery
  • Authentication integration

Technologies:

  • Kong Solace Upstream plugin
  • Solace PubSub+ Standard
  • Message queues and topics
  • Basic authentication

Getting Started

  1. Clone this repository:
git clone https://github.com/your-username/kong-examples.git
cd kong-examples
  1. Set up your Kong Enterprise license:
export KONG_LICENSE_DATA='{"license":{"version":1,...}}'
  1. Choose an example and follow its specific README instructions.

Project Structure

kong-api-gw-examples/
├── README.md
├── ai-image-generation/
│   ├── README.md
│   ├── docker-compose.yaml
│   ├── docker-compose.ee.yaml
│   ├── kong-config/
│   │   └── kong.yaml
│   ├── ee.env
│   └── konnect.env
├── ai-prompt-compressor/
│   ├── README.md
│   ├── docker-compose.yaml
│   ├── docker-compose.ee.yaml
│   ├── kong-config/
│   │   └── kong.yaml
│   ├── ee.env
│   └── konnect.env
├── kafka-http-mediation/
│   ├── README.md
│   ├── docker-compose.yaml
│   └── kong-config/
│       └── kong.yaml
├── kafka-schema-validation/
│   ├── README.md
│   ├── docker-compose.yaml
│   ├── docker-compose.ee.yaml
│   ├── kong-config/
│   │   └── kong.yaml
│   ├── avro-schema.json
│   └── json-schema.json
├── kafka-sse-with-auth/
│   ├── README.md
│   ├── docker-compose.yaml
│   ├── kong-config/
│   │   └── kong.yaml
│   └── realm-export.json
├── request-callout/
│   ├── README.md
│   ├── docker-compose.yaml
│   ├── kong-config/
│   │   └── kong.yaml
│   └── mock-api.yaml
└── solace-http-mediation/
    ├── README.md
    ├── docker-compose.yaml
    ├── docker-compose.ee.yaml
    ├── kong-config/
    │   └── kong.yaml
    ├── ee.env
    └── konnect.env

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

When adding a new example:

  1. Create a new directory with a descriptive name
  2. Include a comprehensive README
  3. Provide all necessary configuration files
  4. Add the example to this main README
  5. Ensure all code is well-documented

License

This project is licensed under the Apache 2.0 License.

Additional Resources