Kong — AI Agent Framework: Live Stats & TrendScore

Live GitHub stats, community sentiment, and trend data for Kong. TrendingBots tracks star velocity, fork activity, and what developers are saying — updated from real data sources.

GitHub data synced: Mar 27, 2026 • Sentiment updated: Apr 9, 2026

GitHub Statistics

Community Sentiment

Community Buzz: The problem is that LeCun was obviously wrong on LLMs before, said a user on HackerNews. 'Kong Enterprise is known for its extensive API management features' as mentioned on Dev.to

Pros & Cons

What People Love

Kong API Gateway, Kong's ease of use, Kong's scalability, Kong's customizability

Common Complaints

No significant complaints in recent discussions

Biggest Positive: Kong API Gateway

Biggest Negative: No significant issues

Why Kong Stands Out

Kong Gateway is valuable because it provides a cloud-native, platform-agnostic, and scalable API and AI gateway with advanced features like multi-LLM support, semantic security, and MCP traffic governance. Its extensible architecture via plugins and native Kubernetes support make it a unique solution. By centralizing common API, AI, and MCP functionality, Kong Gateway creates more freedom for engineering teams to focus on the challenges that matter most. Its advanced routing, load balancing, and health checking features, as well as its universal LLM API, make it a standout solution in the market.

Built With

Build a cloud-native API gateway with advanced AI traffic capabilities — Kong Gateway enables this with its high-performance and extensible architecture via plugins, Build a microservices architecture with service discovery and load balancing — Kong Gateway provides advanced routing, load balancing, and health checking features, Build an AI-powered API with multi-LLM support and semantic security — Kong AI Gateway offers a universal LLM API and MCP traffic governance, Build a Kubernetes ingress controller with native support — Kong Gateway has an official Kubernetes Ingress Controller for serving Kubernetes, Build a plugin-based architecture for extending gateway functionality — Kong Gateway has a Plugin Hub with many community-developed plugins

Getting Started

  1. Clone the Docker repository and navigate to the compose folder: $ git clone https://github.com/Kong/docker-kong && $ cd docker-kong/compose/
  2. Start the Gateway stack using: $ KONG_DATABASE=postgres docker-compose --profile database up
  3. Configure Kong using the Admin API or decK: $ curl -X POST http://localhost:8001/services
  4. Add authentication to an API: $ curl -X POST http://localhost:8001/plugins -d 'name=jwt'
  5. Try configuring a service to verify it works: $ curl -X GET http://localhost:8000

About

🦍 The API and AI Gateway

Official site: https://konghq.com/install/

Category & Tags

Category: development

Tags: ai, ai-gateway, api-gateway, api-management, apis, artificial-intelligence, cloud-native, devops, kubernetes, kubernetes-ingress, kubernetes-ingress-controller, llm-gateway, llm-ops, mcp, mcp-gateway, microservice, microservices, openai-proxy, reverse-proxy, serverless

Market Context

Kong is a leading API gateway solution, competing with other products such as AWS API Gateway