API Gateway
Laposa was approached to implement a centralized API gateway as the connective layer, enabling routing, authentication, and load balancing, while abstracting the complexity of direct integrations. A robust and scalable API architecture was delivered using Kong API Gateway, with comprehensive configuration, seamless deployment, and continuous monitoring to ensure optimal performance and reliability.
- Client | Musgrave Limited
- Technologies | Kong · Kubernetes · DataDog · Event Hubs
- Date | 2017 – 2019
-
Handling
4M+
API requests daily
-
Serving
67
API consumers
-
Exposing
82
APIs and web services
The Challenge
Our client, a large-scale enterprise that collaborates with multiple third-party suppliers and service providers, faced increasing complexity in managing their APIs. Both internal services and third-party APIs had grown into a tangled web of direct integrations over time.
As more API counterparts were introduced, maintaining individual connections became difficult, prone to failure, and difficult to scale or secure. These challenges necessitated the deployment of a solution that would simplify communication between services—both internal and external—while ensuring better security, scalability, and manageability.
The client decided to implement a centralized API gateway as the connective layer, enabling routing, authentication, and load balancing, while abstracting away the complexity of direct integrations. That’s where Laposa stepped in to deliver a robust, scalable API architecture.
The Solution
We deployed Kong API Gateway, a cloud-native solution renowned for its high performance and extensibility via plugins. The solution runs natively on a Kubernetes cluster with horizontal auto-scaling for handling high traffic volumes and suits perfectly the client's consumer-facing applications.
Kong proved to be an ideal fit because it offers functionality for proxying, routing, load balancing, health checking, and authentication management. Specifically, the client’s need to manage both internal and third-party APIs through a centralized proxy was addressed by Kong's robust routing and API orchestration capabilities. With approximately 82 APIs and services being routed through the gateway and 67 distinct consumer entities interacting with these services, managing the traffic flow, access control, and performance optimization became vastly more streamlined.
By serving as the central hub, Kong allows the client to decouple their services from individual API consumers or providers without creating complex interdependencies. Should the client face any changes in API or microservice architecture, there will be no need to untangle what would otherwise be a spider’s web of direct integrations.
Key Technical
Features and
Enhancements
During the deployment, several notable features of Kong were leveraged, including:
Kong's Horizontal Scalability on Kubernetes
Given the high-volume traffic—over 4 million API requests processed daily—we deployed Kong as part of a Kubernetes cluster with horizontal auto-scaling enabled. This ensures the API Gateway can rapidly scale to meet demand and maintain responsiveness under heavy loads.
Request Modification and Authentication Injection
For specific API calls, we utilized Kong’s capabilities to modify HTTP headers, for example to allow cross-origin resource sharing (CORS). For certain APIs, Kong injected authentication tokens into requests to ensure seamless, automated authentication for specific services.
DB-less Mode Migration
Initially, we used a PostgreSQL backend for storing configuration data. However, Kong later introduced a DB-less mode, which removed the need for a central database. We orchestrated a smooth migration to DB-less configuration, eliminating a potential point of failure and further boosting performance by moving more configuration directly into Kong's distributed nature within the Kubernetes architecture. This shift not only simplified the configuration process but also made the entire API ecosystem more resilient.
Security
Measures
Security was a critical aspect of this deployment, considering that both internal systems and third-party services were routed through the API gateway. The approach involved multiple layers of security to ensure API traffic was tightly controlled and monitored:
Web Application Firewall (WAF) and Network Firewall
The client deployed a Web Application Firewall (WAF) in front of the API Gateway, acting as the first line of defense, filtering HTTP requests and providing protection against a variety of attacks, including SQL injection and cross-site scripting. Furthermore, there is an additional Network Firewall to monitor ingress and egress connections, thereby controlling external traffic flows to ensure secure communication.
Access Control Lists (ACLs)
Consumers in Kong were segmented and aligned into specific access groups, ensuring that every consumer had access only to the services they were authorized to use. This was crucial for the client, whose API consumers include both internal teams and external third-parties. Functional groups and strict ACL rules helped prevent unauthorized access to sensitive internal services.
Custom Docker Images
We deployed Kong in custom Docker containers running with a non-root user for enhanced security. Using non-root users follows best DevOps security practices, reducing the likelihood of privilege escalation or container exploitation.
Monitoring and
Observability
To provide continuous, real-time monitoring and reporting for the client, we implemented a robust observability stack anchored by DataDog and Azure Event Hubs.
DataDog Integration
We customized Kong’s HTTP Logger plugin to send real-time logs to DataDog. These logs included key metrics such as the number of API requests, errors, and latency across various services. Several Dashboards were created in DataDog to monitor critical services at a glance. These dashboards provided insights into:
- Total number of successful and failed requests
- Latency and request duration
- Daily traffic patterns (e.g. peak API request times)
- Error distribution and root cause analysis
Alerts and Anomalies Detection
In addition to monitoring, we integrated DataDog Monitors to trigger automated alerts when metrics exceeded predefined thresholds, such as request times taking too long or consistent errors from a service. This alert system helps both teams at Laposa and the client take corrective action proactively, ensuring minimal service degradation.
Azure Event Hubs Logging
In parallel to DataDog, we also streamed logs to Azure Event Hubs, where the client’s internal teams could process and analyze them further using in-house tools. This dual approach of logging provided redundancy while also diversifying log analysis capabilities based on what internal teams preferred.
Results and Benefits
Since the deployment of the Kong API Gateway, the client has experienced significant improvements in both operational efficiency and service reliability:
Improved Scalability
The Kubernetes-based horizontal scaling and DB-less configuration allow Kong to handle increasing traffic levels seamlessly, processing over 4 million API requests daily without impacting performance.
Simplified Integrations
By abstracting away underlying complexities, Kong facilitated clean integrations with third-party APIs and internal services. This reduced the operational overhead and minimized the risk of unforeseen outages stemming from any disruption in third-party API dependencies.
Enhanced Security
With a multi-layered security approach including both API-level access control and robust external firewalls, the client’s APIs are now better protected against external threats.
Improved Monitoring and Reliability
Through DataDog, the client’s teams now have complete visibility into the health and performance of the API ecosystem, ensuring they are quickly alerted to any anomalies, thereby helping them maintain system integrity and reliability.
Conclusion
Laposa’s successful deployment of Kong API Gateway has empowered the client to manage complex APIs and third-party services with greater agility, enhanced security, and improved telemetry.
The reduction in direct, unmanaged integrations has simplified their operational landscape, making it easier to scale, monitor, and enforce security standards across their API-driven architecture. By eliminating database dependency and ensuring scalable, high-speed service delivery, Laposa has fortified the client’s infrastructure for its current needs and future growth.
Next Step
If you're looking to improve how your organization manages APIs, streamline third-party integrations, and implement scalable solutions with robust security, Laposa can help. Whether you need to enhance security controls, simplify integrations, or boost observability, we have the experience and know-how to deliver.
Get in Touch