Skip to main content

API Gateway vs Load Balancer

API gateways and load balancers both route HTTP requests to backend resources but they operate in different ways. We’ll go through some of the differences between the two as well as how to use them together.

What is a load balancer?

A load balancer is a tool that helps distribute incoming requests among a group of servers or resources. It acts as a central interface for clients to connect to backend services through HTTP requests. A load balancer may have multiple endpoints, each of which points to a specific resource such as an EC2 server or a Lambda function. When a client makes a request to one of these endpoints, the load balancer is responsible for routing the request to a healthy backend resource and then returning the response to the client.

Load balancers are primarily used to distribute requests evenly among a horizontally scaled infrastructure cluster, where multiple servers are used to ensure sufficient capacity to handle all the demand. They are also used to decouple clients and services, which is a good practice from a cloud architecture perspective.

What is an API Gateway?

An API gateway is a tool that can manage and balance network traffic, but it does so in a different way than a load balancer. Instead of evenly distributing requests among a group of backend resources, an API Gateway can be configured to send requests to specific resources based on the endpoint being requested. This is especially useful in microservices architectures, where multiple services can be connected to the Gateway and mapped to specific HTTP endpoints. The gateway is responsible for routing requests to the appropriate backend service as needed.

API gateways also scale differently than load balancers. For example, when used with AWS Lambda, it can handle up to 10,000 requests per second and automatically scales to meet demand. However, in cases where large spikes in demand are expected, it may be necessary to request an increase in service quotas to prevent client requests from being throttled.

API gateways also offer a number of additional features that are not available with most load balancers. These include authentication and authorization, API token issuance and management, the ability to generate SDKs based on the API structure, support for throttling to prevent abuse or restrict access based on billing plans, and integration with IAM services to make it easier to control access to underlying resources.

How would you use them together?

API gateways and load balancers can be used together in a system architecture. One way to do this would be to use the API gateway for incoming requests and the load balancer to distribute requests to a group of resources.

For example, a client might send a request to the API gateway endpoint for a specific service. The API gateway could then forward the request to a load balancer, which would distribute the request among a group of servers or resources that are capable of handling the request. The load balancer would receive the response from the resource, package it into an HTTP response, and send it back to the API gateway. The API gateway would then pass the response back to the client.

Using an API gateway in combination with a load balancer can provide the benefits of both tools. The API gateway can be used to handle routing, authentication, and other frontend tasks, while the load balancer can handle distributing requests and scaling resources in the backend.

Conclusion

API gateways are critical tools when deploying a sophisticated microservices architecture while load balancers are simpler and great when you need to handle high load with horizontal scaling. It's very possible that you’d need to use both together if you have to handle high load for your microservice based application.