Best Practices for Performance on DigitalOcean Load Balancers

DigitalOcean Load Balancers are a fully-managed, highly available network load balancing service. Load balancers distribute traffic to groups of Droplets, which decouples the overall health of a backend service from the health of a single server to ensure that your services stay online.


Here are some recommendations on how to get the best performance from your load balancers based on your use case and application architecture.

Use HTTP/2

When Should I Do This?

In most production workloads, HTTP/2 will outperform HTTP and HTTPS due to its pipelining and connection handling. We recommend using it unless there is a clear case for HTTP or HTTPS.

How Does This Improve Performance?

HTTP/2 is a major update to the older HTTP/1.x protocol. It was designed primarily to reduce page load time and resource usage.

Its major features offer significant performance improvements; for example, HTTP/2 is binary (instead of text) and fully multiplexed, uses header compression, and has a prioritization mechanism for delivering files.

The IETF HTTP Working Group’s documentation on HTTP/2 is a good resource to learn more.

How Do I Implement This?

You can use HTTP/2 by setting your load balancer’s forwarding rules in the control panel. Additionally, load balancers can terminate HTTP/2 client connections, allowing them to function as gateways for HTTP/2 clients and HTTP/1.x applications. In other words, you can transition your existing applications without upgrading the backed apps on your Droplets from HTTP/1.x to HTTP/2.

Monitor the Performance of Your Droplets

When Should I Do This?

Monitoring provides critical performance insights and should be part of any production setups.

How Does This Improve Performance?

Often times, performance issues are caused by a lack of resources on the backend rather than the load balancer itself or its configuration. Monitoring enables you to identify the bottlenecks affecting your infrastructure’s performance, including when your workload is overloading your Droplets, so you can implement the most impactful changes.

How Do I Implement This?

There are a number of ways to monitor performance. One place to start is with DigitalOcean Monitoring, a free, opt-in service that gives you information on your infrastructure’s resource usage.

You can start by looking at the default Droplet Graphs and setting up the DigitalOcean Agent to get more information on CPU, memory, and disk utilization. If you find that you don’t have enough resources for your workload, you can scale your Droplets.

Scale Droplets Horizontally or Vertically

When Should I Do This?

If your backend Droplets don’t have enough resources to keep up with your workload, you should consider scaling up or out.

How Does This Improve Performance?

It won’t matter how your load balancer distributes work among your Droplets if the total workload is too large for them to handle, so verify that your backend Droplet pool has sufficient resources.

There are two ways to scale: horizontally, which distributes work over more servers, and vertically, which increases the resources available to existing servers. Although load balancers facilitate horizontal scaling, both kinds of scaling will improve performance.

How Do I Implement This?

To scale horizontally, you can add more Droplets to your load balancer by navigating to a particular load balancer’s page in the control panel and clicking the Add Droplets button.

The kind of Droplets you use will impact performance as well, so make sure you choose the right Droplet for your application. For example, CPU-Optimized Droplets work best for computationally intensive workloads, like CI/CD and high performance application servers.

To scale vertically, you can resize your existing Droplets to give them more RAM and CPU.

Scale the Load Balancer

If you determine that your load balancer cannot maintain enough connections or distribute traffic quick enough, you can scale the load balancer so that it can manage more connections at once.

When Should I Do This?

If your load balancer’s connections or requests per second is close to reaching its maximum limit. You can monitor the load balancer’s traffic patterns using the Graphs tab.

You may also want to increase the load balancer’s number of nodes if you are expecting an increase in traffic.

How Do I Implement This?

You can change the number of nodes your load balancer contains in the Scaling Configuration section of its Settings tab.