Vultr Load Balancer Quickstart Guide
Introduction
The Vultr Load Balancer is a fully-managed solution to distribute traffic to multiple application servers. With a Vultr Load Balancer, you can enable horizontal scaling and increase the reliability of your applications in seconds by setting a few parameters through the customer portal. You don't need to worry about the underlying load balancer server operating system, configuration files, or system management tasks. We handle all the details so you can focus on your application.
What is a Load Balancer?
Assume you have an e-commerce store with a single web server.
As your store becomes more popular, you need to scale up your site to manage the traffic. You could use a more powerful web server, but a better solution might be to use multiple servers and load balancer tools. If you add two more servers and a Vultr Load Balancer, your network looks like this.
Using a load balancer has several advantages.
- You can scale your application up and deploy more web servers as the traffic grows.
- As necessary, you can scale your application down by removing web servers if the traffic drops.
- The Load Balancer detects failed web servers and stops routing traffic to them, improving your application's availability.
Vultr Load Balancers support custom health checks, multiple load balancing algorithms, sticky sessions, proxy protocol, SSL certificates, firewalls, private networks, and more. Vultr Load Balancers work with all our server products, including Bare Metal.
Load balancers are effective for applications that can scale with multiple parallel instances. They distribute the load but don't address file synchronization or database consistency between your application instances.
How to Deploy a Load Balancer
To deploy a new Vultr Load Balancer, navigate to the Add Load Balancer page in the customer portal.
Choose a location. Your load balancer and all instances attached to that load balancer must be in the same location.
Choose a Load Balancer Configuration.
- Enter a label of your choice for this load balancer.
- Choose an algorithm. The default, Roundrobin, selects servers in turn without regard for traffic. Leastconn selects the server with the least number of connections.
- If you use Sticky Sessions, your application must manage the persistence cookies. See the HAProxy documentation for more information.
- If you redirect all traffic from HTTP to HTTPS, you must use an HTTPS rule and SSL certificate.
- If you enable Proxy protocol, you must also configure your backend nodes to accept Proxy protocol.
- Enter the number of nodes for this load balancer.
Create at least one forwarding rule. Do not use port 22 or 65300-65310 because the Load Balancer uses these internally.
HTTP2 defaults to Off. To enable it, you must add at least 1 HTTPS forwarding rule combo (HTTPS -> HTTPS).
VPC Network defaults to Public. If you prefer sending traffic to your instances via their attached VPC, choose that here.
Firewall rules are optional.
Health checks allow the load balancer to determine if an instance is ready to receive traffic.
When you have completed the form, click the Add Load Balancer button to deploy.
How to Attach Instances to a Load Balancer
After the load balancer deploys, navigate to the Load Balancer section, click the three-dot More menu, and click Manage.
On the Manage Load Balancer page, click the Add Instance button, then select an available instance from your location.
Advanced Configurations
Integrated Firewall
The Vultr Load Balancer has an integrated firewall. You can learn more in our article How to Use the Vultr Load Balancer Firewall.
Using the Vultr Firewall with a Load Balancer
The Vultr Firewall can use a Load Balancer as an IP source. We explain more in How to Use the Vultr Firewall with a Vultr Load Balancer.
Private Networks and Multiple Firewalls
Explore an advanced scenario with private networking and multiple firewalls in How to Configure a Vultr Load Balancer with Private Networking, where you'll create an advanced configuration like this.
Feature Reference
This guide is an overview of Load Balancer concepts. If you need more information about a specific feature, the Vultr Load Balancer Feature Reference has detailed configuration information.
Frequently Asked Questions
What are nodes in load balancers?
Nodes allow you to scale your load balancer to handle more traffic. With more nodes, you can handle more concurrent connections and requests per second.
How many connections are allowed?
Load balancers support up to 15,000 simultaneous connections per node.
How many nodes can I have in a load balancer?
We allow up to 99 nodes per load balancer. We only allow odd numbers of nodes in a load balancer. This allows for automatic failover in the event of a node failure.
Can I use a load balancer for servers in multiple regions?
No. A load balancer can only direct traffic for server instances in the same location as the load balancer server itself.
Can I use my load balancer in one location with instances in a different location?
Unfortunately not. Load Balancers and attached instances must be in the same location.
My servers are working; why is my health check failing?
- If using HTTP or HTTPS protocol, ensure the port and URL paths are correct. The health check looks for HTTP 200 OK success status response code. Any other code is considered unhealthy.
- If using TCP protocol, test an open port on the attached node.
How is bandwidth charged?
Vultr Load Balancers are bandwidth neutral. We only charge for bandwidth on the instances attached to the load balancer.
How do I attach instances to my Vultr Load Balancer?
You can assign and remove instances to your Load Balancer in the Vultr customer portal.
How do I manage my Load Balancer?
You do not have to worry about managing Vultr Load Balancer software. They are fully managed.
What protocols do you support?
Vultr Load Balancers support TCP, HTTP, and HTTPS.
More Information
- For comprehensive Load Balancer documentation, see the Load Balancer Feature Reference.
- Learn how to configure wildcard SSL on your load balancer.
- Learn about limits to the number of forwarding rules.