How to Manage Forwarding Rules on a Vultr Load Balancer

Updated on 10 September, 2025

Learn how to create, modify, and delete forwarding rules to control traffic distribution on your Vultr Load Balancer.


Forwarding rules on a Vultr Load Balancer define how incoming traffic is directed to your backend servers. These rules allow you to specify a protocol, port, and target protocol/port for routing requests. By configuring forwarding rules, you can manage and optimize how traffic is distributed, enhancing the performance and accuracy of your load balancing setup.

Follow this guide to manage forwarding rules on a Vultr Load Balancer using the Vultr Customer Portal, API, CLI, or Terraform.

  • Vultr Customer Portal
  • Vultr API
  • Vultr CLI
  • Terraform
  1. Navigate to Products and click Load Balancers.

  2. Click your target Load Balancer to open its management page.

  3. Scroll down to Rules and click the pencil icon in the Forwarding Rules section.

  4. Verify, modify, or click Add to create a rule.

  5. Set the LB Protocol / Port and Instance Protocol / Port.

    • HTTP: Forwards traffic using the Hypertext Transfer Protocol (HTTP).
    • HTTPS: Forwards traffic using the Hypertext Transfer Protocol Secure (HTTPS). Enablng HTTPS rules requires an SSL certificate linked to the Vultr Load Balancer instance.
    • TCP: Forwards traffic using the Transmission Control Protocol (TCP).
  6. Save the changes.

  1. Send a GET request to the List Load Balancers endpoint and note the target Load Balancer's ID.

    console
    $ curl "https://api.vultr.com/v2/load-balancers" \
        -X GET \
        -H "Authorization: Bearer ${VULTR_API_KEY}"
    
  2. Send a POST request to the Create Forwarding Rule endpoint to add new forwarding rules to the target Load Balancer.

    console
    $ curl "https://api.vultr.com/v2/load-balancers/{load-balancer-id}/forwarding-rules" \
        -X POST \
        -H "Authorization: Bearer ${VULTR_API_KEY}" \
        -H "Content-Type: application/json" \
        --data '{
            "frontend_protocol": "{load-balancer-protocol}",
            "frontend_port": {load-balancer-port},
            "backend_protocol": "{instance-protocol}",
            "backend_port": {instance-port}
        }'
    
  3. Send a GET request to the List Forwarding Rules endpoint to view the forwarding rules of the target Load Balancer.

    console
    $ curl "https://api.vultr.com/v2/load-balancers/{load-balancer-id}/forwarding-rules" \
        -X GET \
        -H "Authorization: Bearer ${VULTR_API_KEY}"
    
  1. List all available instances and note the target Vultr Load Balancer ID.

    console
    $ vultr-cli load-balancer list
    
  2. Add new forwarding rules to the target Load Balancer.

    console
    $ vultr-cli load-balancer forwarding create <load-balancer-id> --frontend-protocol "<load-balancer-protocol>" --frontend-port <load-balancer-port> --backend-protocol "<instance-protocol>" --backend-port <instance-port>
    
  3. List all forwarding rules for the target Load Balancer.

    console
    $ vultr-cli load-balancer forwarding list <load-balancer-id>
    
  1. Open your Terraform configuration for the existing Load Balancer.

  2. Add or update forwarding_rules to declare the listener and backend mapping, then apply.

    terraform
    resource "vultr_load_balancer" "lb" {
        # ...existing fields (region, label, health_check, etc.)
    
        forwarding_rules {
            frontend_protocol = "http"
            frontend_port     = 80
            backend_protocol  = "http"
            backend_port      = 8080
        }
    
        forwarding_rules {
            frontend_protocol = "https"
            frontend_port     = 443
            backend_protocol  = "https"
            backend_port      = 8443
        }
    }
    
  3. Apply the configuration and observe the following output:

    Apply complete! Resources: 0 added, 1 changed, 0 destroyed.

Comments

No comments yet.