Chapter 4: Server Load Balancing

This section describes the usage of the REST API to control the server load balancing (SLB) feature in cOS Core. When the REST API is used with SLB, the load balanced servers can send back their loading information so cOS Core can allocate connections across the servers appropriately. This means that a new connection can be allocated to the server with the least load.

The need for this feature usually comes from the uneven server distribution that can arise due to "stickiness" with the cOS Core SLB feature. Stickiness can mean that connections from a particular client are always assigned to a particular server, even though SLB has tried to allocate the load evenly.

Having the REST API feature for SLB means that cOS Core can be aware of the actual load experienced by servers so that allocation decisions can be made using more than just the number of connections.

Configuring SLB to make use of the REST API consists of the following steps:

  1. Create custom software which then runs on each of the servers to be balanced. This software uses the REST API to send back load information to cOS Core. The server load is expressed as a single integer percentage between zero and 100.

  2. Enable REST API access by creating the appropriate Remote Management object in the cOS Core configuration. Doing this is described in Chapter 1, Introduction.

  3. Create a cOS CoreServer Load Balancing Rule object which uses Resource-usage as its distribution method. The Server Identifier string property of this rule must be included with the POST messages sent by servers so they can be associated with the rule in conjunction with the server's IP address.

    Doing this is described further in the server load balancing section of the separate cOS Core Administration Guide.

4.1. Sending the Server Load to cOS Core Using POST

To send the current server load to cOS Core, perform an HTTP POST towards the following URI:

/api/oper/slb

The POST message body can contain the following parameters:

  • serverID - A string which is identical to the Server Identifier property of an SLB Rule in the receiving cOS Core configuration. This is mandatory. Along with the serverIP parameter, it associates an SLB Rule with the server.

  • serverIP - The IP address of the sending server. This is mandatory. Along with the serverID parameter, it associates an SLB Rule with the server.

  • maintenance - This can take the values yes or no. It indicates if the server is in service or not and provides a means for the server to temporarily make itself unavailable by using the value yes. A value of no means the server is available and this will make the server available to cOS Core for balancing if it is not already. The parameter can be missing when reporting a load and the online status is not changing.

  • load - This is an integer which represents the current percentage of server load. This parameter can be omitted if the message is changing the maintenance value (either going offline, or coming back online). If it is omitted when the server is going online, cOS Core will assume a load of zero.

[Note] Note: maintenance and/or load must be present

All SLB POST messages must have either maintenance or load present, or both. The message is invalid if neither is present.

Both could be present when a server is both reporting its load and changing its status (such as when it comes online and reports its load).

For example, to send a load value of 31% for a server with an IP address of 192.168.1.10 and an associated SLB rule identifier of serverA, the POST body would contain the following:

serverID=serverA&serverIP=192.168.1.10&maintenance=no&load=31