Thursday, March 4, 2010

Web Load Testing with OpenSource Tools (Linux)

There are many commercial tools that can help you to estimate the load on your web servers, however there are also several opensource tools that can help you to estimate the load on the servers.

After my research, I found the following tools:

  1. Openload
  2. httperf
  3. ab
  4. siege
These tools are really nice and they have a lot of options to perform testing however I was a little disappointed when I tried to use POST methods. Moreover, they only use the server IP to create connections to the server, and the reports where very limited. If I wanted to simulate different connections with different IPs I had to create scripts. So I decided to check curl to start to generate scripts.

Curl is a very powerful command line tool that can create HTTP or HTTPS connections. Moreover it supports POST methods and cookies.

Example:

#curl -c cookie.txt -d "login=user&password=password" http://webserver/login.php

This command will pass login/password (POST DATA) to the form and it will store the cookie on cookie.txt file. Then you can use the cookie to access restricted pages.

Example:

#curl -b cookie.txt http://webserver/search.php

Great tool, do not you think? ...well let's google to find somebody that has worked scripting with curl.

Voila ...I found a tool that it is great for create load testing = CURL-LOADER

This tool has all the features that I need for testing:
  • Multiple clients.
  • Multiple IPs.
  • Post Method available.
  • Complete sequence of page access.
  • Reports per client and totals
Basically you would have to download the latest file from http://sourceforge.net/projects/curl-loader/files/ then make sure you have GCC compiler, Openssl and Openssl development installed.

Then do
$tar zxfv curl-loader-.tar.gz
$cd curl-loader-

$make optimize=1 debug=0
#make install

The format to use this tool is basically:

#curl-loader -f script.conf

Where script.conf is the file that contains the parameter for the load testing.
Some examples can be found on the following directory:

#cd /usr/share/doc/curl-loader/conf-examples/

However I can show you the following example:

# cat script.conf
########### GENERAL SECTION ###################
#
BATCH_NAME=script-results
CLIENTS_NUM_MAX=255
CLIENTS_RAMPUP_INC=2
INTERFACE=eth0
NETMASK=22
IP_ADDR_MIN=10.100.0.0
IP_ADDR_MAX=10.100.0.254
IP_SHARED_NUM=255
URLS_NUM=4
CYCLES_NUM= 200

########### URLs SECTION #######################

### Login URL - cycling

# GET-part
#
URL=http://webserver/login.php
URL_SHORT_NAME="Login-GET"
REQUEST_TYPE=GET
TIMER_URL_COMPLETION = 5000
TIMER_AFTER_URL_SLEEP =1000

# POST-part
#
URL=""
URL_SHORT_NAME="Login-POST"
URL_USE_CURRENT=1
USERNAME= user
PASSWORD= pass
REQUEST_TYPE=POST
FORM_USAGE_TYPE= SINGLE_USER
FORM_STRING= username=%s&password=%s
TIMER_URL_COMPLETION = 4000
TIMER_AFTER_URL_SLEEP =1000

### Cycling URL
#
URL=http://webserver/search.php?action=search&keyword=test
URL_SHORT_NAME="Service List"
REQUEST_TYPE=GET
TIMER_URL_COMPLETION = 5000
TIMER_AFTER_URL_SLEEP =1000

### Logoff URL - cycling, uses GET and cookies to logoff.
#
URL=http://webserver/logout.php
URL_SHORT_NAME="Logoff"
REQUEST_TYPE=GET
TIMER_URL_COMPLETION = 0
TIMER_AFTER_URL_SLEEP =1000

A quick explanation is the following:

1- Results will be stored on files according to the BATCH_NAME.
2- The machine will be using 255 shared IPs. Two clients will be starting every second with max number 255.
3- Each client will go through 4 pages: Login page, Post user and pass, run a search, then logout.


Enjoy it!!!

5 comments:

  1. Hi,
    can you explain it to me in two words.
    The ip range you specified, shoud i have my "apache box" and "curl-loader box" in the same subnet?
    also the IP's should be free and not in use in the subnet by any machine

    thanks

    ReplyDelete
  2. Hi,
    Your apache box and curl-loader box do not require to be on the same subnet. The subnet IPs that you need for the curl-loader will require to be on the same subnet of the curl-loader so packets can find the route to your machine.

    ReplyDelete
  3. OpenWebLoad is ancient!

    ReplyDelete
  4. Antonio, yes Openload is ancient but it works.

    ReplyDelete
  5. I am getting compile errors on ubuntu

    make optimize=1 debug=0
    gcc -W -Wall -Wpointer-arith -pipe -DCURL_LOADER_FD_SETSIZE=20000 -D_FILE_OFFSET_BITS=64 -fomit-frame-pointer -O3 -ffast-math -finline-functions -funroll-all-loops -finline-limit=1000 -mmmx -msse -foptimize-sibling-calls -I. -I./inc -I/usr//include -c -o obj/parse_conf.o parse_conf.c
    parse_conf.c: In function ‘read_callback’:
    parse_conf.c:3894: error: conflicting types for ‘pread’
    /usr/include/unistd.h:385: note: previous declaration of ‘pread’ was here
    make: *** [obj/parse_conf.o] Error 1

    ReplyDelete