cURL for Throughput Testing

I was recently faced with the dauntingly tedious task of doing throughput testing which meant uploading and downloading files from HTTP and FTP servers and noting the average throughputs in each direction separately and simultaneously. This is fun for about 30 minutes if done by hand but gets very tedious and even confusing afterward as constantly triggering up- and downloads makes you loose your thread at some point when your mind wanders somewhere else during the downloads. So I decided to automate the process.

There must be about a zillion ways to do this and I chose to do it with cURL, a handy command line tool to upload and download files in just about any protocol used on the net, including http, ftp, pop, etc. etc. It's ultra configurable via the command line and has a great variety of output options that make later analysis such as averaging downloads speeds of different files very simple.

For doing repetitive downloads I came up with the bash script (works well under Ubuntu and MacOS):

#!/bin/bash
URL="http://ftp.xyz.com/name-of-file"
OUTFILE=test-down.csv
rm test-down.csv
curl $URL -o /dev/null -w '%{size_download}, %{speed_download}n' >>$OUTFILE
curl $URL -o /dev/null -w '%{size_download}, %{speed_download}n' >>$OUTFILE
curl $URL -o /dev/null -w '%{size_download}, %{speed_download}n' >>$OUTFILE

cat $OUTFILE

The URL variable holds the URL to the file to be downloaded. Obviously if you test high speed links, the server should have enough bandwidth available on its side for the purpose. The OUTFILE variable holds the name of the local file to which the file size and download speeds are written into. Then, the same curl instruction is run 3 times and each time, the result is appended to OUTFILE. While the script runs, each curl instruction outputs information about current speeds, percentage of the download completed, etc.

And here's my script for automated uploading:

#!/bin/bash
UPURL="http://xyz.com/test/upload.html"
LOCALFILE="10MB.zip"
OUTFILE="test-upload.csv"
rm $OUTFILE
curl  -d @$LOCALFILE $UPURL -o /dev/null -w '%{size_upload}, %{speed_upload}n' >> $OUTFILE
curl  -d @$LOCALFILE $UPURL -o /dev/null -w '%{size_upload}, %{speed_upload}n' >> $OUTFILE
cat $OUTFILE

The trick with this one is to find or build a web server as a sink for file uploads. The LOCALFILE variable holds the path and filename to be uploaded and OUTFILE contains the filename of the text file for the results.

Note the '.csv' file extensions of the OUTFILES which is convenient to import the results to a spreadsheet for further analysis.


2 thoughts on “cURL for Throughput Testing”

  1. In my case I needed a server that could handle up- and downloads at 100+ Mbit/s. As I don’t have such as server I could not install an iperf server on that end and hence I had to look for another method, i.e. curl to a public server that is well connected.

Comments are closed.