Getting code that functions efficiently, is well documented, and elegantly solves problems is a large part of successfully solving development challenges. But where a fair number of tutorials, books, and courses leave off is where you need to deploy a solution and support all the traffic a web based solution will get should it become popular.
A large part of why load testing is not frequently done comes down to cost and complexity. A professionally managed load test can cost well over $5,000 and take weeks to initiate and interpret. What I want to do with this blog post is present a viable option for small to medium scale load testing within Amazon Web Services using an open source project named Goad.
Why load testing
One of the best practices that I see ignored far too often is not running load tests on websites and services to identify how much capacity it can handle before becoming unresponsive. This is not quite the same performance tuning or memory profiling, though a load test does inform them.
We can guess where issues will occur, but it is much better to test and observe to be certain. A change in response to data can be more valuable than intuition. A few times I have been surprised after a load test that what I was concerned would be a problem under load was just fine.
Goad is a Go-based load testing solution created during the 2016 Gopher Gala hackathon. Since then it has really been polished and does an amazing job at getting metrics on performance based on utilization.
What it does is best summarized in the website:
Goad takes full advantage of the power of Amazon Lambdas for distributed load testing. You can use Goad to launch HTTP loads from up to four AWS regions at once at this time. Each lambda can handle hundreds of concurrent connections, we estimate that Goad should be able to achieve peak loads of up to 100,000 concurrent requests.
What I love about this so far:
- It cleans up after itself when finished, so nothing lingers wasting money on unused resources. Once the test is over there are no additional charges
- No complicated setup; tests can be run and managed from any computer that can run Go
- All traffic comes from AWS and does not burden the host computer network
- Can be deployed in any geographic region within AWS
- Relatively easy to setup and use
- You can use any URL reachable from your AWS environment
- For smaller load tests I observed my entire test cost only $0.80
What I feel could be improved:
- It would be great if it gave some price estimates. Right now you have to manually calculate this
- Multiple URLs and manifest files of tests are not currently supported, so a manual test have to be initiated for multiple URLs
- Reporting could be a bit more detailed
These improvement ideas don’t take away from what it does now, and I may wind up contributing to the project to help make this already good project cover a few items it doesn’t currently support.
The only dependencies are a good internet connection, an AWS account, and to have Go installed on the computer monitoring the test. There are two options for how to run the project: from source or from a compiled binary. How you choose to proceed is up to you but I will be looking at running tests from the compiled binary.
If you are already using a tool or the command line to do AWS deployments you have the configuration file in place. If not, you will need to create the file manually. For a Linux, OS X, or Unix computer the file need to live at the following path:
If working on a Windows PC the file should be placed at:
This file can contain multiple profiles (default, named profiles, and CLI specific configuration) but we will use the simplest implementation:
[default] aws_access_key_id=ATU4PLWRD9YGF2XEAKVZ aws_secret_access_key=xTyuAXUtnFEMI/Pyx7ftS/ud1YEezzuEXAMPLEKEY
As mentioned I am using the compiled binary so I will need to use the full path when invoking Goad. If you choose to install from source or to add this file to your system path you won’t need to use the full path.
To start let’s look at the help response within the command line:
# Get help: $ goad -h Usage of goad: -c uint number of concurrent requests (default 10) -m string HTTP method (default "GET") -n uint number of total requests to make (default 1000) -r string AWS regions to run in (comma separated, no spaces) (default "us-east-1") -t uint request timeout in seconds (default 15) -u string URL to load test (required) -H string HTTP Header to add to the request. This option can be set multiple times. # For example: $ goad -n 1000 -c 5 -u https://example.com
All of the parameters are explained well, and you should be ready to run your first load test on a URL. As I mentioned you have to run each URL as a new test, but this is a fine for just testing the package. For this example I am going to spin up a static HTML site on a free tier AWS apache server.
For any load test the ability to both monitor the environment and to review the server/service logs are critical. Within AWS we have a dashboard for each server, so basic monitoring is available. We will also do basic configurations in our website to log access and errors to log files. These logs will be separate from the normal apache logs.
Since the focus of this post is Goad and not configuring servers I won’t go into seeing up a server environment and subsequent monitoring. Mostly because Amazon provides this for your resources already. A simple say to see how the environment is performing is to open the AWS Dashboard and view the near real-time statistics.
Hopefully this brief introduction has shown how Goad can be used to explore a website or service. In this modern era where caching, CDNs, Docker containers, and inexpensive cloud computing it is tempting to just add more resources to cover application problems. But adding a simple load test can give evidence in how an application performs, and using Goad to run them should definitely be considered a best practice.