|
|
|
|
SteelArrow Wins First Round of Response Testing
|
|
Version 4.1 is easily the most efficient and stable version of SteelArrow to
date. We tested SteelArrow against other Web Application Servers (WAS) in the
market, and are proud to share our results with anyone who may be interested.
Orignal test cases performed with SteelArrow v3.9 in mid-2001.
First, we should describe the testing phase and the way in which we tested
each of the listed Web Application Servers. There were three tests that were
performed, each used virtual users to connect to the web server
(IIS on Windows 2000 Advanced Server) using our own
HTTP tester utility.
This tester allows us to create virtual users on separate threads so that we are
able to test server response times when the server is under different load
scenarios.
The data that was requested was data available in the Microsoft
Access Northwind sample database (customer table). This database was used as
an ODBC datasource by all WAS servers.
Go here to see the HTML output
that all Web Application Servers were expected to deliver.
No cookies or other form of session management was used with any of the listed
products. The times recorded represent the amount of time between a request, and
the complete data set being returned.
|
|
The First Test - Medium Load
|
|
The first test used 20 virtual users making 100 requests on the server.
Between each request, each virtual user waited for 2000 milli-seconds (2 seconds) and
then made its next request. The results were as follows (click the WAS name to see
the associated screen-shot);
|
|
The Second Test - Heavy Load
|
|
The second test used 50 virtual users making 50 requests on the server. Between
each request, the virtual user waited for a random time upto 2000 milli-seconds (2 seconds)
and then made its next request. The results were as follows (click the WAS name to see
the associated screen-shot);
Note: PHP had two tries as the first test took an exceptional amount of
time to return from the first request.
|
|
The Third Test - Light Load
|
|
The third test used 1 virtual user making 50 requests on the server.
Between each request, the virtual user waited for 1000 milli-seconds (1 second) and then
made its next request. The results were as follows (click the WAS name
to see the associated screen-shot);
|
|
The New Tests - Kill the Server
|
|
Version 4.1 of SteelArrow performed much better than we originally expected, so we
added a few more tests to see if we could find the limit. Based on the previous tests
we felt there was no benefit in comparing against any of the other server products.
The following results represent our findings;
300 Concurrent Users - with 2 second pause |
SteelArrow v4.1 |
1.329 seconds |
20998 bytes |
|
350 Concurrent Users! - with 2 second pause |
SteelArrow v4.1 |
1.760 seconds |
20998 bytes |
The following scripts were used in all of the above tests;
|
|
|
|
Mem. Usage (K) |
VM Size (K) |
SteelArrow v4.1 |
Start Test |
2,544 |
1,140 |
End Test |
6,896 |
not recorded |
15 minutes after test |
not recorded |
not recorded |
|
SteelArrow v3.9 |
Start Test |
2,104 |
1,080 |
End Test |
7,292 |
5,680 |
15 minutes after test |
4,548 |
2,716 |
|
ColdFusion v5.0 (cfserver.exe) |
Start Test |
10,060 |
13,416 |
End Test |
13,944 |
9,960 |
15 minutes after test |
13,940 |
9,960 |
|
ColdFusion v4.5 (cfserver.exe) |
Start Test |
14,256 |
15,280 |
End Test |
17,712 |
18,088 |
15 minutes after test |
17,732 |
18,092 |
|
ASP -- DLLHost.exe |
Start Test |
Not Loaded |
Not Loaded |
End Test |
12,984 |
8,864 |
15 minutes after test |
12,664 |
8,496 |
|
PHP - Loads and Unloads as executing |
|
ActivePERL - Loads and Unloads as executing |
Note: Memory results recorded during medium load test as reported by Windows Task Manager.
For each Web Application Server, there was no optimization of code or changes to
settings beyond those necessary to have the WAS working on the server. All tests
were performed on the same server (AMD Athlon-1GHz 128MB RAM) over a local area
network. The client utility was run on an Intel-333Mhz with 512MB RAM. All network
cards and hubs had network throughput of 100Mbps.
- SteelArrow and ColdFusion were the easiest to get up and running. Running the
install was all the effort that was required. Many ColdFusion samples, and a close
similarity to SteelArrow resulted in the quick creation of a database query script.
- ASP was the second easiest to get running once we were able to find a working database
sample. The errors that were produced while developing the test script were limitted
in information.
- The PHP install required a missing OCX control and as a result we had to manually
install the IIS server plug-in. As with ASP, some time was spent finding a working PHP
database script for use with ODBC.
- The ActivePERL install installed all necessary components to their rightful places on
the server. It was necessary to set directory permissions to allow execution of the
final script. In order to work with an ODBC datasource there was some web searching
required to find an appropriate database module. In the end, we used the Win32::ODBC
module developed by Roth Consulting. As with ASP and PHP, some time was spent finding
a working PERL database script for use with ODBC.
As SteelArrow and ColdFusion are tag based languages there was no real programming
required. With 5 or so language specific tags a database query and data output script was easily
built for each. PHP and ASP are closer to true programming languages and as a result it was
necessary to reference several web sites to create the test scripts that were used here
(to ensure similarity of function).
We do not claim to be experts in ASP, ColdFusion, PERL or PHP. Our test scripts were built from
information and samples that are readily available on the World Wide Web.
Our tests were in-house, and were not performed by a third party. The tests were developed
with fairness in mind. Everything possible was performed to ensure the validity of the test results.
The listed test values represent our own findings and we look forward to anyone who may dispute
what we have communicated here.
We have to admit that we were quite pleasantly surprised by the results of
our newest v4.1 build. Its currently in use on the Ottawa Senators Fan Forum (over 600,000 hits per month), The House Finder (over 100,000 hits per month) and
Carleton University Parking Services
(over 50,000 hits per month) but to name a few...
and yes.... SteelArrow rocks! :)
|