Friday, May 30, 2008

Abusing SMTP Servers for fun and profit with JMeter and the SMTP plug-in

Let’s Be Spam Kings with JMeter! (aka Abusing SMTP Servers for fun and profit)

The other day I came across a SMTP Plug-in for JMeter from http://www.beolink.org/index/jmeter-plug-in.

Two extra .jars are needed that are not shipped with JMeter and those are mail.jar and activation.jar. There were no MD5 sum values so I assume that these two .jars are adequate as the solution appears to work. I am not aware of any possible versioning conflicts as I am no Java guru. I did write “Hello, World!” code once in Java back in 1995.

To install the plug-in I had to copy both mail.jar and activation.jar to the jakarta-jmeter-2.3.1\lib\ sub-dir and copied the SmtpSampler-0.9.jar to the jakarta-jmeter-2.3.1\lib\ext sub-dir.

Let’s create a brand new JMeter project:


Let’s add a thread for our spam chunking:


I’ve setup 5 threads with 200 iterations for a total of 1,000 iterations to be executed in total as this is the number of files that I want to chunk at our poor SMTP server.

Like in my FTP application, I have a CSV file that contains all the information that JMeter will require for chunking spam. It is pretty similar to the FTP data:

8<----------------------------------------------------------------- user1@someserversomeplace.com,10.0.1.4,C:\work\smtpLoadAgent\data\dataFile1.txt.gpg user2@someserversomeplace.com,10.0.1.4,C:\work\smtpLoadAgent\data\dataFile2.txt.gpg user3@someserversomeplace.com,10.0.1.4,C:\work\smtpLoadAgent\data\dataFile3.txt.gpg user4@someserversomeplace.com,10.0.1.4,C:\work\smtpLoadAgent\data\dataFile4.txt.gpg user2@someserversomeplace.com,10.0.1.4,C:\work\smtpLoadAgent\data\dataFile5.txt.gpg user1@someserversomeplace.com,10.0.1.4,C:\work\smtpLoadAgent\data\dataFile6.txt.gpg user5@someserversomeplace.com,10.0.1.4,C:\work\smtpLoadAgent\data\dataFile7.txt.gpg user4@someserversomeplace.com,10.0.1.4,C:\work\smtpLoadAgent\data\dataFile8.txt.gpg user5@someserversomeplace.com,10.0.1.4,C:\work\smtpLoadAgent\data\dataFile9.txt.gpg ------------------------------------------------------------------>8

The columns are pretty easy to figure out: e-mail address, IP address of the SMTP server in question and the path to the file that is going to be chunked. In this case, just like the FTP server example, it’s a GPG’d text file containing random data.

Just like the other example, add a CSV Data Set Config to JMeter defining the variables that will be used:


For some reason, the name of the plug-in comes across as [res_key=smtp_title]. I don’t know if this a config issue on my side or something not set properly in the .jar. I emailed the person that was nice enough to create this plug-in but haven’t heard anything back. It appears to be a cosmetic issue at this point and doesn’t cause any problems that I’ve seen so far.


Now we need to fill out the information for the SMTP sampler so we can abuse the SMTP server:


In this example, I replaced the default [res_key=smtp_title] with “smtpSampler” and set the Server field to the dynamic variable that I defined earlier as ${ipAddress}.

In my particular case, the SMTP server is running on port 35 and not the standard port 25. It’s great that the SMTP plug-in gives us this option as I couldn’t find an option to change the port number in the perl code that I had written with MIME::Lite and Net::SMTP. I had done some testing with a freeware port redirection tool that got the job done but I never got far enough to see how much overhead was consumed by the port redirection. Now I don’t have to worry about it!

I set the Address From field to be a (I think) non-existant e-mail address of joe@blow.com.

Address To is set to the dynamic variable ${emailAddress}

The file that I want to send is set to be ${fileToAttach} which is the full path to the file in question. Since the CSV file has a different file for each line, we won’t be sending the exact same file with each iteration.

For the subject line I used “${fileToAttach} ” and then checked the “Include timestamp in subject” checkbox. The timestamp that will be appended to the filename being sent is actually epoch time and not human readable form. For most people I doubt that this will be a problem. I also checked “Calculate message size” so I can see the number of bytes that were transferred when logging output.

Let’s go ahead and throw some Listeners so that we can see the results of our testing. I’m not going to use the spline graph as I prefer the scatter plot that Graph Results show. Even it isn’t my preferred way of showing data but it is easy enough to dump the output to an XML/CSV file and use some of my perl code to generate “pretty” graphs:


I have found that I prefer to watch the “Average”, “Deviation” and “Throughput” data points in the scatter plot.

In my particular case, I am pointing at an SMTP server running on my local laptop in a Virtual PC 2007 VM hosting a Win2k3 server running hMailServer. For a free e-mail server, hMailServer seems to be a fairly full featured product and includes IMAP support and integration with blacklists and WinClam virus protection and I even saw some entries for tar pitting and even AD support. Pretty darn cool!

Here is a shot of hMailServer waiting to receive our mass spamming of attachments:



Let’s rip loose with JMeter and see what happens:





Well, look at that! 1,000 e-mails chunked at a rate of about 300 per minute or about 5 per second. Not too shabby.

Logging into Thunderbird in the VM we see that the e-mail attachment was sent without issue:



How cool is that?

I found that the JMeter with the SMTP plug-in was faster than my perl code using MIME::Lite and Net::SMTP by about 1.5 e-mails per sec. I also had some issues with the MIME::Lite issuing a die() when a problem was encountered instead of returning a false with $msg->send_by_smtp() which would cause a worker thread to die off which was pretty annoying.

So, with the SMTP plug-in I get access to non-standard ports (as seen above) and better over all performance.

Something that I did find is that if on the SMTP Sampler screen the “Check for Failure” checkbox is checked that each of the e-mail xfers are marked as “warning”. Example is below:



If the log file that I specified is examined you will see the following:

“org.beolink.jmeter.protocol.smtp.UnexpectedSuccessException: Expected failure but got success...”

To checkout the 500 error return I went ahead and did some tests and captured the TCP stream with Wireshark (nee ethereal) and saw that there was no difference between the perl chunking and the JMeter/SMTP plug-in chunking and there were no error codes returned.

My solution? Uncheck the “Check for failure” checkbox and keeping on trucking.

Despite the two issues noted above, the SMTP plug-in for JMeter does a great job and I will be using it instead of my home grown perl solution.

Thursday, May 29, 2008

It's on it's way!

I got news from management today that my portable server, the Billy Bad Ass quad core RAID5 notebook has been approved. It's barely squeaking by the $4K limit but it should be on it's way soon. Yay!

Can't wait to fire that bad boy up with Win2k[38] Server and get all four cores chunking load at a target.

I also got the SMTP Plugin for JMeter up and running and will make an entry about that tomorrow if I have time. It does a pretty good job and is faster than my perl code that makes use of MIME::Lite and Net::SMTP for chunking attachments at an e-mail server.

More details in the next post.

Wednesday, May 28, 2008

Let's abuse a FTP server with JMeter!

Let's use JMeter to chunk files at a FTP server!

For my job I had written some multi-threaded perl code using Net::FTP to chunk files at an FTP server. The data is generated by another chunk of code and during the generation I generate a file by the name of sourceList.txt that has various elements of data used by my perl code.

The CSV file is in the form of:

8<-------------------------------------
user1,pass1,C:\work\ftpLoadAgent\data\dataFile1.txt.gpg,dataFile1.txt.gpg user1,pass1,C:\work\ftpLoadAgent\data\dataFile2.txt.gpg,dataFile2.txt.gpg user1,pass1,C:\work\ftpLoadAgent\data\dataFile3.txt.gpg,dataFile3.txt.gpg user3,pass3,C:\work\ftpLoadAgent\data\dataFile4.txt.gpg,dataFile4.txt.gpg user1,pass1,C:\work\ftpLoadAgent\data\dataFile5.txt.gpg,dataFile5.txt.gpg user3,pass3,C:\work\ftpLoadAgent\data\dataFile6.txt.gpg,dataFile6.txt.gpg user3,pass3,C:\work\ftpLoadAgent\data\dataFile7.txt.gpg,dataFile7.txt.gpg user2,pass2,C:\work\ftpLoadAgent\data\dataFile8.txt.gpg,dataFile8.txt.gpg user2,pass2,C:\work\ftpLoadAgent\data\dataFile9.txt.gpg,dataFile9.txt.gpg user1,pass1,C:\work\ftpLoadAgent\data\dataFile10.txt.gpg,dataFile10.txt.gpg ------------------------------------->8

The first column contains the username on the FTP server and the second column contains the account password. The third column contains the path to the file that I want to chunk upstream to the FTP server and the final column contains the password that the file is saved as on the FTP server.

Let's build a project to crush our FTP server!

Start with a blank JMeter project:


Let’s add a thread group to the test plan and name it “ftpLoadTest."


Let’s add a CSV Data Set Config under the ftpLoadTest thread group:


I set the name of the CSV group to “ftpLoadData” and gave it a path to my .TXT file (which is actually a CSV file with a .TXT extension) and give the column names of the columns of data (username, password, localFileName, remoteFileName).

As the thread executes the FTP sampler the record being pointed at will advanced. It’s my understand that if there are multiple threads they will alternate records so that multiple threads won’t pull from the same record. Sounds good to me so far.

Now it is time to add the FTP sampler that will do the FTP work for us:


Now we have the entry for the FTP sampler:


Now we need to populate the fields with the data that is being pulled from the .TXT file in CSV format. The method of using the referenced variable data is the same way that a variable can be referenced in perl code, that is $ sigil with open brace, variable name and close brace like this: ${someVariableName}.

In this case, I am going to hard code the IP address of the FTP server (ever be it so humble, there is no place like 127.0.0.1). When filling out the password portion of the data entry all the characters will be masked but go ahead and use the ${someVariableName} format. I also set the put (STOR) radio button and select binary checkbox. In my particular case, the data that I am sending is not an ASCII file but a GPG encrypted file and sending the file as a ASCII file could corrupt the data and I wouldn’t want that.

Even though you can’t see it in the password field, it reads “${password}.”


We want to see the results of the FTP chunking so let’s add a “View Results Tree” listener.


Now, in my particular case for this example I am running a local FTP server on my laptop and I am using the super handy stand alone FTP server “Quick’n Easy FTP Server 3.1 Lite.” This FTP server great for prototype testing or just sharing files at a LAN party. The config files are saved in .XML files and you can just xcopy to another machine and fire ‘er up. Good stuff, that!


I’ve already created the usernames and subdirs in advance and won’t go over that in this blog entry. But suffice it to say that setting up the users with the interface is easy enough that even *I* can do it.

The server allows us the see server statistics in real time:


And has the option to allow us to watch the FTP server log in real time:

So, let’s hit the run button and let’s see what happens!


Look at that! 10 successful runs (I only have 10 files to chunk at this time).

Let’s go ahead and add an listener to show results in tabular format and hit run again:


Here we can see more details such as response times, file size (in my case the file sizes are random) and success/failures. Good stuff!

Let’s go ahead and add a spine graph visualizer and see how it looks:


I generated more files so that I’d have more than 10 files to work with. In this case, I generated 10,000 of 5K byte files and instead of random upper case characters I used an uppercase X. The resultant GPG’d files were only 640 to 641 bytes in size.

Appropriately I change the number of iterations to 10,000 and execute the test.


I didn’t like how the spline graph looked with so many data entries so I went ahead and added a graph listener which does a scatter graph of the results. Below are the first 2,458 iterations:


The graph isn’t very readable so when I am executing this test in the future I’ll go ahead and just record the results to file and generate files either with some perl code that I have already coded or with Excel. But, the graph is good enough for watching while a test is executing. To be truthful, I am too much of a slacker to modify the original code to graph the way I'd like to see so I really can't complain.

The rate of chunking the files is about the same as my perl code but in the future I am more likely to use JMeter to run my FTP test as all my other fellow coworkers might not be perl fluent.

All in all, cool stuff!

Today sniffing around the ‘net I found a SMTP sampler and I might be able to use JMeter for my SMTP chunking as well. I hope it works out the same as the FTP JMeter project. I like my perl code and all, but this is a better solution for those folks that aren’t perl coders.

The only major difference between my perl code and the JMeter results is that when an FTP failure occurs, my perl code pushes the data back onto a stack to be picked up by another thread. That is the only major difference.

If this was a LoadRunner scenario with FTP users using a .DAT file in a similar manner the skipping of records with failed iterations would be the same so it’s no biggie in the end and good enough for my purposes.

Friday, May 23, 2008

Why have one Zombie when 10,000 will do?

Ok. I couldn't resist myself. Sure, the ftpLoadAgent was doing a decent job against the prototype server when it was single threaded but ADD got the best of me and I ended up multi-threading it with a thread pool that the user can specify on the command line. Now I can really stress the heck out of the FTP server.

Did I really need to? Nope. It was just the cool thing to do. What fun is load testing if you can't crush the server into pulp? None I say! I won't always crush the server into a pulp but if I need to for stress testing purposes, by golly gosh I can!

I got my protoype server running in Virtual PC 2007 up and running for the most part. Properly setting up WCF with an exported personal certificate was a pain in the butt. It was also my first exposure to nant. I can see where nant can be real handy in a server preparation scenario. I need to learn more about the proper usage of nant.

I did some more playing around with VS2008 TE last night and have pretty much wrapped my noggin' around the basics of the WebTestRequest and related event handlers and I have to say that it is pretty nifty. My initial thoughts is that VS2008 will have a steeper learning curve but be better for those testers that have a development background and that want to leverage the .NET Framework.

Thursday, May 22, 2008

Rolling! Rolling! Rolling! Time to ge that code 'a rolling!

I am getting some push back from IT about the laptop I want as the normal vendors do not carry the Sager Laptops. That is a bummer. I found that the laptop is a rebadged Clevo D901C. I found some other non brand name vendors that sell variants of the D901C with quad core Xeon procs and 8 gig of RAM. It's cool to think that if I do get the Sager (or some other variant of the D901C) that it can be upgraded to 8 gigs of RAM. *Insert Tim Allen grunt here*

I wrote a basic ftpLoadAgent and data generation code in perl today. The data generation code basically generates text files of a random length with random characters that will be PGP'd into encrypted files that will be decrypted by the application that I'm testing. Along with the files I also generate a CSV list with random usernames and the associated username password that will be FTP'd into the server to force work to be done. I also did basic testing of the ftpLoadAgent and it worked as expected.

Writing the ftpLoadAgent is the easy part. Right now it isn't multi-threaded and I don't think that in this case I will need to generate a zombie horde of FTP requests against the server as the polling interval of the app in question is at minimum 60 seconds and my ftpLoadAgent can average about 240 xfers per minute. If need be, I can multi-thread the ftpLoadAgent to get more xfers per unit time.

I will probably re-write the perl ftpLoadAgent in C# just to get back on development with Visual Studio. It's been a while since I've done any general development in C#. My experience with C# is usually all or nothing and I tend to forget various tidbits. When I'm not coding C# I am mainly coding perl. It's funny how programmers tend to think of solutions in the language that they are currently using the most. Once upon a time I would think of solutions in Pascal or C, and then it was FoxPro or Clipper and then VB and then perl and then C# and then back to perl. Stoopid contextual thinking processes.

I also did some playing around with VS2008 coded web tests. I haven't quite wrapped by brain around it all yet but I can already see that the validation rules are going to be very handy and I can see where I could have used the stock or even custom validation rules in some projects at my previous employer. I ended up writing a bunch of HTML parsing code in C to get the job done and I can see where I could of extracted the information that I needed with those validation rules (even the stock ones) rather easily and made my life a better place.

One thing is for sure, it is time to hop back on the C# coding bandwagon and get back up to speed so that I am thinking of solutions in C# rather than perl. I went ahead and did the perl stuff just because it was faster for me in the short run but I need to be thinking longer term.

Oh yeah... VS2008 didn't make any calls to any licensing servers that I could tell unless somehow my laptop tonight mysteriously VPN'd into work (and I don't have the VPN software installed yet) and communicated.

Tuesday, May 20, 2008

VS2008 licenses, eyeballing JMeter and hitting the ground running.

It turns out that the license for VS2008 TE is only a single use license which means that only one of us can use it at a time. I haven't had a chance to test this in actuality to see if it contacts a MS licensing server or something like that. I did get it installed but haven't had a chance to make any coded web tests.

I did take a look at JMeter for my current assignment and I don't think that it is going to be exactly what I need. The FTP sampler might work as it will upload/download any given file name and it is my assumption that the filename can be referenced with a variable of some sort but I haven't played with JMeter enough to determine if that is true. The E-Mail sampler appeared to be a e-mail reader only with POP3/IMAP whereas I need to send e-mails with SMTP.

The current thought is that in the short term I will code some agents with perl to do the tasks that I need (FTP/SMTP) and see if I can generate the load that I need against the server.

I'm also in the process of installing the application on a Virtual PC 2007 VM of Win2k3 Standard and going through the various machinations of getting the application setup. Today I took care of the FTP server and the e-mail server. Tomorrow will be SQL Server 2005. Hopefully by the end of the week I'll know the various sub-components and then be able to do some rudimentary perl coding and get an idea if I am going down the right path.

I still want to go back and give JMeter a once over just to see if I have missed something with respect to SMTP and the E-Mail sampler. The learning curve on JMeter is pretty steep but I still believe that some good stuff is there to be had. I haven't had a chance to look at The Grinder 3 at all but am still very interested in a Jython solution. But, if my employer is willing to spend the fundage on VS2008 TE licenses I may not concentrate on The Grinder 3 much and give more attention to JMeter and VS2008 for my load testing needs but I don't want to take any tool off the table completely.

First day on the new job.

I put in a request for the laptop I mentioned earlier (the Sager UberLaptop of Powah!) but with a BlueRay burner, extra battery (as the battery life of that notebook is zilch, one hour max under good conditions I bet) along with an extra power brick to make moving it between home and work less of a hassle. Can't wait for it to arrive!

It'll be heavy at nearly 20# with laptop, PS and extra battery. We'll just have to see how the mobile server works out. Hopefully I can run with Win2k8 server on the machine for maximum networking performance for load generation. *knock on wood*

It appears that my new employer might be licensed for VS2008 Team Edition and if so, I'll have the basic tools for coded web tests in C#. I don't know about Agent licensing via MS yet but it is one step at a time right now.

Got a little heads up on what appears will be my first LT and it isn't the typical LT that I've performed in the past. There are HTTP interfaces as well as FTP and POP3. I believe that JMeter is capable of running both FTP and POP3 protocols for load testing and I will probably look into that further once I get settled in my current work location. I'm sure that there are more important details and I gathered the above information from just an informal discussion at my first lunch with my workmates.

Tuesday, May 13, 2008

Mental Masturbation and Notebooks

I have been looking for a mobile loadgen notebook for operations when "out in the field." The biggest criteria is portable horsepower and I don't mind a "desktop replacement." I'm not looking for a travel friendly unit but the most portable horsepower that I can carry and I might have found the unit that I am looking for...

Sager NP9262:

  • 17" Wide Viewing Angles WSXGA+ LCD with Super Glossy Surface (1680 x 1050)
  • Intel® Core™ 2 Quad Processor Q6700 / 8MB L2 Cache, 2.66GHz, 1066MHz FSB
  • Single Nvidia GeForce 8800M GTX Graphics with 512MB DDR3 Video Memory
  • Genuine MS Windows® VISTA BUSINESS 32/64-Bit Edition
  • 4GB Dual Channel DDR2 SDRAM at 800MHz - 2 X 2048MB
  • RAID-5 Storage ( Data Strip & Parity - Requires 2nd and 3rd Hard Disk Drives )
  • 200GB 7200 rpm SATA 150 Hard Drive
  • 200GB 7200 rpm SATA 150 Hard Drive
  • 200GB 7200 rpm SATA 150 Hard Drive

How much? $3299.00

It's not so much a notebook as it is a portable four core server. That's a lot of portable horsepower! 400 gig of filespace in RAID5 goodness with four cores. If I can set proc affinity for various processes using the `start /affinity ` command then I should be golden.

But it comes at a price. That sucker should weigh in at around 13 pounds of weight not including the monster 230 watt PS.

I've also looked at the HP 8710P and Dell XPS high end laptops but none of those have the support for RAID5 goodness that gives you a warm and happy feeling if a drive decided to drop off the face of the planet.

Friday, May 9, 2008

The start of this adventure

On Friday, 25 April 2008 I accepted a position in a firm that doesn't have any load testing department. I will be the first load tester and blazing the trail for the firm in question.

I'm leaving my comfy position with a Fortune 50 company with over 1.5 Billion in online sales where I've been load testing various systems for the past eight years with Mercury Interactive LoadRunner.

The big question will be this: Can I find a good open source load testing tool that can get the job done? Can I find a closed source load testing tool that can get the job done and not be as expensive as LoadRunner? LoadRunner is crazy expensive. It gets the job done but there are some things about it that just drive me nuts.

LoadRunner support under HP has really sucked. Big time... Getting a new license for a controller for my replacement laptop took over eight business days. There is no reason it should have been the nightmare that it was.

When I had to create the credentials for the AR system our contract was bound to HP OpenView, not HP MI Load Runner. That took some time to straighten out. It was just a big cluster frak.

For some reason my VUGen would crash both IE and Firefox and I couldn't record scripts. I must've sent back and forth over 20 e-mails with Tier I support who communicated with Tier II supprt. What a PITA that was. Ultimately it doesn't matter since I am leaving my current employer tomorrow. I suggested that they fdisk the drive and reload the OS. There has got to be something on the OS level that is a problem.

Well... We'll just have to see how it goes.

Tomorrow is the last day with my current employer and on 19 May 2008 I'll start with the new firm and we'll see how it goes. It is exciting and scary all at the same time.

So far my initial research has led me to take a closer look at JMeter, The Grinder 3 and OpenSTA. JMeter and OpenSTA look promising but something about The Grinder 3 makes me tingle inside: A loose framework allowing me to run rampant in Jython with actual meaningful string handling routines and native regex support.

OpenSTA seems to be very popular but I don't know if I want to be limited in the code by the SCL scripting language. I've been very spoiled over the past 10 years by perl anonymous hashes and regular expressions. Does Jython even has anonymous hashes? So much to learn! I haven't coded in Python in about seven years so I've forgotten almost all of it.

Weeeee!