Monday, June 30, 2008

Another neat use for PostRequest

I have a page that I am testing against that comes back as a failure because a .js file hasn't been migrated yet so I don't want my coded web test to hit that page and fail. I took care of the problem with PostRequest for the WebTestRequest call like this:



   1:  void preventCallsToJS(object sender, PostRequestEventArgs e) {

   2:    List<WebTestRequest> requestsToRemove = new List<WebTestRequest>()

   3:    foreach (WebTestRequest linkToRemove in e.Request.DependentRequest

   4:      if (linkToRemove.Url.EndsWith("someJavaScriptFile.js")) {

   5:        requestsToRemove.Add(linkToRemove);

   6:      }

   7:    };

   8:    foreach (WebTestRequest linkToRemove in requestsToRemove) {

   9:      e.Request.DependentRequests.Remove(linkToRemove);

  10:    };

  11:  }



For each request I have added the call to the above PostRequest like this:



   1:  request4.PostRequest += new EventHandler<PostRequestEventArgs>(preventCallsToJS);



No more problems hitting those pesky not yet deployed .js files. w00t!

Saving sessions as webtests with Fiddler2

Today I was trying to save some saved sessions with Fiddler2 to a webtest since I cannot record webtests with VS2008 with my common user profile that I use on my laptop.

When I tried to save the file I got a very annoying error that a specified assembly could not be found. Fiddler2 is trying to load version 8.0.0.0 of Microsoft.VisualStudio.QualityTools.WebTestFramework.dll but alas, I do not have VS2005 install, I have VS2008. What to do? I'm already bummed that I cannot record webtests as I should be able to. I find this to be really annoying as recording webtests shouldn't be this annoying.

At this point you're probably asking, "So, Mr. Auswipe-Load-Tester-Dude, does this shake your faith in the VS2008 framework for load testing?" Not yet is my reply. Why you ask when these two issues are clearly impediments? Well, with Load Runner I was running into problems recording virtual users and had the problem for over three weeks where I could not record vusers on my work machine. When I tried to record vusers both FireFox and IE would detonate and crash. I worked with Level I and Level II techs for over three weeks trying to correct the problem and the issues was never solved. It turns out I got my current job offer and accepted it. I told my former coworkers that I figured the solution was to take the fdisk quiz and reload the OS and start again. I certainly hope the same solution is not required for the MS solution.

Anyway, back to the issue at hand...

I did some Googling and I found this entry on CodeProf.com with a solution by Ed Glas. Ed basically says to modify the fiddler.exe.config and use the tag but as documented here but did not include a full fledged example, just the link to the .NET config file documentation. So, I'm not a .NET expert by any stretch of the imagination (but am working on it slowly but surely) so I wasn't quite sure what all config tags were required for the redirect. Here is what I used and it appears to be working *cross fingers*.



   1:  <configuration> 

   2:    <runtime> 

   3:      <legacyUnhandledExceptionPolicy enabled="1" /> 

   4:      <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1">

   5:        <dependentAssembly>

   6:          <assemblyIdentity name="Microsoft.VisualStudio.QualityTools.WebTestFramework"

   7:                                  publicKeyToken="b03f5f7f11d50a3a"

   8:                                  culture="neutral" />

   9:              <bindingRedirect oldVersion="8.0.0.0"

  10:                               newVersion="9.0.0.0"/>

  11:           </dependentAssembly>

  12:        </assemblyBinding>

  13:    </runtime> 

  14:  </configuration> 



After doing some recording it appears that the code that is generated from the converted Fiddler2 to .webtest to coded web test looks good and I see the automagic correlation of data extracted from form fields and the ever dreaded VIEWSTATE.

This oughta be good enough for me right now until I resolve the situation with VS2008. I have no personal problems with recording a session, saving as a .webtest and then added the .webtest to a project and converting to a coded web test so it's all good with me.

Saturday, June 28, 2008

The Good, The Bad and The Ugly.

I've been real busy with non load testing stuff at work and haven't had much of a chance to continue playing around with VS2008 for LT purposes but I have found out some interesting stuff.

The Good:

You can specify the data store (either SQL Server or SQL Server Express) for storing all the individual hits from web sites and then query that information which renders that I was doing in the earlier blog post as moot. Yay! By default, VS2008 TSTE keeps these results in SQL Server Express that is installed when Visual Studio is installed. It's pretty cool that you can choose to store results in another location for historical reasons.

Looking around the data in the tables I'm not sure if you can group page hits together by transaction ID. But, I haven't had much of a chance to play around with querying data and am not for sure if this is the case. If that is the case, I can still use a method similar to a previous blog entry for collecting the metrics myself for catching page response time as it relates to individual transactions. A bit of a PITA but fantastic that I can do it if I want. Yay for a flexible framework that allows me as a test with development background to run around and do this kind of stuff.

One thing I know for sure though is with the ability to query the databases I should now have the ability to crunch the numbers for transactions and now get average, standard deviation, hit counts, etc. The idea is that in a nicely controlled environment I should be able to generate an automagic report for day to day automated load test execution and be able to do something like a mean population comparison of transactions times from one test to the next and pick out statistically significant deviations for alpha = 0.05. That'll be kinda neat if it works out.

Now that I think about it, the VS2008 tool should have been reporting these metrics all along. LoadRunner defaults with basic statistical information and VS2008 should report the same as well. It is kind of annoying, but I like the fact that I have a published schema that I can use to extract the information that I want (see link to schema information for VS2008 further down below). So, another plus in the LoadRunner column. [I might really like the VS2008 product, but I do have to be fair in my comparisons of Pro/Con of VS2008 versus LoadRunner.]

I wasn't able to use the statistical modeling at my previous employer because our systems were just too large with too many variables involved. We had communication from the web server to the database server (which was on the same switch so not much of a problem) but we also had a bunch of communications to the Host. The Host was required for testing the web sites and was actually hitting production systems that merely disregarded the requests and sent back bogus results so that we wouldn't kill the mainframe.

Many years ago I tried to get rid of this variability by writing what I called a "parroting proxy server." It was a bit of C# code that was essentially a protocol agnostic proxy server that would pass messages from point A to point C where the intermediate Point B was the proxy. It would store responses with a in memory hash and if the same request was caught being requested it would go ahead and send back the hashed response to eliminate the variance that we would see. It worked out pretty good but we never got around to implementing it. Unfortunately in a Fortune 50 company the bureaucracy can be too great and with Empire Building as it is the other managers didn't see what was in this for them and I suppose I didn't do a very good sales job. So be it.

To flip the switch to log all data so that you can query the data for fun and profit is to modify the "Timing Details Storage Property" in the Run Setting node of the load test editor to "All individual details."

And unlike the LoadRunner database (AFAIK, I could be wrong!), this schema is published with all sorts of handy information located at http://blogs.msdn.com/billbar/articles/529874.aspx. How great is that?

The Bad:

I haven't had enough time to continue my little load testing adventure. I'm finding that my new employer needs to update their processes as it relates to building production final machines. The current steps are extremely painful. But I suspect it'll be just fine in the long run as I've had some experience in this in the past and now that this process can be fully automated and integrated into our build process to be painless.

The Ugly:

Ok, this is ugly. I tried to write a simple script to augment some stuff that I did last week only to find that I couldn't bring up an instance of IE with the Web Recorder active and it is frustrating as hell. I found several posts that essentially say that the "solution" is to delete your login profile and start again. Yeah, that really sucks, Microsoft. I'd like to have a solution that doesn't require me to blow away my profile and have me spend hours getting my desktop back into the shape that I like it. That is a real PITA for sure.

I have three profiles on my machine and I can only generate recorded web tests with one of these profiles. I blew away one of the smaller profiles (not my main profile where I have 2.5+ gig of files) and gave it a try and it was no-go. I still could not record web tests. Yeah, that is bad. Very bad.

I'm not sure what the solution on this is going to be. In the short term I'm going to have to use my Administrator login to create scripts and then xfer that over to my main login script. How craptacular is that? Once I get some free time I'll have to look into this further but I am dreading it. No fun!

Here are some MS blog entries related to the problem. Perhaps they could help somebody else but I haven't seen any love for me, yet.

http://blogs.msdn.com/mtaute/archive/2007/11/09/diagnosing-and-fixing-web-test-recorder-bar-issues.aspx
http://blogs.msdn.com/edglas/archive/2007/11/01/web-test-recorder-bar-not-showing-up.aspx

It appears that Fiddler2 might be an interim solution from what I read here in this MS blog post located at http://blogs.msdn.com/slumley/pages/enhanced-web-test-support-in-fiddler.aspx.

A PITA? Sure, but it beats not having any solution at all. Fiddler is purt near cool and I've "fiddled" with it a little bit and it is nifty.

Friday, June 20, 2008

Getting all the metrics using WebTestRequestPlugin

If there is one shortcoming that I've noticed with VS2008 for load testing it is the default report exporting. With LR (since at least version 6.5 that I can remember) we've always had the ability to export the final output of page response times to an Excel spreadsheet.

At my previous employer that was part of the routine: Analyze results, export to Excel spreadsheet and then use some custom software to crunch the before and after results into a snazzy easy to read format (props to Brian of www.alphasixty.com!).

From what I've seen with VS2008 so far I see that the exported XML format (with an extension of TRX) contains a lot of good data and it is in an XML format but it isn't as simple as opening up an Excel spreadsheet with all the page response times and slicing and dicing beyond that. So, I still have to give props to LR for that.

Something that I've always wanted from LR was to get even more information. More page response times and the response times of the objects that were pulled by the pages that we were testing. Transaction times is all we ever got.

Well, while playing with VS2008 today I figured out some nifty stuff. Microsoft has done a pretty good job of allowing the tester who happens to be a coder to develop code to do just this time of thing that I want and I am doing it with the WebTestRequestPlugin.

Here is the basic class that I wrote to get page response times of the primary call. It's nothing fancy and for right now it just dumps to STDOUT but it easily can (and will be) adapted to insert into a database for futher analysis. I've just dumped the page and the time in milliseconds just for sake of ease to demonstrate.

Here is my plugin code:



   1:      private class GetAllMetrics : WebTestRequestPlugin {

   2:        public override void PostRequest(object sender, PostRequestEventArgs e) {

   3:          string requestedURL = e.Request.UrlWithQueryString.ToString();

   4:          double requestTime  = e.Response.Statistics.MillisecondsToLastByte;

   5:          string outputString = "The resource " + requestedURL + " took " + requestTime.ToString() +

   6:                                " milliseconds to load.";

   7:          Debug.WriteLine(outputString);

   8:        }

   9:      }



Pretty simple, eh? Sure it is. Short and sweet and to the point.

For this example I am hitting AOL.com. The reasons will be clear in a bit.

I made a private instance of this class in my coded web test:



   1:      private GetAllMetrics getMetrics = new GetAllMetrics();



The reason for this will be clear in a moment. Now, wire up the PostRequest to the instance declared above for my request to AOL.com:



   1:        WebTestRequest request1 = new WebTestRequest("http://www.aol.com/");

   2:        request1.ThinkTime = 2;

   3:        request1.PostRequest += new EventHandler<PostRequestEventArgs>(getMetrics.PostRequest);

   4:        yield return request1;

   5:        request1 = null;



When I hit AOL.com I get something like this in my output window:

"The resource http://www.aol.com/ took 272 milliseconds to load."

That is nifty.

After that, I got to thinking. Why can't I wire up all the dependent requests with the same plumbing and get all the metrics for dependent requests?

I couldn't think of any reason why not and coded this method to do so:



   1:      private void setupDependentMetrics(object sender, PostRequestEventArgs e) {

   2:        foreach (WebTestRequest request in e.Request.DependentRequests) {

   3:          request.PostRequest += new EventHandler<PostRequestEventArgs>(getMetrics.PostRequest);

   4:        };

   5:      }



There on line #3 you see where I reference the private instance declared in my coded web test class. Makes sense now, eh?

I wired up a PostRequest with the method that I coded and got this:



   1:        WebTestRequest request1 = new WebTestRequest("http://www.aol.com/");

   2:        request1.ThinkTime = 2;

   3:        request1.PostRequest += new EventHandler<PostRequestEventArgs>(setupDependentMetrics);

   4:        request1.PostRequest += new EventHandler<PostRequestEventArgs>(getMetrics.PostRequest);

   5:        yield return request1;

   6:        request1 = null;



There we go! Let's fire off this bad boy and see what kind of output we get now.



   1:  The resource http://www.aol.com/ took 272 milliseconds to load.

   2:  The resource http://www.aolcdn.com/_media/aolp_v31/main.js took 44 milliseconds to load.

   3:  The resource http://www.aolcdn.com/_media/aolp_v31/main.css took 86 milliseconds to load.

   4:  The resource http://o.aolcdn.com/ads/adsWrapperAT.js took 41 milliseconds to load.

   5:  The resource http://www.aolcdn.com/aolp_mkhome/474f6b9b-00394-029c3-400cb8e1 took 22 milliseconds to load.

   6:  The resource http://o.aolcdn.com/omniunih.js took 51 milliseconds to load.

   7:  The resource http://www.aolcdn.com/aolp_mkhome/474f6b9a-00272-029c3-400cb8e1 took 13 milliseconds to load.

   8:  The resource http://www.aolcdn.com/aolp/oo_engine_main.js took 14 milliseconds to load.

   9:  The resource http://www.aolcdn.com/_media/aolp_v31/updated.gif took 17 milliseconds to load.

  10:  The resource http://www.aolcdn.com/_media/aolp_v31/new.gif took 20 milliseconds to load.

  11:  The resource http://www.aolcdn.com/aolportal/mars-200-061708.jpg took 36 milliseconds to load.

  12:  The resource http://www.aolcdn.com/aolp_dlsnag/snagbutton_default.gif took 12 milliseconds to load.

  13:  The resource http://www.aolcdn.com/aolportal/derek-jeter-yankees-60mh0620.jpg took 20 milliseconds to load.

  14:  The resource http://www.aolcdn.com/_media/aolp_v31/bctrl.gif took 10 milliseconds to load.

  15:  The resource http://www.aolcdn.com/_media/aolp_v31/pctrl.gif took 11 milliseconds to load.

  16:  The resource http://www.aolcdn.com/_media/aolp_v31/fctrl.gif took 11 milliseconds to load.

  17:  The resource http://ar.atwola.com/image/93238809/81806566/aoladp took 114 milliseconds to load.

  18:  The resource http://twx.doubleclick.net/ad/TW.AOLCom/Site_WS_3/Ticker;MN=93238809;wm=o;rdv=xyz;!c=AIR;!c=AMU;!c=APP;!c=ASS;!c=AUC;!c=AUT;!c=AVE;!c=BEV;!c=BSS;!c=BWL;!c=CMP;!c=COS;!c=CPG;!c=DRG;!c=EDU;!c=EYE;!c=FIT;!c=FLE;!c=FOD;!c=GOV;!c=HCS;!c=HOM;!c=HPS;!c=JLY;!c=MED;!c=MFG;!c=MUS;!c=NEW;!c=PET;!c=PHM;!c=PHO;!c=PRI;!c=PUB;!c=RES;!c=RLE;!c=RTL;!c=SPT;!c=TEL;!c=TOB;!c=TOY;!c=TRL;!c=TV;!c=PER;!c=RDO;!c=SRC;!c=TIC;!c=d-fls;!c=d-jav;!c=d-dxp;!c=d-pxp;sz=88x31;dcove=rd;ord=81806566? took 123 milliseconds to load.

  19:  The resource http://www.aolcdn.com/htmlws30_r1/aol_logo_v3.gif took 484 milliseconds to load.

  20:  The resource http://2mdn.aolcdn.com/viewad/1092682/aolmf.gif took 34 milliseconds to load.

  21:  The resource http://www.aolcdn.com/aolmovies/stars-in-hats-denise-richards-78x78 took 27 milliseconds to load.

  22:  The resource http://www.aolcdn.com/aolp_newspromo3/485bd664-00237-00c1e-400cb8e1 took 12 milliseconds to load.

  23:  The resource http://www.aolcdn.com/aolp_newspromo3/485aafec-0019c-06c66-400cb8e1 took 93 milliseconds to load.

  24:  The resource http://www.aolcdn.com/aolportal/couple-holding-hands-120az060508.jpg took 29 milliseconds to load.

  25:  The resource http://www.aolcdn.com/aolp_supertab_mail/4602fec6-002ab-06ee1-0a00ec5a took 11 milliseconds to load.

  26:  The resource http://www.aolcdn.com/_media/aolp_v31/stb_arr_dn took 13 milliseconds to load.

  27:  The resource http://www.aolcdn.com/aolp_w/3s30 took 15 milliseconds to load.

  28:  The resource http://www.aolcdn.com/aolp_supertab_radio/48172fde-00384-00095-400cb8e1 took 15 milliseconds to load.

  29:  The resource http://www.aolcdn.com/aolp_supertab_video/467abd8c-000b9-05971-0a00ead6 took 12 milliseconds to load.

  30:  The resource http://www.aolcdn.com/aolp_page3/45fb58fc-0006d-04433-0a00ec5a took 13 milliseconds to load.

  31:  The resource http://www.aolcdn.com/_media/aolp_v31/stb_arr_up took 11 milliseconds to load.

  32:  The resource http://ar.atwola.com/image/93227127/81806568/aoladp took 52 milliseconds to load.

  33:  The resource http://ar.atwola.com/image/93236004/81806571/aoladp took 116 milliseconds to load.

  34:  The resource http://twx.doubleclick.net/ad/TW.AOLCom/Site_WS_3;MN=93227127;u=re446207ff2c45ddf;wm=o;rdv=xyz;!c=d-jav;sz=300x250;dcove=rd;ord=81806568? took 98 milliseconds to load.

  35:  The resource http://2mdn.aolcdn.com/viewad/1022462/nbcu_hulk_300x250_now.jpg took 36 milliseconds to load.

  36:  The resource http://www.aolcdn.com/pops_promo/fp_ws_0308_re_luxhouse.jpg took 24 milliseconds to load.

  37:  The resource http://twx.doubleclick.net/ad/TW.AOLCom/Site_WS_3;MN=93236004;u=re446207ff2c45ddf;wm=o;rdv=xyz;!c=d-fls;!c=d-jav;!c=d-dxp;!c=d-pxp;sz=121x60;dcove=rd;ord=81806571? took 99 milliseconds to load.

  38:  The resource http://2mdn.aolcdn.com/viewad/817-grey.gif took 10 milliseconds to load.

  39:  The resource http://www.aolcdn.com/pops_promo/wrtflash_v6.js took 12 milliseconds to load.

  40:  The resource http://ar.atwola.com/image/93227135/81806572/aoladp took 51 milliseconds to load.

  41:  The resource http://twx.doubleclick.net/ad/TW.AOLCom/Site_WSA_3;MN=93227135;u=re446207ff2c45ddf;wm=o;rdv=xyz;!c=d-gif;!c=d-jpg;!c=d-imrd;!c=d-fls;!c=d-jav;!c=d-dxp;!c=d-pxp;sz=291x30;dcove=rd;ord=81806572? took 102 milliseconds to load.

  42:  The resource http://2mdn.aolcdn.com/viewad/817-grey.gif took 10 milliseconds to load.



Look at all that! Why did I use aol.com? Because I knew a bunch of crap would be loaded with the home page hit. :^)

I dunno about you but I thought this was pretty nifty.

If figure that if I load the current transaction into the Context of the current request I figure that I can log the transaction and all the associated hits related to those transactions and be able to do some neat number crunching. That'll come later but that's enough for right now.

Here is the entire coded web test:



   1:  namespace WebTestRequestMetricCollection {

   2:    using System;

   3:    using System.Collections.Generic;

   4:    using System.Text;

   5:    using Microsoft.VisualStudio.TestTools.WebTesting;

   6:    using Microsoft.VisualStudio.TestTools.WebTesting.Rules;

   7:    using System.Diagnostics;

   8:    public class AOLHomePage : WebTest {

   9:      private GetAllMetrics getMetrics = new GetAllMetrics();

  10:      public AOLHomePage() {

  11:        this.PreAuthenticate = true;

  12:      }

  13:      public override IEnumerator<WebTestRequest> GetRequestEnumerator() {

  14:        // Initialize validation rules that apply to all requests in the WebTest

  15:        if ((this.Context.ValidationLevel >= Microsoft.VisualStudio.TestTools.WebTesting.ValidationLevel.Low)) {

  16:          ValidateResponseUrl validationRule1 = new ValidateResponseUrl();

  17:          this.ValidateResponse += new EventHandler<ValidationEventArgs>(validationRule1.Validate);

  18:        }

  19:        WebTestRequest request1 = new WebTestRequest("http://www.aol.com/");

  20:        request1.ThinkTime = 2;

  21:        request1.PostRequest += new EventHandler<PostRequestEventArgs>(setupDependentMetrics);

  22:        request1.PostRequest += new EventHandler<PostRequestEventArgs>(getMetrics.PostRequest);

  23:        yield return request1;

  24:        request1 = null;

  25:      }

  26:      private void setupDependentMetrics(object sender, PostRequestEventArgs e) {

  27:        foreach (WebTestRequest request in e.Request.DependentRequests) {

  28:          request.PostRequest += new EventHandler<PostRequestEventArgs>(getMetrics.PostRequest);

  29:        };

  30:      }

  31:      private class GetAllMetrics : WebTestRequestPlugin {

  32:        public override void PostRequest(object sender, PostRequestEventArgs e) {

  33:          string requestedURL = e.Request.UrlWithQueryString.ToString();

  34:          double requestTime  = e.Response.Statistics.MillisecondsToLastByte;

  35:          string outputString = "The resource " + requestedURL + " took " + requestTime.ToString() +

  36:                                " milliseconds to load.";

  37:          Debug.WriteLine(outputString);

  38:        }

  39:      }

  40:    }

  41:  }

Wednesday, June 18, 2008

Loading parameter data from a CSV in a VS2008 coded web test

Today I've been working on how to load correlated data from a CSV file into a coded web test. VS2008 has support for a wide range of data sources for correlation but I still prefer the good 'ol CSV format.

I rag on LR a lot, but selecting a data file and accessing it with the "{pSomeParam}" format is much easier in LR than VS2008. Details below:

In my little webtest I recorded a simple script hitting Google and I wanted to be able to pass different search terms to Google. Nothing fancy, just enough to figure out how to do what I wanted. This task in LR is pretty trivial.

I created a file by the name of searchTerms.csv that contains the following:

8<------------------
column1
loadrunner
perl
superbase
vs2008
------------------>8

I then added a data source to the web test and converted it over to code to see how I need to code things in the future and this is how it looks:



   1:  [DataSource("searchTerms", 

   2:              "Microsoft.VisualStudio.TestTools.DataSource.CSV", 

   3:              "|DataDirectory|\\databindingtowebtest\\searchTerms.csv", 

   4:              Microsoft.VisualStudio.TestTools.WebTesting.DataBindingAccessMethod.Sequential,

   5:              "searchTerms#csv")]

   6:  [DataBinding("searchTerms", "searchTerms#csv", "column1", "GoogleSearchTerm")]



The context for the current web test should now be automagically updated with an index entry by the name of "GoogleSearchTerm" that is referenced like this:



   1:  request2.QueryStringParameters.Add("q", this.Context["GoogleSearchTerm"].ToString(), false, false);



I don't mind the accessing of the correlated data via the context. That is no big deal, but setting up the databinding is kind of a complicated PITA. I have to give nods to LR at this point for the ease of creating params. But, on the other hand, VS2008 gives me a lot more control of slicing and dicing of return HTML data so there had to be a trade off someplace and I guess this is one of those situations.

Something important that I've learned is that numeric data needs to be double quoted to prevent mangling of the values. For example, I have the following entries now:



  [DataSource("lastNames", 

              "Microsoft.VisualStudio.TestTools.DataSource.CSV", 

              "c:\\work\\data\\LRData\\lastNames.csv", 

              Microsoft.VisualStudio.TestTools.WebTesting.DataBindingAccessMethod.Random, //   .Sequential, 

              "lastNames#csv")]

 

  [DataSource("targetServer",

            "Microsoft.VisualStudio.TestTools.DataSource.CSV",

            "c:\\work\\data\\LRData\\targetServer.csv",

            Microsoft.VisualStudio.TestTools.WebTesting.DataBindingAccessMethod.Sequential,

            "targetServer#csv")]

 

  [DataBinding("lastNames", "lastNames#csv", "lastName", "lastName")]

  [DataBinding("targetServer", "targetServer#csv", "TargetServer", "targetServer")]



The IP address that I want to hit is bound to the context entry for "targetServer."

I found that if I have the entry as:

8<----------------------
TargetServer
10.0.0.100
---------------------->8

I find that if it is not quoted it will be manged into the value "10.001." Just adding the double quotes around the value will allow me to use the IP address as I originally intended.



string targetWebServer = this.Context["targetServer"].ToString();

WebTestRequest request1 = new WebTestRequest("http://" + targetWebServer.ToString() + "/someVDir/somePage.aspx");



Knowing is half the battle. (GI Joe!)

Tuesday, June 17, 2008

Selecting a random button in a grid with VS2008

I am finally getting into writing virtual users with the VS2008 VSTE tools and the learning curve is pretty steep but the amount of control that I get is pretty darn cool so I think the pain points will be worth it.

Something that has always been a pain with LR in the past is randomly selecting some parsed HTML to randomize the next step in a vuser script. Sure, you can use web_reg_save_param with "ord=all" ala:

web_reg_save_param ("someParam", "LB= value=\"", "RB=\"", "Ord=All", LAST);

and pull it all into an array parameter and then iterate over the entries in the parameter by utilizing sprintf() ala:




   1:  web_reg_save_param ("someParam", "LB= value=\"", "RB=\"", "Ord=All", LAST);

   2:   

   3:  for (i = 1; i <= atoi(lr_eval_string("{someParam_count}")); i++) {

   4:    sprintf(someParamValue, "{someParam_%d}", i);

   5:    lr_output_message("%s", lr_eval_string(someParamValue));

   6:  }




That way of accessing the parameter array info always seemed like a PITA to me (maybe I'm in the minority with this thought?). Want to substring the output even further? More of a PITA since C doesn't have any quick and friendly string handling functions (substr(), left(), ltrim(), etc).


But, you have to be able to do it to get the job done.

Today I needed to basically select a random customer view button from a grid that was returned from this WebTestRequest hit:




   1:  WebTestRequest request3 = new WebTestRequest("http://xxx.yyy.zzz.iii/someWebPage.aspx");

   2:  request3.Method = "POST";

   3:  FormPostHttpBody request3Body = new FormPostHttpBody();

   4:  request3Body.FormPostParameters.Add("__EVENTTARGET", this.Context["$HIDDEN1.__EVENTTARGET"].ToString());

   5:  request3Body.FormPostParameters.Add("__EVENTARGUMENT", this.Context["$HIDDEN1.__EVENTARGUMENT"].ToString());

   6:  request3Body.FormPostParameters.Add("__LASTFOCUS", this.Context["$HIDDEN1.__LASTFOCUS"].ToString());

   7:  request3Body.FormPostParameters.Add("__VIEWSTATE", this.Context["$HIDDEN1.__VIEWSTATE"].ToString());

   8:  request3Body.FormPostParameters.Add("__VIEWSTATEENCRYPTED", this.Context["$HIDDEN1.__VIEWSTATEENCRYPTED"].ToString());

   9:  request3Body.FormPostParameters.Add("__EVENTVALIDATION", this.Context["$HIDDEN1.__EVENTVALIDATION"].ToString());

  10:  request3.Body = request3Body;

  11:  ExtractHiddenFields extractionRule2 = new ExtractHiddenFields();

  12:  extractionRule2.Required = true;

  13:  extractionRule2.HtmlDecode = true;

  14:  extractionRule2.ContextParameterName = "1";

  15:  request3.ExtractValues += new EventHandler<ExtractionEventArgs>(selectRandomButton);

  16:  request3.ExtractValues += new EventHandler<ExtractionEventArgs>(extractionRule2.Extract);

  17:  yield return request3;

  18:  request3 = null;



The WebTestRequest has an event for allowing coders to write their own code for extracting data from the HTML output. I wrote the code below is referenced in the above code at line # 15:



   1:  void selectRandomButton(object sender, ExtractionEventArgs e) {

   2:    if (e.Response.HtmlDocument != null) {

   3:      List<String> buttonNameList = new List<String>();

   4:      foreach (HtmlTag tag in e.Response.HtmlDocument.GetFilteredHtmlTags(new string[] { "input" })) {

   5:        if (tag.GetAttributeValueAsString("type") == "submit") {

   6:          buttonNameList.Add(tag.GetAttributeValueAsString("name"));

   7:        };

   8:      };

   9:      if (buttonNameList.Count > 0) {

  10:        Random randomizer = new Random();

  11:        e.WebTest.Context.Add("RANDOMCUSTOMERBUTTON",

  12:                               buttonNameList[(int)randomizer.Next(buttonNameList.Count) - 1]);

  13:        e.Success = true;

  14:      } else {

  15:        e.Success = false;

  16:        e.Message = "No entries were found for customer view buttons.";

  17:      };

  18:    } else {

  19:      e.Success = false;

  20:      e.Message = "No HTML to work against!";

  21:    };

  22:  }



Take a gander at the loop at line #4. In that loop I'm inspected a group of elements searching for the data that I want. No need to specify a dozen calls of web_reg_save_param only to reference said params via lr_eval_string and strcmp(). That's right, nice simple easy to read code! It's all there. I can spin, fold and mutilate the information all I want! That and I have access to .NET Containers. I don't have to write my own double linked list for storing data for declaring the square array in advanced (I did a lot of square arrays because I am a slacker like that). How sweet is that?!

While it might seem more complex at first, I believe that the extra control that I get over the selection of data is fantastic. I can think of several times in the past that I've had to write more complicated code than I have wanted to for the slicing and dicing of HTML and this method would have made things so much easier for me.

And I found a cool code formatting site today located at http://www.manoli.net/csharpformat/format.aspx. It took a little work getting the CSS stuff taken care of (after all, I'm not a GUI whiz) but it was worth it for sure.

Saturday, June 14, 2008

Still waiting for laptop and getting good training.

Well, the IT department is still delaying on my laptop as Sager does not extend a line of credit. *sigh* I found another vendor that has a similar rebadged Clevo laptop but for about the same price I can get Quad Xeon procs if I drop the BlueRay burner. The burner isn't a big deal and I'd rather have the extra horsepower.

The past two weeks I have been recieving training in "Agile Mastery" and "Test Driven Development" and I think both are pretty darn cool. I really enjoyed the "Test Driven Development" training and got a lot out of it.

I can see where the Agile process can be really handy. Load testing has always been the bane of the developer and PM since we usually find problems during the final testing phase and force code re-work and push back go-live dates. I know that I've pushed many a project back to the shagrin of many a program manager. Tough chit I always say.

If all goes well, we will eventually have automated LTs with the continuous integration and that should be real cool. I still see formalized before/after load tests at the end of each iteration for formal iteration results, though.

TDD seemed really bass-ackwards to me at first but after I got into it, I really dugg what I was seeing. The night of the second day of class I continued working on JUnit tests for the lab and went back to refactor some of my code and I got a real warm and fuzzy feeling from being able to re-execute those unit tests and seeing that I hadn't changed the end results of my methods that I had refactored. Suh-weet!

I am hoping to start creating VS2008 coded web tests for the project on Monday. The customer has agreed to supply information but it looks like that I will be volunteered to scrub the data of any identifying information (which, while a pain, is a good thing to do). Hello, perl! I love perl for slicing and dicing.

In the TDD class I was also introduced to the Eclipse IDE. From the folks that I have been talking with they really seem to like the Eclipse IDE more than the VS IDE. So, I took it upon myself to download the latest "Eclipse Classic" and give it a honest go. I found some plug-ins for perl, python/jython (I still want to play around with The Grinder 3), ruby (there is some RoR development here) and BeanShell (for JMeter development).

Can I really pry my hands off elvis for perl development? When I'm coding perl stuff my hands automagically go into vi mode. Normally I don't drop out of elvis for perl development unless I'm invoking Komodo for perl debugging. One of the other developers said that there is a plug-in for vi keybindings for Eclipse, so maybe there is hope after all!

If the data scrubbing goes well, I am hoping to finally be chunking FTP/SMTP data at the test servers and start doing some metrics collection and analysis.

Wednesday, June 4, 2008

JMeter SMTP Plug-in author responds

Luca Maragnani, the author of the SMTP Plug-in for JMeter responded to my e-mail that I had sent the other day about some things I found interesting about the plug-in and he gave me the skinny on the "[res_key=smtp_title]" issue:

"To solve the issue of [res_key=smtp_title] in order to be "SMTP Sampler" you must edit the properties file org/apache/jmeter/resources/messages.properties in ApacheJmeter_core.jar and add the key 'smtp_title=SMTP Sampler'."

So, I took the .jar, uncompressed it with WinRAR, made the edits as described and re-archived the files as a .zip, renamed the extension back to .jar and gave it a run and it worked like a champ.

No word on the "Every e-mail comes back with a warning when check for failure checkbox is checked." I believe that we had a communication issue. No big deal all in all.

It was purt near nice for him to respond to my e-mail in the first place and I thank him for his useful contribution.