Performance testing of UAG devices RRS feed

  • General discussion

  • Hi, I was wondering if anybody has any experience of performance testing UAGs.

    We're currently using HP LoadRunner to record/replay user interactions via the UAGs and have noticed a unique HASH being sent to the server on every login attempt. The problem we have is that this hash is generated locally within the SetPolicy.asp step and LoadRunner is unable to execute the code necessary to generate this hash. Hence every subsequent attempt to login results in a fail.

    What I'm after are some tips/suggestions of how to get around this. We're currently looking at recreating the code that generates the hash within the LoadRunner script, but this is proving difficult.

    We have used LoadRunner successfully against the IAGs previously; here, the hash only changed when we updated our local AV (i.e. once a week); otherwise we were able to re-use the same hash for every access successfully. I'm therefore thinking there is some other data being used (date/time, session id, etc) as input to the hash algorithm, giving us unique values.

    Any help would be much appreciated.

    Regards, Amit

    Monday, May 23, 2011 2:15 PM

All replies

  • I would look at using Visual Tools ;)
    Jason Jones | Forefront MVP | Silversands Ltd | My Blogs: http://blog.msedge.org.uk and http://blog.msfirewall.org.uk
    Monday, May 23, 2011 4:23 PM
  • Hello again - thanks for your response.

    VSTS is an option for us and I have tried recording against the UAG using VSTS (Web Performance Test) but I am experiencing the same issue. When I replay the recorded script and the /InternalSite/SetPolicy.asp sends back the client results - it also sends back a hash value. This hash value changes everytime client results are sent back and hence the original recording fails on replay.

    Would you know how I could reproduce a valid hash value to each time the recording is replayed that the UAG server would accept?

    You said you have successfully run load testing against the UAG - did you run into this issue?

    Again, any help would be appreciated.

    Regards, Amit



    Tuesday, May 24, 2011 2:45 PM
  • Hi Jason,

    Many thanks for your email yesterday; I've copied it below here so that we've got a single thread and also so that some of my colleagues who I've asked for help have visibility of the responses.

    From speaking to a contact at MS, they confirmed that the hash value is purposefully unique to avoid someone being able to fool UAG into thinking a client was healthy just by passing the correct values. MS (not surprisingly) will not release this hash algorithm as this would defeat the object of adding the security feature in the first place ;)

    The only option I think you have is to create a dedicated trunk for performance testing and then disable the endpoint installation and detection option for that trunk. This will prevent the detection process from running and should allow you to record the session in LR.  

    Kind Regards


    I've sent this request on to our suppliers and await their response.

    What I wanted to ask was if there were any metrics around how much of an overhead the endpoint detection has on the UAGs? What I'm trying to avoid is running performance tests with the detection turned off, reporting our solution can support x number of users and then finding out that in production with endpoint detection enabled, it can really only support half or a tenth of that number, for example. I know this is wishful thinking on my part, but any ideas in terms of how intensive the endpoint detection checks are for the UAGs (e.g. receiving and manipulating the client data, generating a hash to confirm the client data hasn't been corrupted/tampered, redirecting to the web server, etc) would be very useful.

    In the meantime, work with VSTS continues but we're still hitting the same problem of capturing/generating the unique hash.

    Regards, Amit

    Wednesday, May 25, 2011 9:45 AM
  • Based upon my understanding, the detection script is all client-side processing, so the only server impact occurs when the client uploads/posts the returned results. But even then, I would imagine this is a pretty small processing hit...I am not aware of any official metrics on this though :(



    Jason Jones | Forefront MVP | Silversands Ltd | My Blogs: http://blog.msedge.org.uk and http://blog.msfirewall.org.uk
    Wednesday, May 25, 2011 9:53 AM
  • Morning Jason,

    You mentioned in a previous email that you were aware of/involved in some successful performance testing of the UAGs using VS; could you provide us with any additional details on how this was achieved?

    Many thanks, Amit

    Thursday, May 26, 2011 10:03 AM