locked
ProClarity hangs on large queries RRS feed

  • Question

  • I've got a report that contains between 40-50k cells when all rows (> 4000) are returned.  Using ProClarity Web Standard it returns the first 100 just fine, but when a user clicks on "Get All Data" the browser completely hangs.  This works fine in Web Professional though.

    I have:

    + Upgraded to ProClarity Analytics Server 6.3 SP2 (Version 6.3.2217.188)
    + Increased AspMaxRequestEntityAllowed in Metabase.xml to 1204800
    + Modified various global.asa settings:
       o RowCount = 5000
       o MaxQuerySize = 100000
       o LimitQuerySize = FALSE
       o SpaceLimitMB = 2048
       o HardSpaceLimitMB = 4096

    This is clearly an IIS/Web Standard issue as it works fine in Professional.  What other config settings can I play with to prevent the browser from hanging?

    Thanks!


    NKB
    Tuesday, December 23, 2008 5:37 PM

Answers

  • What you're seeing is the result of attempting to run a transactional report in an analysis tool.  ProClarity Standard is admittedly not at all good at running a view with this many rows and cols.  The Pro handles it alright because it pages the data, but the Standard loads it all at once is forced to go through the entire javascript block that is returned from the server to format and display the grid.  Believe it or not, we actually spent quite a bit of time and effort on teh javascript to be sure it was as effecient as possible, but we still results such as the ones you are getting when the dataset becomes fairly large.

    Your suggestions for an architecture change are good ones and ones we've talked about before, but at its foundation ProClarity is an analysis tool, and so major changes to it to accomodate what would be considered transactional reports have not been undertaken.

    My suggestion would be to get SQL Server Reporting Services involved in the environment to handle reporting needs such as this, and the use ProClarity to augment the reporting with analysis needs.
    Microsoft ProClarity | This posting is provided "AS IS" with no warranties, and confers no rights.
    • Marked as answer by NBauer Monday, August 17, 2009 10:07 PM
    Thursday, January 8, 2009 11:56 PM

All replies

  • Hi NBauer,

    The first thing that jumps out at me for you to try would be to raise the value of the AspMaxRequestEntityAllowed to a larger value incrementally, to see if a larger value allows the complete data set to be returned to the standard client.  You're probably still running into this not being adequate for the volume of data you're attempting to retrieve.  Here is a link to a document on the ProClarity team blog that may be of use to you:  http://blogs.technet.com/proclarity/archive/2008/05/20/proclarity-web-standard-behavior-my-views-don-t-show-up-after-being-saved.aspx   

    Hope this helps,

    Bob


    Microsoft ProClarity | This posting is provided "AS IS" with no warranties, and confers no rights.
    Friday, December 26, 2008 5:06 PM
  • Bob,

    I've increased the AspMaxRequestEntityAllowed to 1 Gig (and restarted IIS), but to no avail.

    I've also installed Fiddler and traced the HTTP traffic.  For smaller result sets (a few hundred rows) I can see the Request and Response (javascript/html) generated by ProClarity in Fiddler and the output displays fine in my browser.  For larger result sets of a few thousand rows (after clicking "Get All Data") I also see the response in Fiddler, however the browser hangs (both IE and Firefox).  The size of the javascript returned is a little over 7MB, which is well below the 1 Gig setting for AspMaxRequestEntityAllowed.  Is 7MB of JavaScript too much data for the browser?  Any other settings you can think of that are worth updating.

    Niels
    NKB
    Tuesday, December 30, 2008 8:30 PM
  • Hello NKB,

    I have seen this happen a a few times in the past, however the occurance is very rare.  You'll need to test whether or not the ‘Get all data’ function is exceeding an IIS metabase limit (AspBufferingLimit - which defaults to 4 MB).  You can adjust this setting in one of two ways: 

    1)      The metabase.xml file under C:\Windows\system32\inetsrv\.  You’ll need to stop the IIS Admin Service before making the change, and restart it once you’ve saved the metabase.xml file with its new settings.  If you’re unable to stop the IIS Admin Service you can navigate to IIS Manager and right click the server name.  Then select Properties and check the Enable Direct Metabase Edit box.

    2)      The IIS 6 Resource Kit’s Metabase Explorer.  Once in the Metabase Explorer, highlight the W3SVC node and you’ll see all of its properties listed along the right side of your screen.  You can directly edit all IIS settings by double-clicking the appropriate row.

    You’ll want to adjust the AspBufferingLimit to approximately 1 MB for every 1000 records.  You can also change it to a larger value (such as 2000000000 for 2 gigabytes) to see if this eliminates the problem without testing different limits (that may be the fastest alternative for test purposes).  There are no known adverse performance affects on the IIS other than what is imposed by the bandwidth of your web clients. 

    Hopefully this helps.  Keep us posted.

    Regards,
    Amanda


    Microsoft ProClarity | This posting is provided "AS IS" with no warranties, and confers no rights.
    Tuesday, December 30, 2008 9:38 PM
  • Amanda,

    Here's what I've found.  The report does in fact run, but just exceeded my impatience for waiting.  Again, as seen through Fiddler the results are returned from PAS in a timely manner.  The apparent hanging is just the browser executing the javascript returned and rendering the results.

    For a report with 12 columns and a little over 4,000 rows it took 50 minutes for IE Explorer to execute the javascript.  I am running on an Intel Duo core at 1.8Ghz with 2 Gig of RAM.  Nothing spectacular, but more powerful than what many of our customers are running.  Fifty minutes won't satisfy our performance goals.

    Is this expected performance for a result set of this size?  Are there alternatives within ProClairty (e.g., rather than "Get All Data" returning a javascript, could it return plain html in a new browser. or some sort of next page/previous page functionality)?

    Thanks,

    Niels


    NKB
    Tuesday, January 6, 2009 5:45 PM
  • What you're seeing is the result of attempting to run a transactional report in an analysis tool.  ProClarity Standard is admittedly not at all good at running a view with this many rows and cols.  The Pro handles it alright because it pages the data, but the Standard loads it all at once is forced to go through the entire javascript block that is returned from the server to format and display the grid.  Believe it or not, we actually spent quite a bit of time and effort on teh javascript to be sure it was as effecient as possible, but we still results such as the ones you are getting when the dataset becomes fairly large.

    Your suggestions for an architecture change are good ones and ones we've talked about before, but at its foundation ProClarity is an analysis tool, and so major changes to it to accomodate what would be considered transactional reports have not been undertaken.

    My suggestion would be to get SQL Server Reporting Services involved in the environment to handle reporting needs such as this, and the use ProClarity to augment the reporting with analysis needs.
    Microsoft ProClarity | This posting is provided "AS IS" with no warranties, and confers no rights.
    • Marked as answer by NBauer Monday, August 17, 2009 10:07 PM
    Thursday, January 8, 2009 11:56 PM