locked
Where is the Proclarity Usage data stored? RRS feed

  • Question

  • I'm logging both usage and exceptions, but would like to extract this data into a table and build a cube depicting this usage data.  Can someone tell me the name of the file where this information is stored?  Thank you.

    Kole 
    KRS
    Monday, January 5, 2009 4:22 PM

Answers

  • The information is stored in a Windows Event Log file.  If you open up the Windows Event Viewer or the PAS Administration Tool, you should see a log called ProClarity.  I think SSIS has a task that will read Windows Event Log files and allow you to load the data into a table.  The log file gets overwritten after it reaches a certain size so you may want to grab the data at least once a day (or more if usage is heavy) so as not to lose the info.

    Back in the days of ProClarity 5.x and SQL 2000, I had written a complete solution for doing this (log extract, star schema tables, and cube) that was available on the ProClarity Best Practices community.  I'm not sure that the code is available any more though.

    -Jason
    BTG Services
    • Marked as answer by Sean_Flanagan Tuesday, January 6, 2009 2:35 PM
    Monday, January 5, 2009 10:28 PM

All replies

  • The information is stored in a Windows Event Log file.  If you open up the Windows Event Viewer or the PAS Administration Tool, you should see a log called ProClarity.  I think SSIS has a task that will read Windows Event Log files and allow you to load the data into a table.  The log file gets overwritten after it reaches a certain size so you may want to grab the data at least once a day (or more if usage is heavy) so as not to lose the info.

    Back in the days of ProClarity 5.x and SQL 2000, I had written a complete solution for doing this (log extract, star schema tables, and cube) that was available on the ProClarity Best Practices community.  I'm not sure that the code is available any more though.

    -Jason
    BTG Services
    • Marked as answer by Sean_Flanagan Tuesday, January 6, 2009 2:35 PM
    Monday, January 5, 2009 10:28 PM
  • Thank you Jason. 

    Kole
    KRS
    Tuesday, January 6, 2009 3:04 PM
  • I think this is roughly the sample Jason is referring to.

    http://blogs.technet.com/proclarity/archive/2008/06/18/accessing-proclarity-analytics-server-usage-information.aspx
    Microsoft ProClarity | This posting is provided "AS IS" with no warranties, and confers no rights.
    Thursday, January 8, 2009 11:37 PM
  • Hi Jason

    Could you please explain me how u extracted Log details or provide me docuemnt which gives more details of extraction.
    This would be of great help

    Thanks
    Ahkreddy
    Tuesday, April 14, 2009 4:02 PM
  • ahkreddy,

    I was in teh same boat as you about a year ago and I ended up creating my own ETL solution is SSIS to extract the ProClarity Usage Event Log, write it to SQL tables, and combine it with the existing ProClarity_PAS tables to enable me to stick it in SSAS and report usage.

    As far as extracting the Event Log, use the WMI Data Reader task in SSIS.  It connects to the ProClarity Event Log which for us is stored in C:\\Windows\system32\config\PCEvent.Evt and extract Usage Events (EventType = '4'), saving the results to a text file.

    I had to run this package on the server itself, rather than remotely, it didnt like performing WMI remotely but I didnt pursue why as it wasnt an issue for me

    Then I have a 2nd package that does the ETL.  That extracts BookElelements, Books, Libraries and Users tables from ProClarity_PAS.  To deal with the Event logs in the new text file, there are a number of SSIS tasks I had to perform to ETL that into SQL tables.   Start with create a Flat File Connection Manager that deals with the output text file from the WMI extract, and write that to a staging table (SQL).  Next you have to do some work on the data to deal with Carriage Return and Line Feeds that mess it up.  Next pivot the data so insted of having 16 rows in your table for one event record, you want to have 1 row per event record, then finally some more manipulation of the data to deal with datatypes, datetims, trimming etc.  Eventually you have the usage data in a format that you can use in conjunction with the ProClarity_PAS tables, load it up to a cube, and hey presto, you know what books your users are accessing, when etc etc

    Hope that helps you on the right path as I know the old solutions were built in DTS
    regards
    Stuart
    STUART LAWRENCE
    Wednesday, April 29, 2009 8:38 PM
  • Hi Stuart

    CAn you provide the package code which you did if possible so that it can guide me to create my stuff.
    Please help here.

    Thanks
    Ahkreddy
    Friday, May 8, 2009 5:51 PM