none
Delta View vs Delta Table on SQL Server Management Agent RRS feed

  • Question

  • Hi FIM Experts,

    I'd like to ask your opinion on this "Delta View vs Delta Table"

    Using Delta View (our current scenario), there are triggers for Add, Edit, Delete that will insert Dd and ChangeType to a "Changes" table. Then an SQL view is used to have a complete data plus the ChangeType. Now the script to run synchronization manages the "Changes" table by getting maxid of changes before delta import, perform delta import, then delete records in "Changes" table where id <= previouslyFetchedMaxId after sync cycle is finished.

    Using Delta Table, I implemented it following this article:
    http://www.wapshere.com/missmiis/how-to-make-a-sql-delta-table
    I have 3 additional tables. And in my powershell script to run synchronization, I just call a stored procedure to populate the Delta table, before the synchronization cycle. this SP just do this in summary (truncate t_snapshot, truncate t_delta, load t_snapshot with current table, load t_delta, truncate t_archive, load t_archive with t_snaphsot). So t_archive contains the data of last synchronization. The SP executes for 17 seconds while processing close to 200K records.

    My question:

    1. Generally, which is recommended? DV or DT?
    2. Using DT, I perform truncate and load to 3 tables every 10 mins (because that's the sync interval), any issues?
    3. In terms of synchronization issues (ex. data is not sync), which approach is more prone to that?

    I really appreciate if you can share your experience and expertise. Thanks!!!

    Regards,

    Joseph

    Thursday, December 19, 2013 3:57 AM

All replies

  • Both options are listed on MS TechNet:

    I have implemented the trigger approach and (in my experience) found it unreliable.  My colleague (Carol - MissMIIS) much prefers the snapshot approach, and if I had to choose between them I too would now choose the snapshot approach.

    In regards to Q2, yes you would be right to do truncates.  The only problem I can see is that over time you will probably find that your database becomes highly fragmented, and you would then need to counter this with a regular housekeeping SQL Agent job to rebuild your indexes.  Also, be very careful that it is impossible for your imports to start until the script has finished.

    For Q3 I would expect sync issues are more likely with the Trigger approach because it is generally more likely that you will introduce coding errors and/or unhandled use cases this way.  Either way I would make sure that I run a regular (nightly/weekly) full import/delta sync run profile to pick up anything that got missed.


    Bob Bradley (FIMBob @ TheFIMTeam.com) ... now using FIM Event Broker for just-in-time delivery of FIM 2010 policy via the sync engine, and continuous compliance for FIM

    • Proposed as answer by UNIFYBobMVP Thursday, December 26, 2013 12:48 PM
    Thursday, December 26, 2013 12:48 PM