none
custom workflow for writing to external DB RRS feed

  • Question

  • In my scenario, after creating a user in FIM, I need to pass a single attribute to a separate external database.  I've 'walked through' Create a Logging Custom Activity How to, but as I reverse engineer it, I'm not sure it explains the method I should use in my scenario. So asking the forum: What are the basic workflow components that I need in order to take an attribute defined within FIM portal and pass that to a database as part of an INSERT command?  Should I also be considering using CurrentRequestActivity?


    • Edited by Osho27 Monday, December 31, 2012 9:12 PM update
    Monday, December 31, 2012 7:33 PM

Answers

  • I would look into DB triggers for solving your problem. And have a trigger to fire your stored procedure before writes (I'm not a DBA so not to sharp on this but pretty sure that it could be done)

    There is no guaranteed timing between running the MA and running the workflows so I dont think that you can be sure that the workflows have run before your MA's run. But you are on the right track with the CurrentRequestActivity when it comes to workflow. That activity will give you the ID of the target hit by the workflow and from there you can do a ReadObject and get the rest of the information on the object if you need more - and the finish up doing your DB/SP thingy. I do think the trigger bit is a bit more straight-forward if possible


    Regards, Soren Granfeldt
    blog is at http://blog.goverco.com | twitter at https://twitter.com/#!/MrGranfeldt

    • Marked as answer by Osho27 Tuesday, January 1, 2013 6:42 PM
    Tuesday, January 1, 2013 6:08 PM

All replies

  • Not sure exactly what you're trying to do. But it seems like you may want to use a Management Agent for writing information to a database? If you indeed want to do it in a workflow, you'll maybe need to do a lookup or have a lookup at the 'target' object passed to you custom workflow and use information from that object to write to your database.

    Regards, Soren Granfeldt
    blog is at http://blog.goverco.com | twitter at https://twitter.com/#!/MrGranfeldt

    Tuesday, January 1, 2013 1:51 PM
  • To be more specific, I have a management agent configured for the database and can successfully write identity information to the database.  The issue I have is that for one of the attributes I need to first call a stored procedure in the database which will perform an operation before writing that attribute value to the store.  I haven't found anyway for the MA to call a SP, so thinking I must use a custom workflow that I'll pass the value to and let it pass that attribute value to the SP while the MA handles the rest of the attributes. 

    I am reading up FIM's CurrentRequestActivity, not sure if that is the right starting point for capturing the FIM value and then connecting to the DB, calling the SP, etc.  This seems like a good how-to so I'll be happy to write this up in full detail once completed.


    • Edited by Osho27 Tuesday, January 1, 2013 5:55 PM
    Tuesday, January 1, 2013 5:25 PM
  • I would look into DB triggers for solving your problem. And have a trigger to fire your stored procedure before writes (I'm not a DBA so not to sharp on this but pretty sure that it could be done)

    There is no guaranteed timing between running the MA and running the workflows so I dont think that you can be sure that the workflows have run before your MA's run. But you are on the right track with the CurrentRequestActivity when it comes to workflow. That activity will give you the ID of the target hit by the workflow and from there you can do a ReadObject and get the rest of the information on the object if you need more - and the finish up doing your DB/SP thingy. I do think the trigger bit is a bit more straight-forward if possible


    Regards, Soren Granfeldt
    blog is at http://blog.goverco.com | twitter at https://twitter.com/#!/MrGranfeldt

    • Marked as answer by Osho27 Tuesday, January 1, 2013 6:42 PM
    Tuesday, January 1, 2013 6:08 PM
  • Thanks Soren - I'm going to give the trigger approach a try.  Do we know authoritatively how the SQL MA performs writes to SQL?  We are presuming here it uses INSERT or UPDATE, but how can I authoritatively know that for sure? 
    Wednesday, January 2, 2013 2:45 PM
  • Few solutions here:

    (1) Triggers as Soren has suggested, however here some caution have to be taken as using triggers might render view or table not updatable from OleDB point of view. OleDB which is being used by Management Agent can check database structure and "decide" that because of triggers it can not update this table. This happens sometimes but it depends on the construction of a data source(view mostly) and triggers.

    (2) Custom workflow activity which will call stored procedure - actually I went this path in one of my recent projects where I need to set particular flags in DB in response for some events. I could synchronize it but I had to introduce additional attributes into schema and synchronization rules just to flow these bit flags which are local to single DB. So I've decided to use workflow and custom activity in action phase. Pretty easy to develop. 

    (3) Custom MA which will be used to talk to database. Have done this few times in the past. But actually it looks like an overkill here. 

    So either (1) or (2).


    Tomek Onyszko, memberOf Predica FIM Team (http://www.predica.pl), IdAM knowledge provider @ http://blog.predica.pl

    Wednesday, January 2, 2013 3:02 PM
  • Thanks for the insight on triggers, Tomasz. Good stuff.

    If you do want to go with Tomasz 3rd suggestion, you could use my PowerShell MA (http://blog.goverco.com/p/powershell-management-agent.html) - there is a sample script for talking to SQL (even with deltas) - with that you have full control over how SQL is accessed, since you're doing it :-)


    Regards, Soren Granfeldt
    blog is at http://blog.goverco.com | twitter at https://twitter.com/#!/MrGranfeldt

    Wednesday, January 2, 2013 3:25 PM
  • Thanks for explaining the options Tomasz and Soren - and thanks for the PowerShell MA reference.  I'll investigate each of these methods and report back how things go once we decide and implement.
    Wednesday, January 2, 2013 4:28 PM
  • Soren/Tomasz -

    Thinking a bit more about option 1... How can I pass a trigger or SP (that does the encryption) a value to encrypt if the management agent is running the show...so to speak.  I'm missing the handoff piece (MA to SP or trigger) of this solution.  SQL Server has some built-in encryption capabilities, however a clear text value is expected as input.  Before reading into it a bit I imagined that somehow a trigger could capture what was being inserted to the field, work its magic then write...without the agent even being aware of it.  But i don't think that's the case, I think I need to input a value first, then store what comes out.  Thoughts?




    • Edited by Osho27 Thursday, January 3, 2013 5:27 AM
    Thursday, January 3, 2013 5:25 AM
  • I'm not sure what type of encryption you'd like to do and I think more information on that is need.

    You could of course flow values to the MA that you can use in the SQL script if thats what you want to do. Otherwise, you could store some encryption keys in other db/tables on the SQL box it self.


    Regards, Soren Granfeldt
    blog is at http://blog.goverco.com | twitter at https://twitter.com/#!/MrGranfeldt

    Friday, January 4, 2013 7:55 AM
  • Here is what I did to get this working.  I have a function loaded within a trigger that does the encryption when passed the clear text value of the string during the MA export.  So far, so good.  The only issue is that I'm dropping the clear text value into the ether and when I run the SQL management agent export, then import - the import status complains about not getting the value it wrote back (since that clear text field value is now NULL).  I'm posting this as a separate question on the forum, but thought you might chime in on options.

    Thanks

    Monday, January 7, 2013 8:37 PM
  • Might I suggest another way of doing this.

    It would be through the MA Rules Extensions.

    Create an Advanced Export Flow to the DB field and create a rules extension by right clicking the MA,

    Use the MapAttributesForExport built in function (most of the code will be there if you have created the project after creating the advanced export),

    now put in your Encryption Logic in there and now you can do a confirming import without an error.

    Tuesday, January 8, 2013 9:13 AM
  • Certainly a plausible approach, however we're using SQL Server's encryption methods, as those encrypted values are decrypted and read by another product that is integrated with SQL Server.  So, don't think I'll be able to reference SQL functions from within FIM.
    Tuesday, January 8, 2013 9:02 PM
  • You can write C# code in your rules extension, so should be able to execute whatever SQL commands you need to do. That said, I've never written a lot of SQL .NET code, so maybe I'm wrong.

    If you do go this approach, I'd actually recommend doing this during an import rule (ie, MapAttributesForImport), as it seems like you need to execute this AFTER the object is created in the database (if you did it in the MapAttributesforExport method, it would likely execute before the object was actually created. Me, I would have a boolean attribute in the MV such as "storedprocedureexecuted", would have an IAF from SQL to this attribute with a rules extension that states "if storedprocedureexecuted is not present, execute the SP and set the boolean value to true".

    Sounds like what you're doing is exporting the clear text to a database field, which the SP consumes and then clears. This is fine if you're happy to take the risk of having the cleartext field in the db, but as you note you get the "exported value not reimported" warning.


    MCTS: Forefront Identity Manager 2010, Configuring

    Wednesday, January 9, 2013 10:47 PM
  • Yes Ross that can be done, but not recommended as it will connect to the database and Table which FIM MA is already connected to for Export, so that might lead to problems somewhere at a later stage when the data is huge, i think so.
    Thursday, January 10, 2013 6:34 AM
  • Yes Ross that can be done, but not recommended as it will connect to the database and Table which FIM MA is already connected to for Export, so that might lead to problems somewhere at a later stage when the data is huge, i think so.

    I think you mean "SQL MA" and "Import". He could separate the import/sync operations and that wouldn't be an issue. But I don't see how he could do it at all in an EAF, as he couldn't write to the database entry yet because it wouldn't exist.

    Anyway, if we go back and look at his original problem: He has a clear text attribute that he wants to encrypt and put in a field related to that user.

    Osho, why not just encrypt the value in FIM and flow it encrypted out to the database?


    FIMSpecialist.com | MCTS: FIM 2010

    Thursday, January 10, 2013 3:50 PM
  • Hi Ross

    I wasn't suggesting he should write to the database but just connect to it and run a query to encrypt his attribute, collect the encrypted contents and pass it to csentry, then from there on FIM will Export it on its on. That will stop FIM will to throw a exported-change-not-reimported error.

    Thursday, January 10, 2013 7:30 PM
  • The SQL server DBA uses built-in SQL encryption methods, so when we discussed the option of me encrypting within FIM, that sort of makes a lot of work for everyone else who is currently using the encrypt/decrypt features of SQL.  (believe me, your suggestion was my first recommendation)

    I think we have a work around for the challenges - I'm going to post a how-to when we are done.

    Thanks for the help...

    Thursday, January 10, 2013 7:37 PM
  • Right, this was considered to, we were going to actually call to encrypt it within a workflow, but it became a bit unweildy... not off the table as an approach though...

    Thursday, January 10, 2013 7:39 PM