none
Calling out to SQL Server While Crawling and Indexing a SharePoint 2010 List RRS feed

  • Question

  • Hi,

       I have a scenerio where we want to crawl a SharePoint 2010 list which contains a field let's say FileName-A using FAST for SharePoint 2010.  Using the value in FileName-A we need to map it to a SQL Server DB and pull some values off of a table.  My thoughts are to create an extensibilty stage, this will allow me to get the value FileName-A and call into SQL Server and get the values.  Then map them to a managed property. 

     

    Thoughts if this is a good approach or is there a better one?  I don't want to set up two seperate crawlers 1. for the list and the second one for the SQL server. 

     

    Thanks.

    Monday, January 24, 2011 8:19 PM

Answers

  • This is exactly how you should do it. I have a tutorial on my blog for creating and debugging extensibility stages.

    Performance wise I'm not sure how calling an exe multiple times will work out, but you could create a stub loader for the extensibility stage which calls a WCF service which in turn caches more data from the DB to limit the number of DB calls.

    Your other option is to create an event receiver for your list which pulls in and populates a field on your list when you save the entry, thus storing the DB data in the list itself before crawling it.

    Regards,
    Mikael Svenson


    Search Enthusiast - MCTS SharePoint/WCF4/ASP.Net4
    http://techmikael.blogspot.com/ - http://www.comperiosearch.com/
    • Marked as answer by Mark L. Smith Tuesday, January 25, 2011 6:41 PM
    Monday, January 24, 2011 8:54 PM

All replies

  • This is exactly how you should do it. I have a tutorial on my blog for creating and debugging extensibility stages.

    Performance wise I'm not sure how calling an exe multiple times will work out, but you could create a stub loader for the extensibility stage which calls a WCF service which in turn caches more data from the DB to limit the number of DB calls.

    Your other option is to create an event receiver for your list which pulls in and populates a field on your list when you save the entry, thus storing the DB data in the list itself before crawling it.

    Regards,
    Mikael Svenson


    Search Enthusiast - MCTS SharePoint/WCF4/ASP.Net4
    http://techmikael.blogspot.com/ - http://www.comperiosearch.com/
    • Marked as answer by Mark L. Smith Tuesday, January 25, 2011 6:41 PM
    Monday, January 24, 2011 8:54 PM
  • Hi Mikael,

         Thanks for the quick reply!  I agree about the performance hit which is a concern for me.  I too thought about the event receiver or putting the information on the list but since it's an existing format I am not even sure why they are putting meta data into the SQL tables and not storing them on the list but alas it's already done :(.    I'll have to work through how we cache the db info, maybe even do a daily job to extract the information from the db into an XML file since there are only 5 fields we need.  The data does not get updated too quickly. 

    Regards,

     Mark.

    Monday, January 24, 2011 9:03 PM