none
No crawl results RRS feed

  • Question

  • Hi to everyone!

    It's a problem taking place on client testing environment. On our machines crawl is sucessful with no errors.

    But when it's implemented on this environment, no results are showed. 

    Looking throw the logs finded that the process of crawling starts one after one without completing.

    What may be the cause of that behavior?

    P.S. If it helps, the solution has done as custom connector.

    Thursday, October 27, 2011 5:00 PM

Answers

  • Hi Victor,

    Could be a number of reasons, so first check your crawler log and SharePoint logs and report back with your findings :) You should find some error entries at least in the SharePoint logs related to your crawling.

    As you probably have implemented ULS logging in your connector , then check those entries as well.

    A common issue is to forget to set up the certificate needed for communicating with the FAST for SharePoint farm (content distributor), or that the certificate has expired if the system was installed a year ago.

    Regards,
    Mikael Svenson 


    Search Enthusiast - SharePoint MVP/WCF4/ASP.Net4
    http://techmikael.blogspot.com/
    Thursday, October 27, 2011 5:53 PM

All replies

  • Hi Victor,

    Could be a number of reasons, so first check your crawler log and SharePoint logs and report back with your findings :) You should find some error entries at least in the SharePoint logs related to your crawling.

    As you probably have implemented ULS logging in your connector , then check those entries as well.

    A common issue is to forget to set up the certificate needed for communicating with the FAST for SharePoint farm (content distributor), or that the certificate has expired if the system was installed a year ago.

    Regards,
    Mikael Svenson 


    Search Enthusiast - SharePoint MVP/WCF4/ASP.Net4
    http://techmikael.blogspot.com/
    Thursday, October 27, 2011 5:53 PM
  • Hi Victor,

    So, what was the error? :)

    -m


    Search Enthusiast - SharePoint MVP/WCF4/ASP.Net4
    http://techmikael.blogspot.com/
    Sunday, October 30, 2011 2:48 PM
  • Hi Mikael!

    As you proposed it was a common issue with FAST certificate. After creation and configuring of SA I've forgotten to configure SSL communication channel.

    But now I have a little another problem. On this enviroment we have several search applications (SAs) each for special connector. And after launching the crawl one after one the search results looks like beeing covered. For example, after crawling SA1 I've got 500 documents and when I start crawl for SA2 after SA1 (as well as backward) there are only 200 documents.

    Does it a reason why Microsoft advises to use one SA?

     

    Monday, October 31, 2011 4:09 PM
  • Hi Victor,

    The search "service application" architecture is based on having one SA, and expanding with several crawler components. So there is no need for creating several search applications. You add all the content sources on one SA, and then add more crawler components if you need to distribute the crawler over several servers.

    You can read more about adding and removing crawl components at TechNet.

    If you create more than one SSA, you can run into issues with duplicate internal FAST id's if they index to the same FAST farm.

    Regards,
    Mikael Svenson 


    Search Enthusiast - SharePoint MVP/WCF4/ASP.Net4
    http://techmikael.blogspot.com/
    Monday, October 31, 2011 5:55 PM
  • Hi Mikael.

    Your words confirmed case. I tested two times - firstly with several content types from several connectors and the results were fine. After that I started a crawl from other SA and almost all results were covered.

    The main idea was to use several search applications if it's necessary to reset index only for one datasource.

    Yes, it was the article I guided by.

    Can you help me with looking an official Microsoft answer about using only one SSA? It would be great if it's located on Technet or MSDN.

     

    Regards,

    Victor Panasenko

    Tuesday, November 1, 2011 3:28 PM
  • Hi Victor,

    Using one ssa is only mentioned as "notes" on TechNet, and I haven't found any other official documentation explaining it.

    You can also read this thread: http://social.technet.microsoft.com/Forums/en-US/fastsharepoint/thread/00697e28-3544-4996-a3d6-a9399d0187b5/

    and this one: http://social.technet.microsoft.com/Forums/en-US/fastsharepoint/thread/6ad4c886-7e2d-4817-8191-6777110b1766/

    Both answered by MS employees.

    If you want to clear the content of one source, you can also use this procedure:

    1. Remove the start url for the source you want to clear

    2. Save

    3. Re-add it

    This will delete the items from the crawler database as well as send commands to the FAST server to clear out the content from that source.

    That said, having two Content SSA's, each crawling to a separate collection in FAST, makes it perhaps easier to clear out content, but it's not supported by MS.

    And remember, having two collections in FAST is merely a convenience syntax for tooling, as all items are stored in the same binary index, with an added property called "meta.collection".

    Regards,
    Mikael Svenson 


    Search Enthusiast - SharePoint MVP/WCF4/ASP.Net4
    http://techmikael.blogspot.com/
    Tuesday, November 1, 2011 6:08 PM