Enterprise crawler RRS feed

  • Question

  • Hi

    CAn anyone help me. I am trying to crawl a website but the content is not getting fed into my collection on fast esp.

    In the crawler logs i can see the urls being extracted and the crawler is crawling. But in fast there are no documents appearing in my collection.




    Thursday, July 7, 2011 8:55 AM

All replies

  • Hi Clayton,

    Maybe you can review if your Pipeline is not rejecting your documents. You can review this in CLARITY or with doclog -w or doclog -e.



    Thursday, July 7, 2011 9:38 PM
  • I have same problem , I will let you know if I fix it. I am using enterprise web crawler .


    I created a new collection by "New-FASTSearchContentCollection -Name <collection name>" then added the config file by following "crawleradmin  -f <collection file with path" command . The collection name is case specific , so both command has to have same collection name. It use to work before , but something broken or i am doing something wrong. provides how to configure FAST web crawler.

    Thank you !
    Tuesday, July 19, 2011 8:41 PM
  • Its working now , I have to restarted all the services then the association was picked up. I think all the mess happened as previously there were some collections defined ,  but enterprise web crawler will not delete the collection even if delete-fastsearchcontentcollection is issued , you have to specifically use crawleradmin -d <collectionname> . then restart the services , add collection again.

    Thank you !
    Tuesday, July 19, 2011 9:58 PM