Here we are using FAST search for SharePoint 2010 web crawler to index a website. It will put all content to FASTSearchCollection named "ssi".
We have crawler and documentprocesser running at same server SERVER_1.
We decide to clear the collection first, then start a full crawl. After i confirmed the ssi is not crawling, i performed below steps: 1. Clear-FASTSearchContentCollection ssi 2. Get-FASTSearchContentCollection ssi, verified documentcount = 0 3. CrawlerAdmin -F ssi 4. CrawlerAdmin -Q ssi, but got weird that Downloaded (tot/stored/mod/del) : 1,059 URIs / 0 / 0 / 0
I also checked crawler log, found the crawler is able to access the content (status code http 200), but looks like it cannot process it and put it into index. Re-run Get-FASTSearchContentCollection ssi and found the document count is still 0, and lastinput
Then i checked event viewer, found there are tons of error " WARNING : crawler@SERVER_1: systemmsg: Unable to post job"
the crawler log in var\log\syslog latest entries:
[2012-11-29 06:36:33] INFO : crawler@SERVER_1: systemmsg: Shutdown completed (PID: 5996)
[2012-11-29 06:37:16] INFO : crawler@SERVER_1: systemmsg: Started Enterprise Crawler 14.0.0325.0000 on port 13000 (PID: 7008)
[2012-11-29 06:37:16] INFO : crawler@SERVER_1: ssi: Added collection 'ssi' for crawling
[2012-11-29 06:37:41] WARNING : crawler@SERVER_1: systemmsg: Unable to post job
[2012-11-29 06:37:41] WARNING : crawler@SERVER_1: systemmsg: The last message was repeated 224 times
I used nctrl stop and reboot the server, but it does not help.
Anyone knows the reason "Unable to post job" of the crawler?
Edited byFeng_LuThursday, November 29, 2012 6:16 AM
I managed to resolve this problem. It turns out i have to delete the collection and create the collection again.
the correct steps are below:
1. nctrl start crawler
2. crawleradmin -s ssi
3. crawleradmin -d ssi (Note: Have to delete this collection. Simply clear it does NOT work and cause “unable to post job” error of crawler)
4. Clear-FASTSearchContentCollection ssi
5. del D:\FASTSearch\data\crawler\config\node\ssi\*.*
6. del D:\FASTSearch\data\crawler\dsqueues\collection.queues\ssi\*.*
7. del D:\FASTSearch\data\crawler\queues\node\ssi\*.*
8. del D:\FASTSearch\data\crawler\queues\worker\ssi\*.*
9. recreate your collection by using crawleradmin -f <your_xml_file.xml>
10. nctrl restart crawler
11. Run crawleradmin --status, verify the crawler is crawling
However, still have no idea why crawler reports error "unable to post job" if i did not remove the collection.
Edited byFeng_LuThursday, November 29, 2012 8:21 AM
Microsoft is conducting an online survey to understand your opinion of the Technet Web site. If you choose to participate, the online survey will be presented to you when you leave the Technet Web site.