none
Is it normal to see thousands of warnings and errors in the crawl logs? RRS feed

  • Question

  • I'm working for a new client, reviewing their FAST installation for best practices and to resolve a handful of issues they've documented. After a couple of days of review, I've been able to solve 99% of the issues by simply completing a few missed steps when they setup a few customizations.  I've identified errors in the crawl logs that I'm confident are the reason for the final issue I've been asked to address (items they are not seeing in results are not being crawled).

    My question is related to this. Once I correct the root cause of this problem - there will still be around two thousand errors and/or warnings in their crawl log. It's been my experience when dealing with FAST that there are always several errors/warnings in the log, but this volume I've not seen before. 

    I'm interested to know as many opinions as I can get:  Given that there are no obvious or reported problems, what might you consider a 'normal' number of errors and/or warnings to be in the crawl logs?

    Tuesday, January 24, 2012 3:48 PM

All replies

  • What kind of warnings do you see? What kind of content source types are you dealing with?

    If you're crawling "wild" content, i.e. the web, with fairly unrestricted settings, it's in the nature of things that you'll encounter a lot of warnings.

     


    Marcus Johansson | Search Nerd | comperiosearch.com | linkedin.com/in/marcusjohansson
    Tuesday, January 24, 2012 5:44 PM
  • As Marcus said, without knowing the warnings, it's hard to say.  Do you have Advanced Filter Pack enabled on FAST side?  Could it be the errors due to content, as mentioned previously.  You might also see some warnings/errors if there are constant re-submits of batches, which would imply having a bottleneck somewhere.


    Igor Veytskin
    Tuesday, January 24, 2012 7:37 PM
    Moderator