This Wiki page contains best practices regarding the size of the crawl database. The crawl database stores the crawl data (time/status, etc.) about all items that have been crawled. Use the following formula to calculate it's average size:

Crawl: 0.046 × (sum of content databases)

For more info: http://thebitsthatbyte.com/calculating-sharepoint-2010-search-administration-crawl-and-property-database-sizes/

There are some built in heath analyzer rules that can assist you in reducing the size of your crawl database. Run the health analyzer rule Search - One or more crawl databases may have fragmented indices. If it reports that you have a high percentage of fragmented indices then you can have the health rule fix the fragmented indices automatically.

For more info: http://technet.microsoft.com/en-us/library/cc262731.aspx

The crawl database can contain huge amounts (literally 10s of GBs) of empty space. You need to truncate the log file, restore the crawl database or possibly rebuild the crawl database.

For more info: http://www.aiim.org/community/blogs/expert/Is-your-search-database-bloated

Other languages

This article is also available is the following languages :