If you need to pull huge amounts of data from a list or several lists, you'll face list throttling limits. This Wiki pages discusses best practice for dealing with those limits.

The short term solution is to increase the throttling limits. Once you revert to this strategy, you'll soon find that every developer wants to lift the limits and server performance is in jeopardy. The throttling settings are there for a reason. Instead, try to follow the following best practices:
  • Organize your content or retrieval mechanism so that you don't need get more than the default threshold in one go.
  • Use an advanced caching solution (such as Windows Server AppFabric Caching) so that the query with all the list items doesn't need to run repeatedly.
  • Use the search API instead, it has a minimal impact on performance (which can be optimized by using dedicated query servers) and is blazingly fast when it comes to returning results (be aware of the fact that it takes some time before the latest incremental index has run).
  • Make sure you are creating an optimized caml query to avoid threshold limits. This can be done by using indexed fields in your where clauses. Another possible optimization is to order or sort by an indexed field. You can force this by always sorting by the ID field using the following:

    <OrderBy Override='TRUE'><FieldRef Name='ID' /></OrderBy>
  • You can also query in batches to avoid the limit. The PowerShell example in the link below uses a row limit of 2000 and the ListItemCollectionPosition: http://blogs.msdn.com/b/kaevans/archive/2012/02/13/iterating-large-sharepoint-lists-with-powershell.aspx
Inspired by forum discussion: http://social.technet.microsoft.com/Forums/en-US/sharepoint2010programming/thread/18802ebb-d9b6-44ad-a58b-e7c4810502fd