Based on my research, a table is limited by available client and server memory (or browser page size constraints) theoretically. If you are on a 32-bit machine, the worker process that is processing this has less than 2 gigabytes of memory to work with, probably closer to 1 gigabyte when you factor in kernel mode memory. If you are going to be serving up large reports like this, you probably need a 64-bit setup with at least 4 gigs of memory or more on the box. This setup will allow the worker process to allocate more than 2 gigs of usable memory to complete these large requests without issue.
Furthermore, adding page breaks can improve the performance of published reports because only the items on each page have to be processed as you view the report. So in order to reduce the report process time, I suggest that you can try to add page break.
Hope this helps.
TechNet Community Support
Thank you very much for your reply. The server is 64-bit and has 4 GB RAM. So, in this setup, what would be the maximum limit of rows that can be retrieved at one shot? I do have the pagination, but for every next set of records, the query takes a long time to retrieve. I would like it all in one retrieval. Since the data does not change often in this case, I can setup cashing.