Friday, December 14, 2012 5:02 AMHi, this relates to SQL 2012 AS Tabular mode running out of memory when processing.
This question relates to processing of an SSAS solution which runs in Tabular mode.
I battled with this issue for a while now and reverted to the actual Visual Studio procect solution
to try and debug there.
I run each table one by one and all checks out.
The last table is also the largest and I have split it up into partitions. I processed the first
partition in full and it processed okay. It contains almost 4 million records. Moving on to the
second partition, which is much smaller fails upon processing on approx 1.3 million records with an
error "The operation has been cancelled because there is not enough memory available for the
application. If using a 32-bit version of the product, consider upgrading to the 64-bit version or
increasing the amount of memory available on the machine."
I changed the partition to represent only a handful of records (like 35 records) but it still fails with the same error.
My settings are:
C: 100GB with 67GB free when processing
E: 100GB with 80GB free when processing
OS: Windows Server 2012 64 bit
HardMemoryLimit - 0
HeapTypeForObjects - 0
LowMemoryLimit - 65
MemoryHeapType - 2
TotalMemoryLimit - 80
VertiPaqMemoryLimit - 60
VertiPaqPagingPolicy tried both 1 and 2
Please help this is doing my head in!!!
'If this was helpful, please mark and an answer or as being helpful'
Friday, December 14, 2012 6:33 AMConsider upgrading the machine with higher RAM
Please vote as helpful or mark as answer, if it helps
Cheers, Raunak | t: @raunakjhawar | My Blog
Friday, December 14, 2012 12:41 PMModerator
I agree, 4GB RAM is probably not enough and whatever you do, you should probably upgrade.
However... there must be something else going on if processing fails, even when you only have a small number of records in the second partition. Some ideas:
- Because SSAS Tabular is a column store database, thinking about your data in terms of numbers of rows is not very helpful. What you need to look out for instead is columns with a very large number of distinct values in, such as primary key columns on fact tables. Can you try deleting any the columns on the table that you don't need, especially ones with a large number of distinct values in? This would reduce your overall memory overhead.
- You can also try to unprocess the entire model before you process anything, because this will also reduce your memory overhead - when you process a table, the 'old' data stays in memory until processing finishes so it can still be queried. Unprocessing will mean there is no 'old' data to take up memory.
- When you process a partition, SSAS will also recalculate any calculated columns on the table this partition is in, or that refer to that table. If you have any calculated columns this could explain what's going on here, because complex DAX calculations can be very memory-intensive.
Check out my MS BI blog I also do SSAS, PowerPivot, MDX and DAX consultancy and run public SQL Server and BI training courses in the UK