none
Poor performance of Table Valued Parameters

    Question

  • At night I changed my code to user TVP instead of generating huge SQL Statements. I need it bacause I do Lucene.Net Full text search, and then pass found document identifiers to SQL Query, that have to manage some operations. Sometimes i've to pass >1mln identifiers, so I though that TVP will should be ok.

    I changed my code to this:

            private SqlParameter GenerateLongTableSql(string variableName, IList<long> values)
            {
                SqlParameter tvpParam = new SqlParameter();
                tvpParam.ParameterName = "@" + variableName;
                tvpParam.Value = GenerateLongTable(values);
                tvpParam.SqlDbType = SqlDbType.Structured;
                tvpParam.TypeName = "dbo.tp_longList";
                return tvpParam;
            }
            private DataTable GenerateLongTable(IList<long> intList)
            {
                DataTable tmpTable = new DataTable();
                tmpTable.Columns.Add(new DataColumn("id", System.Type.GetType("System.Int64")));
                if (intList != null)
                {
                    foreach (long id in intList.Distinct())
                        tmpTable.Rows.Add((object)id);
                }
                return tmpTable;
            }


    Everything worked fine, but after few hours (at day, so I assume that general usage of the Azure DB's is higher) I started getting timeouts (30sek) on queries from federation members. After looking for the reason, I reverted my code back, and see that now I've no timeout issues. It's strange, because the amount of data, and the usage of it didn't change.

    Now my code looks like that:

            private string GenerateLongTableSqlString(string variableName, IList<long> values)
            {
                StringBuilder sqlQuery = new StringBuilder();
                sqlQuery.AppendFormat("DECLARE @{0} tp_longList;", variableName);
                sqlQuery.AppendFormat("INSERT INTO @{0} VALUES {1};", variableName, String.Join<string>(",", values.Select(x => "(" + x + ")")));
                return sqlQuery.ToString();
            }


    Can someone explain me, why using TVP I cannot execute command within 30 second, but when I change it to query string building solution, it runs in less than 1 second?

    Parameters, data, server and usage volumes are equal. The only difference is how I put data into query.

    And tested SQL query is as simple as:

    SELECT [...] 
    FROM catJournalBids AS jb
    INNER JOIN (select id from @categoryTable) as catJoin on catJoin.id=jb.catid
     WHERE AND jb.dtDay BETWEEN '2012-01-01' AND '2012-01-30'
    GROUP BY jb.dtDay ORDER BY jb.dtDay ASC	


    • Edited by Malachowski Thursday, May 24, 2012 7:01 PM Added sql query
    Thursday, May 24, 2012 6:52 PM

Answers

  • A couple of thoughts that might improve performance.  First, consider passing an IEnumerable<SqlDataRecord> instead of a DataTable.  Second, make sure the "id" column of your table type is declared as a primary key.


    Dan Guzman, SQL Server MVP, http://weblogs.sqlteam.com/dang/

    Saturday, June 02, 2012 1:39 AM
  • Since both SQL Server 2012 (and to a lesser extent 2008 and 2008 R2) and SQL Azure share a code-base, they also share the query optimizer engine.  Therefore, direct comparisons between the two are a good way to allow you to leverage the more mature manageability features of SQL Server (Profiler, XEvents, etc.) to diagnose your problem and then apply the solution to SQL Azure.

    If you don't want to go that route, you can certainly start by approaching this as a query execution problem.  Look at your execution plans and look for query plan problems (missing indexes, implicit conversions, etc.).  Have you taken that step yet?

    Evan

    Saturday, June 02, 2012 3:23 PM

All replies

  • The core engine that runs SQL Azure is the same one that runs SQL Server 2012.  Therefore, it would surprise me to see any significant difference in the overall performance of a specific TVP-based query.  Have you tried your TVP approach against an on-premises version of SQL Server (preferably 2012, but 2008 or 2008 R2 should be close enough for this test)?

    Evan

    Friday, June 01, 2012 1:04 PM
  • Have you tried your TVP approach against an on-premises version of SQL Server (preferably 2012, but 2008 or 2008 R2 should be close enough for this test)?

    I have not. My purpose is to get this app working on Azure, so testing it on on-premise SQL Server makes no sense to me. There wouldn't be any helpful answer after that test.
    Saturday, June 02, 2012 12:43 AM
  • http://blogs.msdn.com/b/benko/archive/2011/12/15/how-to-get-to-sql-azure-query-performance-data.aspx

    http://social.technet.microsoft.com/wiki/contents/articles/sql-azure-performance-and-elasticity-guide.aspx

    You think that execution plan of query may change depending of how I put values into a variable  table before the execution of that query?
    Saturday, June 02, 2012 12:45 AM
  • A couple of thoughts that might improve performance.  First, consider passing an IEnumerable<SqlDataRecord> instead of a DataTable.  Second, make sure the "id" column of your table type is declared as a primary key.


    Dan Guzman, SQL Server MVP, http://weblogs.sqlteam.com/dang/

    Saturday, June 02, 2012 1:39 AM
  • Since both SQL Server 2012 (and to a lesser extent 2008 and 2008 R2) and SQL Azure share a code-base, they also share the query optimizer engine.  Therefore, direct comparisons between the two are a good way to allow you to leverage the more mature manageability features of SQL Server (Profiler, XEvents, etc.) to diagnose your problem and then apply the solution to SQL Azure.

    If you don't want to go that route, you can certainly start by approaching this as a query execution problem.  Look at your execution plans and look for query plan problems (missing indexes, implicit conversions, etc.).  Have you taken that step yet?

    Evan

    Saturday, June 02, 2012 3:23 PM
  • Can I ask how can pass 1  mln identifiers,

    sqlQuery.AppendFormat("DECLARE @{0} tp_longList;", variableName);
                sqlQuery.AppendFormat("INSERT INTO @{0} VALUES {1};", variableName, String.Join<string>(",", values.Select(x => "(" + x + ")")));
    Cause the Table Value Constructor has 1000 limit.

    Tuesday, November 26, 2013 10:34 AM
  • Can I ask how can pass 1  mln identifiers,

    sqlQuery.AppendFormat("DECLARE @{0} tp_longList;", variableName);
                sqlQuery.AppendFormat("INSERT INTO @{0} VALUES {1};", variableName, String.Join<string>(",", values.Select(x => "(" + x + ")")));
    Cause the Table Value Constructor has 1000 limit.

    Rather building the list of values with string concatenation, you can pass the values with an IEnumerable<SqlDataRecord>.  This will perform better and allow you to pass any number of values.  Below is a sample SQL script along with a C# method and class you can refactor for your needs.

    --CREATE type for TVP
    CREATE TYPE dbo.StringTvpType AS TABLE(
    	StringValue varchar(10) NOT NULL
    	);

    using System.Collections.Generic;
    using System.Data;
    using System.Data.SqlClient;
    using System.Linq;
    using System.Text;
    using Microsoft.SqlServer.Server;
    
    void insertTvpData(IEnumerable<String> values)
    {
    
    	var records = new StringSqlDataRecords(values);
    
    	using (var connection = new SqlConnection(connectionString))
    	using (var command = new SqlCommand("INSERT INTO dbo.SomeTable SELECT StringValue FROM @tvp;", connection))
    	{
    		var tvpParm = command.CreateParameter();
    		command.Parameters.Add(tvpParm);
    		tvpParm.ParameterName = "@tvp";
    		tvpParm.SqlDbType = SqlDbType.Structured;
    		tvpParm.TypeName = "dbo.StringTvpType";
    		tvpParm.Value = records;
    		connection.Open();
    		command.ExecuteNonQuery();
    		connection.Close();
    	}
    
    	return;
    
    }
    
    class StringSqlDataRecords : IEnumerable<SqlDataRecord>
    {
    
    	private IEnumerable<String> strings;
    	private SqlMetaData[] metaData = new SqlMetaData[1] { new SqlMetaData("value", SqlDbType.VarChar, 10) };
    
    	public StringSqlDataRecords(IEnumerable<String> strings)
    	{
    		this.strings = strings;
    	}
    
    	public IEnumerator<SqlDataRecord> GetEnumerator()
    	{
    
    		foreach (var str in strings)
    		{
    			var record = new SqlDataRecord(metaData);
    			record.SetString(0, str);
    			yield return record;
    		}
    	}
    
    	System.Collections.IEnumerator System.Collections.IEnumerable.GetEnumerator()
    	{
    		throw new NotImplementedException();
    	}
    }

     

     


    Dan Guzman, SQL Server MVP, http://www.dbdelta.com

    Tuesday, November 26, 2013 1:06 PM
  • Can I ask how can pass 1  mln identifiers,

    sqlQuery.AppendFormat("DECLARE @{0} tp_longList;", variableName);
                sqlQuery.AppendFormat("INSERT INTO @{0} VALUES {1};", variableName, String.Join<string>(",", values.Select(x => "(" + x + ")")));
    Cause the Table Value Constructor has 1000 limit.

    I've used ChunkedEnumerable extension (google for this), so my code looks like this:

            private string GenerateLongTableSqlString(string variableName, IList<long> values)
            {
                StringBuilder sqlQuery = new StringBuilder();
                sqlQuery.AppendFormat("CREATE TABLE #{0} (id bigint not null, PRIMARY KEY(id), UNIQUE(id));", variableName);
    
                foreach (var group in values.Chunk(1000))
                {
                    sqlQuery.AppendFormat("INSERT INTO #{0} VALUES {1};", variableName, String.Join<string>(",", group.Select(x => "(" + x + ")")));
                }
                return sqlQuery.ToString();
            }

    Tuesday, November 26, 2013 1:14 PM