Microsoft SQL Server Code-Named "Denali" CTP3 Release Notes

Microsoft SQL Server Code-Named "Denali" CTP3 Release Notes

Click here to change the language.

This page is carefully and closely monitored. Any changes you make will be evaluated and then quickly accepted, refined, or reverted. Because this is a wiki, additions or refinements to these release notes might have been made by community members. To read the original release notes document, click here.

This Release Notes document describes known issues that you should read about before you install or troubleshoot SQL Server Code-Named “Denali” CTP3 (click here to download it). This Release Notes document is available online only, not on the installation media, and it is updated periodically.

For information about how to get started and install SQL Server Code-Named “Denali”, see the SQL Server Code-Named “Denali” Readme. The Readme document is available on the installation media and from the Readme download page. You can also find more information in SQL Server Books Online and on the SQL Server "Denali" Forums.
Contents
1.0 Before You Install
Before you install SQL Server Code-Named “Denali”, read the following required actions.
1.1 Installation Prerequisite Handling During SQL Server Code-Named “Denali” Setup
The following items describe the prerequisite installation behavior during SQL Server Code-Named “Denali” Setup:
  • Installing SQL Server Code-Named “Denali” is supported only on Windows 7 SP1 or Windows Server 2008 R2 SP1. However, Setup does not block installing SQL Server Code-Named “Denali” on Windows 7 or Windows Server 2008 R2.
  • The .NET Framework 3.5 SP1 is a requirement for SQL Server Code-Named “Denali” when you select Database Engine, Replication, Master Data Services, Reporting Services, Data Quality Services (DQS), or SQL Server Management Studio, and the framework is no longer installed by SQL Server Setup.
    • If you run Setup on a computer with the Windows Vista SP2 or Windows Server 2008 SP2 operating system, and you do not have the .NET Framework 3.5 SP1 installed, SQL Server Setup requires you to download and install the .NET Framework 3.5 SP1 before you can continue with the SQL Server installation. The error message includes a link to the Download Center, or you can download the .NET Framework 3.5 SP1 from Windows Update. To avoid the interruption during SQL Server Setup, you can download and install the .NET Framework 3.5 SP1 before you run SQL Server Setup.
    • If you run Setup on a computer with the Windows 7 SP1 or Windows Server 2008 R2 SP1 operating system, you must enable the .NET Framework 3.5 SP1 before you install SQL Server Code-Named "Denali".
  • The .NET Framework 4 is a requirement for SQL Server Code-Named “Denali”. SQL Server Setup installs the .NET Framework 4 during the feature installation step. SQL Server Express does not install the .NET Framework 4 when installing on the Windows Server 2008 R2 SP1 Server Core operating system. You must install the .NET Framework 4 before you install SQL Server Express on a Windows Server 2008 R2 SP1 Server Core operating system.
  • To make sure that the Visual Studio component can be installed correctly, SQL Server requires you to install an update. SQL Server Setup checks for the presence of this update and then requires you to download and install the update before you can continue with the SQL Server installation. To avoid the interruption during SQL Server Setup, you can download and install the update as described below before running SQL Server Setup (or you can install all the updates for the .NET Framework 3.5 SP1 that are available on Windows Update):
    • If you install SQL Server Code-Named “Denali” on a computer with the Windows Vista SP2 or Windows Server 2008 SP2 operating system, you can get the required update from here.
    • If you install SQL Server Code-Named “Denali” on a computer with the Windows 7 SP1 or Windows Server 2008 R2 SP1 operating system, this update is already installed on the computer.
  • Windows PowerShell 2.0 is a pre-requisite for installing SQL Server Code-Named “Denali” Database Engine components and SQL Server Management Studio, but Windows PowerShell is no longer installed by SQL Server Setup. If PowerShell 2.0 is not present on your computer, you can enable it by following the instructions on the Windows Management Framework page.
1.2 Supported Operating Systems for SQL Server Code-Named “Denali” Evaluation Edition
SQL Server Code-Named “Denali” Evaluation Edition is supported on the Windows Vista SP2, Windows Server 2008 SP2, Windows 2008 R2 SP1, and Windows 7 SP1 operating systems.
1.3 Installing SQL Server Code-Named “Denali” on Windows Server 2008 R2 Server Core Service Pack 1
Starting with SQL Server Code-Named “Denali” CTP3, you can install SQL Server on Windows Server 2008 R2 Server Core SP1, with the following limitations:
  • Microsoft SQL Server Code-Named “Denali”, Community Technology Preview 3 (CTP 3) does not support Setup using the installation wizard on the Server Core operating system. When installing on Server Core, SQL Server Setup supports full quiet mode by using the /Q parameter, or Quiet Simple mode by using the /QS parameter.
  • Upgrade of an earlier version of SQL Server to Microsoft SQL Server Code-Named “Denali”, Community Technology Preview 3 (CTP 3) is not supported on a computer that is running Windows Server 2008 R2 Server Core SP1.
  • Installing a 32-bit version of Microsoft SQL Server Code-Named “Denali”, Community Technology Preview 3 (CTP 3) edition is not supported on a computer running Windows Server 2008 R2 Server Core SP1.
  • Microsoft SQL Server Code-Named “Denali”, Community Technology Preview 3 (CTP 3) cannot be installed side-by-side with earlier versions of SQL Server on a computer that is running Windows Server 2008 R2 Server Core SP1.
  • When you try to launch the SQL Server Setup by running “setup.exe /QS” on Server Core without specifying /Action, Setup will open the SQL Server Installation Center and will not be able to proceed with the installation. The workaround is to specify an action using the /Action command line parameter.
  • Microsoft Sync Framework is supported on Windows Server 2008 R2 Server Core SP1. You can download the appropriate version of Sync Framework from this Microsoft Download Center page (http://go.microsoft.com/fwlink/?LinkId=221788) and install it on a computer that is running Windows Server 2008 R2 Server Core SP1.
1.4 “Test” Update Available to Try New Product Update Feature
In this release of SQL Server, you can try out the new Product Update feature. This feature integrates the latest product updates with the main product installation so that the main product and its applicable updates are installed at the same time. Product Update can search Microsoft Update, Windows Server Update Services (WSUS), a local folder, or a network share for applicable updates.
For this CTP release, the test update will not be distributed to WSUS. When SQL Server 2011 releases, updates will be available to WSUS under the product “SQL Server 2011 Product Updates For Setup”. After a WSUS administrator approves, any client that is pointed to the WSUS service will see these updates. For this CTP release, you can still verify the WSUS functionality by importing from the Microsoft Update Catalog. To do this, follow these steps:
  1. Right-click the Updates node within your WSUS server, and then click Import Updates.
  2. In the Search box, search for SQL Server "Denali" CTP3 Test Update.
  3. Select the update to add, and then click View Basket.
  4. Select Import directly into Windows Server Update Services.
  5. Approve the imported update.
For more information about the Product Updates feature in SQL Server Code-Named “Denali”, see "Product Updates" in the SQL Server "Denali" Installation topic.
1.5 Setup Error on Windows Vista and Windows Server 2008
SQL Server Code-Named “Denali” requires Windows PowerShell 2.0 as a prerequisite. PowerShell 2.0 currently cannot be installed on either Windows Vista or Windows Server 2008 if the operating system language is one of the following 11 languages: Bulgarian (Bulgaria), Estonian (Estonia), Croatian (Croatia), Lithuanian (Lithuania), Latvian (Latvia), Slovak (Slovakia), Slovenian (Slovenia), Serbian (Latin, Serbia), Romanian (Romania), Thai (Thailand), or Ukrainian (Ukraine). You might see the following error:
"This package is not applicable."
1.6 Renaming the SQL Network Name Resource Is Not Supported in SQL Server Code-Named "Denali"
SQL Server Code-Named “Denali” does not support renaming the SQL Network Name resource. If you rename an existing SQL Network Name resource using Windows Cluster Administrator, SQL Server Upgrade and other setup operations of the renamed cluster instance might fail.
1.7 Instance Name Ending with '$' Fails Validation During Setup on Clustered Environment
SQL Server Setup requires user to define the Instance Name for new installations. If you define an instance name that ends with "$", in the Server Configuration page where accounts for services are identified, Setup fails to validate the default accounts derived from instance names. You will see the following validation error in the Setup UI or in the Detail.txt setup log if the setup is run on the command line:
“The name provided is not a properly formed account name.”
Workaround: Change the instance name while making sure the instance does not end with "$". Or, you can change service accounts that end with "$" and use another account, such as a service account, that is appropriate for this purpose.

1.8 A Cluster Group Name that Includes a "\" or "'" Will Cause Setup to Fail When Installing a Clustered Instance 

SQL Server Setup requires users to specify a name for the SQL Server cluster resource group for new cluster installations. If you define an cluster group name that includes a "\" or “ ‘ “, setup will later fail when configuring cluster group. You might see an error similar to the following:

“The cluster group ‘MYGROUP\WITHBACKSLASH’ could not be created.”


Workaround:
Change the cluster group name while making sure the instance does not include “\” or “ ‘ ”.

1.9 Semantic Search Requires You to Install an Additional Dependency
Statistical Semantic Search has an additional prerequisite, the semantic language statistics database, which is not installed by the SQL Server Setup program.
Workaround: To set up the semantic language statistics database as a prerequisite for semantic indexing, do the following things:
  1. Locate and run the Windows Installer package named SemanticLanguageDatabase.msi on the SQL Server installation media to extract the database. For SQL Server Code-Named “Denali” Express, download the semantic language statistics database from Microsoft Download Center (http://go.microsoft.com/fwlink/?LinkId=221787), and then run the Windows Installer package.
  2. Move the database to an appropriate data folder. If you leave the database in the default location, you must change permissions before you can attach it successfully.
  3. Attach the extracted database.
  4. Register the database by calling the stored procedure sp_fulltext_semantic_register_language_statistics_db and providing the name that you gave to the database when you attached it.
If these tasks are not completed, you will see the following error message when you try to create a semantic index:
Msg 41209, Level 16, State 3, Line 1
A semantic language statistics database is not registered. Full-text indexes using 'STATISTICAL_SEMANTICS' cannot be created or populated.
1.10 Sync Framework Is Not Included in the Installation Package
Sync Framework is not included in the SQL Server Code-Named "Denali" CTP3 installation package. You can download the appropriate version of Sync Framework from this Microsoft Download Center page.

1.11 Using Denali's SharePoint Features Requires SharePoint Server 2010 SP1
If you plan to use SQL Server Code-Named "Denali" features in a SharePoint environment, then SharePoint Server 2010 Service Pack 1 (SP1) is required.
Workaround: You must install SP1 and fully patch the server farm before you add SQL Server features to the farm. This requirement applies to the following Denali features: using an instance of Database Engine as the farm's database server, configuring PowerPivot for SharePoint, or deploying Reporting Services in SharePoint mode.

2.0 Analysis Services
2.1 Cannot Create or Open Tabular Projects When BUILTIN\Administrators Group Name Cannot Be Resolved
You must be an administrator on a workspace database server before you can create or open tabular projects. A user can be added to the server administrators group by adding the user name or group name. If you are a member of the BUILTIN\Administrator group, you cannot create or edit BIM files unless the workspace database server is joined to the domain from which it was originally provisioned. If you open or create the BIM file, it will fail with the following error message:
"The BIM file cannot be opened. The server connected to is not valid. Reason: You are not an administrator of server [server name]."
Workarounds:
  • Re-join the workspace database server and Business Intelligence Development Studio (BIDS) computer to the domain.
  • If the workspace database server and/or BIDS computers are not going to be domain joined at all times, add individual user names instead of the BUILTIN\Administrators group as administrators on the workspace database server.
2.2 RS Data Driven Subscriptions Might Not Work Against an AS Data Source
SQL Server 2008 Analysis Services (AS) OLE DB Provider unconditionally overwrites the registry to point to itself as the latest provider, whether or not a newer provider is installed. Data Driven Subscriptions within Reporting Services (RS) require the SQL Server Code-Named "Denali" provider to work.
Office 2010 installs the SQL Server 2008 provider. In the case where Office is installed after Denali, the SQL Server 2008 provider declares itself as the latest even though it is not. If you attempt to create or edit a Data Driven Subscription with a Delivery Query that obtains data from Analysis Services using MDX, then you receive the following error:
"The dataset cannot be generated. An error occurred while connecting to a data source, or the query is not valid for the data source.
Workaround: Repair the Denali CTP3 version of Microsoft Analysis Services OLE DB Provider for Microsoft SQL Server.
2.3 Installing SQL Server 2008 R2 after installing SQL Server Denali causes parse failures in the Reporting Services Query Designer
Create a new report project in SQL Server Denali. Add a new dataset to this report. Open the Query Designer and add a query. You might receive the following error message:
"Query preparation failed. Failed to parse the query to detect if it is MDX or DMX. (MDXQueryGenerator)"
Workaround: This error message occurs only if you install SQL Server 2008 R2 after installing SQL Server Denali. To fix the problem, re-install the MSOLAP provider using the Denali provider. Download SQL Server Code-Named “Denali” CTP3 from the Download Center page.
2.4 Databases Cannot Be Restored Using SSMS When a Server Only Has Models in DirectQuery Mode
You cannot restore a database by using SQL Server Management Studio (SSMS) when the server only has models in DirectQuery Mode.
If you install an Analysis Services instance running in VertiPaq mode, deploy only DirectQuery models to it, right-click any database, click Restore or Restore from PowerPivot, and click Browse to find the follow, then the following error message appears:
"DirectQuery error: MDX/SQL operations are not supported in DirectQuery mode."
Workaround: You can restore the database by using XMLA script, PowerShell, or AMO. You can use SSMS to restore DirectQuery models if there is at least one VertiPaq model on the server.
2.5 Creating Partitions After Deployment Can Fail the Next Deployment
Processing might fail after you re-deploy a model with modified partition definition queries.
For example, you might start with a model with one partition (Partition 1), deploy and process this model, leave your data unchanged, and modify your partition definition. Then you define a second partition (Partition 2) that includes a subset of the previous data set for Partition 1 and then deploy and process your model. When the model is deployed and processed using the Default processing option command, the engine says that there is no need to process Partition 1, since it already has data, and therefore it introduces duplicate data into the model when Partition 2 is processed.
Workarounds:
  • Before you re-deploy your model, go to SQL Server Management Studio (SSMS), right-click the database, and then click Process Database. Select the Process Clear option from the dialog box, and then click OK. This removes all the data from your database, and then you can deploy and process from Business Intelligence Development Studio (BIDS) as expected.
  • Change your deployment setting for the model. Right-click your .smproj file, and then click Properties. Change the Processing Option property from Default to Do Not Process or Full.
2.6 Metadata Pane in SSMS Is Empty When You Browse an AS Database
The metadata pane is empty when you browse an Analysis Services (AS) database in SQL Server Management Studio (SSMS).
For example, if you open SSMS, connect to an AS server, right-click a database, and then click Browse, the metadata pane in the cube browser is empty.
Workaround: Drag the metadata pane to resize it, or switch from Design mode to Text mode. This will show the model metadata and allow you to drag-and-drop into the cube browser.
2.7 Working with Large Models Can Be Difficult to Navigate
In tabular modeling, when you work on a model with many tables, the table tabs do now allow scrolling like they do in Excel. Navigating the tables is a challenge when you have about 10 or more tables.
Workarounds:
  • You can drag a table from the list and position the tab where you want, so that you can access specific tabs more easily.
  • You can rearrange the order and position of the existing tabs.
2.8 SSIS Components for AS Tabular Models Do Not Work as Expected
SQL Server Integration Services (SSIS) components for Analysis Services (AS) do not work as expected for tabular models.
There are a number of known issues when you try to write an SSIS package for working with tabular models:
  • The AS Connection Manager cannot use a tabular model in the same solution as a data source. You must explicitly connect to the AS server before configuring the AS Processing Task or the AS Execute DDL Task.
  • There are a number of issues with the AS Processing Task when you work with tabular models:
    • Instead of databases, tables, and partitions, you see cubes, measure groups, and dimensions. This is a limitation of the task. You can still process your tabular model using the cube/measure group/dimension structure.
    • Some processing options supported by AS running in VertiPaq mode are not exposed in the AS Processing Task, such as Process Defrag.

      Workaround: Use the Analysis Services Execute DDL task instead to execute an XMLA script that contains the ProcessDefrag command.

    • Some configuration options in the tool are not applicable. For example, "Process related objects" should not be used when processing partitions, and the "Parallel Processing" configuration option contains an invalid error message stating that parallel processing is not supported on the standard SKU.
 
2.9 A Model Cannot Be Deployed if You Previously Excluded the BIM File from the Tabular Model Project
2.9 A Model Cannot Be Deployed if You Previously Excluded the BIM File from the Tabular Model Project2.9 A Model Cannot Be Deployed if You Previously Excluded the BIM File from the Tabular Model Project2.9 A Model Cannot Be Deployed if You Previously Excluded the BIM File from the Tabular Model Project
You cannot deploy a model if you have previously excluded the .bim file from the tabular model project.
For example, you might open a tabular model project, display the Solution Explorer window, make sure Show all files is selected, right-click the .bim file in Solution Explorer, click Exclude from Project, right-click the .bim file in Solution Explorer, click Include in Project, and then deploy the project.
The build fails, and the following error appears in the Error List:
"Error 1 Cannot build projects that contain more than one BIM file C:\Program Files (x86)\MSBuild\Business Intelligence Semantic Model\1.0\Microsoft.AnalysisServices.VSHostBuilder.targets 71 8 TabularProject7
Workaround: Exclude the file from the project again. To add it back to the project, right-click the project file in the Solution Explorer, click Add, click Existing Item, and then click the .bim file that was previously excluded.
2.10 Cannot Change Connection in SSMS for Some Connection Types in Tabular Models
In the SQL Server Management Studio (SSMS) Connection Properties dialog box for tabular models, the Connection Manager shows the wrong provider for some types of connections, such as data feeds.
Workaround: Manually type the connection string in the Connection Properties dialog box rather than using the Connection Manager.
2.11 Filtering Members from Different Levels of a Hierarchy Causes an Error
With SQL Server Analysis Services installed in tabular mode, Data Analysis Expressions (DAX) and Multidimensional Expressions (MDX) queries against a tabular model that involve arbitrary shapes with mixed granularity will fail with an error message that indicates such queries are not supported in VertiPaq mode. You see the following error in the PivotTable:
“The query requires the generation of a data request involving a slice that consists of an arbitrary-shaped set with tuples from different hierarchy levels, which is not supported when using Analysis Services in VertiPaq mode.”
2.12 Installing BIDS Fails to Load the VSTS Package
If you install and open Business Intelligence Development Studio (BIDS) and then create a new Analysis Services, Reporting Services, or Integration Services project, the following error message appears:
"'VSTS for Database Professionals SQL Server Data-tier Application' package didn't load correctly"
You can ignore this message if you want to work with Analysis Services, Reporting Services, or Integration Services.
Workaround: To work with the data applications, install Project Juneau after you install BIDS.
2.13 BIDS Stops Responding When You Copy a Large Amount of Data to the Clipboard
If you copy a large amount of data to the clipboard or select many columns and rows in a tabular model project, Business Intelligence Development Studio (BIDS) stops responding.
Workaround: Copy only small data sets from the tabular project.
2.14 Overriding a Database in Restore from PowerPivot Is Not Supported
In the Restore from PowerPivot dialog box, you can find an Override Database checkbox. Overriding a database when restoring a PowerPivot workbook is not supported for the SQL Server Code-Named "Denali" CTP3 release of PowerPivot.
2.15 You Cannot Use a Calculated Column to Mark a Table as a Date Table
In Business Intelligence Development Studio (BIDS), you cannot use calculated columns to mark a table as a Date table. For example, if you add a calculated column, click the Mark As Date Table button, and select the calculated column, then you receive the following error:
"The calculated column 'Table[CalculatedColumn1]' cannot be used as a primary key of the table."
2.16 New Item Dialog Box in BIDS Does Not Include the BIM Item Template
In Business Intelligence Development Studio (BIDS), you might show the Solution Explorer, right-click an empty tabular model project, and then click Add New Item. The New Item dialog box is empty, and you cannot add a new BIM file.
Workaround: Create a new project. Use Add Existing Item to add the newly created BIM file to your tabular project.
[Top]
3.0 Books Online
3.1 Help Viewer for SQL Server Crashes in Environments Configured to Run Only IPv6
If your environment is configured to run only IPv6, the Help Viewer for SQL Server Code-Named “Denali” will crash, and you will be presented with the following error message:
HelpLibAgent.exe has stopped working.
Workaround: To avoid this issue, enable IPv4, or use the following steps to add a registry entry and create an ACL to enable the Help viewer for IPv6:
  1. Create a registry key with the name “IPv6” and a value of “1 (DWORD(32 bit))” under HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Help\v1.0.
  2. Set the security ACL’s for the port for IPv6, executing the following from an admin CMD window:
netsh http add urlacl url=http://[::1]:47873/help/ sddl=D:(A;;GX;;;WD).
Note
This applies to all environments running with only IPv6 enabled. IPv4 (and IPv4 with IPv6) enabled environments are not impacted.
4.0 Data Quality Services
4.1 To Reinstall DQS Server, Delete the DQS Objects After Uninstalling DQS Server
If you uninstall the Data Quality Services (DQS) server, the DQS objects (DQS databases, DQS logins, and a DQS stored procedure) are not deleted from the SQL Server instance.
Workaround: To reinstall the DQS server on the same computer and in the same SQL Server instance, you must manually delete the DQS objects from the SQL Server instance. Additionally, you must also delete the DQS databases (DQS_MAIN and DQS_PROJECTS) files from the C:\Program Files\Microsoft SQL Server\MSSQL11.<SQL_Server_Instance>\MSSQL\DATA folder on your computer before you reinstall DQS server. Otherwise, the DQS server installation fails. Move the database files instead of deleting them if you want to preserve data, such as knowledge bases or data quality projects. For more information about removing DQS objects after the uninstall process is complete, see Remove Data Quality Services Objects.
4.2 Indication of a Terminated Knowledge Discovery or Interactive Cleansing Activity is Delayed

If an administrator terminates an activity in the Activity Monitoring screen, an interactive user who is running the knowledge discovery, domain management, or interactive cleansing activity will not receive any indication that his or her activity was terminated until he or she performs the next operation.
4.3 When a Knowledge Base is Locked, a Knowledge Management Activity Cannot Be Opened
If a knowledge base is locked, you should be able to open a knowledge discovery or domain management activity as read-only. However, in CTP3 if a knowledge base is locked, the knowledge discovery and domain management activities are not available.
4.4 Terminating an Activity While Finish Is in Progress Can Fail
If a knowledge discovery or domain management activity is terminated on the Activity Monitoring screen while a Finish operation is in progress for the activity, an exception is thrown.
Workaround: To avoid this, do not terminate a knowledge discovery or domain management activity while Finish is in progress.
4.5 The Data Connection Will Fail If ##MS_dqs_db_owner_login## and ##MS_dqs_service_login## Are Not Users of the Database or the Data Is Not in a DQS Database
For a connection to a data source to work, the user must define the ##MS_dqs_db_owner_login## and ##MS_dqs_service_login## SQL Login as a user of the database, and the database must be in the same SQL Server instance as the DQS databases. This is true even if the user has full permissions to the data source.
4.6 Dates Earlier than 1753 Cause DQS to Fail
If a knowledge discovery activity is performed on a value in a date field that contains dates earlier than 1-1-1753, the process fails.
Workaround: To avoid this, exclude such dates or use the datetime data type.
4.7 Values Longer than 450 Characters Cause DQS to Fail
If data correction is performed on a source table that contains values longer than 450 characters, the process fails. This occurs with a column of type nvarchar(MAX) or varchar(MAX) in the source table, and when the string domain is configured to break domain values into terms, rather than parsing text as a single value.
Workaround: To avoid this, exclude such values.
4.8 A Cancel Operation Can Discard Work from Multiple Activities
If you click Cancel for a running knowledge discovery or domain management activity, and other activities have completed previously without a publish operation being performed while the activity is running, the work from all activities performed since the last publish will be discarded, not just the current one.
Workaround: To avoid this, publish work that you need to persist in the knowledge base before starting a new activity.
4.9 The DQS Client Application Stops Responding with Large Samples
When the Knowledge Discovery process is run with 15,000 or more records, the client application stops responding. This is caused by a bug that results in an error in some cases in which a query is executed through SQLCLR and a high level of parallelism is required.
Workaround: To avoid this, run the following script in SSMS, and then restart SQL Server:
sp_configure 'show advanced options', 1;
GO
RECONFIGURE WITH OVERRIDE;
GO
sp_configure 'max degree of parallelism', 1;
GO
RECONFIGURE WITH OVERRIDE;
GO
When you run this script, the instance uses only one CPU for all queries that it runs.
4.10 Error While Importing Knowledge Bases, Domains, or Domain Values
While you import a knowledge base, domain, or domain values, an error might occur when you click Browse to select the file to be imported. The last-browsed folder location for the import operation is saved in the registry (Initial Directory key), and DQS opens the same folder location when you click Browse. This error occurs if the last saved folder location is not available because of the renaming, moving, or deletion of the target folder.
Workaround: To fix this issue, you must delete the Initial Directory key at HKEY_CURRENT_USER\Software\Microsoft\Microsoft SQL Server\110\Tools\DQ\Studio\Configurations\Dialogs in the registry.
4.11 DQSInstaller Fails if There Is a Pending Restart Because of .NET Framework 4 Updates
If you attempt to run DQSInstaller.exe on a computer where a restart is pending because of an update installed for .NET Framework 4, the installation operation fails with the following message:
“Failed to create AppDomain “DQS_MAIN.dbo[ddl].62”.”
Workaround: To continue, restart your computer, and then run DQSInstaller.exe again.
4.12 Changes in the DQS Configuration Does Not Require You to Restart SQL Server
In the current version of DQS, you no longer have to restart SQL server after changing the DQS server logging configuration for the new logging settings to be applicable. Instead, after making changes in DQS server logging configuration, you must run the following Transact-SQL statements for the new logging settings to be applicable:
USE [DQS_MAIN]
GO
DECLARE @return_value int
EXEC @return_value = [internal_core].[RefreshLogSettings]
SELECT 'Return Value' = @return_value
GO
For more information about DQS log settings, see Managing DQS Log Files.
4.13 Databases Owned by the SQL Server System Administrator (sa) User Are Not Available for DQS Operations
If a database is owned by the SQL Server System Administrator (sa) user, it is not accessible in DQS. To fix this issue, change the owner of the database to any other user, and retry. For information about changing the owner of a database, see Changing the Database Owner.
4.14 Controls Do Not Scale Properly for New Knowledge Scan
If you change the size of text to “Larger – 150%” (in Windows Server 2008 or Windows 7), or change the Custom DPI setting to 200% (in Windows 7), the Cancel and Create buttons on the New Knowledge Base page are not accessible.
4.15 Screen Resolution of 800x600 Is Not Supported
The DQS client application does not display correctly if the screen resolution is set to 800x600.
Workaround: To resolve the issue, set the screen resolution to a higher value.
4.16 Incorrect Cleansing Results for Values with Term-Based Relations
If you define a term-based relation for a value in a domain, and the source data to be cleansed already contains the correct value as defined in the term-based relationship for the value, the cleansing results (both the preview data and the exported data) is affected as follows:
  • The DQS client contains incorrect data in the Source column and an incorrect status in the Status column.
  • The SQL Server Integration Services data correction component contains an incorrect status in the Status column.
For example, you might define a term-based relation for a domain named Company Name and specify that “Inc” is to be corrected to “Incorporated”. If any of the company names in the source data already contain the term “Incorporated”, such as “ABC Incorporated”, the cleansing results are affected as follows:
  • The DQS client incorrectly displays “ABC Inc” in the Source column, instead of “ABC Incorporated”, and displays “Corrected” in the Status column. The cleansing activity should not have changed the data in the source column because it was already correct as per the term-based relation, and “Correct” should be displayed in the Status column.
  • The SQL Server Integration Services data correction component incorrectly displays “Corrected” instead of “Correct” in the Status column. However, the source column correctly displays the original data (“ABC Incorporated”).
The Output column in the cleansing result in both the DQS client and SQL Server Integration Services correctly display the original data (“ABC Incorporated”).
4.17 Map Bigint Column in the Source Data to a Decimal Domain to Prevent Data Loss
If a column in your source data is of the bigint data type, you must map the column to a domain of the decimal data type rather than the integer data type in DQS. This is because the decimal data type represents a larger range of values than the int data type and therefore can hold larger values.
4.18 DQS does not support case-sensitive and some accent-sensitive server collation
If case-sensitive server collation is set for the SQL Server instance where you are installing DQS server, an error occurs while running DQSInstaller.exe, and you will not be able to install DQS server. If you require case-sensitive server collation for your SQL Server instance, you must install DQS server in another SQL Server instance that has case-insensitive server collation. This issue may also occur if you are using some accent-sensitive server collations, for example, Turkish_CI_AS. For more information about collations in SQL Server, see Working with Collations.
4.19 DQS Client and DQS Cleansing component in Integration Services do not work, if not installed in the default location
If you installed the DQS client and Integration Services in any location other than the default directory for shared components, the following error will occur if you run the DQS client or use the DQS Cleansing component in Integration Services:
“The Data Quality Services client application could not start.”
Message Id: ApplicationStartupFailure
Workaround: To avoid this, make sure you install the DQS client and Integration Services in the default location.
4.20 Recommended DQS edition and computer memory for optimal performance
We recommend that you install the 64-bit edition of DQS server on a computer with at least 4 GB RAM for optimal performance.
4.21 Issue with Concurrent Tasks in DQS
You might receive the following error while running a task in DQS such as knowledge discovery, cleansing, or matching:
“Execution of parallel task failed.”
To cater to concurrent task requests from various DQS clients, the concurrent DQS tasks are scheduled and run as background tasks by SQL Server Service Broker based on certain parameters. This error occurs due to a bug in the component that manages these concurrent tasks.

Workaround:
To fix this issue, do the following:
  1. Exit the DQS client.
  2. Using SQL Server Management Studio, run the following SQL statements against the DQS_MAIN database to disable DQS process concurrency:
    USE [DQS_MAIN]
    GO
    /****** Object: ServiceQueue [dbo].[ParallelExecutionRequestQueue] ******/
    ALTER QUEUE [dbo].[ParallelExecutionRequestQueue] WITH STATUS = ON , RETENTION = ON , ACTIVATION ( STATUS = ON , PROCEDURE_NAME = [internal_core].[ParallelExecutionActivator] , MAX_QUEUE_READERS = 1 , EXECUTE AS N'dbo' ), POISON_MESSAGE_HANDLING (STATUS = ON)
    USE [DQS_MAIN]
    GO
    /****** Object: ServiceQueue [dbo].[ParallelCalibrationRequestQueue] ******/
    ALTER QUEUE [dbo].[ParallelCalibrationRequestQueue] WITH STATUS = ON , RETENTION = ON , ACTIVATION ( STATUS = ON , PROCEDURE_NAME = [internal_core].[ParallelCalibrationActivator] , MAX_QUEUE_READERS = 1 , EXECUTE AS N'dbo' ), POISON_MESSAGE_HANDLING (STATUS = ON)
    USE [DQS_MAIN]
    GO
    /****** Object: ServiceQueue [dbo].[CalibrationRequestQueue] ******/
    ALTER QUEUE [dbo].[CalibrationRequestQueue] WITH STATUS = ON , RETENTION = ON , ACTIVATION ( STATUS = ON , PROCEDURE_NAME = [internal_core].[CalibrationActivator] , MAX_QUEUE_READERS = 1 , EXECUTE AS N'dbo' ), POISON_MESSAGE_HANDLING (STATUS = ON)
  3. Restart the SQL Server service.
  4. Start the DQS client, proceed with your task, and complete it.

Note: If you want to re-enable DQS process concurrency, run the above script again, and substitute MAX_QUEUE_READERS = 1 with MAX_QUEUE_READERS = 4 in all the three instances in the script.

4.22 Limitations with Using Regular Expressions in a Domain Rule
You cannot create a definitive regular expression domain rule (“Value equals regular expression") that contains ‘^’ at the beginning and/or ‘$’ at the end of a domain rule expression. This is because DQS adds these anchors (‘^’ and ‘$’) itself to the rule expression if you are using the definitive regular expression domain rule. For detailed information about regular expression language elements in .NET Framework, see Regular Expression Language Elements.
Workarounds:
  • Do not use these anchors (‘^’ and ‘$’) in the definitive regular expression domain rule.
  • If you must use them, use parenthesis around the whole clause. For example, if you want to return all domain values starting with B, use (^B) instead of ^B as the rule expression.

4.23 Some .NET Framework 4 Updates Might Cause DQS to Fail

Some .NET Framework 4 updates after DQS installation might cause DQS to fail, and an error message is displayed when you use the DQS client. You might receive any of the following two error messages:
“A .NET Framework error occurred during execution of user-defined routine or aggregate "SetDataQualitySessionPhaseTwo.”

"Execution of the Init DQ Server stored procedure failed."
On further investigation of these 2 error messages, you will find the following error message in the error description of both the error messages:

            “Assembly in host store has a different signature than assembly in GAC.”

This happens due to a mismatch between the Module Version IDs (MVIDs) of the SQL Common Language Runtime (SQLCLR) assemblies in the SQL Server “Denali” database and the Global Assembly Cache (GAC). The mismatch was caused due to the .NET Framework 4 update that led to a change in the MVID of the assembly in the GAC.

Workaround:
To fix this issue, do the following:
  1. Exit the DQS client.
  2. Using SQL Server Management Studio, run the following SQL statements against the DQS_MAIN database. In the SQL statements below, replace $(NETAssemblies) according to your OS type:
    • For 32-bit OS: Replace with %Windir%\Microsoft.NET\Framework\v4.0.30319.
    • For 64-bit OS: Replace with %Windir%\Microsoft.NET\Framework64\v4.0.30319. Note: In both cases, %Windir% should be replaced with the environmental variable value (c:\windows, for instance).


    BEGIN
    DECLARE @AssemblyName NVARCHAR(200), @RefreshCmd nvarchar(200), @ErrMsg nvarchar(200)

    DECLARE ASSEMBLY_CURSOR CURSOR FOR
    SELECT name as NAME
    from sys.assemblies
    where name not like '%ssdqs%'
    and name not like '%microsoft.sqlserver.types%'
    and name not like '%practices%'
    and name not like '%office%'
    and name not like '%stdole%'
    and name not like '%Microsoft.Vbe.Interop%'

    OPEN ASSEMBLY_CURSOR

    FETCH NEXT FROM ASSEMBLY_CURSOR
    INTO @AssemblyName

    WHILE @@FETCH_STATUS = 0
    BEGIN
    BEGIN TRY
    SET @RefreshCmd = 'ALTER ASSEMBLY [' + @AssemblyName + '] FROM ''' +
    '$(NETAssemblies)\' + @AssemblyName + '.dll' +
    ''' WITH PERMISSION_SET = UNSAFE'
    EXEC sp_executesql @RefreshCmd
    print 'Successfully upgraded assembly ''' + @AssemblyName + ''''
    END TRY
    BEGIN CATCH
    IF ERROR_NUMBER() != 6285
    BEGIN
    SET @ErrMsg = ERROR_MESSAGE()
    PRINT 'Failed refreshing assembly ' + @AssemblyName + '. Error message: ' + @ErrMsg
    END
    END CATCH

    FETCH NEXT FROM ASSEMBLY_CURSOR
    INTO @AssemblyName

    END

    CLOSE ASSEMBLY_CURSOR
    DEALLOCATE ASSEMBLY_CURSOR
    END


  3. Restart the SQL Server service.
  4. Start the DQS client. 
4.24 NVARCHAR(MAX) and VARCHAR(MAX) Data Types Are Not Supported in the DQS Cleansing Component in Integration Services
Data columns of the NVARCHAR(MAX) and VARCHAR(MAX) data types are not supported in the DQS Cleansing component in Integration Services. As such, these data columns are unavailable for mapping in the Mapping tab of DQS Cleansing Transformation Editor, and hence cannot be cleansed.
Workaround: Before processing these data columns using the DQS Cleansing component, you must convert them to DT_STR or DT_WSTR data type using the Data Conversion transform.
4.25 Cannot Import Domain Values from an Excel File (.xls) Containing More Than 255 Columns of Data
If you import values into a domain from an Excel 97-2003 file (.xls) that contains more than 255 columns of data, an exception message appears, and the import fails.
Workaround: To fix this issue, do one of the following:
  • Save the .xls file as .xlsx file, and then import the values from the .xlsx file into a domain.
  • Remove data in all the columns beyond column 255 in the .xls file, save the file, and then import the values from the .xls file into a domain.

 

[Top]
5.0 Database Engine
If you double-click multiple .sql files to open them, a separate instance of SQL Server Management Studio (SSMS) is launched for each file.
Workaround: To open multiple .sql files in a single instance of SSMS, while in SSMS, click File, Open, and then use the menu to browse for the files.
5.2 SQL Server "Denali" Introduces a New Layout for Maintenance Plans
When SQL Server Code-Named "Denali" CTP3 management tools are used to modify an existing maintenance plan created in a previous version of SQL Server management tools (SQL Server 2008 R2, SQL Server 2008, or SQL Server 2005), the maintenance plan is saved in a new format. Earlier versions of SQL Server management tools do not support this new format.
5.3 Need Admin Privileges to Debug a Query in SSMS
When SQL Server “Denali” CTP3 Management Studio is run in normal user mode and you try to debug a query using the Debug, Start Debug menu or using the Alt+F5 shortcut key, an error dialog pops up that says:
"Unable to start Transact-SQL Debugging. Could not connect to computer 'servername'. Visual Studio has insufficient privileges to debug this process. To debug this process, Visual Studio must be run as an administrator."
This issue is caused because SQL Server Management Studio has insufficient privileges to debug the process.
Workaround: To avoid this issue, launch SQL Server Management Studio with the "Run as administrator" option.
5.4 Reinstall Visual Studio after Installing SSMS
When you install either SQL Server Management Studio Code-Named “Denali” CTP3 or SQL Server Management Studio Express Code-Named “Denali” CTP3 on a computer that is running Visual Studio 2010 SP1, some Visual Studio 2010 components are updated to the Visual Studio 2010 SP1 version level while others are not. This “mixed mode" state of Visual Studio is not supported.
Workaround: To resolve this issue, reinstall Visual Studio 2010 SP1, which is available for download from this Download Center page.
5.5 Restore Dialog Box in SSMS Hangs When Required Backup Files Are Not Specified
When a valid restore plan cannot be created from a set of backup files, the Restore Database Dialog hangs indefinitely and becomes unresponsive. This occurs when files required for the restore plan are not specified.
Workaround: Identify the missing backup files, and then create the restore plan again, or use Transact-SQL to restore the database directly.
5.6 SMO Transfer of Databases Requires Column and Database Collations to be the Same
SQL Server Code-Named “Denali” CTP3 uses the SQLBulkCopy API for SMO transfer. To avoid data corruption, SQL Server Code-Named “Denali” CTP3 does not support SMO transfer of databases with tables that have column collations different from the database collation. There is no workaround in this release.
5.7 CTP3 Utility Control Point Does Not Work Correctly with SQL Server 2008 SP2
SQL Server Code-Named “Denali” CTP3 utility control point does not work correctly with instances of SQL Server 2008 PCU2 when both products are installed on the same computer.
Workarounds:
  • Install SQL Server 2008 SP2 and SQL Server Code-Named "Denali" CTP3 on different computers, and then enroll the SQL Server 2008 SP2 instance in the utility control point in SQL Server Code-Named “Denali” CTP3.
  • If both products are installed on the same computer, then download and install the Microsoft® SQL Server® 2008 R2 SP1 Shared Management Objects redistributable package.
5.8 x64 Operating Systems Require 64-bit PowerShell 2.0
32-bit installations of Windows PowerShell Extensions for SQL Server are not supported for instances of SQL Server “Denali” CTP3 on 64-bit operating systems.
Workarounds:
  • Install 64-bit SQL Server Code-Named “Denali” CTP3 with 64-bit Management Tools and 64-bit Windows PowerShell Extensions for SQL Server.
  • Or, import the SQLPS Module from a 32-bit Windows PowerShell 2.0 prompt.
5.9 Cannot Load the Data-tier Application Assembly in PowerShell
In SQL Server Code-Named “Denali” CTP3, PowerShell scripting operations with data-tier applications (DAC) fail with an error, indicating that the assembly Microsoft.SqlServer.Management.Dac could not be loaded. The SQL Server Code-Named “Denali” CTP3 DAC assemblies are compiled using .Net Framework 4.0, and .Net Framework 4.0 assemblies are not supported in any currently available version of PowerShell. No workaround exists for this issue; DAC operations are not supported in PowerShell for SQL Server Code-Named “Denali” CTP3.
5.10 Transact-SQL IntelliSense for New Keywords
The Transact-SQL IntelliSense feature of the Database Engine Query Editor has not been updated to support all of the new syntax introduced in SQL Server Code-Named "Denali" CTP3. New keywords are not encoded as keywords, and they are underlined as syntax errors. We currently plan to add IntelliSense support for the new syntax in a future version of SQL Server Code-Named "Denali".
Workaround: Ignore the syntax errors. The Transact-SQL statements will run successfully.
5.11 AlwaysOn Availability Groups
5.11.1 Some Functionality Is Supported Only When Trace Flag 9532 Is Enabled
In CTP3, some features of AlwaysOn Availability Groups are only at preview quality and might have more bugs than other features. The preview-quality features are under the control of trace flag 9532. We encourage you to enable this trace flag so that you can test the AlwaysOn preview features and report any feedback on them, along with the rest of the AlwaysOn feature set.
The Transact-SQL options listed below are supported only with trace flag 9532 enabled. If you attempt to use any of these options without trace flag 9532, the Database Engine will issue a "Feature not supported yet" error.
Preview-quality Transact-SQL options:
  • CREATE AVAILABILITY GROUP Transact-SQL options:
  • Using more than two REPLICA ON clauses
  • FAILURE_CONDITION_LEVEL
  • HEALTH_CHECK_TIMEOUT
  • SESSION_TIMEOUT
  • Using any of the following ALTER AVAILABILITY GROUP Transact-SQL options:
  • ADD REPLICA ON
  • REMOVE REPLICA ON
  • ADD DATABASE
  • REMOVE DATABASE
  • ENDPOINT_URL
  • FAILURE_CONDITION_LEVEL
  • HEALTH_CHECK_TIMEOUT
  • SESSION_TIMEOUT
  • PRIMARY_ROLE (ALLOW_CONNECTIONS, READ_ONLY_ROUTING_LIST)
  • SECONDARY_ROLE (ALLOW_CONNECTIONS, READ_ONLY_ROUTING_URL)
In addition to turning on these Transact-SQL options, enabling trace flag 9532 also turns on the corresponding SQL Server Management Studio options and PowerShell cmdlets or cmdlet parameters.
To test preview-quality features, enable trace flag 9532 on every server instance that hosts an availability replica for a given availability group.
5.11.2 Deleting and Re-Creating a WSFC Cluster Requires Disabling and Re-enabling AlwaysOn Availability Groups
If you delete and re-create the WSFC cluster on which you have deployed availability groups, you must disable and re-enable the AlwaysOn Availability Groups feature on each server instance on which this feature was enabled on the original WSFC cluster. For more information, see "Enable and Disable AlwaysOn (SQL Server)" in Books Online.
5.11.3 Client-Connectivity for AlwaysOn Availability Groups in CTP3
This section describes driver support for AlwaysOn Availability Groups and workarounds for using ADO.NET with .NET Framework 4.0 or SQL Native Client 11.0 OLEDB for multi-subnet failover.
Driver Support
The following table summarizes driver support for AlwaysOn Availability Groups in CTP3.
Driver
Multi-Subnet Failover
ApplicationIntent
Read-Only Routing
SQL Native Client 11.0 ODBC
Yes
Yes
Yes
SQL Native Client 11.0 OLEDB
No
Yes
Yes
ADO.NET with .NET Framework 4.0
Q4 CY2011*
Q4 CY2011
Q4 CY2011
Microsoft JDBC driver 4.0 for SQL Server
Yes
Yes
Q4 CY2011
* Q4 CY2011 indicates the fourth quarter of the 2011 calendar year.
MultiSubnetFailover=True Not Supported by NET Framework or OLEDB
In CTP3, only SQL Native Client 11.0 ODBC supports MultiSubnetFailover=True.
If you have created an availability group with a listener name (known as the network name or Client Access Point in the WSFC Cluster Manager), and you are using either ADO.NET with .NET Framework 4.0 or SQL Native Client 11.0 OLEDB, potentially, 50% of your client-connection requests to the availability group listener will hit a connection timeout.
Workarounds: We recommend that you do one of the following tasks.
  • If do not have the permission to manipulate cluster resources, change your connection timeout to 30 seconds (this value results in a 20-second TCP timeout period plus a 10-second buffer).
  • Pros: If a cross-subnet failover occurs, client recovery time is short.
  • Cons: Half of the client connections will take more than 20 seconds.
  • If you have the permission to manipulate cluster resources, the more recommended approach is to set the network name of your availability group listener to RegisterAllProvidersIP=0. For more information, see "Sample PowerShell Script to Disable RegisterAllProvidersIP and Reduce TTL", later in this section.
  • Pros: You do not need to increase your client-connection timeout value.
  • Cons: If a cross-subnet failover occurs, the client recovery time could be 15 minutes or longer, depending on your HostRecordTTL setting and the setting of your cross-site DNS/AD replication schedule.
Sample PowerShell Script to Disable RegisterAllProvidersIP and Reduce TTL
The following sample PowerShell script demonstrates how to disable RegisterAllProvidersIP and reduce TTL. Replace yourListenerName with the name of the listener you are changing.
Import-Module FailoverClusters
Set-ClusterResource yourListenerName|Set-ClusterParameter RegisterAllProvidersIP 0
Set-ClusterResource yourListenerName|Set-ClusterParameter HostRecordTTL 300
[Top]
6.0 Integration Services
6.1 How to Move to the Next Page in Long Reports
When you view a standard report in SQL Server Management Studio to troubleshoot Integration Services packages, and the report has multiple pages, press PAGE DOWN or use the scroll wheel to move through the pages.

6.2 How to Set Parameter Values for Integration Services Packages in SQL Agent

Values for parameters and connection manager properties cannot be specified while creating a SQL Agent job step of type “SQL Server Integration Services Package” because the parameters and properties aren’t displayed in the UI.

Workaround: Use the following procedure:
  1. Use the “Execute Package” dialog to set the parameter and connection manager values as appropriate and then click the “Script” button.
  2. Copy the generated script and paste it into a new T-SQL SQL Agent job step.

[Top]
7.0 Master Data Services
7.1 APIs Are Not Intended to Be Used
The following APIs support the MDS infrastructure and are not intended to be used directly from your code. For a list of these APIs, see What's New in Master Data Services in Denali CTP3.

7.2 Names Cannot be Displayed for Domain-Based Attributes
In the Master Data Services (MDS) Add-in for Excel, only the code is displayed for domain-based attributes. If you attempt to view the Name column by clicking the Settings button, and then in the Data section, you select Show Name column for domain-based attributes, Excel might become unresponsive.
Workaround: Restart Excel, and then clear the option.
7.3 Refresh Might Cause Excel to Become Unresponsive
In the Master Data Services (MDS) Add-in for Excel, clicking the Refresh button on the Master Data tab might cause Excel to become unresponsive. This issue occurs when changes have been made to the entity since the last time the worksheet was refreshed.
Workaround: Delete the worksheet, and then load the updated entity into a new worksheet.
7.4 The Add-in is Only Available to the User Who Installed it
The Master Data Services (MDS) Add-in for Excel is installed for the user who is logged on only. The Add-in is not displayed to other users who open Excel from the same computer.
Workaround: Each individual user who is an administrator must install the Add-in.
[Top]
8.0 Reporting Services
8.1 An Error Occurs When You Create a New Reporting Services Service Application
In SharePoint integrated mode, when you attempt to create a new SQL Server Reporting Services Service application, you see an error message similar to the following.
Login failed for user 'NT AUTHORITY\NETWORK SERVICE'
Workaround: Configure the remote database for SharePoint access using one of the following two procedures:
  • Provide an SSRS Service Database script to the DBA of the remote SQL instance.
    1. Run the PowerShell cmdlet to create the T-SQL script:
      Get-SPRSCreateDatabaseScripts -Databasename -username
    2. The DBA of the remote SQL instance then runs the script on the remote computer.
    3. Run the 'SSRS create service application' page, and then specify the SSRS Service Database that was granted rights through the script.
  • Grant DBO Access for the SharePoint Farm Administrators group to the remote SQL Server instance:
    1. Have a SQL DBA of the remote SQL Server instance grant DBO access to the SharePoint Farm Administrators Group.
    2. Run the 'SSRS create service application' page.
8.2 Data Driven Subscriptions and MDX Query Designer Do Not Work Against the AS Data Source
SQL Server Code-Named “Denali” Reporting Services integration with Analysis Services (AS) relies on a code module that is shared by several SQL Server components. In Denali CTP3, the Report Server Data Driven Subscriptions feature and the MDX Query Designer feature cannot load the code module.
If you attempt to create or edit a Data Driven Subscription with a Delivery Query that obtains data from Analysis Services using MDX, you receive an error. If you attempt to use the MDX Query Designer to create or edit an MDX statement that obtains data from Analysis Services, you receive an error. You might see the following error:
“Could not load file or assembly 'msmgdsrv, Version=9.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91' or one of its dependencies. This assembly is built by a runtime newer than the currently loaded runtime and cannot be loaded.”
Workaround: Use the following procedure:
  1. To correct the issue for the MDX Query Designer, install the Microsoft® Analysis Services OLE DB Provider for Microsoft® SQL Server® 2008 R2 specific to your architecture.
  2. To correct the issue for SSRS Data Driven Subscriptions, do the following:
    1. Modify the Reporting Services web.config file as described in the .NET 3.5 documentation. The setting is shown below.
    2. Change the configuration file to the following:
<configuration>
<runtime>
& <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1">
<dependentAssembly>
<assemblyIdentity name="msmgdsrv" publicKeyToken="89845dcd8080cc91" />
<codeBase version="9.0.0.0" href="[ProgramFiles_path]\Microsoft Analysis Services\AS OLEDB\10\msmgdsrv.dll"/>
</dependentAssembly>
</assemblyBinding>
</runtime>br> </configuration>
Note
VVerify you set the value in href to point to the assembly installed with the Analysis Services OLE DB Provider.

8.3 MDX Syntax Highlighting in SSMS Is Not Available
When you write an MDX query or statement in the SQL Server Denali SQL Server Management Studio (SSMS) MDX query window, no syntax highlighting for incorrect statements is displayed. Currently there is no workaround for this using SSMS for SQL Server Code-Named "Denali". The functionality is available in SQL Server 2008 R2.
8.4 SQL Server PDW Files Do Not Work in BIDS and RB
SQL Server PDW data extension files do not work in Business Intelligences Development Studio (BIDS) and Report Builder (RB) in SQL Server Code-Named "Denali" CTP3. The data extension cannot be loaded when executing the query. And the Edit button is disabled in the Data Source Property dialog box.
 
8.5 English Text Found in Chinese Versions of Project Crescent and in SSRS Alerts
In these versions of Project Crescent and the alerts in SQL Server Reporting Services (SSRS) in Microsoft SQL Server Code-Named "Denali" Community Technology Preview 3 (CTP3), the text in these local user interfaces is English:
  • Chinese Simplified (CHS)
  • Chinese Traditional (CHT)
This is expected behavior for this CTP. We plan to localize this text for the released version of Microsoft SQL Server Code-Named "Denali".
8.6 Custom Configuration File Changes Are not Preserved
If you are upgrading a computer from a previous CTP release of Microsoft SQL Server Code-Named "Denali", and you had made custom changes to the configuration files, you will need to make the same changes to the configuration files after you upgrade to SQL Server Code-Named "Denali" Community Technology Preview 3 (CTP3).
TThe following files are affected:
  • web.config
  • client.config
Workaround:
  1. Make a backup copy of the configuration files before you run SQL Server setup.
  2. Run SQL Server setup.
  3. Manually edit the configuration files, and then add in your custom entries.
8.7 Chinese Localized Text in Project Crescent Is Not Currently Available
Localization into Simplified Chinese and Traditional Chinese is not available in Project Crescent in SQL Server Code-Named "Denali" CTP3. This is a known issue, and we plan to address it before the release of Microsoft SQL Server Code-Named "Denali".
8.8 Project Crescent with Arabic Client: Cannot Add Date Fields to Filters in CTP3
In Microsoft SQL Server Code-Named "Denali" CTP3, when using Project Crescent in an Arabic client, if you drag a date field to the filters area, you see a System.ArgumentOutOfRangeException error message. Thus, you cannot filter on a date field in the Filters Area in the CTP3 release of Project Crescent. We currently plan to fix this issue in a future release.
Workaround: For the CTP3 release of Project Crescent, you can use other means of filtering, such as slicers.
[Top]
9.0 StreamInsight
9.1 Download StreamInsight
This release of Microsoft SQL Server Code-Named "Denali" includes a preliminary English version of Microsoft StreamInsight 1.2 on the installation media. To download the final version of StreamInsight 1.2 in English or another language, go to the StreamInsight download page on the Microsoft Download Center.
10.0 Upgrade Advisor
10.1 Link to Install Upgrade Advisor Is Not Enabled on Chinese (HK) Operating Systems
When you try to install Upgrade Advisor on any supported Windows version in Chinese (Hong Kong) operating systems (OS), you might find that the link to install Upgrade Advisor is not enabled.
Workaround: Locate the SQLUA.msi file on your SQL Server Code-Named "Denali" media at \\1028_CHT_LP\x64\redist\Upgrade Advisor or at \1028_CHT_LP\x86\redist\Upgrade Advisor, depending on your operating system architecture.
11.0 How to Find More Known Issues
You can join the SQL Server Community to share your experiences and learn from other people who are using SQL Server Code-Named “Denali”. For more information, see the SQL Server "Denali" Forums, SQL Server Connect Feedback Center (to search for and file bugs or suggestions) and the SQL Server Community Web site.
© 2011 Microsoft Corporation. All rights reserved.
Sort by: Published Date | Most Recent | Most Useful
Comments
  • Why bother including an Upgrade installation option if you can't upgrade anything?

  • Fantastic article!!