<<Part 2


1  Introduction

The previous two parts of this series focused on databases as an Azure service (e.g. SQL Server, MySQL etc.) and databases powered by Azure Cosmos DB platform (e.g. MongoDB, Cassandra). In this final part we’ll look briefly at databases-as-a-service hosted outside of Azure, namely, Oracle and SAP Hana. While there are many databases that fall into this category it will be enough to demonstrate how we can connect and execute a query. The techniques highlighted in the previous article (part 2) upload the necessary client assemblies to enable us to connect to various cloud databases can be employed. There is also another couple of technical considerations that we need to address, one where the client assembly is too large for the Integration Account Assemblies’ blade and one where the client uses unmanaged code.

This article assumes familiarity with the databases being connected to and queried.


2  How to Connect to Oracle

As mentioned in part 1, section 2, the System.Data.OracleClient assembly exists in the GAC. Therefore, we ought to be able to add a reference to the assembly and namespaces that we need to use. There is however an issue: the Oracle client (System.Data.OracleClient), and also the OleDB Provider (MSDAORA) and ODBC Driver for Oracle (Msorcl32), have all been deprecated and will be removed from later versions of the .NET Framework.

Fortunately, Oracle have provided an ODP.NET Managed Driver with no additional client software required to connect to an Oracle Database. The package is available on Nuget (Oracle.ManagedDataAccess). There, however, one small drawback, the client assembly is larger than the 2MB limit for assemblies; it’s 4.8MB.

Steps 

  1. Create a new Oracle database. See here for trial instructions and here to create a new service;
  2.  Install Oracle SQL Developer (see here) and create a ‘sales’ database and ‘customers’ table. Insert some test rows;
  3.  Download the NuGet package Oracle.ManagedDataAccess 18.3.0;
  4.  Create an Azure blob storage;
  5.  Locate the Oracle.ManagedDataAccess.dll and upload it to the blob storage. Note: Use the Storage Explorer to check the content-type. Ensure it is set to application/octet-stream and not application/x-msdownload, e.g.:

  6. Copy the Oracle.ManagedDataAccess blob URI and go to the Integration Account Assemblies blade;
  7. Upload the Assembly, choose larger than 2MB and paste in the URI;
  8. Create the XSLT Oracle.ManagedDataAccess map given below. Update the conString variable with the Oracle DB connection details;
  9. Create a new, or edit an existing, Logic App:
    1. Add a Compose Action and copy the Person XML message from Appendix A;
    2. Add a Transform XML action. Set the Content value to the previous Compose action output. Set the Map to the value to one of the XSLT maps previously uploaded;
  10. Run the Logic App. The output should appear like the following:

  11. The Phone element should be populated with the tasks from the test data created in step 2.

Oracle.ManagedDataAccess

<?xml version="1.0" encoding="utf-16"?>
<xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns:msxsl="urn:schemas-microsoft-com:xslt"  exclude-result-prefixes="msxsl s0 ScriptNS0" version="1.0" xmlns:s0="http://AzureBizTalkMapsDemo/Person" xmlns:ns0="http://AzureBizTalkMapsDemo/Contact" xmlns:ScriptNS0="http://schemas.microsoft.com/BizTalk/2003/ScriptNS0">
  <xsl:output omit-xml-declaration="yes" method="xml" indent="yes" version="1.0" />
  <xsl:template match="/">
  <xsl:apply-templates select="/s0:Person" />
  </xsl:template>
  <xsl:template match="/s0:Person">
  <ns0:Contact>
  <Title><xsl:value-of select="Title/text()" /></Title>
  <Forename><xsl:value-of select="Forename/text()" /></Forename>
  <Surname><xsl:value-of select="Surname/text()" /></Surname>
  <Phone><xsl:value-of select="ScriptNS0:GetPhoneNo(ID/text())" /></Phone>
  </ns0:Contact>
  </xsl:template>
  <msxsl:script language="C#" implements-prefix="ScriptNS0">
  <msxsl:assembly name ="System.Data"/>
  <msxsl:assembly name ="Oracle.ManagedDataAccess, Version=4.122.18.3, Culture=neutral, PublicKeyToken=89b483f429c47342"/>
  <msxsl:using namespace="Oracle.ManagedDataAccess.Client" />
  <![CDATA[
  public string GetPhoneNo(int id)
  {
  //Create a connection to Oracle
  string conString = "User Id=system; password=xxxxxxxxx;" +
  
  //How to connect to an Oracle DB without SQL*Net configuration file
  //also known as tnsnames.ora.
  "Data Source=xxx.xxx.xxx.xxx:1521/PDB1.xxxxxxxxx.oraclecloud.internal; Pooling=false;";
  
  //How to connect to an Oracle Database with a Database alias.
  //Uncomment below and comment above.
  //"Data Source=pdborcl;Pooling=false;";
  
  OracleConnection con = new OracleConnection();
  con.ConnectionString = conString;
  con.Open();
  
  //Create a command within the context of the connection
  //Use the command to display employee names and salary from the Employees table
  OracleCommand cmd = con.CreateCommand();
  cmd.CommandText = "SELECT PHONE FROM CUSTOMERS WHERE ID=1";
  
  //Execute the command and use datareader to display the data
  OracleDataReader reader = cmd.ExecuteReader();
  reader.Read();
  return reader.GetString(0);
  }
  ]]>
  </msxsl:script
</xsl:stylesheet>

Points to Note

Remember that the assemblies will get cached and so this download from blob storage will not occur every time the map is run.

 


3  How to Connect to SAP Hana

In this final section we’ll look at connecting to SAP Hana, which will prove to be the most complicated. There will be two main issues to overcome: first uploading the unmanaged code and the second is the managed code searching the registry for the unmanaged code locations. It is assumed that you have access to a SAP Hana database and have created a test schema and database.

Unfortunately, there is no Nuget package for this client. Download the SAP HANA Client Software Packages SAP HANA Client Revision 120 for Windows x86 32 bit, this package contains the ADO.NET assemblies but a lot more besides that are not of interest in our case. There are three assemblies that are required; two can be found in the client/ADONET.TGZ/ADONET.tar zip file: libadonetHDB.dll & Sap.Data.Hana.v4.5.dll (in /ado.net/v4.5), and libSQLDBCHDB.dll in client/SQLDBCTGZ/SQLDBC.tar. The first and third of these assemblies are the data provider’s unmanaged code as described here.

One way to upload the unmanaged code is by embedding the assemblies as resources with a .NET assembly and then calling a class method to write the resources out the local file system. The process is as follows:

1. Copy the three files to a separate folder;

2. Right-click on the file libadonetHDB.dll and click Send to |Compressed (zipped) folder;

3. Right-click on the file libSQLDBCHDB.dll and click Send to |Compressed (zipped) folder;

4. Create a new C# class library and add the two files from steps 2 & 3 as resources;

5. Replaces the class library code with the following:

using Microsoft.Win32;
using System;
using System.IO;
using System.IO.Compression;
 
namespace SAPHanaClient
{
    static public class Client
    {
        static Client()
        {
        }
 
        static public string Initialize()
        {
            string status = string.Empty;
            string sapClientPath = System.IO.Path.GetTempPath() + "SAP\\";
 
            try
            {
                // check SAPHana folder exists, if not create it
                if (!Directory.Exists(sapClientPath))
                {
                    System.IO.Directory.CreateDirectory(sapClientPath);
                    status += "Directory created. ";
                }
 
                // check libadonetHDB.dll exists, if not decompress and write file
                if (!File.Exists(sapClientPath + "libadonetHDB.dll"))
                {
                    byte[] libadonetHDB = SAPHanaClient.Properties.Resources.libadonetHDB;
                    File.WriteAllBytes(sapClientPath + "libadonetHDB.zip", libadonetHDB);
                    status += "libadonetHDB.zip written. ";
                    ZipFile.ExtractToDirectory(sapClientPath + "libadonetHDB.zip", sapClientPath);
                    status += "libadonetHDB.zip unzipped. ";
                }
 
                // check libSQLDBCHDB.dll exists, if not decompress and write file
                if (!File.Exists(sapClientPath + "libSQLDBCHDB.dll"))
                {
                    byte[] libSQLDBCHDB = SAPHanaClient.Properties.Resources.libSQLDBCHDB;
                    File.WriteAllBytes(sapClientPath + "libSQLDBCHDB.zip", libSQLDBCHDB);
                    status += "libSQLDBCHDB.zip written. ";
                    ZipFile.ExtractToDirectory(sapClientPath + "libSQLDBCHDB.zip", sapClientPath);
                    status += "libSQLDBCHDB.zip unzipped. ";
                }
 
                return status;
            }
            catch (Exception ex)
            {
                return "Status: " + status + " Exception: " + ex.ToString();
            }
        }
    }
}

6. The method above, when executed, will create a ‘SAP’ folder in the local ‘TEMP’ folder, write out the two zip files, and then unzipped them both;

7. Sign and build the assembly. An assembly SapHanaClient.dll should have been built.

The second problem is more involved. To update the registry with the appropriate keys we need Administrator privileges which we don’t have. There are two possibilities: edit the managed code and set the registry values or implement a hook to catch the registry read calls (e.g. see here). Going with the first option:

1. Install dnSpy (see here);

2.       Open the Sap.Data.Hana.v4.5.dll assembly and locate the class HanaUnmanagedDll and the method SearchNativeDlls(string):

3. Replace the contents with the following:

4. Make sure the assembly is strong named and save the changes.

At this point we now have our two client assemblies to use. Notice that one is over 2MB and so we’ll have to use the same technique as the Oracle client in the previous section and upload it to blob storage. It’s is well worth testing these assemblies in a console app before to ensure they work as expected before proceeding.

Steps

 

1. Create a new SAP Hana database. See here for trial instructions and here to create a new service;

2. Install Eclipse and the SAP plugin (see here & here) and create a ‘sales’ database and ‘customers’ table. Insert some test rows;

3. Create an Azure blob storage;

4. Locate the SapHanaClient.dll and upload it to the blob storage. Note: Use the Storage Explorer to check the content-type. Ensure it is set to application/octet-stream and not application/x-msdownload;

5. Copy the SapHanaClient blob URI and go to the Integration Account Assemblies' blade;

6. Upload the assembly, choose larger than 2MB and paste in the URI;

7. Upload the Sap.Data.Hana.v4.5.dll assembly to the Integration Account Assemblies' blade;

8. Create the XSLT sap.Hana.Data map given below. Update the connectionString variable with the Sap Hana DB connection details;

9. Create a new, or edit an existing, Logic App:

a. Add a Compose Action and copy the Person XML message from Appendix A;

b. Add a Transform XML action. Set the Content value to the previous Compose action output. Set the Map to the value to one of the XSLT maps previously uploaded;

10. Run the Logic App. The output should appear like the following:

 

First run

Second run

 

 

11.   The Phone element should be populated in both runs with the phone number from the SAP Hana database. If the first run the notes element should be populated with the initialization result from unzipping the unmanaged code.

Sap.Data.Hana

<?xml version="1.0" encoding="utf-16"?>
<xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns:msxsl="urn:schemas-microsoft-com:xslt"  exclude-result-prefixes="msxsl s0 ScriptNS0" version="1.0" xmlns:s0="http://AzureLogicAppsMapsDemo/Person" xmlns:ns0="http://AzureLogicAppsMapsDemo/Contact" xmlns:ScriptNS0="http://schemas.microsoft.com/BizTalk/2003/ScriptNS0">
  <xsl:output omit-xml-declaration="yes" method="xml" indent="yes" version="1.0" />
  <xsl:variable name ="initialize" select ="ScriptNS0:Initialize()" />
  <xsl:template match="/">
    <xsl:apply-templates select="/s0:Person" />
  </xsl:template>
  <xsl:template match="/s0:Person">
    <ns0:Contact>
      <Title><xsl:value-of select="Title/text()" /></Title>
      <Forename><xsl:value-of select="Forename/text()" /></Forename>
      <Surname><xsl:value-of select="Surname/text()" /></Surname>
      <Phone><xsl:value-of select="ScriptNS0:GetPhoneNo(ID/text())" /></Phone>
      <Notes><xsl:value-of select="$initialize" /></Notes>
    </ns0:Contact>
  </xsl:template>
  <msxsl:script language="C#" implements-prefix="ScriptNS0">
    <msxsl:assembly name ="System.Data, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"/>
    <msxsl:assembly name ="SapHanaClient, Version=1.0.0.0, Culture=neutral, PublicKeyToken=xxxxxxxxxxxxxxx"/>
    <msxsl:assembly name ="Sap.Data.Hana.v4.5, Version=1.0.120.0, Culture=neutral, PublicKeyToken=xxxxxxxxxxxxxxx"/>
    <msxsl:using namespace ="SapHanaClient"/>
    <msxsl:using namespace ="Sap.Data.Hana"/>
    <![CDATA[
      public string Initialize()
      {
  return Client.Initialize();
      }
      public string GetPhoneNo(int id)
      {
  string connectionString = "Server=xx.xxx.xxx.xxx:39015; UserName=xxxxx; Password=xxxxxxxx";
  HanaConnection hanaConn = new HanaConnection(connectionString);
  hanaConn.Open();
 
  string SCHEMA = "DEMO";
  string CUSTOMER_TABLE = "CUSTOMER";
 
  HanaCommand hanaCmd = new HanaCommand("SELECT PHONE FROM \"" + SCHEMA + "\".\"" + CUSTOMER_TABLE + "\" WHERE ID = '1'", hanaConn);
  HanaDataReader hanaDR = hanaCmd.ExecuteReader();
  hanaDR.Read();
  string phone = hanaDR.GetString(0);
 
  return phone;
  }
      }
    ]]>
  </msxsl:script>   
</xsl:stylesheet>

Points to Note

We’ve been able to upload and install managed code to the ‘map’ environment by utilizing the ‘TEMP’ folder. Notice the variable ‘initialize’ declared at the top of the XSLT, this ensures that the Client.Initialize method is call first before any other processing. The method first checks to determine whether the unmanaged code exists in the environment since there’s no need to write out the files each time the map is called. The method also returns debugging information.


4  Further Considerations

Performance

  • Before implementing this pattern it is well worth understanding how it is implemented in BizTalk Server built-in database functoids (see here), and the two community database functiods (BizTalk Mapper Extensions UtilityPack: New Database Functoids & BlogicalFunctoids: BizTalk Database lookup functoid with caching feature). Basically, the database table containing the reference data is first loaded into a Hashtable. This Hashtable is then queried to extract the required data, in other words, only a single connection and query is made. This is an important consideration if, say, the data enrichment process is within a loop; you would not want to open a connection and execute query for each element in the loop;
  • In some cases it may be more appropriate to use an alternative pattern, especially of the map will connect and query many times during its execution. One alternative is to extract the data prior to executing the map, passing both messages to the map. In BizTalk this can be done by using a multi-source map and this technique of using an envelope message (see here) can still be employed in a Logic App XSLT map. Another way is to make use of the xsl:param feature and pass the reference data to the XSLT map as a parameter (see here);
  • The C# script should be avoided if high performance is required as they require loading the script engine multiple times (see here);

Development

  • When developing and testing these XSLT maps this is best done in Visual Studio and using the XSLT | Start XSLT Without Debugging run option rather than using the Logic Apps portal or the BizTalk mapper;
  • During development, when uploading and testing maps in the Logic Apps portal, because of the way the XSLT map get compiled, cached and loaded into the AppDomain, you may find errors like “Could not load file or assembly xxxxxxxxxx or one of its dependencies. The system cannot find the file specified.” or the same output generated for a newly uploaded map or assembly with the same name. This is because while assemblies can be loaded into the AppDomain, they cannot be unloaded. They can only be unloaded by unloading the AppDomain. The AppDomain will get unload when the AppPool gets recycled but waiting for that to occur is rather tedious. One way round this is to execute a script to unload the AppDomain, e.g.: AppDomain.Unload(AppDomain.CurrentDomain).
  • Appendix A contains an updated XSLT map of that provided in part 1 that can be used to unload the current AppDomain. It has also been modified to bring back certain environment details (otherwise we start to get timeouts) only when enabled. Remember to set the Transform options to ‘Generate text output’:

5  Summary

In this series we’ve managed to implement message enricher patterns for ten cloud databases; connecting to and querying them with varying degrees of ease and complexity. The ‘map’ integration environment need not be seen purely as a Blackbox, rather, using a number of techniques, we’ve been able to learn about the environment, what’s installed and configured, and what can and cannot be accessed. Further, by understanding how Integration Account assemblies are used, we can use this mechanism to upload both system and third-party assemblies to the environment and references them with our XSLT maps.

One thing to note is that the assemblies can only be used with XSLT 1.0 maps. It is important to make a distinction between the XSLT 1.0 processor and the XSLT 2.0 processor for two reasons: first, XSLT 2.0 is not backwards compatible with XSLT 1.0 (unless a vendor optionally implements a backwards compatibility mode) and, second, the new XSLT 2.0/3.0 support in Logic Apps has come about not by enhancements to the Microsoft XSLT processor, but by implementing the SAXON HE (9.8.0.8) open source XSLT processor. Now, there is no ability to include inline C# script (or any of the other .NET language available) nor call external assemblies, so this enrichment pattern can only be implemented in XSLT 1.0.

The addition of external assemblies to the Integration Account now makes the migration of XSLT maps that utilize custom assemblies a lot easier. By careful analysis of the assemblies and their references we can easily migrate XSLT maps. From a BizTalk perspective, maps utilizing the Database Functoids can now be migrated too, and the Message Content Enricher pattern implementation can be extended to many cloud databases. Thinking further afield, similar implementations can be developed for other cloud data sources not just databases. 

 


6  References


.Net Assemblies:

BizTalk Database Functoids:

Code Samples:

Azure CosmosDB: 

 


7  Appendix B

 

<?xml version="1.0" encoding="utf-8"?>
<xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform" xmlns:msxsl="urn:schemas-microsoft-com:xslt" exclude-result-prefixes="msxsl userCSharp" version="1.0" xmlns:userCSharp="http://schemas.microsoft.com/BizTalk/2003/userCSharp">
  <xsl:output omit-xml-declaration="yes" media-type="application/text" method="text" version="1.0" />
  <xsl:param name ="GetAppDomainAssemblies" select ="'false'" />
  <xsl:param name ="GetGACAssemblies" select ="'false'" />
  <xsl:param name ="GetOleDbProviders" select ="'false'" />
  <xsl:param name ="GetODBCDrivers" select ="'false'" />
  <xsl:param name ="GetFolderContents" select ="'none'" />
  <xsl:param name ="AppDomain_Unload" select ="'false'" />
  <xsl:template match="/">
  <xsl:value-of select ="userCSharp:Invoke($GetAppDomainAssemblies, $GetGACAssemblies, $GetOleDbProviders, $GetODBCDrivers, $GetFolderContents, $AppDomain_Unload)" />
  </xsl:template>
  <msxsl:script language="C#" implements-prefix="userCSharp">
  <msxsl:assembly name="GACManagerApi, Version=1.0.0.0, Culture=neutral, PublicKeyToken=53b236e983296548" />
  <msxsl:assembly name="Microsoft.Web.Administration, Version=7.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35, processorArchitecture=MSIL" />
  <msxsl:assembly name ="System.Data"/>
  <msxsl:using namespace="System.Data.OleDb" />
  <msxsl:using namespace="GACManagerApi" />
  <msxsl:using namespace="System.Reflection" />
  <msxsl:using namespace="System.Reflection.Emit" />
  <msxsl:using namespace="System.IO" />
  <msxsl:using namespace="System" />
  <msxsl:using namespace="System.Collections" />
  <msxsl:using namespace="System.Runtime.Serialization" />
  <msxsl:using namespace="System.Runtime.Serialization.Formatters.Binary" />
  <msxsl:using namespace="System.Data" />
  <msxsl:using namespace="Microsoft.Win32" />
  <msxsl:using namespace="System.Reflection" />
  <msxsl:using namespace="System.Runtime.Versioning" />
  <msxsl:using namespace="Microsoft.Web.Administration" />
  <msxsl:using namespace="Microsoft.Web.Management" />
  <msxsl:using namespace="System.Threading" />
  <msxsl:using namespace="System.Threading.Tasks" />
  <![CDATA[
  public string Invoke(string getAppDomainAssemblies, string getGACAssemblies, string getOleDbProviders, string getODBCDrivers, string getFolderContents, string appDomainUnload)
  {
  // Need to start a thread in order to return to the calling Logic App
  if (appDomainUnload == "true")
  {
  var t = Task.Run( () =>
  {
  //System.Threading.Thread.Sleep(1000);
  AppDomain.Unload(AppDomain.CurrentDomain);
  });
  return "AppDomain.Unload(AppDomain.CurrentDomain) called.";
  }
  
  string response = string.Empty;
   
  if (getGACAssemblies == "true")
  
  string gacAssemblies = "GAC Assemblies:" + System.Environment.NewLine;
   
  // Create an assembly enumerator.
  var assemblyEnumerator = new AssemblyCacheEnumerator();
  
  // Get the first assembly.
  var assemblyName = assemblyEnumerator.GetNextAssembly();
  
  // Start to loop through the assemblies. 
  while (assemblyName != null)
  {
  // The 'assemblyName' object is a COM interface, if we create an
  // AssemblyDescription from it, we will have access to more information.
  var assemblyDescription = new AssemblyDescription(assemblyName);
  gacAssemblies += assemblyDescription.DisplayName + System.Environment.NewLine;
  
  // Move to the next assembly.
  assemblyName = assemblyEnumerator.GetNextAssembly();
  }
  response = gacAssemblies;
  }
  
  if (getAppDomainAssemblies == "true")
  {
  if (response != string.Empty)
  {
  response += System.Environment.NewLine;
  }
  string adAssemblies = "AppDomain Assemblies:" + System.Environment.NewLine;
  foreach (Assembly a in AppDomain.CurrentDomain.GetAssemblies())
  {
  adAssemblies += a.GetName().FullName + System.Environment.NewLine;
  }
  response += adAssemblies;
  }
  
  if (getOleDbProviders == "true")
  {
  if (response != string.Empty)
  {
  response += System.Environment.NewLine;
  }
  string sProviders = "OleDb Providers:" + System.Environment.NewLine;
  using (IDataReader dr =OleDbEnumerator.GetRootEnumerator())
  {
  while (dr.Read())
  {
  sProviders += dr.GetString(0) + System.Environment.NewLine;
  }
  }
  response += sProviders;
  }
  
  if (getODBCDrivers == "true")
  {
  if (response != string.Empty)
  {
  response += System.Environment.NewLine;
  }
  string[] odbcDriverNames = null;
  using (RegistryKey localMachineHive = Registry.LocalMachine)
  using (RegistryKey odbcDriversKey = localMachineHive.OpenSubKey(@"SOFTWARE\ODBC\ODBCINST.INI\ODBC Drivers"))
  {
  if (odbcDriversKey != null)
  {
  odbcDriverNames = odbcDriversKey.GetValueNames();
  }
  }
  String odbcDrivers = "ODBC Drivers:" + System.Environment.NewLine + string.Join(System.Environment.NewLine, odbcDriverNames);
  response += odbcDrivers;
  }
  
  if (getFolderContents != "none")
  {
  if (response != string.Empty)
  {
  response += System.Environment.NewLine;
  }
  string[] dirContents = Directory.GetFiles(getFolderContents);
  string contents = string.Join(System.Environment.NewLine, dirContents);
  response += "Contents of : " + getFolderContents + System.Environment.NewLine + contents;
  }
   
  return response;
  }
  ]]>
  </msxsl:script>
</xsl:stylesheet>