Monday, November 21, 2011

Sql Azure Import-Export Service / bacpac / dac / Extract Dat-Tier Application

B"H

It goes by many names, but to me it is one thing ...
The much needed sql azure backup solution that has been so sorely missing since sql azure launched.

At first they tried telling us that there is no need for backups because their datacenters are so secure that nothing could ever go wrong, and they could be correct - they will never lose our data. BUT - and this is a big but - one of the main reasons we backup data is to protect against human error. so unless we could create versioned backups, our data was vulnerable.

There were many quazy-solutions. SqlAzureMW, Sync Framework, etc. but none of them allowed you to schedule a backup or give your users a way to create an instantaneous backup before they do some action that could corrupt data. Until now ... I am hoping that this new Export service might finaly be what we need

Before we start
It appears that the service - which is an http endpoint that exports your entire database to an azure storage account - needs to be run be the "server-level principal login (created by the provisioning process)". There are many documents that suggest that there is some combination of permissions that would allow another user to execute the command and trigger a backup. However I have not been able to get any other user to be able to execute it. Maybe because most of those documents (including the http error returned by the server) seam to be written for denali (primarily the on-premises version). Even if I could get it to work I would not want to give those permissions to any user using the database interactively - much less every user.
That leaves us with the only option being creating a service on a secure server that will execute the command to trigger the backup using my personal login. Being that the location of the backups will be hardcoded into the service (i.e. not a parameter) together with my azure storage access key. I can be pretty comfortable allowing anyone to trigger a backup without being concerned with it being a security risk. As I can know exactly where those bacpacs are going and that they can not be redirected into the wrong hands.
That being said... how do you call this http endpoint?

The good stuff
Redgate has a free utility that takes your parameters and issues the call on your behalf. They even have a command prompt that you could put in a batch file. but what if you want to include it in you application (or run it in your service)?

The Dac team put out a sample project on codeplex but no documentation.
I found the sample project to be too complex to show the basic ideas. So I put something together for anyone who is trying to figure it out.

The service has a general endpoint http://dacdc.cloudapp.net/DACWebService.svc and some region specific endpoints like north central us - https://ch1prod-dacsvc.azure.com/DACWebService.svc

The service is a REST like service so you can not just use WCF to call straight into it. You will need to use an http library. However the service requires the parameters to be passed in as an xml document. The easiest way to create xml in the proper format is probably to download the contracts from the service.
You can do that by using svcutil.exe or by "Add A Service Reference" in VS. either of those tools will create proxy wrappers for the data contracts in the service. Neither of them will create the app.config sections that usually come along with WCF client proxies (because the service is not a WCF service in that sense). Neither will their operation contract proxies be of any use.
Once you have the classes matching the Datacontracts the following c# snippet should be all that you need.

I restructured the codeplex project so that the logic is more direct and I stripped out all of the parts that distracted from the example.
I also used the task library to wrap the async pattern which I find to be more understandable and this way it is ready for the asnyc and await keywords in c# 5.


    public class BackupUsingDacService
    {
        public BackupUsingDacService(Credentials credentials)
        {
            _Credentials = credentials;
        }

        private Credentials _Credentials;
        const string _EndPoint = "https://ch1prod-dacsvc.azure.com/DACWebService.svc"; // North Central US
        const string _DbServerName = "###.database.windows.net";
        public Task> Backup()
        {
            var ImportExport = new DacImportExport.ExportInput
                {
                    BlobCredentials = new DacImportExport.BlobStorageAccessKeyCredentials
                    {
                        Uri = "http://###.blob.core.windows.net/sqlbackup/coc_" + DateTime.Now.ToString() + ".bacpac",
                        StorageAccessKey = "###"
                    },
                    ConnectionInfo = new DacImportExport.ConnectionInfo
                    {
                        ServerName = _DbServerName,
                        DatabaseName = _Credentials.InitialCatalog,
                        UserName = _Credentials.UserName,
                        Password = _Credentials.Password
                    }
                };
            var req = (HttpWebRequest)WebRequest.Create( new Uri( _EndPoint + "/Export" ) );
            req.Method = "POST";
            req.ContentType = "application/xml";
            return Task.Factory.FromAsync( req.BeginGetRequestStream, req.EndGetRequestStream, req )
                .ContinueWith( ar =>
                {
                    DataContractSerializer serializer = new DataContractSerializer( typeof( DacImportExport.ExportInput ) );
                    serializer.WriteObject( ar.Result, ImportExport );
                    ar.Result.Close();
                    var req2 = ar.AsyncState as HttpWebRequest;
                    return Task.Factory.FromAsync( req2.BeginGetResponse, req2.EndGetResponse, req2 )
                        .ContinueWith( ar2 =>
                        {
                            Guid resultGuid = Guid.Empty;
                            try
                            {
                                Encoding enc = Encoding.GetEncoding( 1252 );
                                using (var responseStream = new StreamReader( ar2.Result.GetResponseStream(), enc ))
                                {
                                    using (XmlDictionaryReader reader = XmlDictionaryReader.CreateTextReader( responseStream.BaseStream, new XmlDictionaryReaderQuotas() ))
                                    {
                                        DataContractSerializer dcs = new DataContractSerializer( typeof( Guid ) );
                                        resultGuid = (Guid)dcs.ReadObject( reader, true );
                                    }
                                }
                                ar2.Result.Close();
                            } catch (WebException ex)
                            {
                                if ((ex.Response != null) && (ex.Response is HttpWebResponse))
                                {
                                    HttpWebResponse response = ex.Response as HttpWebResponse;
                                    Console.WriteLine( "Service Error: {0} {1} ", response.StatusCode, response.StatusDescription );
                                } else
                                {
                                    Console.WriteLine( "Service Error: {0}", ex.Message );
                                }
                            }
                            return resultGuid;
                        } );

                } );
        }
}

Sunday, June 6, 2010

How To Re-Provision a scope in Microsoft Sync Framework

B"H

What happens when you are using Sync Framework and you need to make a change to the schema/structure of the database?

Whether you are just adding a column, or you are changing the primary key on a table in the scope or you need to add a table to the scope. If you change a table in the scope, you invalidate the scope and it will no longer sync with its peers. Even if you update all of the peers so that their schemas all match, the scope config data will still be out of sync as will the stored procedures and maybe even the triggers and the _tracking tables.

So here is my proposal. I have written a function in c# to drop the tracking infrastructure for the tables that have changed. Then a second function that uses the framework to create a new scope but only letting it create the infrastructure for the new tables. Ten it copies the scope config data to the old schema and manually deletes the new schema info.

Before you run this function you need to apply the schema changes to all the peers. What I do is create a script of the changes I want to make and then run them on every peer. You need to run the function ReProvision once for every peer.

Here they are:
     public void ReProvision(SqlConnection conn, string scopeName, IEnumerable<string> tablesThatChanged)  
{
var serverConfig = new SqlSyncScopeProvisioning();
var scopeDesc = SqlSyncDescriptionBuilder.GetDescriptionForScope(scopeName, conn);
scopeDesc.ScopeName += "_temp";

foreach (var tableName in tablesThatChanged)
{
DropTrackingForTable(conn, tableName);

var bracketedName = string.Format("[{0}]", tableName);
scopeDesc.Tables.Remove(scopeDesc.Tables[bracketedName]);

var tableDescription = SqlSyncDescriptionBuilder.GetDescriptionForTable(bracketedName, conn);
scopeDesc.Tables.Add(tableDescription);
}

serverConfig.PopulateFromScopeDescription(scopeDesc);
serverConfig.SetCreateProceduresDefault(DbSyncCreationOption.Skip);
serverConfig.SetCreateTableDefault(DbSyncCreationOption.Skip);
serverConfig.SetCreateTrackingTableDefault(DbSyncCreationOption.Skip);
serverConfig.SetCreateTriggersDefault(DbSyncCreationOption.Skip);
serverConfig.SetPopulateTrackingTableDefault(DbSyncCreationOption.Skip);
serverConfig.SetCreateProceduresForAdditionalScopeDefault(DbSyncCreationOption.Skip);

foreach (var tableName in tablesThatChanged)
{
var bracketedName = string.Format("[{0}]", tableName);

serverConfig.Tables[bracketedName].CreateProcedures = DbSyncCreationOption.Create;
serverConfig.Tables[bracketedName].CreateTrackingTable = DbSyncCreationOption.Create;
serverConfig.Tables[bracketedName].PopulateTrackingTable = DbSyncCreationOption.Create;
serverConfig.Tables[bracketedName].CreateTriggers = DbSyncCreationOption.Create;
}
serverConfig.Apply(conn);

using (SqlCommand comm1 = new SqlCommand(@"
declare @config_id uniqueidentifier, @config_data xml
SELECT @config_id=sc.config_id, @config_data=sc.config_data
From scope_config sc Join [scope_info] si on si.scope_config_id=sc.config_id
WHERE si.scope_name = @scope_name + '_temp'

Update [scope_config] Set config_data=@config_data
From scope_config sc Join [scope_info] si on si.scope_config_id=sc.config_id
WHERE si.scope_name = @scope_name

Delete [scope_config] WHERE config_id=@config_id;
Delete [scope_info] WHERE scope_config_id=@config_id;
", conn))
{
conn.Open();
comm1.Parameters.AddWithValue("@scope_name", scopeName);
comm1.ExecuteNonQuery();
conn.Close();
}
}




1:      public void DropTrackingForTable(SqlConnection conn ,string tableName)
2: {
3: SqlCommand comm;
4: StringBuilder sb = new StringBuilder();
5: //Drop tracking table & triggers
6: sb.AppendFormat(@"
7: IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[{0}_tracking]') AND type in (N'U'))
8: DROP TABLE [dbo].[{0}_tracking]
9: IF EXISTS (SELECT * FROM sys.triggers WHERE object_id = OBJECT_ID(N'[dbo].[{0}_insert_trigger]'))
10: DROP TRIGGER [dbo].[{0}_insert_trigger]
11: IF EXISTS (SELECT * FROM sys.triggers WHERE object_id = OBJECT_ID(N'[dbo].[{0}_delete_trigger]'))
12: DROP TRIGGER [dbo].[{0}_delete_trigger]
13: IF EXISTS (SELECT * FROM sys.triggers WHERE object_id = OBJECT_ID(N'[dbo].[{0}_update_trigger]'))
14: DROP TRIGGER [dbo].[{0}_update_trigger]
15: ",tableName);
16: //Drop Procedures
17: foreach (string procName in new string[] { "delete", "deletemetadata", "insert", "insertmetadata", "update", "updatemetadata", "selectrow", "selectchanges" })
18: {
19: sb.AppendFormat(@"
20: IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[{0}_{1}]') AND type in (N'P', N'PC'))
21: DROP PROCEDURE [dbo].[{0}_{1}]
22: ", tableName, procName);
23: }
24: using (comm = new SqlCommand(sb.ToString(), conn))
25: {
26: conn.Open();
27: comm.ExecuteNonQuery();
28: conn.Close();
29: }
30: }
31:

If anyone would like me to explain what is going on in these functions, just post a comment or send me an email. If I know someone is actually reading this thing I will put more effort into it.

Friday, May 28, 2010

A call to action for a new Heterogeneous Sync Framework

B"H

We need a framework to sync separate unrelated data sources . Like LDAP & Outlook, or Outlook and your custom in-house CRM.

I am constantly coming across scenarios that sound exactly like this, and every time it seams like I am re-inventing the wheel and building the entire solution over again. I am actually quite confident that everyone reading this post will have dozens of examples similar to what is described here.

I have come to the conclusion a while ago that this problem domain needs to be abstracted and then we can build a framework to handle some of the recurring themes. I was really excited when I found Microsoft Sync Framework because I thought it was exactly that type of abstraction. However after researching it for about two months now, and implementing it a few times. I have to conclude that it is not what we are looking for.

We need a Heterogeneous Sync Framework. A Framework to sync unrelated (but naturally relatable) Data Sources.

The Microsoft Sync Framework deals too much with the relationship and change tracking. It assumes that you are starting with matching data, and only then, does it changes over time.

We need a Sync Framework that assumes that we have conflicting data at every step, and we have no idea how it got that way. Because the data is coming from two sources that couldn't give a darn about each other.

Enough ranting for now.
If anyone knows of such a framework, or wants to start working on such a framework with me (it would need to be much less complicated than Microsoft's solution), or knows how SyncFX can gracefully handle these scenarios. Please POST A COMMENT!

Thank you

Thursday, May 27, 2010

Microsoft Sync Framework Data Provider ignores records where a field in the PK changes

If you update a field in a primary key, the SqlServer Provider does not record a change (or insert-delete). Therefore your change never gets propagated and you are left with an orphan record.

Thursday, May 13, 2010

There goes SQL Azures SSL support again

B"H

this morning, I tried accessing some Ms Access reports that have worked just fine in the past, and all of a sudden the gave the error "The certificate's CN name does not match the passed value."

When I checked my c# clients and saw that they were all working fine I remembered seeing this post on MSDN forums. which told me to add "TrustServerCertificate=true" to my ADO.net connection string. I had done that back then, and that was what was making my c# clients work, so I needed a way to add that to my ODBC parameters.

Thank G-D, I recently switched the way that I have my clients specify ODBC parameters on their personal computers.

I destribute a *.reg file which creates a User DSN on their computer.
Then all of my ODBC connections refer to that DSN, knowing that it will definitely exest on their computer or I can just instruct them to run the *.reg file.

So this morning all I had to do was figure out the correct syntaxt for "TrustServerCertificate=true" in ODBC and redistribute the *.reg file and instruct all of my clients to rerun the file.

the file now looks like this:

Windows Registry Editor Version 5.00

[HKEY_CURRENT_USER\Software\ODBC\ODBC.INI\NameOfDSN]
"Driver"="c:\\Windows\\system32\\sqlncli10.dll"
"Server"="NameOfMyServer.database.windows.net"
"Database"="NameOfDatabase"
"Encrypt"="Yes"
"TrustServerCertificate"="Yes"
"LastUser"="username@NameOfMyServer"
"Trusted_Connection"="No"
"AutoTranslate"="No"

[HKEY_CURRENT_USER\Software\ODBC\ODBC.INI\ODBC Data Sources]
"
NameOfDSN"="SQL Server Native Client 10.0"

replacing NameOfDSN with the actual name of the DSN and NameOfMyServer with the actual name of the server

I hope this helps someone.

Thursday, April 29, 2010

How to get your data into (and out of) SQL Azure

B"H

Most applications that will be hosting their data in SQL Azure at this point, will be applications that are being ported from other other databases. Particularly SQL Server 2008 that has been hosted on premises.

So how do you get the data from your local SQL to SQL Azure?

What you'll find out from the various sources on the net is that you have many options. I will try writing a blog post about each one. If I missed any, please let me know.
  • Write ADO.net code.
    I'll post some samples.
  • Write code to use the BCP utility.
    The sql azure team have instructions on their blog
  • Write BCP API code.
    Most of us are more comfortable with C# than with batch files. So automating a transfer would be more natural for us with an api.
  • Use the "SQL Azure Migration tool"
    Download it here. It uses bcp internally & it is open source hosted on codeplex. so we can tweak it to do whatever we need. Currently a migration can not be saved and/or repeated. However, it is open source, so maybe one of us can add that functionality.
  • Use the Microsoft Sync Framework with its new Azure capabilities

  • Use SSIS - Sql Server Migration Services
    Not available in SQLExpress so I will not write about it.

Monday, April 26, 2010

My first step

B"H

Recently I decided to take a client to the cloud!

The client did not have reliable server space at their location (which was continuously causing problems) and we were having issues integrating their data with their partners. So we decided to move to the cloud.

Being a Microsoft shop the obvious choice was Azure.

What wasn't so obvious was some of the turbulence that we were going to hit (and continue to hit) on our journey cloudward.

So here I am typing my words into the ether, resigned to the fact they they may never be read by another human being. But with the hopes that I will be able to help at least one person navigate the journey and avoid some of the pitfalls that I encountered.

I am a programmer/developer/architect (about as far from an administrator as possible) so things should be taken from that perspective.

I will start with the issues I encounter as they arise an then get back to the first challenges when I have time. So this may not be in chronological order. Sorry no road map here. But I do hope that it will help with particular details.

I will covers issues relating to Azure (right now sql azure then on to the other offerings), SQL when applicable and the Sync Framework, which I now use for so much more but I originally discovered it while trying to migrate my data to azure.