Sunday, June 6, 2010

How To Re-Provision a scope in Microsoft Sync Framework

B"H

What happens when you are using Sync Framework and you need to make a change to the schema/structure of the database?

Whether you are just adding a column, or you are changing the primary key on a table in the scope or you need to add a table to the scope. If you change a table in the scope, you invalidate the scope and it will no longer sync with its peers. Even if you update all of the peers so that their schemas all match, the scope config data will still be out of sync as will the stored procedures and maybe even the triggers and the _tracking tables.

So here is my proposal. I have written a function in c# to drop the tracking infrastructure for the tables that have changed. Then a second function that uses the framework to create a new scope but only letting it create the infrastructure for the new tables. Ten it copies the scope config data to the old schema and manually deletes the new schema info.

Before you run this function you need to apply the schema changes to all the peers. What I do is create a script of the changes I want to make and then run them on every peer. You need to run the function ReProvision once for every peer.

Here they are:
     public void ReProvision(SqlConnection conn, string scopeName, IEnumerable<string> tablesThatChanged)  
{
var serverConfig = new SqlSyncScopeProvisioning();
var scopeDesc = SqlSyncDescriptionBuilder.GetDescriptionForScope(scopeName, conn);
scopeDesc.ScopeName += "_temp";

foreach (var tableName in tablesThatChanged)
{
DropTrackingForTable(conn, tableName);

var bracketedName = string.Format("[{0}]", tableName);
scopeDesc.Tables.Remove(scopeDesc.Tables[bracketedName]);

var tableDescription = SqlSyncDescriptionBuilder.GetDescriptionForTable(bracketedName, conn);
scopeDesc.Tables.Add(tableDescription);
}

serverConfig.PopulateFromScopeDescription(scopeDesc);
serverConfig.SetCreateProceduresDefault(DbSyncCreationOption.Skip);
serverConfig.SetCreateTableDefault(DbSyncCreationOption.Skip);
serverConfig.SetCreateTrackingTableDefault(DbSyncCreationOption.Skip);
serverConfig.SetCreateTriggersDefault(DbSyncCreationOption.Skip);
serverConfig.SetPopulateTrackingTableDefault(DbSyncCreationOption.Skip);
serverConfig.SetCreateProceduresForAdditionalScopeDefault(DbSyncCreationOption.Skip);

foreach (var tableName in tablesThatChanged)
{
var bracketedName = string.Format("[{0}]", tableName);

serverConfig.Tables[bracketedName].CreateProcedures = DbSyncCreationOption.Create;
serverConfig.Tables[bracketedName].CreateTrackingTable = DbSyncCreationOption.Create;
serverConfig.Tables[bracketedName].PopulateTrackingTable = DbSyncCreationOption.Create;
serverConfig.Tables[bracketedName].CreateTriggers = DbSyncCreationOption.Create;
}
serverConfig.Apply(conn);

using (SqlCommand comm1 = new SqlCommand(@"
declare @config_id uniqueidentifier, @config_data xml
SELECT @config_id=sc.config_id, @config_data=sc.config_data
From scope_config sc Join [scope_info] si on si.scope_config_id=sc.config_id
WHERE si.scope_name = @scope_name + '_temp'

Update [scope_config] Set config_data=@config_data
From scope_config sc Join [scope_info] si on si.scope_config_id=sc.config_id
WHERE si.scope_name = @scope_name

Delete [scope_config] WHERE config_id=@config_id;
Delete [scope_info] WHERE scope_config_id=@config_id;
", conn))
{
conn.Open();
comm1.Parameters.AddWithValue("@scope_name", scopeName);
comm1.ExecuteNonQuery();
conn.Close();
}
}




1:      public void DropTrackingForTable(SqlConnection conn ,string tableName)
2: {
3: SqlCommand comm;
4: StringBuilder sb = new StringBuilder();
5: //Drop tracking table & triggers
6: sb.AppendFormat(@"
7: IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[{0}_tracking]') AND type in (N'U'))
8: DROP TABLE [dbo].[{0}_tracking]
9: IF EXISTS (SELECT * FROM sys.triggers WHERE object_id = OBJECT_ID(N'[dbo].[{0}_insert_trigger]'))
10: DROP TRIGGER [dbo].[{0}_insert_trigger]
11: IF EXISTS (SELECT * FROM sys.triggers WHERE object_id = OBJECT_ID(N'[dbo].[{0}_delete_trigger]'))
12: DROP TRIGGER [dbo].[{0}_delete_trigger]
13: IF EXISTS (SELECT * FROM sys.triggers WHERE object_id = OBJECT_ID(N'[dbo].[{0}_update_trigger]'))
14: DROP TRIGGER [dbo].[{0}_update_trigger]
15: ",tableName);
16: //Drop Procedures
17: foreach (string procName in new string[] { "delete", "deletemetadata", "insert", "insertmetadata", "update", "updatemetadata", "selectrow", "selectchanges" })
18: {
19: sb.AppendFormat(@"
20: IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[{0}_{1}]') AND type in (N'P', N'PC'))
21: DROP PROCEDURE [dbo].[{0}_{1}]
22: ", tableName, procName);
23: }
24: using (comm = new SqlCommand(sb.ToString(), conn))
25: {
26: conn.Open();
27: comm.ExecuteNonQuery();
28: conn.Close();
29: }
30: }
31:

If anyone would like me to explain what is going on in these functions, just post a comment or send me an email. If I know someone is actually reading this thing I will put more effort into it.

Friday, May 28, 2010

A call to action for a new Heterogeneous Sync Framework

B"H

We need a framework to sync separate unrelated data sources . Like LDAP & Outlook, or Outlook and your custom in-house CRM.

I am constantly coming across scenarios that sound exactly like this, and every time it seams like I am re-inventing the wheel and building the entire solution over again. I am actually quite confident that everyone reading this post will have dozens of examples similar to what is described here.

I have come to the conclusion a while ago that this problem domain needs to be abstracted and then we can build a framework to handle some of the recurring themes. I was really excited when I found Microsoft Sync Framework because I thought it was exactly that type of abstraction. However after researching it for about two months now, and implementing it a few times. I have to conclude that it is not what we are looking for.

We need a Heterogeneous Sync Framework. A Framework to sync unrelated (but naturally relatable) Data Sources.

The Microsoft Sync Framework deals too much with the relationship and change tracking. It assumes that you are starting with matching data, and only then, does it changes over time.

We need a Sync Framework that assumes that we have conflicting data at every step, and we have no idea how it got that way. Because the data is coming from two sources that couldn't give a darn about each other.

Enough ranting for now.
If anyone knows of such a framework, or wants to start working on such a framework with me (it would need to be much less complicated than Microsoft's solution), or knows how SyncFX can gracefully handle these scenarios. Please POST A COMMENT!

Thank you

Thursday, May 27, 2010

Microsoft Sync Framework Data Provider ignores records where a field in the PK changes

If you update a field in a primary key, the SqlServer Provider does not record a change (or insert-delete). Therefore your change never gets propagated and you are left with an orphan record.

Thursday, May 13, 2010

There goes SQL Azures SSL support again

B"H

this morning, I tried accessing some Ms Access reports that have worked just fine in the past, and all of a sudden the gave the error "The certificate's CN name does not match the passed value."

When I checked my c# clients and saw that they were all working fine I remembered seeing this post on MSDN forums. which told me to add "TrustServerCertificate=true" to my ADO.net connection string. I had done that back then, and that was what was making my c# clients work, so I needed a way to add that to my ODBC parameters.

Thank G-D, I recently switched the way that I have my clients specify ODBC parameters on their personal computers.

I destribute a *.reg file which creates a User DSN on their computer.
Then all of my ODBC connections refer to that DSN, knowing that it will definitely exest on their computer or I can just instruct them to run the *.reg file.

So this morning all I had to do was figure out the correct syntaxt for "TrustServerCertificate=true" in ODBC and redistribute the *.reg file and instruct all of my clients to rerun the file.

the file now looks like this:

Windows Registry Editor Version 5.00

[HKEY_CURRENT_USER\Software\ODBC\ODBC.INI\NameOfDSN]
"Driver"="c:\\Windows\\system32\\sqlncli10.dll"
"Server"="NameOfMyServer.database.windows.net"
"Database"="NameOfDatabase"
"Encrypt"="Yes"
"TrustServerCertificate"="Yes"
"LastUser"="username@NameOfMyServer"
"Trusted_Connection"="No"
"AutoTranslate"="No"

[HKEY_CURRENT_USER\Software\ODBC\ODBC.INI\ODBC Data Sources]
"
NameOfDSN"="SQL Server Native Client 10.0"

replacing NameOfDSN with the actual name of the DSN and NameOfMyServer with the actual name of the server

I hope this helps someone.

Thursday, April 29, 2010

How to get your data into (and out of) SQL Azure

B"H

Most applications that will be hosting their data in SQL Azure at this point, will be applications that are being ported from other other databases. Particularly SQL Server 2008 that has been hosted on premises.

So how do you get the data from your local SQL to SQL Azure?

What you'll find out from the various sources on the net is that you have many options. I will try writing a blog post about each one. If I missed any, please let me know.
  • Write ADO.net code.
    I'll post some samples.
  • Write code to use the BCP utility.
    The sql azure team have instructions on their blog
  • Write BCP API code.
    Most of us are more comfortable with C# than with batch files. So automating a transfer would be more natural for us with an api.
  • Use the "SQL Azure Migration tool"
    Download it here. It uses bcp internally & it is open source hosted on codeplex. so we can tweak it to do whatever we need. Currently a migration can not be saved and/or repeated. However, it is open source, so maybe one of us can add that functionality.
  • Use the Microsoft Sync Framework with its new Azure capabilities

  • Use SSIS - Sql Server Migration Services
    Not available in SQLExpress so I will not write about it.

Monday, April 26, 2010

My first step

B"H

Recently I decided to take a client to the cloud!

The client did not have reliable server space at their location (which was continuously causing problems) and we were having issues integrating their data with their partners. So we decided to move to the cloud.

Being a Microsoft shop the obvious choice was Azure.

What wasn't so obvious was some of the turbulence that we were going to hit (and continue to hit) on our journey cloudward.

So here I am typing my words into the ether, resigned to the fact they they may never be read by another human being. But with the hopes that I will be able to help at least one person navigate the journey and avoid some of the pitfalls that I encountered.

I am a programmer/developer/architect (about as far from an administrator as possible) so things should be taken from that perspective.

I will start with the issues I encounter as they arise an then get back to the first challenges when I have time. So this may not be in chronological order. Sorry no road map here. But I do hope that it will help with particular details.

I will covers issues relating to Azure (right now sql azure then on to the other offerings), SQL when applicable and the Sync Framework, which I now use for so much more but I originally discovered it while trying to migrate my data to azure.