Advertisements

Archive

Posts Tagged ‘Sync Framework’

How to sync large size database with sync framework

May 11, 2011 3 comments

By using sync framework 2.1 we can easily sync databases frequently by scheduling sync with some interval. In one of our project we have to give chance for client to schedule sync frequently. We have implemented the functionalities using sync framework 2.1 with ASP.Net mvc application. You can go through the main codes for sync here.

It was working fine until we are testing with large size db (greater than 10 GB). When we are testing with large size db, we got error in the middle of sync process.

The error is  “There is no enough space for the disk”. Then we are increased the size of the target db to 20 GB (source db 10GB) but got same error. We are search on the google and cannot found enough support for the issue. We are going behind the issue and after some days we have found the root cause of the issue. It was not related to sync framework. When sync process is running, a log file will be writing as backend process to the location “C:\Resources\temp\7aad4070ce51495c82cde6b1d410aa65.WorkerRole1\RoleTemp” of Virtual Machine in the worker role (WaHostBootstrapper). The size of this log file will be increasing continuously and there is some limitation for the size of the file (normally 60 mb). Obviously it will take long time to sync large db and log file size increased and crashed once the max size exceeded. At last we have found the solution and now our application can sync large db without any error. The solution is given below.

1.    We need to extend the max size of the log file. We can achieve it by following code in the “ServiceDefinition.csdef”

<LocalResources>

      <LocalStorage name=”localStoreOne” sizeInMB=”20480″ cleanOnRoleRecycle=”true” />

    </LocalResources>

   Name ( localStoreOne) : indicates your local storage name

    sizeInMB : indicates the maximum size of the log file you want to give.

    cleanOnRoleRecycle : It will delete and recreated log file for each workerrole when it is        set to true

2.    In the “OnStart()” method in the workerrole we need to map the temp folder by using following code.

string customTempLocalResourcePath =

       RoleEnvironment.GetLocalResource(“localStoreOne”).RootPath;

       Environment.SetEnvironmentVariable(“TMP”, customTempLocalResourcePath);

       Environment.SetEnvironmentVariable(“TEMP”, customTempLocalResourcePath);

Then we can see that log file is archived in the middle of sync with some size, and can hold up to the size that we mentioned in the ServiceDefenition file. It is better to  cleanOnRoleRecycle is set to true, then it will automatically deleted the log files once the worker role restart and recreated again.

Advertisements

How to Sync schema changed database using sync framework?

May 4, 2011 5 comments

I was working on azure database synchronization for the last month and successfully completed the task. In my case both databases, source and target are in the azure. We can also done the sync from azure to local also. Below are the codes for using sync two azure database.
Step by Step process of sync databases using sync framework :
1. Create both source db and target db connection strings
2. Create DbSyncScopeDescription object ‘myscope’ for describing our scope.
3. Add tables to the ‘myscope’ that we need to sync
4. Do the provisioning process for both source db and target db.
5. Set memory allocation to the database providers
6. Set application transaction size on destination provider.
7. Create object for class SyncOrchestrator and call Synchronize() method.

public static void Setup(string sqlSourceConnectionString, 
string sqlTargetConnectionString, 
string scopeName, int scopeid, Scope ScopeDetails)
 {
 try
 {
 SqlConnection sqlServerConn = new SqlConnection(sqlSourceConnectionString);
 SqlConnection sqlAzureConn = new SqlConnection(sqlTargetConnectionString);
 DbSyncScopeDescription myScope = new DbSyncScopeDescription(scopeName);
string connectionstring = ConfigurationSettings.AppSettings["ConnectionString"];
 SqlDataReader myDataReader = null;
 SqlConnection conn = new SqlConnection(connectionstring);
//Adding tables that we need to sync from the source db
 SqlCommand cmd = new SqlCommand("Select TableName from
scheduler_tables where scopeid =" + scopeid, 
conn);
 conn.Open();
 myDataReader = cmd.ExecuteReader();
while (myDataReader.Read())
 {
 DbSyncTableDescription TestSchema1 = 
SqlSyncDescriptionBuilder.GetDescriptionForTable(Convert.ToString
(myDataReader["TableName"]), 
sqlServerConn);
 // Add the tables from above to the scope
 myScope.Tables.Add(TestSchema1);
 }
// Setup SQL Server for sync
 SqlSyncScopeProvisioning sqlServerProv = new SqlSyncScopeProvisioning
(sqlServerConn, myScope);
 sqlServerProv.CommandTimeout = 60 * 30;
 if (!sqlServerProv.ScopeExists(scopeName))
 {
 // Apply the scope provisioning.
 sqlServerProv.Apply();
 }
// Setup SQL Azure for sync
 SqlSyncScopeProvisioning sqlAzureProv = new SqlSyncScopeProvisioning
(sqlAzureConn, myScope);
 sqlAzureProv.CommandTimeout = 60 * 30;
 if (!sqlAzureProv.ScopeExists(scopeName))
 {
 sqlAzureProv.Apply();
 }
 sqlAzureConn.Close();
 sqlServerConn.Close();
Sync(sqlSourceConnectionString, sqlTargetConnectionString, ScopeDetails,
 instanceID);
 }
 catch (Exception ex)
 {
 throw;
 }
 }
public static void Sync(string sqlSourceConnectionString,
string sqlTargetConnectionString, 
Scope ScopeDetails)
 {
 try
 {
 SqlConnection sqlServerConn = new SqlConnection(sqlSourceConnectionString);
 SqlConnection sqlAzureConn = new SqlConnection(sqlTargetConnectionString);
SqlSyncProvider RemoteProvider = new SqlSyncProvider
(ScopeDetails.ScopeName, sqlAzureConn);
 SqlSyncProvider LocalProvider = new SqlSyncProvider
(ScopeDetails.ScopeName, sqlServerConn);
//Set memory allocation to the database providers
 RemoteProvider.MemoryDataCacheSize = MemorySize;
 LocalProvider.MemoryDataCacheSize = MemorySize;
//Set application transaction size on destination provider.
 RemoteProvider.ApplicationTransactionSize = BatchSize;
//Count transactions
 RemoteProvider.ChangesApplied += new EventHandler
(RemoteProvider_ChangesApplied);
SyncOrchestrator orch = new SyncOrchestrator();
 orch.RemoteProvider = RemoteProvider;
 orch.LocalProvider = LocalProvider;
 orch.Direction = SyncDirectionOrder.Upload;
String syncdetails;
 syncdetails = ShowStatistics(orch.Synchronize());
 sqlAzureConn.Close();
 sqlServerConn.Close();
}
 catch (Exception ex)
 {
 throw;
 }
 }
public static string ShowStatistics(SyncOperationStatistics syncStats)
 {
 string message;
 syncStartTime = syncStats.SyncStartTime.ToString();
 message = "\tSync Start Time :" + syncStats.SyncStartTime.ToString();
 //Console.WriteLine(message);
 syncEndTime = syncStats.SyncEndTime.ToString();
 message = message + "\tSync End Time :" + syncStats.SyncEndTime.ToString();
 //Console.WriteLine(message);
 message = message + "\tUpload Changes Applied :" +
syncStats.UploadChangesApplied.ToString();
 //Console.WriteLine(message);
 message = message + "\tUpload Changes Failed :" +
syncStats.UploadChangesFailed.ToString();
 //Console.WriteLine(message);
 message = message + "\tUpload Changes Total :" +
syncStats.UploadChangesTotal.ToString();
 //Console.WriteLine(message);
 message = message + "\tDownload Changes Applied :" +
syncStats.DownloadChangesApplied.ToString();
 //Console.WriteLine(message);
 message = message + "\tDownload Changes Failed :" +
syncStats.DownloadChangesFailed.ToString();
 //Console.WriteLine(message);
 message = message + "\tDownload Changes Total :" +
syncStats.DownloadChangesTotal.ToString();
 //Console.WriteLine(message);
return message;
 }

The above codes are working fine until any schema changes occurred in the source database. If we made any schema changes on the source sync will be failed.
In my experience, I have added a new column in the source db called ‘isActive’ and got error once I trying to sync again “invalid column name isActive’”.
The cause of the issue is when we sync first time, will created so many tables, stored procedures and triggers by sync framework to identify that which areas to be taken sync in future or which areas having changes after the previous sync.
So we need to clear all these data related to sync, then only it will take any schema changes in the source db. For delete this records Microsoft providing a class SqlSyncScopeDeprovisioning and method DeprovisionStore() for removing all these data. So if we have any schema changes in the source database, we need to create a object of the class and call this method before applying the provision. Below are the codes to de provision the entire db,

SqlSyncScopeDeprovisioning deprovisioningvar = new SqlSyncScopeDeprovisioning(sqlServerConn);
deprovisioningvar.DeprovisionStore();

Then it is working fine ..! The problem is that it will clear all the data regarding the previous sync so it will take long time to sync first time, for the second time onwards there is no need to do the deprovison until any new schema changes occurred.

%d bloggers like this: