Home > Windows Azure > How to sync large size database with sync framework

How to sync large size database with sync framework

By using sync framework 2.1 we can easily sync databases frequently by scheduling sync with some interval. In one of our project we have to give chance for client to schedule sync frequently. We have implemented the functionalities using sync framework 2.1 with ASP.Net mvc application. You can go through the main codes for sync here.

It was working fine until we are testing with large size db (greater than 10 GB). When we are testing with large size db, we got error in the middle of sync process.

The error is  “There is no enough space for the disk”. Then we are increased the size of the target db to 20 GB (source db 10GB) but got same error. We are search on the google and cannot found enough support for the issue. We are going behind the issue and after some days we have found the root cause of the issue. It was not related to sync framework. When sync process is running, a log file will be writing as backend process to the location “C:\Resources\temp\7aad4070ce51495c82cde6b1d410aa65.WorkerRole1\RoleTemp” of Virtual Machine in the worker role (WaHostBootstrapper). The size of this log file will be increasing continuously and there is some limitation for the size of the file (normally 60 mb). Obviously it will take long time to sync large db and log file size increased and crashed once the max size exceeded. At last we have found the solution and now our application can sync large db without any error. The solution is given below.

1.    We need to extend the max size of the log file. We can achieve it by following code in the “ServiceDefinition.csdef”


      <LocalStorage name=”localStoreOne” sizeInMB=”20480″ cleanOnRoleRecycle=”true” />


   Name ( localStoreOne) : indicates your local storage name

    sizeInMB : indicates the maximum size of the log file you want to give.

    cleanOnRoleRecycle : It will delete and recreated log file for each workerrole when it is        set to true

2.    In the “OnStart()” method in the workerrole we need to map the temp folder by using following code.

string customTempLocalResourcePath =


       Environment.SetEnvironmentVariable(“TMP”, customTempLocalResourcePath);

       Environment.SetEnvironmentVariable(“TEMP”, customTempLocalResourcePath);

Then we can see that log file is archived in the middle of sync with some size, and can hold up to the size that we mentioned in the ServiceDefenition file. It is better to  cleanOnRoleRecycle is set to true, then it will automatically deleted the log files once the worker role restart and recreated again.

  1. Sharon
    October 30, 2012 at 11:23 am

    Im getting same error when i tried to sync two sql 2008 r2 databases by a simple windows application. And one of them is large and getting system out of memory exception.could you please guide me to implement this logic.

  2. October 31, 2012 at 11:05 am

    Hi Sharon,

    When we try to sync large database it will exceed the size of log file created while sync so that it will break down in the middle. To avoid this condition, we have to follow two steps as mentioned in this post.

    Please try this and if you get any error, please send me more details about error.

  3. July 5, 2013 at 12:59 pm

    This is very fascinating, You are an overly professional blogger.
    I have joined your feed and stay up for in quest
    of more of your magnificent post. Also, I have shared your site in
    my social networks

  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: