Archive

Archive for the ‘Windows Azure’ Category

How to deploy sync framework to windows azure/Deploy sync framework project to windows azure

October 5, 2011 1 comment

Deployment of sync framework workerrole / webrole project into azure environment

The article is going to explain that how we can deploy sync framework worker role or webrole projects created in Visual Studio into azure environment. We have created a worker role application that sync azure databases frequently. We are using Sync Framework 2.1 for sync database in the azure. Once it is completed the application, the challenge was to deploy this sync framework to azure platform. We have to include some dlls and also need to do some configuration settings to deploy project into Windows Azure platform

 Steps to do for deploying Sync Framework Project into Azure

 1.Open your Windows Azure Cloud Service project in Visual Studio >> In the Solution Explorer, right-click the

Web Role project, point to Add, and then click Add Reference.

2.Add references to Microsoft.Synchronization.dll, Microsoft.Synchronization.Data.dll, and Microsoft.Synchronization.Data.SqlServer.dll from Sync Framework 2.1

installation folder most of the time it is C:\Program Files (x86)\Microsoft Sync Framework\2.1.

3.Select all files and take the Properties window, then set the value of Aliases property to global and

Copy Local property to True.

4.Create a class file named activationcontext.cs file with the following content and add the file to

Workerrole/Webrole project.

 using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Web;
using System.Runtime.InteropServices;
using System.IO;

namespace Microsoft.Samples.Synchronization
{
    public class ActivationContext
    {
        // Activation Context API Functions

        [DllImport("Kernel32.dll", SetLastError = true)]
        private extern static IntPtr CreateActCtx(ref ACTCTX actctx);

        // Activation context structure
        private struct ACTCTX
        {
            public int cbSize;
            public uint dwFlags;
            public string lpSource;
            public ushort wProcessorArchitecture;
            public ushort wLangId;
            public string lpAssemblyDirectory;
            public string lpResourceName;
            public string lpApplicationName;
        }

        private const int ACTCTX_FLAG_ASSEMBLY_DIRECTORY_VALID = 0x004;
        private const int ACTCTX_FLAG_SET_PROCESS_DEFAULT = 0x00000010;
        private IntPtr m_hActCtx = (IntPtr)0;
        public const UInt32 ERROR_SXS_PROCESS_DEFAULT_ALREADY_SET = 14011;

     /// <summary>
     /// Explicitly load a manifest and create the process-default activation
     /// context. It takes effect immediately and stays there until the process exits.
    /// </summary>
        static public void CreateActivationContext()
        {
            string rootFolder = AppDomain.CurrentDomain.BaseDirectory;
            string manifestPath = Path.Combine(rootFolder, "webapp.manifest");
            UInt32 dwError = 0;

            // Build the activation context information structure
            ACTCTX info = new ACTCTX();
            info.cbSize = Marshal.SizeOf(typeof(ACTCTX));
            info.dwFlags = ACTCTX_FLAG_SET_PROCESS_DEFAULT;
            info.lpSource = manifestPath;
            if (null != rootFolder && "" != rootFolder)
            {
                info.lpAssemblyDirectory = rootFolder;
                info.dwFlags |= ACTCTX_FLAG_ASSEMBLY_DIRECTORY_VALID;
            }

            dwError = 0;

            // Create the activation context
            IntPtr result = CreateActCtx(ref info);
            if (-1 == result.ToInt32())
            {
                dwError = (UInt32)Marshal.GetLastWin32Error();
            }

            if (-1 == result.ToInt32() &&
ActivationContext.ERROR_SXS_PROCESS_DEFAULT_ALREADY_SET != dwError)
            {
                string err = string.Format("Cannot create process-default win32 sxs context,
error={0} manifest={1}", dwError, manifestPath);
                ApplicationException ex = new ApplicationException(err);
                throw ex;
            }
        }
    }
}

5.Add a folder named synchronization.assemblies to the Web Role/ Worker Role project and add the following five files to the folder.

 Microsoft.Synchronization.dll
Microsoft.Synchronization.Data.dll
Microsoft.Synchronization.Data.SqlServer.dll
Synchronization21.dll

 Create a file named synchronization.assemblies.manifest, add the following content, and add the file to this folder.

 <?xml version='1.0' encoding='UTF-8' standalone='yes'?>
<assembly xmlns='urn:schemas-microsoft-com:asm.v1' manifestVersion='1.0'>
<assemblyIdentity
type="x64"
name="synchronization.assemblies"
version="2.1.0.0"/>
<file name = "synchronization21.dll">
<comClass clsid="{EC413D66-6221-4ebb-AC55-4900FB321011}"
threadingModel="Both"/>   
</file>
</assembly>

6.Multiple-select all files under synchronization.assemblies folder, right-click, and then click Properties.

Set the value of Build Action property to Content and Copy To Output Directory to Copy Always.

7.Create a file named webapp.manifest, add the following content, and add the file to the Web Role project.

<?xml version='1.0' encoding='UTF-8' standalone='yes'?>
<assembly xmlns="urn:schemas-microsoft-com:asm.v1" manifestVersion="1.0">
<assemblyIdentity name="webapp" version="8.0.0.0" type="x64"/>
<dependency>
<dependentAssembly>
<assemblyIdentity name="synchronization.assemblies" version="2.1.0.0" type="x64"/>
</dependentAssembly>
</dependency>
</assembly>

 8.Set the value of Build Action property to Content and Copy To Output Directory to Copy Always for the webapp.manifest file using Properties window.

9.Add the following statement to the OnStart method before base.OnStart method call in the WebRole.cs file.

 Microsoft.Samples.Synchronization.ActivationContext.CreateActivationContext();

SQL Query for truncate all tables in a database in SQL/SQL Azure

September 26, 2011 2 comments

How to delete or truncate all the data from all the tables in a database

 For clean up the database or switch to a new database, we may need to truncate or delete all tables in the database. Its not easy to select and delete all tables in a database as the database may having n number of tables. We can get all tables in a database from sys objects and we can apply delete or truncate script to each and every table from a single cursor.

 SQL script for delete/truncate all tables in a database in SQL/SQL Azure

 The following script truncates all tables in the selected database. If you want to delete all tables in the database we can edit this query with delete statement instead of truncate statement.

 DECLARE @GetObjectName AS CURSOR 
DECLARE @StringVal AS nvarchar(max) 
DECLARE @DBObjName AS nvarchar(100)
DECLARE @Type as Varchar(2)

SET @GetObjectName = CURSOR FOR 
Select type,Name from sys.sysobjects where type in('U')
OPEN @GetObjectName 
FETCH NEXT 
FROM @GetObjectName INTO @Type,@DBObjName

WHILE @@FETCH_STATUS = 0 
BEGIN
     Set @StringVal = 'truncate table ' + @DBObjName
       exec (@StringVal)           
FETCH NEXT 
FROM @GetObjectName INTO @Type,@DBObjName
END 
CLOSE @GetObjectName 
DEALLOCATE @GetObjectName

 How to drop all stored procedures (sps) and functions from a database

The following script drop all procedures and user defined functions from a database.

 DECLARE @GetObjectName AS CURSOR 
DECLARE @StringVal AS nvarchar(max) 
DECLARE @DBObjName AS nvarchar(100)
DECLARE @Type as Varchar(2)

SET @GetObjectName = CURSOR FOR 
Select type,Name from sys.sysobjects where type in('P','FN','TF')
AND Name not in ('DBM_EnableDisableAllTableConstraints')
OPEN @GetObjectName 
FETCH NEXT 
FROM @GetObjectName INTO @Type,@DBObjName

WHILE @@FETCH_STATUS = 0 
BEGIN

      IF @Type='P'
      BEGIN
            Set @StringVal = 'Drop Procedure ' + @DBObjName
            exec (@StringVal)
      END
      ELSE IF @Type='FN' OR @Type='TF'
      BEGIN
            Set @StringVal = 'Drop Function ' + @DBObjName
            exec (@StringVal)
      END

FETCH NEXT 
FROM @GetObjectName INTO @Type,@DBObjName
END 
CLOSE @GetObjectName 
DEALLOCATE @GetObjectName

How to find the size and cost of sql azure database

September 15, 2011 Leave a comment

SQL query to find the cost and size of sql azure database

 SQL  Azure database is a paid service as per the usage of the database. So we should have an idea regarding our usage of data in the sql azure database. Then only we can use it as cost effective product. So we need to regularly checking the size and cost of our sql azure database.

 Query to find the size and cost of SQL azure entire database

 The  following query will return the size and cost of the sql azure database.  The latest price of the azure storage can be seen here. http://www.microsoft.com/windowsazure/pricing/

 The following query will give the size and price of the database as per the following price:

 

DECLARE @SizeInBytes bigint
SELECT @SizeInBytes =
(SUM(reserved_page_count) * 8192)
    FROM sys.dm_db_partition_stats

DECLARE @Edition sql_variant
SELECT  @Edition =DATABASEPROPERTYEX ( DB_Name() , 'Edition' )

SELECT    (CASE
    WHEN @SizeInBytes/1073741824.0 < 1 THEN
(CASE @Edition WHEN 'Web' THEN 9.99 ELSE 99.99 END)
    WHEN @SizeInBytes/1073741824.0 < 5 THEN
(CASE @Edition WHEN 'Web' THEN 49.95 ELSE 99.99 END)
    WHEN @SizeInBytes/1073741824.0 < 10 THEN 99.99 
    WHEN @SizeInBytes/1073741824.0 < 20 THEN 199.98
    WHEN @SizeInBytes/1073741824.0 < 30 THEN 299.97            
    WHEN @SizeInBytes/1073741824.0 < 40 THEN 399.96             
    WHEN @SizeInBytes/1073741824.0 < 50 THEN 499.95            
         END)  / @SizeInBytes
 

Query to find size and cost of SQL Azure datbase as per indexes

 The following query will return the size and cost of sql azure dtabase’s indexes.

 DECLARE @SizeInBytes bigint
SELECT @SizeInBytes =
(SUM(reserved_page_count) * 8192)
    FROM sys.dm_db_partition_stats

DECLARE @Edition sql_variant
SELECT  @Edition =DATABASEPROPERTYEX ( DB_Name() , 'Edition' )

DECLARE @CostPerByte float

SELECT    @CostPerByte = (CASE
    WHEN @SizeInBytes/1073741824.0 < 1 THEN
(CASE @Edition WHEN 'Web' THEN 9.99 ELSE 99.99 END)
    WHEN @SizeInBytes/1073741824.0 < 5 THEN
(CASE @Edition WHEN 'Web' THEN 49.95 ELSE 99.99 END)
    WHEN @SizeInBytes/1073741824.0 < 10 THEN 99.99 
    WHEN @SizeInBytes/1073741824.0 < 20 THEN 199.98
    WHEN @SizeInBytes/1073741824.0 < 30 THEN 299.97            
    WHEN @SizeInBytes/1073741824.0 < 40 THEN 399.96             
    WHEN @SizeInBytes/1073741824.0 < 50 THEN 499.95            
         END)  / @SizeInBytes SELECT idx.name,
SUM(reserved_page_count) * 8192 'bytes',
      (SUM(reserved_page_count) * 8192) * @CostPerByte 'cost'
FROM sys.dm_db_partition_stats AS ps
    INNER JOIN sys.indexes AS idx ON
idx.object_id = ps.object_id AND idx.index_id = ps.index_id
WHERE type_desc = 'NONCLUSTERED'
GROUP BY idx.name
ORDER BY 3 DESC
 

Query to find out the size and cost of each tables in SQL Azure database

 The following query return the cost per month per row for every table in the database.
DECLARE @SizeInBytes bigint
SELECT @SizeInBytes =
(SUM(reserved_page_count) * 8192)
    FROM sys.dm_db_partition_stats
DECLARE @Edition sql_variant
SELECT  @Edition =DATABASEPROPERTYEX ( DB_Name() , 'Edition' )
DECLARE @CostPerByte float
SELECT    @CostPerByte = (CASE
    WHEN @SizeInBytes/1073741824.0 < 1 THEN
(CASE @Edition WHEN 'Web' THEN 9.99 ELSE 99.99 END)
    WHEN @SizeInBytes/1073741824.0 < 5 THEN
(CASE @Edition WHEN 'Web' THEN 49.95 ELSE 99.99 END)
    WHEN @SizeInBytes/1073741824.0 < 10 THEN 99.99 
    WHEN @SizeInBytes/1073741824.0 < 20 THEN 199.98
    WHEN @SizeInBytes/1073741824.0 < 30 THEN 299.97            
    WHEN @SizeInBytes/1073741824.0 < 40 THEN 399.96             
    WHEN @SizeInBytes/1073741824.0 < 50 THEN 499.95            
         END)  / @SizeInBytes
SELECT    
      sys.objects.name,
      sum(reserved_page_count) * 8192 'Bytes',
      row_count 'Row Count', 
      (CASE row_count WHEN 0 THEN 0 ELSE
       (sum(reserved_page_count) * 8192)/ row_count END)
        'Bytes Per Row',
      (CASE row_count WHEN 0 THEN 0 ELSE
       ((sum(reserved_page_count) * 8192)/ row_count)
        * @CostPerByte END)
        'Monthly Cost Per Row'
FROM    
      sys.dm_db_partition_stats, sys.objects
WHERE    
      sys.dm_db_partition_stats.object_id = sys.objects.object_id
GROUP BY sys.objects.name, row_count

How to upload large size file/blob to azure storage using ASP.Net,C#

June 28, 2011 19 comments

Tool to upload local files into azure storage.

We have already demonstrated an application to upload files into azure storage using simple C# application in one of our previous post. Using that application we can upload local folder files into azure storage very easily and we can see that error log for each file uploading process. That is working fine for up to some size of the file (we are checking with 1 GB ) it is working fine. But when we are trying to upload larger than 3 GB files using same codes in one of our live project, got error in the middle of uploading files into azure storage. Then we are found out another solution for uploading very large size file into azure storage blob.

How can upload large size files into azure storage using C#, ASP.Net

When we are trying to upload very large size local files into azure storage blob, we are getting error related to timeout. To resolve this issue, we need to split up our local large size files into different small packages, then upload and after the upload successfully re pack the file again. This parallel file upload can be achieved by following code. We can upload very huge files to azure blob from a web page in ASP.Net,C# by using following code.

public void ParallelDownloadToFile(CloudBlockBlob blob,
string fileName, int maxBlockSize)
{
try
{
// refresh the values
blob.FetchAttributes();
long fileSize = blob.Attributes.Properties.Length;
var filePath = Path.GetDirectoryName(fileName);
var fileNameWithoutPath = Path.GetFileNameWithoutExtension(fileName);

// let's figure out how big the file is here
long leftToRead = fileSize;
int startPosition = 0;

// have 1 block for every maxBlockSize bytes plus 1 for the remainder
var blockCount =
((int)Math.Floor((double)(fileSize / maxBlockSize))) + 1;

// setup the control array
BlockTransferDetail[] transferDetails =
new BlockTransferDetail[blockCount];

// create an array of block keys
string[] blockKeys = new string[blockCount];
var blockIds = new List<string>();

// populate the control array...
for (int j = 0; j < transferDetails.Length; j++)
{
int toRead = (int)(maxBlockSize < leftToRead ?
maxBlockSize :
leftToRead);

string blockId = Path.Combine(filePath,
string.Format("{0}_{1}.dat",
fileNameWithoutPath,
j.ToString("00000000000")));

if (startPosition < 0)
startPosition = startPosition * -1;
if (toRead < 0)
toRead = toRead * -1;
transferDetails[j] = new BlockTransferDetail()
{
StartPosition = startPosition,
BytesToRead = toRead,
BlockId = blockId
};

if (toRead > 0)
{
blockIds.Add(blockId);
}

// increment the starting position
startPosition += toRead;
leftToRead -= toRead;
}

// now we do a || download of the file.
var result = Parallel.For(0, transferDetails.Length, j =>
{
// get the blob as a stream
try
{
using (BlobStream stream = blob.OpenRead())
{
Thread.Sleep(10000);
stream.Seek(transferDetails[j].StartPosition, SeekOrigin.Begin);

// setup a buffer with the proper size
byte[] buff = new byte[transferDetails[j].BytesToRead];

// read into the buffer
stream.Read(buff, 0, transferDetails[j].BytesToRead);

using (Stream fileStream = new FileStream(transferDetails[j].BlockId,
    FileMode.Create, FileAccess.Write, FileShare.None))
{
using (BinaryWriter bw = new BinaryWriter(fileStream))
{
bw.Write(buff);
bw.Close();
}
}
buff = null;
}
}
catch (Exception ex)
{
throw;
}
});

// assemble the file into one now...
using (Stream fileStream = new FileStream(fileName,
FileMode.Append, FileAccess.Write, FileShare.None))
{
using (BinaryWriter bw = new BinaryWriter(fileStream))
{
// loop through each of the files on the disk
for (int j = 0; j < transferDetails.Length; j++)
{
// read them into the file (append)
bw.Write(File.ReadAllBytes(transferDetails[j].BlockId));

// and then delete them
File.Delete(transferDetails[j].BlockId);
}
}
}

transferDetails = null;
}
catch (Exception ex)
{
throw;
}
}

public static class BlobExtensions
{
static readonly string DFDriveEnvVarName = "AZURE_DRIVE_DEV_PATH";
static readonly string containername =
RoleEnvironment.GetConfigurationSettingValue("Container")
.ToLowerInvariant();

public static bool Exists(this CloudBlob blob, string folderName)
{
try
{
//test doesnt work in df
if (RoleEnvironment.DeploymentId.ToLowerInvariant().StartsWith("deployment("))
{
string path = Environment.GetEnvironmentVariable(DFDriveEnvVarName);
path += "\\devstoreaccount1\\";
path += containername;
path += "\\";
path += folderName;
//path += blob.Uri.Segments.Last();
if (Directory.Exists(path))
return true;
else
return false;
}
else
{
blob.FetchAttributes();
return true;
}

}
catch (StorageClientException e)
{
if (e.ErrorCode == StorageErrorCode.ResourceNotFound)
{
return false;
}
else
{
throw;
}
}
}
}

By using the above code we can easily upload very large size files from local folder to azure blob storage. What the function does is, splitting the file stream into different byte packets and start uploading these small pieces of files into blob storage so there is no timeout issue generated. The above function is working well and we are implemented in one of our project also. We can uploaded large size files (Up to larger than 2GB checked, shall working for heavy large file also) into azure blob storage in our ASP.Net mvc application.

How to upload blob to azure using ASP.Net / Tool for upload files to azure using ASP.Net,C#

June 24, 2011 5 comments

Simple way to Upload files to azure blob  

When we are dealing with windows azure application in ASP.Net we need to communicate with blobs in the azure from the asp.net code. Here we are going to demonstrate how we can upload files from local system to windows azure storage space using asp.net code. The application offers user to upload all files in the mapped local folder to a blob in the azure. It is a usefull tool for working with windows azure storage for upload files.

Requirements need to upload files in to azure blob

For upload files from local folder to azure we must need a azure storage account name and account key. We can purchase it from Microsfot. 

Steps to upload files into azure storage

  1. Create a simple windows application from visual studio, file >> new project >> visual C# >> Windows >> Windows Forms Application
  2. Need to add reference ‘Microsoft.WindowsAzure.StorageClient.dll’ that we can available after installing
  3. Drag a Button(For upload), a textbox (For display status) and a ListBox (List files with error or success message)to the form.
  4. On Button Click function we are going to upload file from local folder (here C:\\myfiles we can configure by accepting browse option).  

UI Design for the upload file to Azure Storage application

C# Source Code for Upload files to Azure Storage  

private void button1_Click(object sender, EventArgs e)
{
if (txtFolderName.Text == "")
{
MessageBox.Show("Please select a folder to upload files");
return;
}
if (MessageBox.Show("Do you want to Upload files to Blob?",
    "Confirm Upload", MessageBoxButtons.YesNo) == DialogResult.Yes)
{               
lblHeader.Visible = false;
txtStatusMessage.Text = "Upload Starting...";
lstBxFiles.Items.Clear();
UploadFileIntoBlop(txtFolderName.Text.Trim());
lblHeader.Visible = true;
}
}

public void UploadFileIntoBlop(string fileFullPath)
{
try
{
string dataCenterSettingKey =
        "DefaultEndpointsProtocol=" + "http" + ";AccountName="
        + "YourAccountNameHere" + ";AccountKey=" +
        "YourAccountKeyHere";
string filepath = "";
CloudStorageAccount storageAccount =
CloudStorageAccount.Parse(dataCenterSettingKey);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
string csvfileblob = "migrationfiles";
CloudBlobContainer blobContainer =
blobClient.GetContainerReference(csvfileblob);
bool isError = false;
foreach (string file in Directory.GetFiles(fileFullPath))
{
string[] filename = file.Split('\\');
string theFile = filename[filename.Length - 1];
try
{
DateTime startDate = DateTime.Now;
filepath = file.Trim();
FileStream fileContent = ReadFile(filepath);
var blob = blobContainer.GetBlobReference(theFile);
blob.UploadFromStream(fileContent);
TimeSpan difference = DateTime.Now.Subtract(startDate);
int totalSeconds = (int)Math.Ceiling(difference.TotalMinutes);
lstBxFiles.Items.Add(theFile +
" _Success (" + totalSeconds + " mnts)");
}
catch (Exception ex)
{
isError = true;
lstBxFiles.Items.Add(theFile + " _Error");
}
}
if (isError)
txtStatusMessage.Text = "Process completed with errors";
else
txtStatusMessage.Text = "Upload done successfully";
}
catch (Exception ex)
{
txtStatusMessage.Text = "Eror Ocurred : " + ex.Message;
}
}

public FileStream ReadFile(string fileFullPath)
{
FileStream fileStream = null;
if (System.IO.File.Exists(fileFullPath))
{
fileStream = new FileStream(fileFullPath,
FileMode.Open, FileAccess.Read);
}
return fileStream;
}

private void btnBrowse_Click(object sender, EventArgs e)
{
DialogResult result = this.folderBrowserDialog1.ShowDialog();
if (result == DialogResult.OK)
{
txtFolderName.Text = this.folderBrowserDialog1.SelectedPath;
}
}

 Run the application, then press upload button then in the button click event, we are calling UploadFileIntoBlop function. Then it will take all files under the local folder (folder path can be passed as parameter), and uploaded to a container (here it is migration files). In the middle of uploading to blob, we are adding all file names under the mentioned local folder in to list box with the status of whether it is uploaded or not. We can use this application as the tool for upload files from local to azure blob storage.

How to sync large size database with sync framework

May 11, 2011 3 comments

By using sync framework 2.1 we can easily sync databases frequently by scheduling sync with some interval. In one of our project we have to give chance for client to schedule sync frequently. We have implemented the functionalities using sync framework 2.1 with ASP.Net mvc application. You can go through the main codes for sync here.

It was working fine until we are testing with large size db (greater than 10 GB). When we are testing with large size db, we got error in the middle of sync process.

The error is  “There is no enough space for the disk”. Then we are increased the size of the target db to 20 GB (source db 10GB) but got same error. We are search on the google and cannot found enough support for the issue. We are going behind the issue and after some days we have found the root cause of the issue. It was not related to sync framework. When sync process is running, a log file will be writing as backend process to the location “C:\Resources\temp\7aad4070ce51495c82cde6b1d410aa65.WorkerRole1\RoleTemp” of Virtual Machine in the worker role (WaHostBootstrapper). The size of this log file will be increasing continuously and there is some limitation for the size of the file (normally 60 mb). Obviously it will take long time to sync large db and log file size increased and crashed once the max size exceeded. At last we have found the solution and now our application can sync large db without any error. The solution is given below.

1.    We need to extend the max size of the log file. We can achieve it by following code in the “ServiceDefinition.csdef”

<LocalResources>

      <LocalStorage name=”localStoreOne” sizeInMB=”20480″ cleanOnRoleRecycle=”true” />

    </LocalResources>

   Name ( localStoreOne) : indicates your local storage name

    sizeInMB : indicates the maximum size of the log file you want to give.

    cleanOnRoleRecycle : It will delete and recreated log file for each workerrole when it is        set to true

2.    In the “OnStart()” method in the workerrole we need to map the temp folder by using following code.

string customTempLocalResourcePath =

       RoleEnvironment.GetLocalResource(“localStoreOne”).RootPath;

       Environment.SetEnvironmentVariable(“TMP”, customTempLocalResourcePath);

       Environment.SetEnvironmentVariable(“TEMP”, customTempLocalResourcePath);

Then we can see that log file is archived in the middle of sync with some size, and can hold up to the size that we mentioned in the ServiceDefenition file. It is better to  cleanOnRoleRecycle is set to true, then it will automatically deleted the log files once the worker role restart and recreated again.

Compress and extract file using Gzipstream

May 4, 2011 6 comments

I have faced situation for downloading large blob files from azure server to local folder. It was easy to download the file from azure server. But if file has more than 2 GB (means larger size) it will got error in the middle of the downloading. When I was facing the issue, I was search on google and not found feasible solution soon that’s why I posted this compress and decompress codes here..

using System.IO.Compression;

 

Compress files ….

public static void Compress(FileInfo fi)

        {

            // Get the stream of the source file.

            using (FileStream inFile = fi.OpenRead())

            {

                // Prevent compressing hidden and

                // already compressed files.

                if ((File.GetAttributes(fi.FullName)

                    & FileAttributes.Hidden)

                    != FileAttributes.Hidden & fi.Extension != “.gz”)

                {

                    ///**

                    //string[] filesplits = fi.FullName.Split(‘.’);

                    // Create the compressed file.

                    using (FileStream outFile =

                                File.Create(fi.FullName + “.gz”))

                    {

                        using (GZipStream Compress =

                            new GZipStream(outFile,

                            CompressionMode.Compress))

                        {

                            // Copy the source file into

                            // the compression stream.

                            inFile.CopyTo(Compress);

                            Console.WriteLine(“Compressed {0} from {1} to {2} bytes.”,

                                fi.Name, fi.Length.ToString(), outFile.Length.ToString());

                        }

                    }

                }

            }

        }

 

 

Decompress compressed file..

public void Decompress(FileInfo fi)

        {

            // Get the stream of the source file.

            using (FileStream inFile = fi.OpenRead())

            {

                // Get original file extension, for example

                // “doc” from report.doc.gz.

                string origName = curFile.Remove(curFile.Length –

                        fi.Extension.Length);

                //Create the decompressed file.

                using (FileStream outFile = File.Create(origName))

                {

                    using (GZipStream Decompress = new GZipStream(inFile,

                            CompressionMode.Decompress))

                    {

                        // Copy the decompression stream

                        // into the output file.

                        Decompress.CopyTo(outFile);

                        Console.WriteLine(“Decompressed: {0}”, fi.Name);

                    }

                }

            }

        }

Categories: Windows Azure Tags: ,
%d bloggers like this: