Home > DotNet, Windows Azure > How to upload large size file/blob to azure storage using ASP.Net,C#

How to upload large size file/blob to azure storage using ASP.Net,C#


Tool to upload local files into azure storage.

We have already demonstrated an application to upload files into azure storage using simple C# application in one of our previous post. Using that application we can upload local folder files into azure storage very easily and we can see that error log for each file uploading process. That is working fine for up to some size of the file (we are checking with 1 GB ) it is working fine. But when we are trying to upload larger than 3 GB files using same codes in one of our live project, got error in the middle of uploading files into azure storage. Then we are found out another solution for uploading very large size file into azure storage blob.

How can upload large size files into azure storage using C#, ASP.Net

When we are trying to upload very large size local files into azure storage blob, we are getting error related to timeout. To resolve this issue, we need to split up our local large size files into different small packages, then upload and after the upload successfully re pack the file again. This parallel file upload can be achieved by following code. We can upload very huge files to azure blob from a web page in ASP.Net,C# by using following code.

public void ParallelDownloadToFile(CloudBlockBlob blob,
string fileName, int maxBlockSize)
{
try
{
// refresh the values
blob.FetchAttributes();
long fileSize = blob.Attributes.Properties.Length;
var filePath = Path.GetDirectoryName(fileName);
var fileNameWithoutPath = Path.GetFileNameWithoutExtension(fileName);

// let's figure out how big the file is here
long leftToRead = fileSize;
int startPosition = 0;

// have 1 block for every maxBlockSize bytes plus 1 for the remainder
var blockCount =
((int)Math.Floor((double)(fileSize / maxBlockSize))) + 1;

// setup the control array
BlockTransferDetail[] transferDetails =
new BlockTransferDetail[blockCount];

// create an array of block keys
string[] blockKeys = new string[blockCount];
var blockIds = new List<string>();

// populate the control array...
for (int j = 0; j < transferDetails.Length; j++)
{
int toRead = (int)(maxBlockSize < leftToRead ?
maxBlockSize :
leftToRead);

string blockId = Path.Combine(filePath,
string.Format("{0}_{1}.dat",
fileNameWithoutPath,
j.ToString("00000000000")));

if (startPosition < 0)
startPosition = startPosition * -1;
if (toRead < 0)
toRead = toRead * -1;
transferDetails[j] = new BlockTransferDetail()
{
StartPosition = startPosition,
BytesToRead = toRead,
BlockId = blockId
};

if (toRead > 0)
{
blockIds.Add(blockId);
}

// increment the starting position
startPosition += toRead;
leftToRead -= toRead;
}

// now we do a || download of the file.
var result = Parallel.For(0, transferDetails.Length, j =>
{
// get the blob as a stream
try
{
using (BlobStream stream = blob.OpenRead())
{
Thread.Sleep(10000);
stream.Seek(transferDetails[j].StartPosition, SeekOrigin.Begin);

// setup a buffer with the proper size
byte[] buff = new byte[transferDetails[j].BytesToRead];

// read into the buffer
stream.Read(buff, 0, transferDetails[j].BytesToRead);

using (Stream fileStream = new FileStream(transferDetails[j].BlockId,
    FileMode.Create, FileAccess.Write, FileShare.None))
{
using (BinaryWriter bw = new BinaryWriter(fileStream))
{
bw.Write(buff);
bw.Close();
}
}
buff = null;
}
}
catch (Exception ex)
{
throw;
}
});

// assemble the file into one now...
using (Stream fileStream = new FileStream(fileName,
FileMode.Append, FileAccess.Write, FileShare.None))
{
using (BinaryWriter bw = new BinaryWriter(fileStream))
{
// loop through each of the files on the disk
for (int j = 0; j < transferDetails.Length; j++)
{
// read them into the file (append)
bw.Write(File.ReadAllBytes(transferDetails[j].BlockId));

// and then delete them
File.Delete(transferDetails[j].BlockId);
}
}
}

transferDetails = null;
}
catch (Exception ex)
{
throw;
}
}

public static class BlobExtensions
{
static readonly string DFDriveEnvVarName = "AZURE_DRIVE_DEV_PATH";
static readonly string containername =
RoleEnvironment.GetConfigurationSettingValue("Container")
.ToLowerInvariant();

public static bool Exists(this CloudBlob blob, string folderName)
{
try
{
//test doesnt work in df
if (RoleEnvironment.DeploymentId.ToLowerInvariant().StartsWith("deployment("))
{
string path = Environment.GetEnvironmentVariable(DFDriveEnvVarName);
path += "\\devstoreaccount1\\";
path += containername;
path += "\\";
path += folderName;
//path += blob.Uri.Segments.Last();
if (Directory.Exists(path))
return true;
else
return false;
}
else
{
blob.FetchAttributes();
return true;
}

}
catch (StorageClientException e)
{
if (e.ErrorCode == StorageErrorCode.ResourceNotFound)
{
return false;
}
else
{
throw;
}
}
}
}

By using the above code we can easily upload very large size files from local folder to azure blob storage. What the function does is, splitting the file stream into different byte packets and start uploading these small pieces of files into blob storage so there is no timeout issue generated. The above function is working well and we are implemented in one of our project also. We can uploaded large size files (Up to larger than 2GB checked, shall working for heavy large file also) into azure blob storage in our ASP.Net mvc application.

  1. Devendra
    December 1, 2012 at 6:37 pm

    I am getting following error :

    The type or namespace name ‘BlockTransferDetail’ could not be found (are you missing a using directive or an assembly reference?)

    Which namespace contains definition for BlockTransferDetail?

  2. December 3, 2012 at 11:20 am

    Hi Devendra,

    Sorry for late reply and incomplete post.
    BlockTransferDetails is a class created by us and including that class below.

    public class BlockTransferDetail
    {
    public int StartPosition { get; set; }
    public int BytesToRead { get; set; }
    public string BlockId { get; set; }
    }

  3. Devendra
    December 3, 2012 at 1:39 pm

    Thanks Tuvian,

    Actually I want to upload the large file (about 3-5 GB) with resume feature.
    I have been searching for the same since last four days. I am running into the critical time.
    I got your this thread and working onto to fulfill my needs.

    If you have some more suggestions/links for my problem, then please let me know.

    Thanks again
    Devendra

  4. Devendra
    December 3, 2012 at 5:49 pm

    tuvian :
    Hi Devendra,
    Sorry for late reply and incomplete post.
    BlockTransferDetails is a class created by us and including that class below.
    public class BlockTransferDetail
    {
    public int StartPosition { get; set; }
    public int BytesToRead { get; set; }
    public string BlockId { get; set; }
    }

    I am getting following error when it call blob.PutBlock(transferDetails[j].BlockId, new MemoryStream(buff), convertedHash, options);

    Error : Unable to write data to the transport connection: An existing connection was forcibly closed by the remote host.

    can you please tell me what could be the values for “maxBlockSize”?

  5. December 4, 2012 at 9:48 am

    Hi Devendra,

    I will post detailed source code for the same in the today evening.
    Please post your email ID also.
    thanks,

  6. Devendra
    December 4, 2012 at 12:09 pm

    tuvian :
    Hi Devendra,
    I will post detailed source code for the same in the today evening.
    Please post your email ID also.
    thanks,

    My mail id is – devendra_sipl@systematixindia.com

  7. December 5, 2012 at 10:30 am

    Hi Devendra,
    I am using 52428800 for maxBlockSize
    I didnt get details about your error. Please mention detailed error details

  8. Devendra
    December 7, 2012 at 2:49 pm

    tuvian :
    Hi Devendra,
    I am using 52428800 for maxBlockSize
    I didnt get details about your error. Please mention detailed error details

    Hi,

    I set the maxBlockSize = 4194304 (4 MB) and I am getting OutOfMemory exception on

    byte[] buff = new byte[transferDetails[j].BytesToRead]; (When its size grow upto 1GB+)

    Today I set the maxBlockSize = 52428800 and get the same error.

    As I read from a blog that the max size is up to 4 MB only. Is that right?

    Thanks
    Devendra

  9. Devendra
    December 10, 2012 at 6:19 pm

    Any reply please?

  10. December 13, 2012 at 10:31 am

    Hi Devendra,

    Please include below codes in service defenition file and try.

  11. December 13, 2012 at 10:45 am

    Hi Devendra,

    Please include below mentioned codes in service defenition file and try.

    LocalResources
    LocalStorage name=”InstanceDriveCache” cleanOnRoleRecycle=”false” sizeInMB=”300″
    LocalResources

  12. Sandeep Patel
    January 28, 2013 at 10:34 pm

    Hi Tuvian,

    I see that you have got very good understanding on this topic. See if you can help me getting things cleared.

    I am having asp.net web application that uses standard way to upload the file to storage. It takes close to 1 minute to upload 500 MB file. (see code below)

    However if i use the AZCopy utility provided by Microsoft, then its very quick and takes less than 10 seconds to upload same file via command prompt.

    I understand that in web appliaction it has to go through memory stream and then upload to storage, but why so long. Is there any way to come closer to 10 seconds to upload 500 MB file via web application. How long does your code takes to upload 500 MB file?

    Am i missing any thing important?

    Kindly help me out.

    http://blogs.msdn.com/b/windowsazurestorage/archive/2012/12/03/azcopy-uploading-downloading-files-for-windows-azure-blobs.aspx

    http://www.windowsazure.com/en-us/develop/net/how-to-guides/blob-storage/#upload-blob

    Thanks,
    SPatel

  13. April 23, 2013 at 8:18 pm

    Hurrah! After all I got a web site from where I know how to genuinely obtain useful
    facts concerning my study and knowledge.

  14. May 29, 2013 at 8:36 am

    fantastic publish, very informative. I’m wondering why the opposite specialists of this sector do not realize this. You must proceed your writing. I’m sure, you’ve a great readers’ base already!

  15. anil
    August 12, 2013 at 6:02 pm

    Title:Upload Images to Azure storage Account Inserted Information Store to Database using MVC

    Hi,

    TABLE: 

    ID Name ImageURL

    1 aaa http://example/Blob/Images/1.jpg

    At the time of inserting the data insert into sqlserver database along with Image Url like above example,The Image stored to Azure Storage account,

    At the time of Uploading image automatically store data in database above formate(Example I am giving)Images are store in Azure storage account using MVC

    In the same way Update,Delete,Display also

    Can you give me this example code or give me any sample example for this

    help me

  16. August 13, 2013 at 10:21 am

    Hi Anil,

    You can use same method ‘ParallelDownloadToFile’ from the above example to upload the image into blob.
    In the same method you have to call another normal method to store ID,Name and URL of the image into the SQL Server. That is normal method for saving data into SQL Server, not related with Azure.

    Please try and let me know if any help needs.

    Thanks

  17. September 13, 2013 at 11:41 am

    If that Bill is approved by lawmakers it would set up a regulated Internet gambling industry would bring the
    US millions of dollars in a short space of time only to
    lose everything moments later. There are online casino 3d slots three games that
    are offered for the free online casino games have
    found themselves strapped for cash. Thanks for watching And this
    is how basic blackjack card counting works.

  18. January 5, 2015 at 1:37 pm

    Yesterday i spent 300 $ for platinium roulette system , i hope that i will make my first cash online

  1. July 25, 2011 at 6:17 am
  2. August 17, 2020 at 8:25 am

Leave a comment