Advertisements

Archive

Archive for June, 2011

How to upload large size file/blob to azure storage using ASP.Net,C#

June 28, 2011 19 comments

Tool to upload local files into azure storage.

We have already demonstrated an application to upload files into azure storage using simple C# application in one of our previous post. Using that application we can upload local folder files into azure storage very easily and we can see that error log for each file uploading process. That is working fine for up to some size of the file (we are checking with 1 GB ) it is working fine. But when we are trying to upload larger than 3 GB files using same codes in one of our live project, got error in the middle of uploading files into azure storage. Then we are found out another solution for uploading very large size file into azure storage blob.

How can upload large size files into azure storage using C#, ASP.Net

When we are trying to upload very large size local files into azure storage blob, we are getting error related to timeout. To resolve this issue, we need to split up our local large size files into different small packages, then upload and after the upload successfully re pack the file again. This parallel file upload can be achieved by following code. We can upload very huge files to azure blob from a web page in ASP.Net,C# by using following code.

public void ParallelDownloadToFile(CloudBlockBlob blob,
string fileName, int maxBlockSize)
{
try
{
// refresh the values
blob.FetchAttributes();
long fileSize = blob.Attributes.Properties.Length;
var filePath = Path.GetDirectoryName(fileName);
var fileNameWithoutPath = Path.GetFileNameWithoutExtension(fileName);

// let's figure out how big the file is here
long leftToRead = fileSize;
int startPosition = 0;

// have 1 block for every maxBlockSize bytes plus 1 for the remainder
var blockCount =
((int)Math.Floor((double)(fileSize / maxBlockSize))) + 1;

// setup the control array
BlockTransferDetail[] transferDetails =
new BlockTransferDetail[blockCount];

// create an array of block keys
string[] blockKeys = new string[blockCount];
var blockIds = new List<string>();

// populate the control array...
for (int j = 0; j < transferDetails.Length; j++)
{
int toRead = (int)(maxBlockSize < leftToRead ?
maxBlockSize :
leftToRead);

string blockId = Path.Combine(filePath,
string.Format("{0}_{1}.dat",
fileNameWithoutPath,
j.ToString("00000000000")));

if (startPosition < 0)
startPosition = startPosition * -1;
if (toRead < 0)
toRead = toRead * -1;
transferDetails[j] = new BlockTransferDetail()
{
StartPosition = startPosition,
BytesToRead = toRead,
BlockId = blockId
};

if (toRead > 0)
{
blockIds.Add(blockId);
}

// increment the starting position
startPosition += toRead;
leftToRead -= toRead;
}

// now we do a || download of the file.
var result = Parallel.For(0, transferDetails.Length, j =>
{
// get the blob as a stream
try
{
using (BlobStream stream = blob.OpenRead())
{
Thread.Sleep(10000);
stream.Seek(transferDetails[j].StartPosition, SeekOrigin.Begin);

// setup a buffer with the proper size
byte[] buff = new byte[transferDetails[j].BytesToRead];

// read into the buffer
stream.Read(buff, 0, transferDetails[j].BytesToRead);

using (Stream fileStream = new FileStream(transferDetails[j].BlockId,
    FileMode.Create, FileAccess.Write, FileShare.None))
{
using (BinaryWriter bw = new BinaryWriter(fileStream))
{
bw.Write(buff);
bw.Close();
}
}
buff = null;
}
}
catch (Exception ex)
{
throw;
}
});

// assemble the file into one now...
using (Stream fileStream = new FileStream(fileName,
FileMode.Append, FileAccess.Write, FileShare.None))
{
using (BinaryWriter bw = new BinaryWriter(fileStream))
{
// loop through each of the files on the disk
for (int j = 0; j < transferDetails.Length; j++)
{
// read them into the file (append)
bw.Write(File.ReadAllBytes(transferDetails[j].BlockId));

// and then delete them
File.Delete(transferDetails[j].BlockId);
}
}
}

transferDetails = null;
}
catch (Exception ex)
{
throw;
}
}

public static class BlobExtensions
{
static readonly string DFDriveEnvVarName = "AZURE_DRIVE_DEV_PATH";
static readonly string containername =
RoleEnvironment.GetConfigurationSettingValue("Container")
.ToLowerInvariant();

public static bool Exists(this CloudBlob blob, string folderName)
{
try
{
//test doesnt work in df
if (RoleEnvironment.DeploymentId.ToLowerInvariant().StartsWith("deployment("))
{
string path = Environment.GetEnvironmentVariable(DFDriveEnvVarName);
path += "\\devstoreaccount1\\";
path += containername;
path += "\\";
path += folderName;
//path += blob.Uri.Segments.Last();
if (Directory.Exists(path))
return true;
else
return false;
}
else
{
blob.FetchAttributes();
return true;
}

}
catch (StorageClientException e)
{
if (e.ErrorCode == StorageErrorCode.ResourceNotFound)
{
return false;
}
else
{
throw;
}
}
}
}

By using the above code we can easily upload very large size files from local folder to azure blob storage. What the function does is, splitting the file stream into different byte packets and start uploading these small pieces of files into blob storage so there is no timeout issue generated. The above function is working well and we are implemented in one of our project also. We can uploaded large size files (Up to larger than 2GB checked, shall working for heavy large file also) into azure blob storage in our ASP.Net mvc application.

Advertisements

Nested Common Table Expression(CTE) in SQL Server 2005/2008

June 28, 2011 10 comments

What is Common Table Expression (CTE)?

A common table expression (CTE) is a temporary storage result set, which will be accessible within the next execution scope of a query. That means we didn’t get CTE result after the second query statement.  A CTE is similar to a derived table in that it is not stored as an object and lasts only for the duration of the query. Unlike a derived table, a CTE can be self-referencing and can be referenced multiple times in the same query.

A very simple example of Common Table Expression (CTE) in SQL Server for pagination

Suppose we have a table ‘Employee’ with following structure. Suppose we need to apply pagination for this table, that means fetch only some part of data within the given index.  To achieve this, we usually insert all records into a temporary table to get a new id column as row number. Then we have to fetch data from this table as per this new column.

Table Structure:

By using Common Table Expression (CTE) we can accomplish pagination query with single statement as follows. In our CTE we are fetching data with row number, and from this CTE we are fetching some range of data as we needed. This query helps to get pagination data in SQL Server using CTE with single execution of query. The scope of the CTE will be ended just after this execution of statement.

WITH CTE AS
(
SELECT id
,name
,age
,joindate
,ROW_NUMBER() OVER (ORDER BY id DESC) AS RowNumber
FROM employee
)
SELECT *
FROM CTE
WHERE RowNumber BETWEEN 1 AND 5;

OUTPUT:

 

Is it possible Nested Common Table Expression (CTE) in SQL Server?

In some scenario we need to do some calculation from the result of first CTE and have to generate another result from the first CTE. For doing that, can we do CTE inside another CTE?  To accomplish this we can try with nested CTE in SQL Server which means declare a new CTE just after the first CTE to use result of first CTE.  Suppose we need to get number of years of experience of each employee. If we have very large number of data in the table, it will be very slow if we calculate this much of employee’s service years (If we have to fetch only 5 at a time).

In this case what is the better method is, first of all fetch the 5 records (pagination size) from the table, and apply the all calculation that we need to do for these 5 records only. We can accomplish this technique using CTE. What we are going to do is, first fetching 5 records from the table using CTE1, and then the very next step we are creating next CTE2. In CTE2 we are doing all calculation which we needed with result of CTE1.  Then it will be a faster pagination in SQL Server even having any complex calculation in the query. This nested CTE mechanism can be applied for anywhere in the SQL Server if we have any more complex calculation have to be done from the previous results. We can n number of CTE in nested CTE SQL Server 2005/2008. The last CTE can use previous CTE result, which can be used just previous CTE result and so on.

;with CTE1 as (
SELECT id
,name
,age
,joindate
,ROW_NUMBER() OVER (ORDER BY id DESC) AS RowNumber
FROM employee    
)
,CTE2 AS (
SELECT CTE1.id
,CTE1.name
,CTE1.age
,CTE1.joindate
,CTE1.RowNumber
,DATEDIFF(YEAR,CTE1.joindate,GETDATE()) as yearOfservce
FROM employee INNER JOIN CTE1
ON employee.id = CTE1.id
WHERE CTE1.RowNumber between 1 and 5
)
SELECT * FROM CTE2;

OUTPUT:

 

Why need to use Common Table Expression (CTE) for pagination query?

In some scenario there is some complex calculation have to do for the fetching queries. In such case if we do this calculation for all data in the table and fetch only paginated rows finally, there is some performance issue will be there and we didn’t get proper useful for using pagination. For this scenario we have to fetch paginated row from the table and do the complex calculation to those paginated rows only, it will improve the performance of pagination query. We can achieve it same as above queries using CTE (Common Table Expression) in SQL Server 2055/2008 for pagination. Common Table Expression (CTE) improve the performance of the pagination query in SQL Server as it takes only needed data first then do the calculation as above query.

How to create a Setup package by using Visual Studio .NET?

June 24, 2011 1 comment

Creating installation package for C# application in Visual Studio

Here we are demonstrating how we can deploy a c# windows application for getting installation package. After the deployment we will get a installation setup file that can be installed in any system. For installing this installation package in other system, we should have dotNet framework, windows installer as minimum requirement.

 Simple Steps to create a setup project in windows application with screenshots

We are explaining the way that how to create a setup file for windows applications by step by steps process including screenshots.

  1. Create a setup project, file>>new project >> Setup and Deployment >> Setup Project
  2. View File System of Setup project. Right Click Setup project >> View >> File System
  3. Add primary output of the application into the application folder. Right Click Application Folder >> Add >> Project Output >> Select Primary Output >> OK
  4. Add application exe to the application folder. Right click Application Folder >> Add >> File >> Select Exe from project bin folder >> OK
  5. To show installed files on desktop and program menu, Create shortcut of exe (Right click exe and click create shortcut) and copy the shortcut to User’s Desktop and User’s Programs Menu folders.
  6. Build the setup project in release. Right Click the setup project from solution explorer and select build option. After the build in the setup project folder, there is exe file will be generated inside the release folder. (Your Main Project Folder >> Setup Project Folder >> bin >> Release)

How to upload blob to azure using ASP.Net / Tool for upload files to azure using ASP.Net,C#

June 24, 2011 5 comments

Simple way to Upload files to azure blob  

When we are dealing with windows azure application in ASP.Net we need to communicate with blobs in the azure from the asp.net code. Here we are going to demonstrate how we can upload files from local system to windows azure storage space using asp.net code. The application offers user to upload all files in the mapped local folder to a blob in the azure. It is a usefull tool for working with windows azure storage for upload files.

Requirements need to upload files in to azure blob

For upload files from local folder to azure we must need a azure storage account name and account key. We can purchase it from Microsfot. 

Steps to upload files into azure storage

  1. Create a simple windows application from visual studio, file >> new project >> visual C# >> Windows >> Windows Forms Application
  2. Need to add reference ‘Microsoft.WindowsAzure.StorageClient.dll’ that we can available after installing
  3. Drag a Button(For upload), a textbox (For display status) and a ListBox (List files with error or success message)to the form.
  4. On Button Click function we are going to upload file from local folder (here C:\\myfiles we can configure by accepting browse option).  

UI Design for the upload file to Azure Storage application

C# Source Code for Upload files to Azure Storage  

private void button1_Click(object sender, EventArgs e)
{
if (txtFolderName.Text == "")
{
MessageBox.Show("Please select a folder to upload files");
return;
}
if (MessageBox.Show("Do you want to Upload files to Blob?",
    "Confirm Upload", MessageBoxButtons.YesNo) == DialogResult.Yes)
{               
lblHeader.Visible = false;
txtStatusMessage.Text = "Upload Starting...";
lstBxFiles.Items.Clear();
UploadFileIntoBlop(txtFolderName.Text.Trim());
lblHeader.Visible = true;
}
}

public void UploadFileIntoBlop(string fileFullPath)
{
try
{
string dataCenterSettingKey =
        "DefaultEndpointsProtocol=" + "http" + ";AccountName="
        + "YourAccountNameHere" + ";AccountKey=" +
        "YourAccountKeyHere";
string filepath = "";
CloudStorageAccount storageAccount =
CloudStorageAccount.Parse(dataCenterSettingKey);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
string csvfileblob = "migrationfiles";
CloudBlobContainer blobContainer =
blobClient.GetContainerReference(csvfileblob);
bool isError = false;
foreach (string file in Directory.GetFiles(fileFullPath))
{
string[] filename = file.Split('\\');
string theFile = filename[filename.Length - 1];
try
{
DateTime startDate = DateTime.Now;
filepath = file.Trim();
FileStream fileContent = ReadFile(filepath);
var blob = blobContainer.GetBlobReference(theFile);
blob.UploadFromStream(fileContent);
TimeSpan difference = DateTime.Now.Subtract(startDate);
int totalSeconds = (int)Math.Ceiling(difference.TotalMinutes);
lstBxFiles.Items.Add(theFile +
" _Success (" + totalSeconds + " mnts)");
}
catch (Exception ex)
{
isError = true;
lstBxFiles.Items.Add(theFile + " _Error");
}
}
if (isError)
txtStatusMessage.Text = "Process completed with errors";
else
txtStatusMessage.Text = "Upload done successfully";
}
catch (Exception ex)
{
txtStatusMessage.Text = "Eror Ocurred : " + ex.Message;
}
}

public FileStream ReadFile(string fileFullPath)
{
FileStream fileStream = null;
if (System.IO.File.Exists(fileFullPath))
{
fileStream = new FileStream(fileFullPath,
FileMode.Open, FileAccess.Read);
}
return fileStream;
}

private void btnBrowse_Click(object sender, EventArgs e)
{
DialogResult result = this.folderBrowserDialog1.ShowDialog();
if (result == DialogResult.OK)
{
txtFolderName.Text = this.folderBrowserDialog1.SelectedPath;
}
}

 Run the application, then press upload button then in the button click event, we are calling UploadFileIntoBlop function. Then it will take all files under the local folder (folder path can be passed as parameter), and uploaded to a container (here it is migration files). In the middle of uploading to blob, we are adding all file names under the mentioned local folder in to list box with the status of whether it is uploaded or not. We can use this application as the tool for upload files from local to azure blob storage.

Press enter for search in asp.net / Call button click event on press enter button in ASP.Net

June 23, 2011 5 comments

How to call code behind function on press enter in ASP.Net? 

In ASP.Net application we have to call a server side function on press enter from any text box. Suppose we have a textbox for search items and we need to call search function once the user enter the keywords in the text box and press enter button. Here we can use javascript or jquery to call server side function in ASP.Net. But it is little bit overhead to developer to achieve it. 

How to call search funtion when press enter button in search texbox?

To call search function when a user press enter button from search box, we can set default button for a asp panel to the search button in our aspx page.

Steps to call server side function on enter key press / Accessing A Code Behind Function On Keypress Event

  1. Create a aspx page and create a aspx panel.
  2. Create a Search text box (txtSearchBox) and search button (btnSearch) inside the asp panel control.
  3. Set default button for the asp panel to btnSearchBox button.
  4. On button click event call search funtion in the code behind.
  5. The if we press enter from search textbox, it wwill call button click event in the code behind. 

ASPX Page 

<asp:Panel ID="Panel1" DefaultButton="btnSearch" runat="server">
<table width="100%" cellpadding="0" cellspacing="0" style="padding-top: 15px;">
 <tr>
  <td valign="middle" style="width: 250px;">
   What are you looking for?
  </td>
  <td align="right" style="width: 300px;">
   <asp:TextBox ID="txtSearch" runat="server" CssClass="qkSearchTextBox" />
  </td>
  <td align="left">
   <asp:Button ID="btnSearch" runat="server" Text="Search"
   CssClass="searchButton" OnClick="btnSearch_Click1" />
  </td>
 </tr>
</table>
</asp:Panel>

10 Simple Steps to improve performance of ASP.Net Application after complete development?

June 16, 2011 Leave a comment

How to improve the ASP.Net application once it is completed development? 

Most of the developers dont have any awareness about the performance at the time of coding and once in the deployment stage it will effect the performance of the website then it is very hard to make good performance. Here we are describing 10 very useful and time consuming performance tips that can be easily applied even after the complete development. We can follow this performance tips once completed our website development. 

10 simple tips to improve website after the development in ASP.Net,C#
1) Turn off Tracing If it is not required

When need to trace should be enable : It can be useful only by developers for monitoring trace logging. Once the application is on live there is no need to trace and can be make it as disable so that improve the performance.We can disable tracing in web.config by using following codes.

<trace enabled=”false” requestLimit=”10” pageoutput=”false” traceMode=”SortByTime” localOnly=”true”>

2) If we are not using Session variable, turn off Session.  

Session variable is used to track the current user details like authenticate user, shopping cart etc. Session is a good feature in ASP.Net but it will be take more time to use it, so it is better to don’t use session as possible and disable session state.

<@%Page EnableSessionState=”false”%>
If use the session state only to retrieve data from it and not to update it, make the session state read only by using the directive,
<@%Page EnableSessionState =”ReadOnly”%>

3) Disable View State
It’s an automatic facility provided by ASP.Net to store data in between the page postback. In some scenario we don’t need to use view state like static data displaying, Displaying list of data from database without any operation etc.. In this case we can disable view state. We can disable view state by pagewise or control wise.

<%@ Page EnableViewState=”false” %>

4) Set debug=false in web.config
On the development phase, there will be debug = true state in web.config file. Once we deployed, we don’t need this mode, because there will be a pdb files while deployment. So we can set debug=”false” in the web.config before deployment.

 
5) Avoid Response.Redirect
Response.Redirect tell to the application that just go to the given link and it will be a time expensive. If we wanto load a page in the server itself, we can use Server.Transfer method.

6) Use the String builder to concatenate string
If we use string to concatenate strings many times, it will create each time new object for the string and assign new concatenated string to new object. It will time expensive, so we can use stringbuilder instead of string to concatenate string variables.

7) Avoid throwing exceptions

Exceptions are very heaviest resources and its better to use in very urgency case. The better method is using try catch for every methods track errors for futher verification by developer instead of throwing to users.

8) Use Page.ISPostBack
Make sure you don’t execute code needlessly. Use Page.ISPostBack property to ensure that you only perform page initialization logic when a page is first time loaded and not in response to client postbacks.

9) Use Foreach loop instead of For loop for String Iteration
For string iteration purpose Foreach loop has more performance than for loop. So its better to use for each loop instead of for loop.

10) Make JavaScript and CSS External

If our javascript and css are in external file, it will be more faster because most of the case browser cached the files and downloaded very fast.So its better to use javscript and css as external file.

Listview inside another Listview Conrol in ASP.Net,C#

June 14, 2011 14 comments

We have already discussed about how can implement a ListView control for the shopping site in ASP.Net,C#. As we know ListView controls repeated controls inside it upto the data count that we bound to ListView Control. In some scenario we need to dynamically repeated controls inside another repeated controls.

Dynamic menu using ListView inside another ListView Control in ASP.

For eg : If we need to implement dynamic menu control by fetching category and subcategory from the database. In this case there is n number of category so we need to use a listview for this, but each category there is also n number of subacategories so we have to implement a another ListView for this subcategory list.


Steps to use a listview inside another listview

In our scenario, there will be a category list and under the each categories, there is n number of subcategories should be display. First of all we cretaed a ListView control which is used to hold category name and list of subcategories coming under this category. For display category name, we just included a div and bounded a ‘Name’ value to this. For listing subcategories, we created another listview control inside the itemtemplate of the first listview and included the linkbutton for display subcategory name.

<asp:ListView runat="server" ID="RightMenuItems" 
ItemPlaceholderID="PlaceHolder2">
<LayoutTemplate>
<asp:PlaceHolder runat="server" ID="PlaceHolder2" />
</LayoutTemplate>
<ItemTemplate>
<asp:Panel runat="server" ID="panelCategory" CssClass="h1header">
    <div style="float: left;">
        <%# Eval("Name") %></div>
    <div style="float: right; margin-top: 5px; margin-right: 5px;">
        <%--<asp:Image runat="server" ID="imgCollapse" />--%></div>
</asp:Panel>
<asp:ListView runat="server" ID="subMenu" 
ItemPlaceholderID="PlaceHolder3"
DataSource='<%# Eval("subCategories") %>'
OnItemCommand="listViewTest_ItemCommand">
<LayoutTemplate>
    <asp:Panel runat="server" ID="panelSubCategory" 
HorizontalAlign="center" Width="100%">
        <table width="100%" cellspacing="0" border="0" cellpadding="0">
            <asp:PlaceHolder runat="server" ID="PlaceHolder3" />
        </table>
    </asp:Panel>
</LayoutTemplate>
<ItemTemplate>
    <tr>
        <td>
            <asp:LinkButton ID="lnkBtnSubMenu" runat="server"
            Text='<%# Eval("Name") %>' CommandName="clickSubMenu"
                CommandArgument='<%# Eval("ID") %>' />
            <asp:Label runat="server" ID="lblID" Visible="false" 
Text='<%# Eval("ID") %>' />
        </td>
    </tr>
</ItemTemplate>
</asp:ListView>
</ItemTemplate>
</asp:ListView>

How to create structured data for Listview inside another ListView Control

In order to bind the data source to the above ListView controls, the data should be having same structure (Category List having each category child subcategory List). For achieving this we fetch the category and subcategory from the database and created a list having required structure as follows.

DataTable dtTblCategory = new DataTable();
dtTblCategory = (new eMallBL()).getCategories(0);
DataTable dtTblSubCategory = new DataTable();
dtTblSubCategory = (new eMallBL()).getSubCategories(0, 0);
IList<eMallEntity.ItemCatagory> categories =
    new List<eMallEntity.ItemCatagory>();
for (int i = 0; i < dtTblCategory.Rows.Count; i++)
{
eMallEntity.ItemCatagory category = new eMallEntity.ItemCatagory();
category.Name = dtTblCategory.Rows[i]["Name"].ToString();
category.ID = Convert.ToInt32(dtTblCategory.Rows[i]["ID"].ToString());
IList<eMallEntity.ItemSubCatagory> subCategories =
    new List<eMallEntity.ItemSubCatagory>();
for (int j = 0; j < dtTblSubCategory.Rows.Count; j++)
{
if (dtTblSubCategory.Rows[j]["CategoryID"].ToString()
    == dtTblCategory.Rows[i]["ID"].ToString())
{
    eMallEntity.ItemSubCatagory subCategory = new eMallEntity.ItemSubCatagory();
    subCategory.Name = dtTblSubCategory.Rows[j]["Name"].ToString();
    subCategory.ID = Convert.ToInt32(dtTblSubCategory.Rows[j]["ID"].ToString());
    subCategories.Add(subCategory);
}
}
category.subCategories = subCategories;
categories.Add(category);
}
RightMenuItems.DataSource = categories;
RightMenuItems.DataBind();
%d bloggers like this: