I am exporting a large DataTable to an Excel sheet (.xls) format. I call stored procedure here, which retus the DataTable. The stored procedure used to retu 7,50,000 rows with 93 columns.
In back end, if I pass the parameters for the stored procedure, it takes 8 mins to complete the process, since it is a long process.
But when I call the procedure from the front end, while assigning the records to the DataTable, "System Out Of Memory" exception occurring.
So When I google it, I came to know it is because the size exceeds the server RAM size, so it throws this exception.
What is the alteate way for this? Anyone suggest me. Below is my code,
C#:
DataTable dt = BindDatatable(); Response.ClearContent(); Response.Buffer = true; Response.AddHeader("content-disposition", string.Format("attachment; filename={0}", "MBQ_Auto.xls")); Response.ContentType = "application/ms-excel"; //Response.ContentType = "application/octet-stream"; //Response.ContentType = "text/tab-separated-values"; string str = string.Empty; foreach (DataColumn dtcol in dt.Columns) { Response.Write(str + dtcol.ColumnName); str = "t"; } Response.Write("n"); int rowCount = 0; foreach (DataRow dr in dt.Rows) { str = ""; for (int j = 0; j < dt.Columns.Count; j++) { Response.Write(str + Convert.ToString(dr[j])); str = "t"; } Response.Write("n"); if (unchecked(++rowCount % 1024 == 0)) Response.Flush(); } Response.End();