I have a DataTable being generated using the C# NI DAQmx code. I want to take this DataTable and put it in an excel file when a CheckBox is checked. The DAQmx code records this data 'x' number of samples at a time. When this number is high, the program is slow, but it still works. I want to record a low number of samples at a time, and then save that data into an excel file.
In my current code, the data in the excel file is constantly overwritten. This is not desirable, as I need all recorded data.
Currently the data will actively record when the box is checked, but it will not concatenate. I have tried many searches and explored many methods for this, but I haven't quite been able to adapt anything for my needs.
Relevant code will be included below. Any help is appreciated, thanks.
Note: Data does not have to be a .xlsx file. It can be a .csv
This code is the DataTable generation via DAQmx:
private void DataToDataTable(AnalogWaveform<double>[] sourceArray, ref DataTable dataTable)
{
// Iterate over channels
int currentLineIndex = 0;
string test = currentLineIndex.ToString();
foreach (AnalogWaveform<double> waveform in sourceArray)
{
for (int sample = 0; sample < waveform.Samples.Count; ++sample)
{
if (sample == 50)
break;
dataTable.Rows[sample][currentLineIndex] = waveform.Samples[sample].Value;
}
currentLineIndex++;
}
}
public void InitializeDataTable(AIChannelCollection channelCollection, ref DataTable data)
{
int numOfChannels = channelCollection.Count;
data.Rows.Clear();
data.Columns.Clear();
dataColumn = new DataColumn[numOfChannels];
int numOfRows = 50;
for (int currentChannelIndex = 0; currentChannelIndex < numOfChannels; currentChannelIndex++)
{
dataColumn[currentChannelIndex] = new DataColumn()
{
DataType = typeof(double),
ColumnName = channelCollection[currentChannelIndex].PhysicalName
};
}
data.Columns.AddRange(dataColumn);
for (int currentDataIndex = 0; currentDataIndex < numOfRows ; currentDataIndex++)
{
object[] rowArr = new object[numOfChannels];
data.Rows.Add(rowArr);
}
}
This is my current method of saving to an Excel file:
private void Excel_cap_CheckedChanged(object sender, EventArgs e)
{
int i = 0;
for (excel_cap.Checked = true; excel_cap.Checked == true; i ++) {
{
StringBuilder sb = new StringBuilder();
IEnumerable<string> columnNames = dataTable.Columns.Cast<DataColumn>().
Select(column => column.ColumnName);
sb.AppendLine(string.Join(",", columnNames));
foreach (DataRow row in dataTable.Rows)
{
IEnumerable<string> fields = row.ItemArray.Select(field => field.ToString());
sb.AppendLine(string.Join(",", fields));
}
File.AppendAllText(filename_box.Text, sb.ToString());
}
}
}
Since you mentioned it does not have to be Excel, it could be a CSV, then you can use your CSV code but change the File.WriteAllText line to File.AppendAllText which will append the text rather than replacing the existing file. AppendAllText will create the file if it doesn't exist.
File.AppendAllText("test.csv", sb.ToString());
Are you sure you are using EPPlus? This CreateExcelFile looks like it is a copied code snippet.
With EPPlus, this would be as easy as
using (var package = new ExcelPackage(new FileInfo(#"a.xslx")))
{
if (!package.Workbook.Worksheets.Any())
package.Workbook.Worksheets.Add("sheet");
var sheet = package.Workbook.Worksheets.First();
var appendRow = (sheet.Dimension?.Rows ?? 0) + 1;
sheet.Cells[appendRow, 1].LoadFromDataTable(new DataTable(), false);
package.SaveAs(new FileInfo(#"a.xslx"));
}
It looks like you have some objects, then convert them to DataTable and then write them to Excel/CSV. If you skip the to DataTable conversion, you'll speed things up. EPPlus has LoadFromCollection which may just work with your AnalogWaveform<double>.
Shameless advertisement: I got these snippets from my blog post about EPPlus.
Related
What I have is a CSV that I have imported into a Datagridview.
I am now looking for a way to only import the column with the header # and Delay and not all info in the CSV, so any help on this would be appreciated.
Here is the Code I have thus far:
private void button1_Click(object sender, EventArgs e)
{
DataTable dt = new DataTable();
DialogResult result = openFileDialog1.ShowDialog();
if (result == DialogResult.OK) // Test result.
{
String Fname = openFileDialog1.FileName;
//String Sname = "export";
string[] raw_text = System.IO.File.ReadAllLines(Fname);
string[] data_col = null;
int x = 0;
foreach (string text_line in raw_text)
{
data_col = text_line.Split(';');
if (x == 0)
{
for (int i = 0; i < data_col.Count(); i++)
{
dt.Columns.Add(data_col[i]);
}
x++;
}
else
{
dt.Rows.Add(data_col);
}
}
dataGridView1.DataSource = dt;
}
}
When I read from CSV files, I create a list of values that I want for each row and use that list as the basis for my INSERT statement to the database.
I know where to find the data I want in the CSV file so I specifically target those items while I'm building my list of parameters for the query.
See the code below:
// Read the file content from the function parameter.
string content = System.Text.Encoding.ASCII.GetString(bytes);
// Split the content into an array where each array item is a line for
// each row of data.
// The Replace simply removes the CarriageReturn LineFeed characters from
// the source text and replaces them with a Pipe character (`|`)
// and then does the split from that character.
// This is just personal preference to do it this way
string[] data = content.Replace("\r\n", "|").Split('|');
// Loop through each row and extract the data you want.
// Note that each value is in a fixed position in the row.
foreach (string row in data)
{
if (!String.IsNullOrEmpty(row))
{
string[] cols = row.Split(';');
List<MySqlParameter> args = new List<MySqlParameter>();
args.Add(new MySqlParameter("#sid", Session["storeid"]));
args.Add(new MySqlParameter("#name", cols[0]));
args.Add(new MySqlParameter("#con", cols[3]));
try
{
// Insert the data to the database.
}
catch (Exception ex)
{
// Report an error.
}
}
}
In the same way, you could build your list/dataset/whatever as a data source for your datagridview. I would build a table.
Here's a mockup (I haven't got time to test it right now but it should get you on the right track).
DataTable table = new DataTable();
table.Columns.Add("#");
table.Columns.Add("Delay");
foreach (var line in raw_text)
{
DataRow row = table.NewRow();
row[0] = line[0]; // The # value you want.
row[1] = line[1]; // The Delay value you want.
table.Rows.Add(row);
}
DataGridView1.DataSource = table;
DataGridView1.DataBind();
Using TextFieldParser can make handling CVS input less brittle:
// add this using statement for TextFieldParser - needs reference to Microsoft.VisualBasic assembly
using Microsoft.VisualBasic.FileIO;
...
// TextFieldParser implements IDisposable so you can let a using block take care of opening and closing
using (TextFieldParser parser = new TextFieldParser(Fname))
{
// configure your parser to your needs
parser.TextFieldType = FieldType.Delimited;
parser.Delimiters = new string[] { ";" };
parser.HasFieldsEnclosedInQuotes = false; // no messy code if your data comes with quotes: ...;"text value";"another";...
// read the first line with your headers
string[] fields = parser.ReadFields();
// add the desired headers with the desired data type
dt.Columns.Add(fields[2], typeof(string));
dt.Columns.Add(fields[4], typeof(string));
// read the rest of the lines from your file
while (!parser.EndOfData)
{
// all fields from one line
string[] line = parser.ReadFields();
// create a new row <-- this is missing in your code
DataRow row = dt.NewRow();
// put data values; cast if needed - this example uses string type columns
row[0] = line[2];
row[1] = line[4];
// add the newly created and filled row
dt.Rows.Add(row);
}
}
// asign to DGV
this.dataGridView1.DataSource = dt;
I am using below code to export data from a csv file to datatable.
As the values are of mixed text i.e. both numbers and Alphabets, some of the columns are not getting exported to Datatable.
I have done some research here and found that we need to set ImportMixedType = Text and TypeGuessRows = 0 in registry which even did not solve the problem.
Below code is working for some files even with mixed text.
Could someone tell me what is wrong with below code. Do I miss some thing here.
if (isFirstRowHeader)
{
header = "Yes";
}
using (OleDbConnection connection = new OleDbConnection(#"Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + pathOnly +
";Extended Properties=\"text;HDR=" + header + ";FMT=Delimited\";"))
{
using (OleDbCommand command = new OleDbCommand(sql, connection))
{
using (OleDbDataAdapter adapter = new OleDbDataAdapter(command))
{
adapter.Fill(table);
}
connection.Close();
}
}
for comma delimited file this worked for me
public DataTable CSVtoDataTable(string inputpath)
{
DataTable csvdt = new DataTable();
string Fulltext;
if (File.Exists(inputpath))
{
using (StreamReader sr = new StreamReader(inputpath))
{
while (!sr.EndOfStream)
{
Fulltext = sr.ReadToEnd().ToString();//read full content
string[] rows = Fulltext.Split('\n');//split file content to get the rows
for (int i = 0; i < rows.Count() - 1; i++)
{
var regex = new Regex("\\\"(.*?)\\\"");
var output = regex.Replace(rows[i], m => m.Value.Replace(",", "\\c"));//replace commas inside quotes
string[] rowValues = output.Split(',');//split rows with comma',' to get the column values
{
if (i == 0)
{
for (int j = 0; j < rowValues.Count(); j++)
{
csvdt.Columns.Add(rowValues[j].Replace("\\c",","));//headers
}
}
else
{
try
{
DataRow dr = csvdt.NewRow();
for (int k = 0; k < rowValues.Count(); k++)
{
if (k >= dr.Table.Columns.Count)// more columns may exist
{ csvdt .Columns.Add("clmn" + k);
dr = csvdt .NewRow();
}
dr[k] = rowValues[k].Replace("\\c", ",");
}
csvdt.Rows.Add(dr);//add other rows
}
catch
{
Console.WriteLine("error");
}
}
}
}
}
}
}
return csvdt;
}
The main thing that would probably help is to first stop using OleDB objects for reading a delimited file. I suggest using the 'TextFieldParser' which is what I have successfully used for over 2 years now for a client.
http://www.dotnetperls.com/textfieldparser
There may be other issues, but without seeing your .CSV file, I can't tell you where your problem may lie.
The TextFieldParser is specifically designed to parse comma delimited files. The OleDb objects are not. So, start there and then we can determine what the problem may be, if it persists.
If you look at an example on the link I provided, they are merely writing lines to the console. You can alter this code portion to add rows to a DataTable object, as I do, for sorting purposes.
I have 2 procedures in SQL and their results should be dumped as a CSV file from C#. I’m able to get the results of 1 procedure , but I’m clueless as to how to go about adding the results from the 2nd procedure.
Procedure 1. GetCSVData
Procedure 2 . GetHeader
The below C# code successfully gets data into CSV from procedure 1 ('GetCSVData').
Please help me in integrating the data from Procedure2 (“GetHeader”) into ‘GetCSVData’ and write it to the same CSV file.
My C# code:
public string GetCSVData(string SId, string TotalRow)
{ try
{
Sql v = new Sql("Block_Rpt_CSV");
v.Add("SId",SId);
v.Add("TotalRow",TotalRow);
v.Run();
DataTable dt=new DataTable();
dt.Clear();
dt.Columns.Add("a", typeof(String));
dt.Columns.Add("b", typeof(String));
dt.Columns.Add("c", typeof(String));
foreach (System.Data.DataRow item in v.Results.Rows)
{
dt.Rows.Add(item.GetString("Name"),item.GetString("Id"),
item.GetString("Class")
}
dt.AcceptChanges();
string csvData = DataTableToCSVFile(dt, SId, TotalRow);
System.Web.HttpResponse response = System.Web.HttpContext.Current.Response;
response.ClearContent();
response.Clear();
string filename = "CsvView.csv";
response.ContentType = "text/csv";
response.AddHeader("Content-Disposition", "attachment;filename=\"" + filename + "\";");
HttpContext.Current.Response.AddHeader("Content-Length", csvData.Length.ToString());
response.Write(csvData);
response.Flush();
response.End();
AjaxBuilder r = new AjaxBuilder();
r.Add("GetCSVData", "Success");
return r.ToString();
}
catch (Exception e)
{
AjaxBuilder r = new AjaxBuilder();
r.Add("GetCSVData", "Failed");
r.Add("Exception_Message", e.Message);
r.Add("Exception_StackTrace", e.StackTrace);
return r.ToString();
}
}
#Harvey : Sorry I missed seeing your post until now. Thanks for the code. However I've been struggling with this before I saw your's. Can you please take a look and at my code and see if I can get it to work? Else I'll use your code. As of now all my 20 columns heads are displayed in my CSV and obviously the data in each column does not match.
private string DataTableToCSVFile(DataTable dt, string SId, string TotalRow)
{ DataTable dtExcel = dt;
StringBuilder sb = new StringBuilder();
sb.Append("CSV Data");
sb.Append("\n");
foreach (DataColumn column in dtExcel.Columns)
{
sb.Append(column.ColumnName + ",");
}
sb.Append("\n");
foreach (DataRow row in dtExcel.Rows)
{
for (int i = 0; i < dtExcel.Columns.Count; i++)
{
sb.Append(row[i].ToString() + ",");
}
sb.Append("\n");
}
return sb.ToString();
}
You can make your DataTable object as global and then use that same object in both the functions.
Run whichever you want first and insert the columns and records into id. And then, run the second one and insert the columns and records into it accordingly.
You can also use two separate DataTable objects for each function and then merge them after you have received results in both.
Also, make sure you write the actual CSV generation code at the end afterwards when both your functions are called.
UPDATE :
Here's how you can write your datatable values on a CSV file :
string csvPath = System.Web.Hosting.HostingEnvironment.MapPath("pathToCSV.csv");
StreamWriter sw = new StreamWriter(csvPath, false);
if (dtDataTablesList.Rows.Count > 0)
{
//First we will write the headers.
int iColCount = dtDataTablesList.Columns.Count;
for (int i = 0; i < iColCount; i++)
{
sw.Write(dtDataTablesList.Columns[i]);
if (i < iColCount - 1)
{
sw.Write(",");
}
}
sw.Write(sw.NewLine);
// Now write all the rows.
foreach (DataRow dr in dtDataTablesList.Rows)
{
for (int i = 0; i < iColCount; i++)
{
if (!Convert.IsDBNull(dr[i]))
{
sw.Write(dr[i].ToString());
}
if (i < iColCount - 1)
{
sw.Write(",");
}
}
sw.Write(sw.NewLine);
}
}
Now here, since you want to split some columns and write them first and then the others, you will have to modify the for condition a bit here and I think that should be fine with you.
Hope this gives an idea of how to progress with this.
Can someone provide a link with a tutorial about exporting data to an excel file using c# in an asp.net web application.I searched the internet but I didn't find any tutorials that will explain how they do it.
You can use Interop http://www.c-sharpcorner.com/UploadFile/Globalking/datasettoexcel02272006232336PM/datasettoexcel.aspx
Or if you don't want to install Microsoft Office on a webserver
I recommend using CarlosAg.ExcelXmlWriter which can be found here: http://www.carlosag.net/tools/excelxmlwriter/
code sample for ExcelXmlWriter:
using CarlosAg.ExcelXmlWriter;
class TestApp {
static void Main(string[] args) {
Workbook book = new Workbook();
Worksheet sheet = book.Worksheets.Add("Sample");
WorksheetRow row = sheet.Table.Rows.Add();
row.Cells.Add("Hello World");
book.Save(#"c:\test.xls");
}
}
There is a easy way to use npoi.mapper with just below 2 lines
var mapper = new Mapper();
mapper.Save("test.xlsx", objects, "newSheet");
Pass List to below method, that will convert the list to buffer and then return buffer, a file will be downloaded.
List<T> resultList = New List<T>();
byte[] buffer = Write(resultList, true, "AttendenceSummary");
return File(buffer, "application/excel", reportTitle + ".xlsx");
public static byte[] Write<T>(IEnumerable<T> list, bool xlsxExtension = true, string sheetName = "ExportData")
{
if (list == null)
{
throw new ArgumentNullException("list");
}
XSSFWorkbook hssfworkbook = new XSSFWorkbook();
int Rowspersheet = 15000;
int TotalRows = list.Count();
int TotalSheets = TotalRows / Rowspersheet;
for (int i = 0; i <= TotalSheets; i++)
{
ISheet sheet1 = hssfworkbook.CreateSheet(sheetName + "_" + i);
IRow row = sheet1.CreateRow(0);
int index = 0;
foreach (PropertyInfo property in typeof(T).GetProperties())
{
ICellStyle cellStyle = hssfworkbook.CreateCellStyle();
IFont cellFont = hssfworkbook.CreateFont();
cellFont.Boldweight = (short)NPOI.SS.UserModel.FontBoldWeight.Bold;
cellStyle.SetFont(cellFont);
ICell cell = row.CreateCell(index++);
cell.CellStyle = cellStyle;
cell.SetCellValue(property.Name);
}
int rowIndex = 1;
// int rowIndex2 = 1;
foreach (T obj in list.Skip(Rowspersheet * i).Take(Rowspersheet))
{
row = sheet1.CreateRow(rowIndex++);
index = 0;
foreach (PropertyInfo property in typeof(T).GetProperties())
{
ICell cell = row.CreateCell(index++);
cell.SetCellValue(Convert.ToString(property.GetValue(obj)));
}
}
}
MemoryStream file = new MemoryStream();
hssfworkbook.Write(file);
return file.ToArray();
}
You can try the following links :
http://www.codeproject.com/Articles/164582/8-Solutions-to-Export-Data-to-Excel-for-ASP-NET
Export data as Excel file from ASP.NET
http://codeissue.com/issues/i14e20993075634/how-to-export-gridview-control-data-to-excel-file-using-asp-net
I've written a C# class, which lets you write your DataSet, DataTable or List<> data directly into a Excel .xlsx file using the OpenXML libraries.
http://mikesknowledgebase.com/pages/CSharp/ExportToExcel.htm
It's completely free to download, and very ASP.Net friendly.
Just pass my C# function the data to be written, the name of the file you want to create, and your page's "Response" variable, and it'll create the Excel file for you, and write it straight to the Page, ready for the user to Save/Open.
class Employee;
List<Employee> listOfEmployees = new List<Employee>();
// The following ASP.Net code gets run when I click on my "Export to Excel" button.
protected void btnExportToExcel_Click(object sender, EventArgs e)
{
// It doesn't get much easier than this...
CreateExcelFile.CreateExcelDocument(listOfEmployees, "Employees.xlsx", Response);
}
(I work for a finanical company, and we'd be lost without this functionality in every one of our apps !!)
I am using Microsoft.Office.Interop.Excel to read a spreadsheet that is open in memory.
gXlWs = (Microsoft.Office.Interop.Excel.Worksheet)gXlApp.ActiveWorkbook.ActiveSheet;
int NumCols = 7;
string[] Fields = new string[NumCols];
string input = null;
int NumRow = 2;
while (Convert.ToString(((Microsoft.Office.Interop.Excel.Range)gXlWs.Cells[NumRow, 1]).Value2) != null)
{
for (int c = 1; c <= NumCols; c++)
{
Fields[c-1] = Convert.ToString(((Microsoft.Office.Interop.Excel.Range)gXlWs.Cells[NumRow, c]).Value2);
}
NumRow++;
//Do my other processing
}
I have 180,000 rows and this turns out be very slow. I am not sure the "Convert" is efficient. Is there anyway I could do this faster?
Moon
Hi I found a very much faster way.
It is better to read the entire data in one go using "get_range". This loads the data into memory and I can loop through that like a normal array.
Microsoft.Office.Interop.Excel.Range range = gXlWs.get_Range("A1", "F188000");
object[,] values = (object[,])range.Value2;
int NumRow=1;
while (NumRow < values.GetLength(0))
{
for (int c = 1; c <= NumCols; c++)
{
Fields[c - 1] = Convert.ToString(values[NumRow, c]);
}
NumRow++;
}
There are several options - all involve some additional library:
OpenXML 2.0 (free library from MS) can be used to read/modify the content of an .xlsx so you can do with it what you want
some (commercial) 3rd-party libraries come with grid controls allowing you to do much more with excel files in your application (be it Winforms/WPF/ASP.NET...) like SpreadsheetGear, Aspose.Cells etc.
I am not sure the "Convert" is efficient. Is there anyway I could do
this faster?
What makes you believe this? I promise you that Convert.ToString() is the most effective method in the code you posted. Your problem is that your looping through 180,000 records in an excel document...
You could split the work up since you know the number of row this is trival to do.
Why are you coverting Value2 to a string exactly?
I found really fast way to read excel in my specific way. I need to get it as a two dimensional array of string. With really big excel, it took about one hour in old way. In this way, I get my values in 20sec.
I am using this nugget: https://reposhub.com/dotnet/office/ExcelDataReader-ExcelDataReader.html
And here is my code:
DataSet result = null;
//https://reposhub.com/dotnet/office/ExcelDataReader-ExcelDataReader.html
using (var stream = File.Open(path, FileMode.Open, FileAccess.Read))
{
// Auto-detect format, supports:
// - Binary Excel files (2.0-2003 format; *.xls)
// - OpenXml Excel files (2007 format; *.xlsx)
using (var reader = ExcelReaderFactory.CreateReader(stream))
{
result = reader.AsDataSet();
}
}
foreach (DataTable table in result.Tables)
{
if (//my conditions)
{
continue;
}
var rows = table.AsEnumerable().ToArray();
var dataTable = new string[table.Rows.Count][];//[table.Rows[0].ItemArray.Length];
Parallel.For(0, rows.Length, new ParallelOptions { MaxDegreeOfParallelism = 8 },
i =>
{
var row = rows[i];
dataTable[i] = row.ItemArray.Select(x => x.ToString()).ToArray();
});
importedList.Add(dataTable);
}
I guess it's not the Convert the source of "slowing"...
Actually, retrieving cell values is very slow.
I think this conversion is not necessary:
(Microsoft.Office.Interop.Excel.Range)gXlWs
It should work without that.
And you can ask directly:
gXlWs.Cells[NumRow, 1].Value != null
Try to move the entire range or, at least, the entire row to an object Matrix and work with it instead of the range itself.
Use the OleDB Method. That is the fastest as follows;
string con =
#"Provider=Microsoft.Jet.OLEDB.4.0;Data Source=D:\temp\test.xls;" +
#"Extended Properties='Excel 8.0;HDR=Yes;'";
using(OleDbConnection connection = new OleDbConnection(con))
{
connection.Open();
OleDbCommand command = new OleDbCommand("select * from [Sheet1$]", connection);
using(OleDbDataReader dr = command.ExecuteReader())
{
while(dr.Read())
{
var row1Col0 = dr[0];
Console.WriteLine(row1Col0);
}
}
}