I need some suggestion on the below code in which, i am writing some data into the file and at the same time and i updating records in the database.
foreach (DataRow dr in dtCSVdata.Rows)
{
StringBuilder sbData = new StringBuilder();
//Below line of code is used to write data in database
SQLHelper.MoveOtherQuotesToWaitingArivalOfBags(quoteHeaderId, Config.WSUserName, refCode);
foreach (object obj in dr.ItemArray)
{
if (sbData.Length == 0)
{
sbData.Append("\"");
sbData.Append(RemoveSpecialCharacters(obj.ToString()));
sbData.Append("\"");
}
else
{
sbData.Append(",");
sbData.Append("\"");
sbData.Append(RemoveSpecialCharacters(obj.ToString()));
sbData.Append("\"");
}
}
//Write to file
writer.WriteLine(sbData.ToString());
}
Sometimes timeout exception occurs while updating the records in database and when exception occurs control is passed to the catch block and no data is being written into the file.
Suppose if there were 100 records and loop has just updated the 50 records then there would be no data written in the file for 50 records. So, it becomes very difficult to track what records has been updated.
Can you please help me in this as i want if something goes wrong then data in the file should be written before that.
Just move the line to insert data after writing to file,
foreach (DataRow dr in dtCSVdata.Rows)
{
StringBuilder sbData = new StringBuilder();
//comment Below line of codewhich is used to write data in database
// SQLHelper.MoveOtherQuotesToWaitingArivalOfBags(quoteHeaderId, Config.WSUserName, refCode);
foreach (object obj in dr.ItemArray)
{
if (sbData.Length == 0)
{
sbData.Append("\"");
sbData.Append(RemoveSpecialCharacters(obj.ToString()));
sbData.Append("\"");
}
else
{
sbData.Append(",");
sbData.Append("\"");
sbData.Append(RemoveSpecialCharacters(obj.ToString()));
sbData.Append("\"");
}
}
//Write to file
writer.WriteLine(sbData.ToString());
writer.close();//may use using
//after writing insert into database table.
SQLHelper.MoveOtherQuotesToWaitingArivalOfBags(quoteHeaderId, Config.WSUserName, refCode);
}
Related
I am working with Csv file and datagridview in a C# project for a inventory app, I try to update a row to CSV file!
i need to update if user edit a row current word with a new word but my problem here is i need save the current word and new word and get total in pseudo code example:
foreach (DataGridViewRow row in dataGridView1.Rows)
{
if(row in column is modified)
update specific row with comma to current file and load it...
}
Csv file is look like,
Current:
1;2;;4;5
Update:
1;2,A;;4;5 changed device A total: 1 time...
Next row modified :
1;A;;4,B,C;5 changed device B and C total change : 2 time...
With a database it's easy to update data but i don't have sql server installed so this option has not for me i think..
My goal is for tracking device out/in so if you have a solution please share it.
Short of using an SQL server, maybe something like this could help? LiteDB You'd have your LiteDB to host your data, and export it CSV whenever you need. Working with CSV files usually means you'll re-write the whole file every time there is an update to make... Which is slow and cumbersome. I recommend you use CSV to transport data from Point A to Point B, but not to maintain data.
Also, if you really want to stick to CSV, have a look at the Microsoft Ace OLEDB driver, previously known as JET driver. I use it to query CSV files, but I have never used it to update... so your mileage may vary.
Short of using an actual DataBase or a database driver, you'll have to use a StreamReader along with a StreamWriter. Read the file with the StreamReader, write the new file with the StreamWriter. In your StreanReader. This implies you'll have code in your StreamReader to find the correct Line(s) to update.
Here's the class I created and am using to interact with LiteDB. It's not all that robust, but it did exactly what I needed it to do at the time. I had to make changes to a slew of products hosted on my platform, and I used this to keep track of the progress.
using System;
using LiteDB;
namespace FixProductsProperty
{
public enum ListAction
{
Add = 0,
Remove,
Update,
Disable,
Enable
}
class DbInteractions
{
public static readonly string dbFilename = "MyDatabaseName.db";
public static readonly string dbItemsTableName = "MyTableName";
public void ToDataBase(ListAction incomingAction, TrackingDbEntry dbEntry = null)
{
if (dbEntry == null)
{
Exception ex = new Exception("dbEntry can not be null");
throw ex;
}
// Open database (or create if not exits)
using (var db = new LiteDatabase(dbFilename))
{
var backupListInDB = db.GetCollection<TrackingDbEntry>(dbItemsTableName);
//ovverride action if needed
if (incomingAction == ListAction.Add)
{
var tempone = backupListInDB.FindOne(p => p.ProductID == dbEntry.ProductID);
if (backupListInDB.FindOne(p => p.ProductID == dbEntry.ProductID) != null)
{
//the record already exists
incomingAction = ListAction.Update;
//IOException ex = new IOException("Err: Duplicate. " + dbEntry.ProductID + " is already in the database.");
//throw ex;
}
else
{
//the record does not already exist
incomingAction = ListAction.Add;
}
}
switch (incomingAction)
{
case ListAction.Add:
backupListInDB.Insert(dbEntry);
break;
case ListAction.Remove:
//backupListInDB.Delete(p => p.FileOrFolderPath == backupItem.FileOrFolderPath);
if (dbEntry.ProductID != 0)
{
backupListInDB.Delete(dbEntry.ProductID);
}
break;
case ListAction.Update:
if (dbEntry.ProductID != 0)
{
backupListInDB.Update(dbEntry.ProductID, dbEntry);
}
break;
case ListAction.Disable:
break;
case ListAction.Enable:
break;
default:
break;
}
backupListInDB.EnsureIndex(p => p.ProductID);
// Use Linq to query documents
//var results = backupListInDB.Find(x => x.Name.StartsWith("Jo"));
}
}
}
}
I use it like this:
DbInteractions yeah = new DbInteractions();
yeah.ToDataBase(ListAction.Add, new TrackingDbEntry { ProductID = dataBoundItem.ProductID, StoreID = dataBoundItem.StoreID, ChangeStatus = true });
Sorry... my variable naming convention sometimes blows...
I'm fairly new to C# and this has me stumped. My project is using DataTables and TableAdapters to connect to a SQL Server database. I have a method that opens Excel, builds a DataRow and then passes that to the method below which adds it to my DataTable (cdtJETS) via the TableAdapter (ctaJETS).
public bool AddJETSRecord(DataRow JETSDataRow)
{
bool bolException = false;
cdtJETS.BeginLoadData();
// Add the data row to the table
try
{
cdtJETS.ImportRow(JETSDataRow);
}
catch (Exception e)
{
// Log an exception
bolException = true;
Console.WriteLine(e.Message);
}
cdtJETS.EndLoadData();
// If there were no errors and no exceptions, then accept the changes
if (!cdtJETS.HasErrors && !bolException)
{
ctaJETS.Update(cdtJETS);
return true;
}
else
return false;
}
The above works fine and the records show up in SQL Server as expected. I have another method that grabs a subset of the records in that DataTable and outputs them to another Excel file (this is a batch process that will collect records over time using the above method and then occasionally output them, so I can't directly move the data from the first Excel file to the second). After the second Excel file is updated I want to delete the records from the table so that they aren't duplicated the next time the method is run. This is where I'm having the issue:
public bool DeleteJETSRecords(DataTable JETSData)
{
int intCounter = 0;
DataRow drTarget;
// Parse all of the rows in the JETS Data that is to be deleted
foreach (DataRow drCurrent in JETSData.Rows)
{
// Search the database data table for the current row's OutputID
drTarget = cdtJETS.Rows.Find(drCurrent["OutputID"]);
// If the row is found, then delete it and increment the counter
if (drTarget != null)
{
cdtJETS.Rows.Remove(drTarget);
intCounter++;
}
}
// Continue if all of the rows were found and removed
if (JETSData.Rows.Count == intCounter && !cdtJETS.HasErrors)
{
cdtJETS.AcceptChanges();
try
{
ctaJETS.Update(dtJETS);
}
catch (Exception)
{
throw;
}
return true;
}
else
cdtJETS.RejectChanges();
return false;
}
As I step through the method I can see the rows being removed from the DataTable (i.e. if JETSData has 10 rows, at the end cdtJETS has n-10 rows) and no exceptions are thrown, but after I AcceptChanges and Update the TableAdapter, the underlying records are still in my SQL Server table. What am I missing?
The Rows.Remove method is equivalent to calling the row's Delete method, followed by the row's AcceptChanges method.
As with the DataTable.AcceptChanges method, this indicates that the change has already been saved to the database. This is not what you want.
The following should work:
public bool DeleteJETSRecords(DataTable JETSData)
{
int intCounter = 0;
DataRow drTarget;
// Parse all of the rows in the JETS Data that is to be deleted
foreach (DataRow drCurrent in JETSData.Rows)
{
// Search the database data table for the current row's OutputID
drTarget = cdtJETS.Rows.Find(drCurrent["OutputID"]);
// If the row is found, then delete it and increment the counter
if (drTarget != null)
{
drTarget.Delete();
intCounter++;
}
}
// Continue if all of the rows were found and removed
if (JETSData.Rows.Count == intCounter && !cdtJETS.HasErrors)
{
// You have to call Update *before* AcceptChanges:
ctaJETS.Update(dtJETS);
cdtJETS.AcceptChanges();
return true;
}
cdtJETS.RejectChanges();
return false;
}
I make use of SqlBulkCopier to insert a large number of entries into our logging database.
The layout of the program is:
It receives a stream of data from the networks (other servers) it then parses the stream and builds up Log objects. (200 - 400) a second. I then add each log to the Sql DataTable Object.
I then increment a counter. Once i have 10000 logs I do the sqlBulkInsert.
now the issue I am having is that, if One of the rows doesn't fit the sql validation, Ie, the one field is to long etc. then I loose all the remaining logs.
Is there not a way to call validate on the data table for each Log Item I add to it, that way i can skip the invalid ones and keep all the valid ones safe.
Currently I am inserting one Item at time, and if it fails I ignore it and carry on with the next. But this obviously defeats the point and performance benefits of SqlBulk Copy.
Some Code:
private DataTable _logTable;
public void AddLog(Log log)
{
if (log.serverId != null || log.serverId > 1)
{
try
{
_logTable.Rows.Add(log.logId, log.messageId, log.serverId, log.time, log.direction, log.hasRouting,
log.selfRouting, log.deviceType, log.unitId, log.accountCode, log.clientId, log.data);
if (_logBufferCounter++ > BufferValue)
{
_logBufferCounter = 0;
using (var sbc = new SqlBulkCopy(_connectionString, SqlBulkCopyOptions.TableLock))
{
sbc.DestinationTableName = "dbo.Logs";
sbc.BulkCopyTimeout = 0;
sbc.WriteToServer(_logTable);
_logTable.Clear();
sbc.Close();
}
}
}
catch (Exception e)
{
Log.Error("Failed to write bulk insert for LOG Table", e);
_logTable.Clear();
}
}
else
{
Log.Error("Server Id is null for LOG: " + LogToString(log));
}
}
No, there is not.
But you can as a programmer do validation before inserting. Not exactly that hard, you know. And there is no need to have a HEAVY data table at all - use normal objects and hammer them into the SqlBulkDCopy instance using your own implementation of the required interface ;)
I have a method that queries a table for the count of its records. QA has discovered an "edge case" where if a particular operation is canceled in a particular order and speed (as fast as possible), the GUI "forgets" about the rest of the records in that table (the contents of the tables are uploaded to a server; when each one finishes, the corresponding table is deleted).
To be clear, this table that is having records deleted from it and then queried for count ("workTables") is a table of table names, that are deleted after they are processed.
What I have determined (I'm pretty sure) is that this anomaly occurs when a record from the "workTables" table is in the process of being deleted when the workTables table is queried for the count of its records. This causes an exception, which causes the method to return -1, which in our case indicates we should cuase the GUI to not display those records.
Is there a way to check if a table is in the process of having a record deleted from it, and wait until after that operation has completed, before proceeding with the query, so that it won't throw an exception?
For those interested in the specifics, this method is the one that, under those peculiar circumstances, throws an exception:
public int isValidTable(string tableName)
{
int validTable = -1;
string tblQuery = "SELECT COUNT(*) FROM ";
tblQuery += tableName;
openConnectionIfPossibleAndNecessary();
try
{
SqlCeCommand cmd = objCon.CreateCommand();
cmd.CommandText = tblQuery;
object objcnt = cmd.ExecuteScalar();
validTable = Int32.Parse(objcnt.ToString());
}
catch (Exception ex)
{
validTable = -1;
}
return validTable;
}
...and this is the method that deletes a record from the "workTables" table after the corresponding table has had its contents uploaded:
private void DropTablesAndDeleteFromTables(string recordType, string fileName)
{
try
{
WorkFiles wrkFile = new WorkFiles();
int tableOK = 0;
DataSet workfiles;
tableOK = wrkFile.isValidWorkTable(); // -1 == "has no records"
if (tableOK > 0) //Table has at least one record
{
workfiles = wrkFile.getAllRecords();
//Go thru dataset and find filename to clean up after
foreach (DataRow row in workfiles.Tables[0].Rows)
{
. . .
dynSQL = string.Format("DELETE FROM workTables WHERE filetype = '{0}' and Name = '{1}'", tmpType, tmpStr);
dbconn = DBConnection.GetInstance();
dbconn.DBCommand(dynSQL, false);
populateListBoxWithWorkTableData();
return;
} // foreach (DataRow row in workfiles.Tables[0].Rows)
}
}
catch (Exception ex)
{
SSCS.ExceptionHandler(ex, "frmCentral.DropTablesAndDeleteFromTables");
}
}
// method called by DropTablesAndDeleteFromTables() above
public int isValidWorkTable() //reverted to old way to accommodate old version of DBConnection
{
// Pass the buck
return dbconn.isValidTable("workTables");
}
I know this code is very funky and klunky and kludgy; refactoring it to make more sense and be more easily understood is a long and ongoing process.
UPDATE
I'm not able to test this code:
lock (this)
{
// drop the table
}
...yet, because the handheld is no longer allowing me to copy files to it (I get, "Cannot copy [filename.[dll,exe] The device has either stopped responding or has been disconnected" (it is connected, as shown by ActiveStync))
If that doesn't work, I might have to try this:
// global var
bool InDropTablesMethod;
// before querying that database from elsewhere:
while (InDropTablesMethod)
{
Pause(500);
}
UPDATE 2
I've finally been able to test my lock code (copies of binaries were present in memory, not allowing me to overwrite them; the StartUp folder had a *.lnk to the .exe, so every time I started the handheld, it tried to run the buggy versions of the .exe), but it doesn't work - I still get the same conflict/contention.
UPDATE 3
What seems to work, as kludgy as it may be, is:
public class CCRUtils
{
public static bool InDropTablesMethod;
. . .
if (CCRUtils.InDropTablesMethod) return;
CCRUtils.InDropTablesMethod = true;
. . . // do it all; can you believe somebody from El Cerrito has never heard of CCR?
CCRUtils.InDropTableMethod = false;
UPDATE 4
Wrote too soon - the bug is back. I added this MessageBox.Show(), and do indeed see the text "proof of code re-entrancy" at run-time.
while (HHSUtils.InDropTablesMethod)
{
MessageBox.Show("proof of code re-entrancy");
i++;
if (i > 1000000) return;
}
try
{
HHSUtils.InDropTablesMethod = true;
. . .
}
HHSUtils.InDropTablesMethod = false;
...so my guess that code re-entrancy may be a problem is correct...
I know this question exists, because it's mine and I put up 500 bounty points on it:
Exporting C# report to Excel when there are more than 5K lines
The answer got me over the hump (to some degree) but we're sort of at the point where we just accept that abnormally large datasets just can't be exported via our ASP front end, so we ship those requests off to our SQL Server DBs, who then run the appropriate stored procedures and copy/paste to Excel spreadsheets.
My question here is; can someone definitively answer whether or not it's absolutely impossible to export a large dataset to an Excel spreadsheet via a ASP front end? Once a particular report hits about 8K records or something, it just can't seem to be done. I'm just trying to determine whether any other potential tweak can be made, or if that much data is just more than ASP can handle?
Well... since I've streamed gigabytes of data directly from ASP.NET, I'm pretty sure you're doing something wrong. Try to isolate the problem first - is it in putting the data into the session, is it request / response limits, is it request timeouts? Figure out where the problem is, and then go ahead and solve it! :)
In general terms, there's no reason why you should put the data in a DataSet first. Instead, use a SqlDataReader and write the data to output in chunks. This way you'll avoid having the whole data set in memory; the same way, you can just directly write to the output stream, without buffering the generated HTML in memory. Why do you keep data in Session? Wouldn't it be better to just hold the parameters necessary to retrieve it from the DB as needed, using the DataReader?
If you're having trouble with timeouts, periodical Flushes help. This also helps reduce the memory footprint on the ASP.NET side.
Saving the output data to a file on the server first also helps, and it allows you to wire up partial file downloads too - just make sure you actually have enough space on the drive.
EDIT:
Ok, so you've got an SqlCommand. Instead of using it in a SqlDataAdapter, you can do something like this (cmd being your SqlCommand instance):
HtmlTextWriter wr = new HtmlTextWriter(Response.Output);
using (var rdr = cmd.ExecuteReader())
{
int index = 0;
wr.WriteBeginTag("table");
wr.WriteLine("<tr><td>Column 1</td><td>Column 2</td></tr>");
while (rdr.Read())
{
wr.WriteBeginTag("tr");
wr.WriteBeginTag("td");
wr.Write(rdr["Column1"]);
wr.WriteEndTag("td");
wr.WriteBeginTag("td");
wr.Write(rdr["Column2"]);
wr.WriteEndTag("td");
wr.WriteEndTag("tr");
if (index++ % 1000 == 0) Response.Flush();
}
wr.WriteEndTag("table");
}
I have not tested it, so it might need some tweaking, but the idea should be pretty obvious.
It is possible to do this as I have actually just finished some code specifically to do this as part of a reporting project that I am working on where we have in-excess of 20K records that need to be pulled back and exported into excel.
I will pull out the code and stick it on github for you to look at.
I am actually using NPOI's excel processing package and then using my custom code I am able to process any List of classes dynamically into a dataset and then dump it into the worksheets.
I need to tidy up the code for you but I should have something ready for you this evening.
This code will work for both desktop and web apps.
To give you an idea my code has been able to process a dataset of over 30K relatively quickly. I have to resolve an issue with datasets over the limit of 65536 records first before it is ready for you.
The nice thing with this solution means it doesn't rely on excel being installed on the machine hosting the solution.
EDIT
I have loaded a project onto github here:
https://github.com/JellyMaster/ExcelHelper
but here is the main bit that does all the excel processing:
public static MemoryStream CreateExcelSheet(DataSet dataToProcess)
{
MemoryStream stream = new MemoryStream();
if (dataToProcess != null)
{
var excelworkbook = new HSSFWorkbook();
foreach (DataTable table in dataToProcess.Tables)
{
var worksheet = excelworkbook.CreateSheet();
var headerRow = worksheet.CreateRow(0);
foreach (DataColumn column in table.Columns)
{
headerRow.CreateCell(table.Columns.IndexOf(column)).SetCellValue(column.ColumnName);
}
//freeze top panel.
worksheet.CreateFreezePane(0, 1, 0, 1);
int rowNumber = 1;
foreach (DataRow row in table.Rows)
{
var sheetRow = worksheet.CreateRow(rowNumber++);
foreach (DataColumn column in table.Columns)
{
sheetRow.CreateCell(table.Columns.IndexOf(column)).SetCellValue(row[column].ToString());
}
}
}
excelworkbook.Write(stream);
}
return stream;
}
public static DataSet CreateDataSetFromExcel(Stream streamToProcess, string fileExtentison = "xlsx")
{
DataSet model = new DataSet();
if (streamToProcess != null)
{
if (fileExtentison == "xlsx")
{
XSSFWorkbook workbook = new XSSFWorkbook(streamToProcess);
model = ProcessXLSX(workbook);
}
else
{
HSSFWorkbook workbook = new HSSFWorkbook(streamToProcess);
model = ProcessXLSX(workbook);
}
}
return model;
}
private static DataSet ProcessXLSX(HSSFWorkbook workbook)
{
DataSet model = new DataSet();
for (int index = 0; index < workbook.NumberOfSheets; index++)
{
ISheet sheet = workbook.GetSheetAt(0);
if (sheet != null)
{
DataTable table = GenerateTableData(sheet);
model.Tables.Add(table);
}
}
return model;
}
private static DataTable GenerateTableData(ISheet sheet)
{
DataTable table = new DataTable(sheet.SheetName);
for (int rowIndex = 0; rowIndex <= sheet.LastRowNum; rowIndex++)
{
//we will assume the first row are the column names
IRow row = sheet.GetRow(rowIndex);
//a completely empty row of data so break out of the process.
if (row == null)
{
break;
}
if (rowIndex == 0)
{
for (int cellIndex = 0; cellIndex < row.LastCellNum; cellIndex++)
{
string value = row.GetCell(cellIndex).ToString();
if (string.IsNullOrEmpty(value))
{
break;
}
else
{
table.Columns.Add(new DataColumn(value));
}
}
}
else
{
//get the data and add to the collection
//now we know the number of columns to iterate through lets get the data and fill up the table.
DataRow datarow = table.NewRow();
object[] objectArray = new object[table.Columns.Count];
for (int columnIndex = 0; columnIndex < table.Columns.Count; columnIndex++)
{
try
{
ICell cell = row.GetCell(columnIndex);
if (cell != null)
{
objectArray[columnIndex] = cell.ToString();
}
else
{
objectArray[columnIndex] = string.Empty;
}
}
catch (Exception error)
{
Debug.WriteLine(error.Message);
Debug.WriteLine("Column Index" + columnIndex);
Debug.WriteLine("Row Index" + row.RowNum);
}
}
datarow.ItemArray = objectArray;
table.Rows.Add(datarow);
}
}
return table;
}
private static DataSet ProcessXLSX(XSSFWorkbook workbook)
{
DataSet model = new DataSet();
for (int index = 0; index < workbook.NumberOfSheets; index++)
{
ISheet sheet = workbook.GetSheetAt(index);
if (sheet != null)
{
DataTable table = GenerateTableData(sheet);
model.Tables.Add(table);
}
}
return model;
}
}
This does require the NPOI nuget package to be installed in your project.
Any questions give me a shout. The github project does a bit more but this is enough to get you going hopefully.