I'm having major problems trying to clear a DataTable in C#.
At all other points when editing information in the database, I've used something like
somethingTableTableAdapter1.Update(greenParksDataSet);
greenParksDataSet1.AcceptChanges();
and this has worked just fine. However, in one of my forms I have a button to clear the open table. It seems to clear the DataTable in memory, but not actually propagate the changes to the database. Essentially the code boils down to
greenParksDataSet.HistoryTable.RecieptIDColumn.AutoIncrementSeed = -1;
greenParksDataSet.HistoryTable.RecieptIDColumn.AutoIncrementSeed = -1;
greenParksDataSet.HistoryTable.RecieptIDColumn.AutoIncrementSeed = 1;
greenParksDataSet.HistoryTable.RecieptIDColumn.AutoIncrementSeed = 1;
greenParksDataSet.HistoryTable.Rows.Clear();
historyTableTableAdapter.Update(greenParksDataSet);
greenParksDataSet.AcceptChanges();
I've tried messing around with the order of things but nothing seems to work. With that code on its own, the DataTable is cleared (and the changes are shown in the DataGridView) but when adding a historyTableTableAdapter.Fill(greenParksDataSet.HistoryTable); after that, it is shown that the database is not being changed whatsoever.
I've probably missed something obvious, but I've been puzzling over this for ages, and any help would be appreciated. Thanks!
Have a look at this question/answer.
Clear just works on the datatable in memory - it does not flag them for deletion.
I think you are using a binding source so try this :
greenParksDataSet.HistoryTable.Rows.Clear();
HistoryTableBindingSource.EndEdit();
historyTableTableAdapter.Update(greenParksDataSet);
greenParksDataSet.AcceptChanges();
Related
I'm experimenting with C#/.net/WPF all for the first time. I've created a project and set up a datasource (just a table with some sample data) and created two tableadapters named Prods and Prods1 - the latter has a filter applied in the query to return slightly different results. I've dropped both tables on my form and both dutifully display their respective data.
I thought I would then swap the data source for each. So the default generated Window_Loaded:
MSDSTest.prodtestDataSet prodtestDataSet = ((MSDSTest.prodtestDataSet)(this.FindResource("prodtestDataSet")));
// Load data into the table Prods. You can modify this code as needed.
MSDSTest.prodtestDataSetTableAdapters.ProdsTableAdapter prodtestDataSetProdsTableAdapter = new MSDSTest.prodtestDataSetTableAdapters.ProdsTableAdapter();
prodtestDataSetProdsTableAdapter.Fill(prodtestDataSet.Prods);
System.Windows.Data.CollectionViewSource prodsViewSource = ((System.Windows.Data.CollectionViewSource)(this.FindResource("prodsViewSource")));
prodsViewSource.View.MoveCurrentToFirst();
// Load data into the table Prods1. You can modify this code as needed.
MSDSTest.prodtestDataSetTableAdapters.Prods1TableAdapter prodtestDataSetProds1TableAdapter = new MSDSTest.prodtestDataSetTableAdapters.Prods1TableAdapter();
prodtestDataSetProds1TableAdapter.Fill(prodtestDataSet.Prods1);
System.Windows.Data.CollectionViewSource prods1ViewSource = ((System.Windows.Data.CollectionViewSource)(this.FindResource("prods1ViewSource")));
prods1ViewSource.View.MoveCurrentToFirst();
I now want to make the first data grid (prodsViewSource) instead display the data for the second table, and ignore the second table entirely. So, I changed that as follows:
MSDSTest.prodtestDataSet prodtestDataSet = ((MSDSTest.prodtestDataSet)(this.FindResource("prodtestDataSet")));
// Load data into the table Prods. You can modify this code as needed.
MSDSTest.prodtestDataSetTableAdapters.Prods1TableAdapter prodtestDataSetProdsTableAdapter = new MSDSTest.prodtestDataSetTableAdapters.Prods1TableAdapter();
prodtestDataSetProdsTableAdapter.Fill(prodtestDataSet.Prods1);
System.Windows.Data.CollectionViewSource prodsViewSource = ((System.Windows.Data.CollectionViewSource)(this.FindResource("prodsViewSource")));
prodsViewSource.View.MoveCurrentToFirst();
With the second block having been commented out.
I must be missing something fundamental - what I think I'm doing is redefining the prodtestDataSetProdsTableAddapter variable to use an instance of the prods1 table adapter, and then using that to populate the prodsViewSource grid on the form, but I end up with a blank. Where's my error?
...
Well, I posted this after beating my head against it for an hour and, minutes later, realized the FAR easier thing to do is to just change the datacontext property of the grid in question.
I would still like to understand why doing it the vastly more complicated-bordering-on-Rube-Goldbergian way didn't work, though, so if anyone can explain that, it would still be welcome.
I'm looking at updating stored values in a RethinkDB using the C# RethinkDB.Driver library and I'm just not getting it right.
I can achieve an update by getting the result, altering that object then making a separate call to update with that object. When there are many calls to a record to update like this, the value being updated elsewhere whilst the application is working with the record.
TestingObject record = r.Db("test").Table("learning").Get("c8c54346-e35f-4025-8641-7117f12ebc5b").Run(_conn);
record.fieldNameIntValue = record.fieldNameIntValue + 1;
var result = r.Db("test").Table("learning").Get("c8c54346-e35f-4025-8641-7117f12ebc5b").Update(record).Run(_conn);
I've been trying something along these lines :
var result = r.Db("test").Table("learning").Get("c8c54346-e35f-4025-8641-7117f12ebc5b").Update(row => row["fieldNameIntValue"].Add(1)).Run(_conn);
but the result errors with Inserted value must be an OBJECT (got NUMBER):101 which suggests this is only passing the field value back instead of updating the object.
Ideally I'd like to update multiple columns at once, any advice is appreciated :)
This is an example that works in the ReQL data explorer. You can chain as may filters before the update as you want. I assume this will translate to the C# Driver, but I dont have any experience with that.
r.db('database').table('tablename').update({clicks: r.row("clicks").add(1)}).run().then(function(result){ ...
Thanks T Resudek your answer and a clearer head helped emphasised the need to map the calculation to the property.
Looking at the javadocs for update it has HashMap method which I followed with the c# library and it works.
var result = r.Db("test").Table("learning").Get("c8c54346-e35f-4025-8641-7117f12ebc5b").Update(row => r.HashMap("fieldNameIntValue",row["fieldNameIntValue"].Add(1))).Run(_conn);
I'd be interested to know if this is the right way or was a better way.
My program is writing records to a db table. So far, it has written about 58 new records to this table. All of a sudden, I get an error message saying "Row not found or changed." Which is odd, because I'm inserting a new record, not trying to find one or update an existing one. Here's the small bit of code that I'm using to create an object and then insert to the table:
// create new comment object
var comment = new Comment
{
TableName = "Circuit",
TableKey = circuitId,
Text = remarks,
CreatedOn = DateTime.Now,
CreatedByName = "loadCC03Circuits",
CreatedByUupic = "000000000"
};
cimsContext.Comments.InsertOnSubmit(comment);
cimsContext.SubmitChanges();
I'm not quite sure what to do, at this point. Each field has a value, there are no nulls. And, as I said, 58 records have already been written out by this very same bit of code before this happens so, other than the data being off (which, according to the field values in my debugger session, are not) I'm not quite sure what else to check. Any advice?
EDIT: I added an answer below that made this problem go away. But, I'm not sure why this solution worked.
I found a solution, but not the "answer". The solution, in this case, was to make a variable that contained the DateTime.Now value:
var dateNow = DateTime.Now;
And I changed the affected line of code to look like this:
CreatedOn = dateNow,
Wonders of wonders, I no longer received the error. I'm not sure why this fixed the problem, I only tried this on a suggestion from a co-worker. He theorizes that the sandbox database that I'm working is sluggish and could be affecting the DateTime.Now function. Regardless, this made that issue go away. I wish I had a definitive answer, though. I hate making a problem go away when I don't understand why the solution worked.
I have a function that saves an XML file and then binds it to a gridview. My problem is that the gridview is updating before the file is done saving.
So far I've been able to get the save to occur first by inserting a 1 second pause, however, I realize that this is a terrible, not to mention unreliable, way of getting the desired result. My code currently looks like this
editingFunction();
gsXML.Save(Server.MapPath("~/xmlFile.xml"));
System.Threading.Thread.Sleep(1000); // Ill-advised, I know...
XmlDataSource1.Data = gsXML.OuterXml;
XmlDataSource1.DataBind();
updatePanel1.Update();
Does anyone know a better way to ensure that the save function occurs before the binding?
EDIT: It seems that I misdiagnosed the problem. The save function was in fact executing first, however, I needed to clear the XmlDataSource.Data first by calling 'XmlDataSource1.Data = null.' Thanks to Graffito for pointing this out!
As the DataGridView is already bound to its source, the instruction " XmlDataSource1.Data = gsXML.OuterXml" does not operate.
To force a new binding, remove the binding first:
XmlDataSource1.Data = null.
XmlDataSource1.Data = gsXML.OuterXml
I'm working with a SqlDataAdapter on a Windows Form with C#. I have a BindingSource linking it to my fields with functional record traversal and saving changes back to the database.
I'd like to give users the option of updating the database with changes to the current record, not writing those made to other records but keeping them in the set of cached modifications (i.e. Save vs. Save All).
I've put together the following which works (sort of):
SqlCommand updateCurrent = new SqlCommand("UPDATE Table SET Attribute = #attribute WHERE ID = #currentRecord", sqlConnection)
updateCurrent.Parameters.AddWithValue("#currentRecord", bindingSource.GetItemProperties(null)["ID"].GetValue(bindingSource.Current));
updateCurrent.Parameters.AddWithValue("#attribute", bindingSource.GetItemProperties(null)["Attribute"].GetValue(bindingSource.Current));
updateCurrent.ExecuteNonQuery();
It works in that it updates the currently shown record (and only that record), but when the regular update function is called later on, it causes a System.Data.DBConcurrencyException (UpdateCommand affected 0 of the expected 1 records).
I think I understand why the error is happening (I've made changes to the database that now aren't reflected in the cached copy), but not how to proceed.
Is there a best practice for doing this? Is it an inherently bad idea to start out with?
All you need to do in order to archive what you want is the following:
This command will update your database with the content of this particular row (yourDataRow).
YourTableAdapter.Update(yourDataRow);
This command will update the whole DataTable.
YourTableAdapter.Update(yourDataTable);
The DataTable will know which row have been updated and which have been saved.
Just spit balling here after Taking A Look At It. But:
Problem #1
I would do it as such: If you're saving the updates as they happen, then the idea of a "Save All" is pretty much thrown out the window (useless) because saving all is obviously inefficient when everything is already up to date.
...So update one at a time OR require a SAVE ALL.
Problem #2 (actual complained about problem)
The DBConcurrencyException is not an error, it's a thrown Exception (difference), and the only reason it's thrown is because there were no updates made to the database. (Because you are saving on a row basis already) Thus why would you have an update? You wouldn't. So perhaps an empty try/catch would be the best route since you seem to be auto-saving pretty much.
The Way I Would do it (honestly):
Unless you're working w/ large amounts of data (lets say > 10,000 rows) I would create a "Save All" function which updates all rows that were changed (maybe use a focus listeners and add it to a list or something to figure out the change). If you wanted to save each time an edit was made like you are doing then use the "Save All" function , which in this case is just that 1 row. If others were changed, Save All to the rescue. Works each way.
Added Bonus: Using a cached copy is actually a dumb idea. (unless your computer is a beast) like I said for small data, totally fine. But let's image an 1,000,000 row database. Now try caching 1,000,000 rows... no you're right comparing will be faster, but loading all that unneeded data into memory is a horrible idea. You're program will crash when scaling.