Explanation
Typing in Combobox adds items from the database that contain% the given string. It is too much data, so I'm having to limit the query else the program freezes. The limit prevents future updates to my Combobox, and any attempt to clear the items per text changed event causes a crash with the error: Attempted to read or write protected memory. This is often an indication that other memory is corrupt. In addition to the crash, I need the limit removed, but doing so causes the items added to flood the Combobox with too much data, e.g. the 10,000+ items.
Text Update Event
private void PartDescription_TextUpdate()
{
OdbcLink.ConnectODBC(this, $"Select DISTINCT PM_PRT, PM_DES, PM_NET " +
$"From PFWF0022.PARTMAST WHERE PM_PRT LIKE '{partNumberInput.Text}%' LIMIT 10");
}
Yes, I realize this query is susceptible to SQL injection but I need to work out the kinks and that fix will be coming afterward if I'm able to find a solution.
Data Handling From Query
if(queryString.Contains("WHERE PM_PRT LIKE"))
{
//DEBUG: This line of code causes the memory issue, but it was my best attempt
//at trying to get the data to update as the queries came in.
form.partNumberInput.Items.Clear();
while (reader.Read())
{
//INFO: This is my comboBox and this adds to it for each item
//returned from the database.
form.partNumberInput.Items.Add(reader[0].ToString());
}
}
Expected Solution
I'd like to be able to poll through my database queries dynamically without overloading the comboBox suggestions that are generated. The suggestions MUST be pulled from the database as items are continually changing minute to minute, and loading at application start isn't an option since it will take quite some time to return that data.
Related
I have a C# front end with mySql 8 back end for my company's database. I have a query that returns one month's production material consumption by way of material inventory and project process tracking. I have no problem with the query, but the time spent on the query (actually many queries).
This is the setup of my code, ran by clicking a label.
private void lblQry_Click(xxx,xxx)
{
DateTime startDT = DateTime.Now();
DataTable qryResults = new DataTable(); datagridviewResult.DataSource = qryResults.
//set up qryResults as Projects vs. materials consumeds and cost associated,
(I will skip the actual codes here, but it consists several foreach loops with many queries to
assemble the consumption and cost data per project...)
DateTime endDT=DateTime.Now(); TimeSpan ts=endDT-startDT;
MessageBox.Show("Query time: " + ts.Seconds.ToString());
}
I have run this code many times, the results were checked and verified. But it takes a long time to finish and the time is so much longer than the tracking I put in the code, the pop message which showed about 20 - 30 secs usually, but the actual time when data appeared int datagridview after I clicked the label is on the level of over 10 minutes.
Please someone gives me some ideas as to where the problem is? Why these two times are off so much? Thanks a lot.
Don't assign a DataTable to a datagridview's DataSource and then start filling the table; you could trigger a huge number of updates and refreshes to the datagridview. If your datatable contained a thousand items and you managed to trigger a refresh of the grid after every item item, it'll refresh itself a thousand times
Fill your table and then assign it to the DataSource
If you absolutely must make this assignment first, call BeginLoadData/EndLoadData which should turn off a load of stuff for the load. If you're still finding that the table loads in a few seconds but it takes the form 10 minutes to finish rendering its updates then we'll probably need to see more code to work out what is taking up the time
I have a TreeView, and am using a DataReader to populate it with data from specific columns in the database, specifically "consultationDate" (and another one beginning with the same name, just append "Notes" onto the end of that one, to get "consultationNotes).
The thing is, as soon as it gets around to executing the code that is meant to add the notes to the tree view, it tell me that I have an IndexOutOfBoundsException. The columns concerned do have data in them.
The code is below.
Any ideas as to why I am getting that error?
while (treeNodeReader.Read())
{
//childNode = parentNode.Nodes.Add("Note Details: " + treeNodeReader["consultationNotes"].ToString());
}
(For ease of readability, because the issues is the same for both columns, I've included only one column's code here).
I'm working with a SqlDataAdapter on a Windows Form with C#. I have a BindingSource linking it to my fields with functional record traversal and saving changes back to the database.
I'd like to give users the option of updating the database with changes to the current record, not writing those made to other records but keeping them in the set of cached modifications (i.e. Save vs. Save All).
I've put together the following which works (sort of):
SqlCommand updateCurrent = new SqlCommand("UPDATE Table SET Attribute = #attribute WHERE ID = #currentRecord", sqlConnection)
updateCurrent.Parameters.AddWithValue("#currentRecord", bindingSource.GetItemProperties(null)["ID"].GetValue(bindingSource.Current));
updateCurrent.Parameters.AddWithValue("#attribute", bindingSource.GetItemProperties(null)["Attribute"].GetValue(bindingSource.Current));
updateCurrent.ExecuteNonQuery();
It works in that it updates the currently shown record (and only that record), but when the regular update function is called later on, it causes a System.Data.DBConcurrencyException (UpdateCommand affected 0 of the expected 1 records).
I think I understand why the error is happening (I've made changes to the database that now aren't reflected in the cached copy), but not how to proceed.
Is there a best practice for doing this? Is it an inherently bad idea to start out with?
All you need to do in order to archive what you want is the following:
This command will update your database with the content of this particular row (yourDataRow).
YourTableAdapter.Update(yourDataRow);
This command will update the whole DataTable.
YourTableAdapter.Update(yourDataTable);
The DataTable will know which row have been updated and which have been saved.
Just spit balling here after Taking A Look At It. But:
Problem #1
I would do it as such: If you're saving the updates as they happen, then the idea of a "Save All" is pretty much thrown out the window (useless) because saving all is obviously inefficient when everything is already up to date.
...So update one at a time OR require a SAVE ALL.
Problem #2 (actual complained about problem)
The DBConcurrencyException is not an error, it's a thrown Exception (difference), and the only reason it's thrown is because there were no updates made to the database. (Because you are saving on a row basis already) Thus why would you have an update? You wouldn't. So perhaps an empty try/catch would be the best route since you seem to be auto-saving pretty much.
The Way I Would do it (honestly):
Unless you're working w/ large amounts of data (lets say > 10,000 rows) I would create a "Save All" function which updates all rows that were changed (maybe use a focus listeners and add it to a list or something to figure out the change). If you wanted to save each time an edit was made like you are doing then use the "Save All" function , which in this case is just that 1 row. If others were changed, Save All to the rescue. Works each way.
Added Bonus: Using a cached copy is actually a dumb idea. (unless your computer is a beast) like I said for small data, totally fine. But let's image an 1,000,000 row database. Now try caching 1,000,000 rows... no you're right comparing will be faster, but loading all that unneeded data into memory is a horrible idea. You're program will crash when scaling.
I'm new to n-tier enterprise development. I just got quite a tutorial just reading threw the 'questions that may already have your answer' but didn't find what I was looking for. I'm doing a geneology site that starts off with the first guy that came over on the boat, you click on his name and the grid gets populated with all his children, then click on one of his kids that has kids and the grid gets populated with his kids and so forth. Each record has an ID and a ParentID. When you choose any given person, the ID is stored and then used in a search for all records that match the ParentID which returns all the kids. The data is never changed (at least by the user) so I want to just do one database access, fill all fields into one datatable and then do a requery of it each time to get the records to display. In the DAL I put all the records into a List which, in the ObjectDataSource the function that fills the GridView just returns the List of all entries. What I want to do is requery the datatable, fill the list back up with the new query and display in the GridView. My code is in 3 files here
(I can't get the backticks to show my code in this window) All I need is to figure out how to make a new query on the existing DataTable and copy it to a new DataTable. Hope this explains it well enough.
[edit: It would be easier to just do a new query from the database each time and it would be less resource intensive (in the future if the database gets too large) to store in memory, but I just want to know if I can do it this way - that is, working from 1 copy of the entire table] Any ideas...
Your data represents a tree structure by nature.
A grid to display it may not be my first choice...
Querying all data in one query can be done by using a complex SP.
But you are already considering performance. Thats always a good thing to keep in mind when coming up with a design. But creating something, improve it and only then start to optimize seems a better to go.
Since relational databases are not real good on hierarchical data, consider a nosql (graph)database. As you mentioned there are almost no writes to the DB, nosql shines here.
I'm trying to make sense of a situation I have using entity framework on .net 3.5 sp1 + MySQL 6.1.2.0 as the provider. It involves the following code:
Response.Write("Products: " + plist.Count() + "<br />");
var total = 0;
foreach (var p in plist)
{
//... some actions
total++;
//... other actions
}
Response.Write("Total Products Checked: " + total + "<br />");
Basically the total products is varying on each run, and it isn't matching the full total in plist. Its varies widely, from ~ 1/5th to half.
There isn't any control flow code inside the foreach i.e. no break, continue, try/catch, conditions around total++, anything that could affect the count. As confirmation, there are other totals captured inside the loop related to the actions, and those match the lower and higher total runs.
I don't find any reason to the above, other than something in entity framework or the mysql provider that causes it to end the foreach when retrieving an item.
The body of the foreach can have some good variation in time, as the actions involve file & network access, my best shot at the time is that when the .net code takes beyond certain threshold there is some type of timeout in the underlying framework/provider and instead of causing an exception it is silently reporting no more items for enumeration.
Can anyone give some light in the above scenario and/or confirm if the entity framework/mysql provider has the above behavior?
Update 1: I can't reproduce the behavior by using Thread.Sleep in a simple foreach in a test project, not sure where else to look for this weird behavior :(.
Update 2: in the example above the .Count() always returns the same + correct amount of items. Using ToList or ToArray as suggested gets around of the issue as expected (no flow control statements in the foreach body) and both counts match + don't vary on each run.
What I'm interested in is what causes this behavior in entity framework + mysql. Would really prefer not having to change the code in all the projects that use entity framework + mysql to do .ToArray before enumerating the results because I don't know when it'll swallow some results. Or if I do it, at least know what/why it happened.
If the problem is related to the provider or whatever, then you can solve/identify that by realising the enumerable before you iterate over it:
var realisedList = plist.ToArray();
foreach(var p in realisedList)
{
//as per your example
}
If, after doing this, the problem still persists then
a) One of the actions in the enumerator is causing an exception that is getting swallowed somewhere
b) The underlying data really is different every time.
UPDATE: (as per your comment)
[deleted - multiple enumerations stuff as per your comment]
At the end of the day - I'd be putting the ToArray() call in to have the problem fixed in this case (if the Count() method is required to get a total, then just change it to .Length on the array that's constructed).
Perhaps MySql is killing the connection while you're enumerating, and doesn't throw an error to EF when the next MoveNext() is called. EF then just dutifully responds by saying that the enumerable is simply finished. If so, until such a bug in the provider is fixed, the ToArray() is the way forward.
I think actually that you hit on the answer in your question, but it may be the data that is causing the problem not the timeout. Here is the theory:
One (or several) row(s) in the result set has some data that causes an exception / problem, when it hits that row the system thinks that it has reached the last row.
To test this you could try:
Ordering the data and see if the number returned in the for each statement is the same each time.
Select only the id column and see if the problem goes away
Remove all rows from the table, add them back a few at a time to see if a specific row is causing the problem
If it is a timeout problem, have you tried changing the timeout in the connection string.
I believe it has to do with the way the EF handles lazy loading. You might have to use either Load() or Include() and also check using IsLoaded property within your processing loop. Check out these two links for more information:
http://www.singingeels.com/Articles/Entity_Framework_and_Lazy_Loading.aspx
http://blogs.msdn.com/jkowalski/archive/2008/05/12/transparent-lazy-loading-for-entity-framework-part-1.aspx
I apologize I don't know more about EF to be more specific. Hopefully the links will provide enough info to get you started and others can chime in with any questions you might have.
The issue, cause and workaround is described exactly in this mysql bug.
As suspected it Is a timeout related error in the provider, but its not the regular timeout i.e. net_write_timeout. That's why the simple reproduction in a test project didn't work, since the timeout relates to All the cycles of the foreach and not just a particularly long body between the read of 2 rows.
As of now, the issue is present in the latest version of the MySql provider and under normal conditions would only affect scenarios where rows are being read with a connection maintained for a long time (which might or not involve a slow query). This is great, because it doesn't affect all of the previous projects where I have used MySql / applying the workaround to the sources also means it doesn't fail silently.
Ps. couple of what seem to be related mysql bugs: 1, 2