Scanning my table c# mysql - c#

So i'm making a library system, and i encounter a problem in my Borrowing and Returning of books.
When i borrow a book then return it changing the status to return, then borrowing that same book again success, then when i'm returning it my trapping for return books message show.
sql = "SELECT * FROM tbltransactionbooks WHERE fBarCodeNo LIKE '" + txtBARCODE_R.Text.Trim() + "%'";
cfgotcall.engageQuery(sql);
if (cfgotcall.tbl.Rows[0]["fStatus"].ToString().Equals("Borrowed"))
{
txtTITLE_R.Text = cfgotcall.tbl.Rows[0]["fBookTitle"].ToString();
txtAUTHOR_R.Text = cfgotcall.tbl.Rows[0]["fAuthor"].ToString();
txtYEAR_R.Text = cfgotcall.tbl.Rows[0]["fBookYr"].ToString();
txtACCNO_R.Text = cfgotcall.tbl.Rows[0]["fAccNo"].ToString();
txtCALLNO_R.Text = cfgotcall.tbl.Rows[0]["fCallNo"].ToString();
txtBARCODE_BR.Text = cfgotcall.tbl.Rows[0]["fBarcodeNo"].ToString();
txtSEARCH.Text = cfgotcall.tbl.Rows[0]["fStudent"].ToString();
txtReturnDate.Text = cfgotcall.tbl.Rows[0]["fBorrowDate"].ToString();
txtIDNO.Text = cfgotcall.tbl.Rows[0]["fIDStudent"].ToString();
txtLEVEL.Text = cfgotcall.tbl.Rows[0]["fLevel"].ToString();
}
else
{
MessageBox.Show("Book already returned.");
}
What i debug in my own code in the pic is it doesn't scan the whole rows in the tbltransactionbooks it only reads the first row of my table.
33 123 NAME IT 2/20/2017 2/20/2017 [HISTORY OF] COMPUTERS: THE MACHINES WE THINK WITH Returned
33 123 NAME IT 2/21/2017 2/21/2017 [HISTORY OF] COMPUTERS: THE MACHINES WE THINK WITH Borrowed
![2]: http://i66.tinypic.com/28ajgwg.jpg
How dow I scan the whole rows in my table? If my code above does not look good I'm open in suggestion of how I am able make it clean. thanks

cfgotcall.tbl.Rows[0] refers to the the first row only.
You should iterate each row:
sql = "SELECT * FROM tbltransactionbooks WHERE fBarCodeNo LIKE '" + txtBARCODE_R.Text.Trim() + "%'";
cfgotcall.engageQuery(sql);
foreach(var row in cfgotcall.tbl.Rows)
{
if (row["fStatus"].ToString().Equals("Borrowed"))
{
txtTITLE_R.Text = row["fBookTitle"].ToString();
txtAUTHOR_R.Text = row["fAuthor"].ToString();
txtYEAR_R.Text = row["fBookYr"].ToString();
txtACCNO_R.Text = row["fAccNo"].ToString();
txtCALLNO_R.Text = row["fCallNo"].ToString();
txtBARCODE_BR.Text = row["fBarcodeNo"].ToString();
txtSEARCH.Text = row["fStudent"].ToString();
txtReturnDate.Text = row["fBorrowDate"].ToString();
txtIDNO.Text = row["fIDStudent"].ToString();
txtLEVEL.Text = row["fLevel"].ToString();
}
else
{
MessageBox.Show("Book already returned.");
}
}
To avoid getting run time errors due to typos in the column names and the SQL query, you can use LINQ to SQL or Entity Framework. This way each row is converted to an object and you can access each column by accessing an object property\field.

The first row is checked because use access only the first row:
if (cfgotcall.tbl.Rows[0]...)
To check all the rows, iterate through the cfgotcall.tbl.Rows collection either using the for or foreach loop.

You are only accessing the first row [0] of the table. You need to itterate through all of them using for or foreach loop.
Side note: I guess this is a homework. It will not do you any good if you find complete solution in the internet. You should rather try to understand how collections/arrays work and how to traverse them using loops.
A couple of tutorials to help you with the task:
https://msdn.microsoft.com/en-us/library/aa288462(v=vs.71).aspx
http://csharp.net-informations.com/collection/csharp-collection-tutorial.htm

Related

C#: DataTable getting only one row of the search result

I'm having a sudden and strange problem with DataTable. I'm using C# with MySQL database to develop a system, and I'm trying to export custom reports. The problem is that, somehow, my DataTable is getting only one result (I've tested my query on MySQL and should be something like 30 results on the xls file and the DataTable).
Strangely, these functions are used in other parts of the system to export other kinds of reports, and work perfectly. This is the select function that I'm using:
public DataTable selectBD(String tabela, String colunas) {
var query = "SELECT " + colunas + " FROM " + tabela;
var dt = new DataTable();
Console.WriteLine("\n\n" + query + "\n\n");
try
{
using (var command = new MySqlCommand(query, bdConn)) {
MySqlDataReader reader = command.ExecuteReader();
dt.Load(reader);
reader.Close();
}
}
catch (MySqlException) {
return null;
}
bdConn.Close();
return dt;
}
And this is my query:
SELECT
cpf_cnpj, nomeCliente, agenciaContrato, contaContrato,
regionalContrato, carteiraContrato, contratoContrato,
gcpjContrato, avalistaContrato, enderecoContrato,
telefoneContrato, dataChegadaContrato, dataFatoGerContrato,
dataPrimeiraParcelaContrato, dataEmissaoContrato, valorPlanilhaDebitoContrato
FROM
precadastro
INNER JOIN
contrato
ON precadastro.cpf_cnpj = contrato.FK_cpf_cnpj
LEFT JOIN faseprocessual
ON contrato.idContrato = faseprocessual.FK_idContrato
And that is the result of the query on SQLyog
I've tested and the DataTable returned by the function only receive the one row, and it's not the first row of the MySQL results. Someone had this kind of problem before?
DataTable load expects primary key from your data (supplied by DataReader) and tries to guess it from passed rows. Since there's no such key, Load method guesses it's the first column (cpf_cnpj). But, values in that column aren't unique so the each row gets overwritten by next one, and the result is just one row in your DataTable.
It's the issue that persist for years, and I'm not sure there's one solution to rule them all. :)
You can try:
change query so that some unique values get into first column (unfortunately, I can't see something unique in your screenshot) or concatenate two or more values to get unique value.
Prepare DataTable by yourself by creating columns (this mirroring structure of resultset) and then iterate through DataReader to copy data.
add some autoincrement value in your query (or make temporary table with auto_increment column then fill that table)
Last suggestion could be something like this (I haven't worked much with mySql, so this is some suggestion i have googled :)):
SELECT
#i:=#i+1 AS id,
cpf_cnpj, nomeCliente, agenciaContrato, contaContrato,
regionalContrato, carteiraContrato, contratoContrato,
gcpjContrato, avalistaContrato, enderecoContrato,
telefoneContrato, dataChegadaContrato, dataFatoGerContrato,
dataPrimeiraParcelaContrato, dataEmissaoContrato, valorPlanilhaDebitoContrato
FROM
precadastro
INNER JOIN
contrato
ON precadastro.cpf_cnpj = contrato.FK_cpf_cnpj
LEFT JOIN faseprocessual
ON contrato.idContrato = faseprocessual.FK_idContrato
CROSS JOIN (SELECT #i:= 0) AS i
here's answer on SO which uses auto number in query.

Optimizing query that uses AsEnumerable and SingleOrDefault

Not long ago there was a feature request in the program I am maintaining. Basically it has to fill up a table in the database with info from a text file. These files can be pretty big, but it was fairly easy to do because these files were defined as the complete list of user data. Therefore the table could be truncated and the just filled up again with data from the text file.
But then a week ago it was decided that these files are actually updates of current user info, so now I have to retrieve the correct MeteringPointId (which only exist once if it does exist) and then update info on it. If it doesn't exist, just insert data as before.
The way I do this is retrieving the complete database table with data from the database into memory and then just updating on that info before finally saving the changes by calling the datatables update function. It works fine, except that finding the row with the MeteringPointId is slow:
DataRow row = MeteringPointsDataTable.NewRow();
// this is called for each line in the text file to find the corresponding MeteringPointId. It can be 300.000 times.
row = MeteringPointsDataTable.AsEnumerable().SingleOrDefault(r => r.Field<string>("MeteringPointId").ToString() == MeteringPointId);
Is there a way to retrieve a DataRow from a DataTable that is faster than this?
If you are sure that only one item con fullfil the condition use FirstOrDefault instead of Single. Thus you won´t collect the whole table but only the first entry you´ve found.
You can use Select method of DataTable.
var expression = "[MeteringPointId] = '" + MeteringPointId + "'";
DataRow[] result = MeteringPointsDataTable.Select(expression);
Also you can create an expression like,
var idList = new []{"id1", "id2", "id3", ...};
var expression = "[MeteringPointId] in " + string.Format("({0})", string.Join(",", idList.Select(i=> "'"+i+"'")));
Similar usage is here
Hope it helps..
You could put the whole table in a dictionary:
//At the start
var meteringPoints = MeteringPointsDataTable.AsEnumerable().ToDictionary(r => r.Field<string>("MeteringPointId").ToString());
//For each row of the text file:
DataRow row;
if (!meteringPoints.TryGetValue(MeteringPointId, out row))
{
row = MeteringPointsDataTable.NewRow();
meteringPoints[MeteringPointId] = row;
}

How to check if string is in column value of returned DataRow array in C#?

I know this is very basic, but I can't seem to get it right. I have this datatable that I will fill with processed information from my database.
After searching if the EmpRequestID has already been added in the Datatable, I want to be able to get the value of the column named "RequestedEmp" of the returned row and check if already contains the initials that my variable is currently hosting (it is in a loop). If it does not, append the initials in the variable in the existing initials in the row.
DataRow[] MyEmpReq_Row = MyEmpRequests_DataTable.Select("EmpRequestID='" + EmpRequestID + "'");
int SameReqID = MyEmpReq_Row.Length;
if (MyEmpReq_Row > 0) //REQ ID IN DT, MULTIPLE EMP 1 REQUEST
{
//INSERT HERE
}
else //ID NOT IN DT YET
{
MyEmpRequests_DataTable.Rows.Add(EmpRequestID, ActionBy, Requested_Initials, DateSubmitted, RequestStatus);
}
I want to be able to do something like this
string RetrievedInitials = MyEmpReq_Row["RequestedEmp"].ToString();
if (RetrievedInitials LIKE '%" + Requested_Initials + "'") // but if statements doesnt have LIKE
or this and then know if the column contains the value or not.
MyEmpReq_Row.Select("RequestedEmp LIKE '%" + Requested_Initials + "'");
if (RetrievedInitials.Contains(Requested_Initials))
Take a look at the string class:
http://msdn.microsoft.com/en-us/library/system.string.aspx
As already mentioned by some other posters, the Contains() method may be of use. However, you should also look at EndsWith() (which is more in line with your use of the wildcard in your LIKE query) and StartsWith().
Example:
if (RetrievedInitials.EndsWith(Requested_Initials))

Store multiple ListBox selections in database?

I have a ListBox called listbox. Its "multiple selection" property is set to true. I need to store the selections from this ListBox into a database field.
Note that I am using web forms, ASP.NET, C#, and Visual Studio Web Developer 2010 Express.
My code is as follows:
SqlCommand insertNL = new SqlCommand("insert into dbo.newsletter (newsletter_subject,newsletter_body,newsletter_sentto) VALUES ('" + TextBox1.Text + "', '" + TextBox2.Text + "', '" + ListBox1.SelectedItem + "')", badersql);
badersql.Open();
insertNL.ExecuteNonQuery();
badersql.Close();
Unfortunately, this code only stores the first selected value of the ListBox in the "newsletter_sentto" column of my newsletter table. Does anyone have any suggestions as to how to fix this code? Thanks in advance.
Things to fix:
Before doing anything else, parameterize your sql. This is ripe for injection. Tutorial here.
You aren't disposing of your cmd or connection string. Wrap those in Using clauses. Example here.
Do a foreach on the items to see which ones are selected. Either store those in a comma separated list in the database (which needs parsing on the way back out) or store them in their own table.
You need to decide how you want to store the multiple "newsletter_sentto" values first.
You're best solution is to create a new child table where you have 1 row and column per selected item, with a foreign key back to your newsletter table.
You can try to store them all together in one row with multiple columns (sentto1, sentto2, etc), this will limit the max number of values you can store, and will cause problems searching across multiple fields. How will you query what was sent to a particular person? WHERE sentto1=#user or sentto2=#user... no index can be used there.
You can stuff all the value in a single row and column using a "," or ";" to separate the values. this will cause many problems because you'll have to constantly split the string apart, every time you need to get at one of the sentto values.
First, you should use parametrized queries instead of straight concatenation to protect against SQL injection. Second, you need to cycle through the selected items and build (presumably) a delimited-list of the selections.
var selectedItems = new List<string>();
foreach( var item in ListBox1.SelectedItems )
selectedItems.Add( item.ToString() );
var sql = "Insert dbo.newsletter(newsletter_subject,newsletter_body,newsletter_sentto)"
+ " Values(#newsletter_subject, #newsletter_body, #newsletter_sentto)"
badersql.Open();
using ( qlCommand = new SqlCommand(sql, badersql) )
{
qlCommand.Parameters.AddWithValue("#newsletter_subject", TextBox1.Text);
qlCommand.Parameters.AddWithValue("#newsletter_body", TextBox2.Text);
qlCommand.Parameters.AddWithValue("#newsletter_sentto", string.Join(',', selectedItems.ToArray()));
qlCommand.ExecuteNonQuery();
};
Try to get selected items like that :
string newsletterSentTo = "";
foreach (ListItem item in ListBox1.Items)
{
if (item.Selected)
newsletterSentTo += "," + item.Text;
}

DataTable.Select and Performance Issue in C#

I'm importing the data from three Tab delimited files in the DataTables and after that I need to go thru every row of master table and find all the rows in two child tables. Against each DataRow[] array I found from the child tables, I have to again go thru individually each row and check the values based upon different paramenters and at the end I need to create a final record which will be merger of master and two child table columns.
Now I have done that and its working but the problem is its Performance. I'm using the DataTable.Select to find all child rows from child table which I believe making it very slow.
Please remember the None of the table has any Primary key as the duplicate rows are acceptable.
At the moment I have 1200 rows in master table and aroun 8000 rows in child table and the total time it takes to do that is 8 minutes.
Any idea how can I increase the Performance.
Thanks in advance
The code is below ***************
DataTable rawMasterdt = importMasterFile();
DataTable rawDespdt = importDescriptionFile();
dsHelper = new DataSetHelper();
DataTable distinctdt = new DataTable();
distinctdt = dsHelper.SelectDistinct("DistinctOffers", rawMasterdt, "C1");
if (distinctdt.Rows.Count > 0)
{
int count = 0;
foreach (DataRow offer in distinctdt.Rows)
{
string exp = "C1 = " + "'" + offer[0].ToString() + "'" + "";
DataRow masterRow = rawMasterdt.Select(exp)[0];
count++;
txtBlock1.Text = "Importing Offer " + count.ToString() + " of " + distinctdt.Rows.Count.ToString();
if (masterRow != null )
{
Product newProduct = new Product();
newProduct.Code = masterRow["C4"].ToString();
newProduct.Name = masterRow["C5"].ToString();
// -----
newProduct.Description = getProductDescription(offer[0].ToString(), rawDespdt);
newProduct.Weight = getProductWeight(offer[0].ToString(), rawDespdt);
newProduct.Price = getProductRetailPrice(offer[0].ToString(), rawDespdt);
newProduct.UnitPrice = getProductUnitPrice(offer[0].ToString(), rawDespdt);
// ------- more functions similar to above here
productList.Add(newProduct);
}
}
txtBlock1.Text = "Import Completed";
public string getProductDescription(string offercode, DataTable dsp)
{
string exp = "((C1 = " + "'" + offercode + "')" + " AND ( C6 = 'c' ))";
DataRow[] dRows = dsp.Select( exp);
string descrip = "";
if (dRows.Length > 0)
{
for (int i = 0; i < dRows.Length - 1; i++)
{
descrip = descrip + " " + dRows[i]["C12"];
}
}
return descrip;
}
.Net 4.5 and the issue is still there.
Here are the results of a simple benchmark where DataTable.Select and different dictionary implementations are compared for CPU time (results are in milliseconds)
#Rows Table.Select Hashtable[] SortedList[] Dictionary[]
1000 43,31 0,01 0,06 0,00
6000 291,73 0,07 0,13 0,01
11000 604,79 0,04 0,16 0,02
16000 914,04 0,05 0,19 0,02
21000 1279,67 0,05 0,19 0,02
26000 1501,90 0,05 0,17 0,02
31000 1738,31 0,07 0,20 0,03
Problem:
The DataTable.Select method creates a "System.Data.Select" class instance internally, and this "Select" class creates indexes based on the fields (columns) specified in the query. The Select class makes re-use of the indexes it had created but the DataTable implementation does not re-use the Select class instance hence the indexes are re-created every time DataTable.Select is invoked. (This behaviour can be observed by decompiling System.Data)
Solution:
Assume the following query
DataRow[] rows = data.Select("COL1 = 'VAL1' AND (COL2 = 'VAL2' OR COL2 IS NULL)");
Instead, create and fill a Dictionary with keys corresponding to the different value combinations of the values of the columns used as the filter. (This relatively expensive operation must be done only once and the dictionary instance must then be re-used)
Dictionary<string, List<DataRow>> di = new Dictionary<string, List<DataRow>>();
foreach (DataRow dr in data.Rows)
{
string key = (dr["COL1"] == DBNull.Value ? "<NULL>" : dr["COL1"]) + "//" + (dr["COL2"] == DBNull.Value ? "<NULL>" : dr["COL2"]);
if (di.ContainsKey(key))
{
di[key].Add(dr);
}
else
{
di.Add(key, new List<DataRow>());
di[key].Add(dr);
}
}
Query the Dictionary (multiple queries may be required) to filter the rows and combine the results into a List
string key1 = "VAL1//VAL2";
string key2 = "VAL1//<NULL>";
List<DataRow>() results = new List<DataRow>();
if (di.ContainsKey(key1))
{
results.AddRange(di[key1]);
}
if (di.ContainsKey(key2))
{
results.AddRange(di[key2]);
}
I know this is an old question, and code underpinning this issue may have changed, but I've recently encountered (and gain some insight into) this very issue.
For anyone coming along at a later date ... here's what I found.
Performance of the DataTable.Select(condition) is quite sensitive to the nature and structure of the 'condition' you provide. This looks like a bug to me (where would I report it to Microsoft?) but it may merely be a quirk.
I've written a set of tests to demonstrate the issue that are structured as follows:
Define a datatable with a few simple columns,like this:
var dataTable = new DataTable();
var idCol = dataTable.Columns.Add("Id", typeof(Int32));
dataTable.Columns.Add("Code", typeof(string));
dataTable.Columns.Add("Name", typeof(string));
dataTable.Columns.Add("FormationDate", typeof(DateTime));
dataTable.Columns.Add("Income", typeof(Decimal));
dataTable.Columns.Add("ChildCount", typeof(Int32));
dataTable.Columns.Add("Foreign", typeof(Boolean));
dataTable.PrimaryKey = new DataColumn[1] { idCol };
Populate the table with 40000 records, each with a unique 'Code' field.
Perform a batch of 'selects' (each with different parameters) against the datatable using two similar, but differently formatted, queries and record and compare the total time taken by each of the two formats.
You get remarkable results. Testing, for example, the below two conditions side-by-side:
Q1: [Code] = 'XX'
Q2: ([Code] = 'XX')
[ I do multiple Select calls using the above two queries, each iteration I replace the XX with a valid code that exists in the datatable ]
The result?
Time comparison for 320 lookups against 40000 records: 180 msec total search time with no brackets, 6871 msec total search time for search WITH brackets
Yes - 38 times slower if you just have the extra brackets surrounding the condition.
There are other scenarios which react differently.
For example,
[Code] = '{searchCode}' OR 1=0 vs ([Code] = '{searchCode}' OR 1=0) take similar (slow) times to execute, but:
[Code] = '{searchCode}' AND 1=1 vs ([Code] = '{searchCode}' AND 1=1) again shows the non-bracketed version to be close to 40 times faster.
I've not investigated all scenarios, but it seems that the introduction of brackets - either redundantly around a simple comparison check, or as required to specify sub-expression precedence - or the presence of an 'OR' slows the query down considerably.
I could speculate that the issue is caused by how the datatable parses the condition you use and how it creates and uses internal indexes ... but I won't.
You can speed it up a lot by using a dictionary. For example:
if (distinctdt.Rows.Count > 0)
{
// build index of C1 values to speed inner loop
Dictionary<string, DataRow> masterIndex = new Dictionary<string, DataRow>();
foreach (DataRow row in rawMasterdt.Rows)
masterIndex[row["C1"].ToString()] = row;
int count = 0;
foreach (DataRow offer in distinctdt.Rows)
{
Then in place of
string exp = "C1 = " + "'" + offer[0].ToString() + "'" + "";
DataRow masterRow = rawMasterdt.Select(exp)[0];
You would do this
DataRow masterRow;
if (masterIndex.ContainsKey(offer[0].ToString())
masterRow = masterIndex[offer[0].ToString()];
else
masterRow = null;
If you create a DataRelation between your parent and child DataTables, you can look up child rows by invoking DataRow.GetChildRows(DataRelation) on the parent row (resp. DataRow.GetChildRelName in case of typed DataSets). The search will apply a TreeMap lookup, and performance should be fine even with a lot of child rows.
In case you have to search for rows based on other criteria than on a DataRelation's foreign keys, I recommend to use DataView.Sort / DataView.FindRows() instead of DataTable.Select(), as soon as you have to query the data more than once. DataView.FindRows() will be based on TreeMap lookup (O(log(N)), where as DataTable.Select() has to scan all rows (O(N)). This article contains more details: http://arnosoftwaredev.blogspot.com/2011/02/when-datatableselect-is-slow-use.html
DataTables can be made to have Relationships with other DataTables in a DataSet. See http://msdn.microsoft.com/en-us/library/ay82azad%28VS.71%29.aspx for a bit of discussion and as a start point to browsing. I've not much experience of using them but as I understand it they will do what you want (assuming your tables are in a suitable format). I would assume that these have greater efficiency than a manual process of doing the same but I may be wrong. Might be worth seeing if they work for you and benchmarking to see if they are an improvement or not...
Have you ran it through a profiler? That should be the first step. Anyhow, this might help:
Read the master text file into memory line by line. Put the master record into a dictionary as the key. Add it to the dataset (1 pass through master).
Read child text file line by line, add this as a value for the appropriate master record in the dictionary created above
Now you have everything in the dictionary in memory, only doing 1 pass through each file.
Do a final pass through the dictionary/children and process each column and perform final calcs.

Categories

Resources