SQLite Select statement returns nothing - C# - c#

Fairly simple problem, I have a table in a database that contains one column and fourteen rows.
When trying to return all of the rows in the database I try the following:
command = new SQLiteCommand("SELECT Value FROM Currency", connection);
Yet when I look at the amount of results affected (And the lack of elements in my array) I notice the following:
Apparently there's nothing in there, I've even checked with two seperate tools that confirm the data is in there. Am I executing this incorrectly? I simply want to iterate over the values returned and store them in an array.
Thanks for your time!
Edit:
Solution!
int i = 0;
while (dReader.Read())
{
_data[i] = Convert.ToSingle(dReader[0]);
i++;
}
This works fine :)

SqlDataReader.RecordsAffected is not affected for select queries.
Gets the number of rows changed,
inserted, or deleted by execution of
the Transact-SQL statement.
EDIT:
while ( dReader.Read() )
{
Console.WriteLine("Value " + dReader[0]);
}

Shouldn't you be doing a
while ( dReader.Read() )
{
....
}
to loop through your resultset?

Related

Insert and ignore duplications

I have a list of rows that I want to insert in one batch (add X rows, single call to SaveChanges). Unfortunately, from time to time, some of the items in the list already exist. Since the insertion process is taking place in a transaction (all or nothing), nothing gets added.
Code to show the idea:
using (var context = new CacheDbContext())
{
context.Counter.Add(new Counter
{
Id = "test-3",
CounterType = "test",
Expiry = DateTime.UtcNow.AddHours(1),
Value = 0
});
context.Counter.Add(new Counter
{
Id = "test-2",
CounterType = "test",
Expiry = DateTime.UtcNow.AddHours(1),
Value = 0
});
await context.SaveChangesAsync().ConfigureAwait(false);
}
My goal is to do an insertion and if the one or more items are already exists - ignore.
The naive solution is to check if the ID is exists, before insert it. This will work, but it has poor performance. I want to execute multiple inserts, with one call.
I know that it possible using SQL like this:
INSERT INTO table_name(c1)
VALUES(c1)
ON DUPLICATE KEY UPDATE c1 = VALUES(c1) + 1;
if my EF will translate my SQL INSERT statement for something like this, it will be good for me.
Is this possible?
Any other solution will be welcome.

C#: DataTable getting only one row of the search result

I'm having a sudden and strange problem with DataTable. I'm using C# with MySQL database to develop a system, and I'm trying to export custom reports. The problem is that, somehow, my DataTable is getting only one result (I've tested my query on MySQL and should be something like 30 results on the xls file and the DataTable).
Strangely, these functions are used in other parts of the system to export other kinds of reports, and work perfectly. This is the select function that I'm using:
public DataTable selectBD(String tabela, String colunas) {
var query = "SELECT " + colunas + " FROM " + tabela;
var dt = new DataTable();
Console.WriteLine("\n\n" + query + "\n\n");
try
{
using (var command = new MySqlCommand(query, bdConn)) {
MySqlDataReader reader = command.ExecuteReader();
dt.Load(reader);
reader.Close();
}
}
catch (MySqlException) {
return null;
}
bdConn.Close();
return dt;
}
And this is my query:
SELECT
cpf_cnpj, nomeCliente, agenciaContrato, contaContrato,
regionalContrato, carteiraContrato, contratoContrato,
gcpjContrato, avalistaContrato, enderecoContrato,
telefoneContrato, dataChegadaContrato, dataFatoGerContrato,
dataPrimeiraParcelaContrato, dataEmissaoContrato, valorPlanilhaDebitoContrato
FROM
precadastro
INNER JOIN
contrato
ON precadastro.cpf_cnpj = contrato.FK_cpf_cnpj
LEFT JOIN faseprocessual
ON contrato.idContrato = faseprocessual.FK_idContrato
And that is the result of the query on SQLyog
I've tested and the DataTable returned by the function only receive the one row, and it's not the first row of the MySQL results. Someone had this kind of problem before?
DataTable load expects primary key from your data (supplied by DataReader) and tries to guess it from passed rows. Since there's no such key, Load method guesses it's the first column (cpf_cnpj). But, values in that column aren't unique so the each row gets overwritten by next one, and the result is just one row in your DataTable.
It's the issue that persist for years, and I'm not sure there's one solution to rule them all. :)
You can try:
change query so that some unique values get into first column (unfortunately, I can't see something unique in your screenshot) or concatenate two or more values to get unique value.
Prepare DataTable by yourself by creating columns (this mirroring structure of resultset) and then iterate through DataReader to copy data.
add some autoincrement value in your query (or make temporary table with auto_increment column then fill that table)
Last suggestion could be something like this (I haven't worked much with mySql, so this is some suggestion i have googled :)):
SELECT
#i:=#i+1 AS id,
cpf_cnpj, nomeCliente, agenciaContrato, contaContrato,
regionalContrato, carteiraContrato, contratoContrato,
gcpjContrato, avalistaContrato, enderecoContrato,
telefoneContrato, dataChegadaContrato, dataFatoGerContrato,
dataPrimeiraParcelaContrato, dataEmissaoContrato, valorPlanilhaDebitoContrato
FROM
precadastro
INNER JOIN
contrato
ON precadastro.cpf_cnpj = contrato.FK_cpf_cnpj
LEFT JOIN faseprocessual
ON contrato.idContrato = faseprocessual.FK_idContrato
CROSS JOIN (SELECT #i:= 0) AS i
here's answer on SO which uses auto number in query.

Optimizing query that uses AsEnumerable and SingleOrDefault

Not long ago there was a feature request in the program I am maintaining. Basically it has to fill up a table in the database with info from a text file. These files can be pretty big, but it was fairly easy to do because these files were defined as the complete list of user data. Therefore the table could be truncated and the just filled up again with data from the text file.
But then a week ago it was decided that these files are actually updates of current user info, so now I have to retrieve the correct MeteringPointId (which only exist once if it does exist) and then update info on it. If it doesn't exist, just insert data as before.
The way I do this is retrieving the complete database table with data from the database into memory and then just updating on that info before finally saving the changes by calling the datatables update function. It works fine, except that finding the row with the MeteringPointId is slow:
DataRow row = MeteringPointsDataTable.NewRow();
// this is called for each line in the text file to find the corresponding MeteringPointId. It can be 300.000 times.
row = MeteringPointsDataTable.AsEnumerable().SingleOrDefault(r => r.Field<string>("MeteringPointId").ToString() == MeteringPointId);
Is there a way to retrieve a DataRow from a DataTable that is faster than this?
If you are sure that only one item con fullfil the condition use FirstOrDefault instead of Single. Thus you won´t collect the whole table but only the first entry you´ve found.
You can use Select method of DataTable.
var expression = "[MeteringPointId] = '" + MeteringPointId + "'";
DataRow[] result = MeteringPointsDataTable.Select(expression);
Also you can create an expression like,
var idList = new []{"id1", "id2", "id3", ...};
var expression = "[MeteringPointId] in " + string.Format("({0})", string.Join(",", idList.Select(i=> "'"+i+"'")));
Similar usage is here
Hope it helps..
You could put the whole table in a dictionary:
//At the start
var meteringPoints = MeteringPointsDataTable.AsEnumerable().ToDictionary(r => r.Field<string>("MeteringPointId").ToString());
//For each row of the text file:
DataRow row;
if (!meteringPoints.TryGetValue(MeteringPointId, out row))
{
row = MeteringPointsDataTable.NewRow();
meteringPoints[MeteringPointId] = row;
}

check if values are in datatable

I have an array or string:
private static string[] dataNames = new string[] {"value1", "value2".... };
I have table in my SQL database with a column of varchar type. I want to check which values from the array of string exists in that column.
I tried this:
public static void testProducts() {
string query = "select * from my table"
var dataTable = from row in dt.AsEnumerable()
where String.Equals(row.Field<string>("columnName"), dataNames[0], StringComparison.OrdinalIgnoreCase)
select new {
Name = row.Field<string> ("columnName")
};
foreach(var oneName in dataTable){
Console.WriteLine(oneName.Name);
}
}
that code is not the actual code, I am just trying to show you the important part
That code as you see check according to dataNames[index]
It works fine, but I have to run that code 56 times because the array has 56 elements and in each time I change the index
is there a faster way please?
the Comparison is case insensitive
First, you should not filter records in memory but in the datatabase.
But if you already have a DataTable and you need to find rows where one of it's fields is in your string[], you can use Linq-To-DataTable.
For example Enumerable.Contains:
var matchingRows = dt.AsEnumerable()
.Where(row => dataNames.Contains(row.Field<string>("columnName"), StringComparer.OrdinalIgnoreCase));
foreach(DataRow row in matchingRows)
Console.WriteLine(row.Field<string>("columnName"));
Here is a more efficient (but less readable) approach using Enumerable.Join:
var matchingRows = dt.AsEnumerable().Join(dataNames,
row => row.Field<string>("columnName"),
name => name,
(row, name) => row,
StringComparer.OrdinalIgnoreCase);
try to use contains should return all value that you need
var data = from row in dt.AsEnumerable()
where dataNames.Contains(row.Field<string>("columnName"))
select new
{
Name = row.Field<string>("columnName")
};
Passing a list of values is surprisingly difficult. Passing a table-valued parameter requires creating a T-SQL data type on the server. You can pass an XML document containing the parameters and decode that using SQL Server's convoluted XML syntax.
Below is a relatively simple alternative that works for up to a thousand values. The goal is to to build an in query:
select col1 from YourTable where col1 in ('val1', 'val2', ...)
In C#, you should probably use parameters:
select col1 from YourTable where col1 in (#par1, #par2, ...)
Which you can pass like:
var com = yourConnection.CreateCommand();
com.CommandText = #"select col1 from YourTable where col1 in (";
for (var i=0; i< dataNames.Length; i++)
{
var parName = string.Format("par{0}", i+1);
com.Parameters.AddWithValue(parName, dataNames[i]);
com.CommandText += parName;
if (i+1 != dataNames.Length)
com.CommandText += ", ";
}
com.CommandText += ");";
var existingValues = new List<string>();
using (var reader = com.ExecuteReader())
{
while (read.Read())
existingValues.Add(read["col1"]);
}
Given the complexity of this solution I'd go for Max' or Tim's answer. You could consider this answer if the table is very large and you can't copy it into memory.
Sorry I don't have a lot of relevant code here, but I did a similar thing quite some time ago, so I will try to explain.
Essentially I had a long list of item IDs that I needed to return to the client, which then told the server which ones it wanted loaded at any particular time. The original query passed the values as a comma separated set of strings (they were actually GUIDs). Problem was that once the number of entries hit 100, there was a noticeable lag to the user, once it got to 1000 possible entries, the query took a minute and a half, and when we went to 10,000, lets just say you could boil the kettle and drink your tea/coffee before it came back.
The answer was to stick the values to check directly into a temporary table, where one row of the table represented one value to check against. The temporary table was keyed against the user who performed the search, so this meant other users searches wouldn't become corrupted with each other, and when the user logged out, then we knew which values in the search table could be removed.
Depending on where this data comes from will depend on the best way for you to load the reference table. But once it is there, then your new query will look something like:-
SELECT Count(t.*), rt.dataName
FROM table t
RIGHT JOIN referenceTable rt ON tr.dataName = t.columnName
WHERE rt.userRef = #UserIdValue
GROUP BY tr.dataName
The RIGHT JOIN here should give you a value for each of your reference table values, including 0 if the value did not appear in your table. If you don't care which one don't appear, then changing it to an INNER JOIN will eliminate the zeros.
The WHERE clause is to ensure that your search only returns the unique items that you are looking for at the moment - the design should consider that concurrent access will someday occur here (even if it doesn't at the moment), so writing something in to protect it is advisable.

How to delete all rows from datatable

I want to delete all rows from datatable with rowstate property value Deleted.
DataTable dt;
dt.Clear(); // this will not set rowstate property to delete.
Currently I am iterating through all rows and deleting each row.
Is there any efficient way?
I don't want to delete in SQL Server I want to use DataTable method.
We are using this way:
for(int i = table.Rows.Count - 1; i >= 0; i--) {
DataRow row = table.Rows[i];
if ( row.RowState == DataRowState.Deleted ) { table.Rows.RemoveAt(i); }
}
This will satisfy any FK cascade relationships, like 'delete' (that DataTable.Clear() will not):
DataTable dt = ...;
// Remove all
while(dt.Count > 0)
{
dt.Rows[0].Delete();
}
dt.Rows.Clear();
dt.Columns.Clear(); //warning: All Columns delete
dt.Dispose();
I typically execute the following SQL command:
DELETE FROM TABLE WHERE ID>0
Since you're using an SQL Server database, I would advocate simply executing the SQL command "DELETE FROM " + dt.TableName.
I would drop the table, fastest way to delete everything. Then recreate the table.
You could create a stored procedure on the SQL Server db that deletes all the rows in the table, execute it from your C# code, then requery the datatable.
Here is the solution that I settled on in my own code after searching for this question, taking inspiration from Jorge's answer.
DataTable RemoveRowsTable = ...;
int i=0;
//Remove All
while (i < RemoveRowsTable.Rows.Count)
{
DataRow currentRow = RemoveRowsTable.Rows[i];
if (currentRow.RowState != DataRowState.Deleted)
{
currentRow.Delete();
}
else
{
i++;
}
}
This way, you ensure all rows either get deleted, or have their DataRowState set to Deleted.
Also, you won't get the InvalidOperationException due to modifying a collection while enumerating, because foreach isn't used. However, the infinite loop bug that Jorge's solution is vulnerable to isn't a problem here because the code will increment past a DataRow whose DataRowState has already been set to Deleted.

Categories

Resources