I have a query similar to the one on msdn site:
// Query the database for the row to be updated.
var query =
from ord in db.Orders
where ord.OrderID == 11000
select ord;
// Execute the query, and change the column values
// you want to change.
foreach (Order ord in query)
{
ord.ShipName = "Mariner";
ord.ShipVia = 2;
// Insert any additional changes to column values.
}
// Submit the changes to the database.
try
{
db.SubmitChanges();
}
catch (Exception e)
{
Console.WriteLine(e);
// Provide for exceptions.
}
What I want now, is a way to know the affected rows of last update command. I've tried using:
int affectedRows = dc.GetChangeSet().Updates.Count;
in various manner, but this instruction always getting me 0, even if the table is correctly updated.
dc.GetChangeSet() tells you how many changes your LINQ to SQL context is planning to make when you call SubmitChanges(). It does not track the number of affected rows as reported by the database.
If you call int affectedRows = dc.GetChangeSet().Updates.Count; before calling SubmitChanges, you will see how many rows it expects to have affected. After calling SubmitChanges, there are no more pending changes, so you'll always get a zero count.
Related
I have a problem and I don't know why it's happening!
I am trying to find Max value in a column inside the database table, I use this code:
private void FrmCustomer_New_Load(object sender, EventArgs e)
{
int NewID;
DataTable _Dt = CusVar.GetCustomersIDs();
if (_Dt.Rows.Count == 0)
{
NewID = 1;
this.txt1.Text = NewID.ToString();
DataRow _Row = CusVar.GetCustomersIDs().Rows[0];
MessageBox.Show(Convert.ToString(_Dt.Rows[0]["MaxID"]));
}
}
the code is working but it gives 1 although there are no records in the table?
I use C# and Access database ACCDB.
I use this function in Cls_Customers:
public DataTable GetCustomersIDs()
{
DAL.DataAccessLayer DAL = new DAL.DataAccessLayer();
DataTable Dt = new DataTable();
Dt = DAL.DataSelect("Select Max(CustomerID) AS MaxID From TblCustomers", null);
DAL.CloseConn();
return Dt;
}
what is the problem, please?
This is your query:
Select Max(CustomerID) AS MaxID
From TblCustomers
It is an aggregation query. Aggregation queries with no group by always return one row -- regardless of whether any rows match.
The value returned in the single row is NULL for MaxId.
I am highly suspicious of what you want to do. If you want the maximum id -- and no rows if the table is empty -- then do:
select c.CustomerID
from TblCustomers c
order by c.CustomerID desc
fetch first 1 row only;
(This uses ANSI/ISO standard syntax so the exact logic might depend on your database.)
My suspicion is that you then want to use this id for an insert -- and that is a bad approach. Almost all databases support some sort of auto-incremented column (with syntax elements such as auto_increment, serial, or identity). That is the right way to assign a unique incrementing id to a column in a table.
The Sql query might be returning null and if there are no rows that match the criteria.
If the intention is to find the number of rows returned.Change the sql query to return the count instead of using aggregate function.
Try with this:
SELECT MAX(T1.CustomerID) AS MaxID From TblCustomers T1
Similar to this question I am running through a datatable, using the data to fill a new dataset for the purposes of data migration.
The migration inserts into a data set then every 5000 records the added rows get saved to the database using EricEJ SqlCeBulkCopy method.
My problem is that for the first amount of records (5000 ish) the average milliseconds taken per record is around 150-200, but it gradually increases. at record 11000 this figure is now at around 475 milliseconds.
I have a typed data set with EnforceConstraints turned off.
The actual database write always takes less than a second so I am pretty sure it is not the database itself, so I am left with the code taken longer to run each iteration, which could be down to the code itself or something I am not realising about datasets.
Could the dataset be increasing the time because it is using indexes or some keys that are not turned off by using the EnforceConstraints = false property?
One other thought is that I am checking to see if a record exists before inserting it, so I have tried both the Linq methods .ANY() and FirstOrDefault() != null
I iterate through a datatable, for each record I read some values then pass them to this method.
private int MigrateItems(string reference, string brand, string captureSite, string captureOperator, DateTime captureDate, DateTime addedDate, DateTime updatedDate, bool retain)
{
//prepare the inputs
reference = reference.Trim();
int brandID = -1, databaseUpdateID = -1, captureID = -1, insertedRowID = -1;
//get the foreign keys
brandID = MigrateBrands(brand);
databaseUpdateID = MigrateDatabaseUpdates(reference);
captureID = MigrateCaptures(captureSite, captureOperator, captureDate);
//if the item doesn't exist then add it
bool exists = dataSet.Item.FirstOrDefault(a => string.Equals(a.Reference, reference, StringComparison.CurrentCultureIgnoreCase)) == null ? false : true; ;
if (exists == false)
{
var insertedRow = dataSet.Item.AddItemRow(brandID, databaseUpdateID, captureID, reference, retain, updatedDate, addedDate);
insertedRowID = insertedRow.ID;
}
else insertedRowID = dataSet.Item.Single(a => string.Equals(a.Reference, reference, StringComparison.CurrentCultureIgnoreCase)).ID;
return insertedRowID;
}
Once 5000 records have been iterated or all records have been done then I call this method:
private void BulkInsertData()
{
using (var bulkCopier = new SqlCeBulkCopy(connectionString))
{
bulkCopier.DestinationTableName = dataSet.Brand.TableName;
bulkCopier.WriteToServer(dataSet.Brand.Where(a => a.RowState == DataRowState.Added).AsEnumerable());
//(same code for all the tables)
//change all row states to unchanged
dataSet.AcceptChanges();
}
}
I'm using the following:
C#
Visual Studio 2012
Sql Server Ce 4.0
I have come across a problem in using the DataAdapter, which I hope someone can help with. Basically I am creating a system, which is as follows:
Data is read in from a data source (MS-Access, SQL Server or Excel), converted to data tables and inserted into a local SQL Server database, using DataAdapters. This bit works fine. The SQL server table has a PK, which is an identity field with auto increment set to on.
Subsequent data loads read in the data from the source and compare it to what we already have. If the record is missing then it is added (this works fine). If the record is different then it needs to be updated (this doesn't work).
When doing the differential data load I create a data table which reads in the schema from the destination table (SQL server) and ensures it has the same columns etc.
The PK in the destination table is column 0, so when a record is inserted all of the values from column 1 onwards are set (as mentioned this works perfectly.). I don't change the row status for items I am adding. The PK in the data table is set correctly and I can confirm this.
When updating data I set column 0 (the PK column) to be the value of the record I am updating and set all of the columns to be the same as the source data.
For updated records I call AcceptChanges and SetModified on the row to ensure (I thought) that the application calls the correct method.
The DataAdapter is set with SelectCommand and UpdateCommand using the command builder.
When I run, I have traced it using SQL profiler and can see that the insert command is being ran correctly, but the update command isn't being ran at all, which is the crux of the problem. For reference an insert table will look something like the following
PK Value1 Value 2 Row State
== ====== ======= =========
124 Test1 Test 2 Added
123 Test3 Test4 Updated
Couple of things to be aware of....
I have tested this by loading the row to be changed into the datatable, changing some column fields and running update and this works. However, this is impractical for my solution because the data is HUGE >1Gb so I can't simply load it into a datatable without taking a huge performance hit. What I am doing is creating the data table with a max of 500 rows and the running the Update. Testing during the initial data load showed this to be the most efficient in terms of memory useage and performance. The data table is cleared after each batch is ran.
Anyone any ideas on where I am going wrong here?
Thanks in advance
Andrew
==========Update==============
Following is the code to create the insert/update rows
private static void AddNewRecordToDataTable(DbDataReader pReader, ref DataTable pUpdateDataTable)
{
// create a new row in the table
DataRow pUpdateRow = pUpdateDataTable.NewRow();
// loop through each item in the data reader - setting all the columns apart from the PK
for (int addCount = 0; addCount < pReader.FieldCount; addCount++)
{
pUpdateRow[addCount + 1] = pReader[addCount];
}
// add the row to the update table
pUpdateDataTable.Rows.Add(pUpdateRow);
}
private static void AddUpdateRecordToDataTable(DbDataReader pReader, int pKeyValue,
ref DataTable pUpdateDataTable)
{
DataRow pUpdateRow = pUpdateDataTable.NewRow();
// set the first column (PK) to the value passed in
pUpdateRow[0] = pKeyValue;
// loop for each row apart from the PK row
for (int addCount = 0; addCount < pReader.FieldCount; addCount++)
{
pUpdateRow[addCount + 1] = pReader[addCount];
}
// add the row to the table and then update it
pUpdateDataTable.Rows.Add(pUpdateRow);
pUpdateRow.AcceptChanges();
pUpdateRow.SetModified();
}
The following code is used to actually do the update:
updateAdapter.Fill(UpdateTable);
updateAdapter.Update(UpdateTable);
UpdateTable.AcceptChanges();
The following is used to create the data table to ensure it has the same fields/data types as the source data
private static DataTable CreateDataTable(DbDataReader pReader)
{
DataTable schemaTable = pReader.GetSchemaTable();
DataTable resultTable = new DataTable(<tableName>); // edited out personal info
// loop for each row in the schema table
try
{
foreach (DataRow dataRow in schemaTable.Rows)
{
// create a new DataColumn object and set values depending
// on the current DataRows values
DataColumn dataColumn = new DataColumn();
dataColumn.ColumnName = dataRow["ColumnName"].ToString();
dataColumn.DataType = Type.GetType(dataRow["DataType"].ToString());
dataColumn.ReadOnly = (bool)dataRow["IsReadOnly"];
dataColumn.AutoIncrement = (bool)dataRow["IsAutoIncrement"];
dataColumn.Unique = (bool)dataRow["IsUnique"];
resultTable.Columns.Add(dataColumn);
}
}
catch (Exception ex)
{
message = "Unable to create data table " + ex.Message;
throw new Exception(message, ex);
}
return resultTable;
}
In case anyone is interested I did manage to get around the problem, but never managed to get the data adapter to work. Basically what I did was as follows:
Create a list of objects with an index and a list of field values as members
Read in the rows that have changed and store the values from the source data (i.e. the values that will overwrite the current ones in the object). In addition I create a comma separated list of the indexes
When I am finished I use the comma separated list in a sql IN statement to return the rows and load them into my data adapter
For each one I run a LINQ query against the index and extract the new values, updating the data set. This sets the row status to modified
I then run the update and the rows are updated correctly.
This isn't the quickest or neatest solution, but it does work and allows me to run the changes in batches.
Thanks
Andrew
We used the VS2010 data connection design wizard to define a connection to a SQLCe database with just one table (database="UserMetrics" table="User"). Try as we might the update doesn't seem to hold, I've been through posts on SO and MSDN but can't see the glaring error...
//initialize
UserMetricsDataSet umDataSet = new UserMetricsDataSet( );
UserMetricsDataSetTableAdapters.UserTableAdapter umTableAdapter = new UserMetricsDataSetTableAdapters.UserTableAdapter( );
// check that test data is there and count is correct
umTableAdapter.Fill( umDataSet.User );
UserMetricsDataSet.UserRow umRow = (UserMetricsDataSet.UserRow)umds.User.Rows[0];
int count = umDataSet.User.Rows.Count; //yep its there
//lets add some rows
for (int i = 0; i < 100; i++)
umDataSet.User.AddUserRow( "smith", (float)54, (float)3, 1);
umds.User.AcceptChanges( );
//umTableAdapter.Update(umDataSet.User); //tried this also ... no change...
// there are now 101 rows !!
int count = umDataSet.User.Rows.Count; //yep its there
umRow = (UserMetricsDataSet.UserRow)umds.User.Rows[101];
//lets double check
umTableAdapter.Fill( umDataSet.User );
int count = umds.UserMetris.Rows.Count; //huh!!! count==1 ???
I'm assuming umDataSet is the same thing as umds.
What about?
UserMetricsDataSet.UserRow umRow = umDataSet.User.NewRow();
umRow["Name"] = "smith";
// etc
umDataSet.User.ImportRow(umRow);
umDataSet.User.AcceptChanges();
Or what about reversing the order? The TableAdapter.Update() method sends updates back to the database based on the changes in the DataTable. If you accept the changes first, then the RowState is reset on each DataTable row, so there are no updates found to send back to the database.
From MSDN:
In order to send the modified data to a database, you call the Update method of a TableAdapter. The adapter's Update method will update a single data table and execute the correct command (INSERT, UPDATE, or DELETE) based on the RowState of each data row in the table.
try
{
umTableAdapter.Update(umDataSet.User);
umDataSet.User.AcceptChanges();
}
catch (Exception ex)
{
// TableAdapter.Update() can throw exceptions
}
Fairly simple problem, I have a table in a database that contains one column and fourteen rows.
When trying to return all of the rows in the database I try the following:
command = new SQLiteCommand("SELECT Value FROM Currency", connection);
Yet when I look at the amount of results affected (And the lack of elements in my array) I notice the following:
Apparently there's nothing in there, I've even checked with two seperate tools that confirm the data is in there. Am I executing this incorrectly? I simply want to iterate over the values returned and store them in an array.
Thanks for your time!
Edit:
Solution!
int i = 0;
while (dReader.Read())
{
_data[i] = Convert.ToSingle(dReader[0]);
i++;
}
This works fine :)
SqlDataReader.RecordsAffected is not affected for select queries.
Gets the number of rows changed,
inserted, or deleted by execution of
the Transact-SQL statement.
EDIT:
while ( dReader.Read() )
{
Console.WriteLine("Value " + dReader[0]);
}
Shouldn't you be doing a
while ( dReader.Read() )
{
....
}
to loop through your resultset?