I am in the process of moving a database from one server to another. But now I am getting the error 'Invalid column name msrepl_tran_version'. This is a column which I deleted off the new database as it was related to replication which I no longer need.
I have recreated the Datasets, done a search for anything with msrepl_tran_version in the entire solution and nothing. I can't see where it is referencing this column from, it doesn't exist!
Any ideas would be much appreciated.
Thanks.
It is a transactional replication column and not straight forward to remove, it seems you haven't totally removed it...
how to remove msrepltranversion column
Related
The more I read on this, the more confused I get, so hope someone can help. I have a complex database setup, which sometimes produces the error on update:
"Concurrency violation: the UpdateCommand affected 0 of the expected 1 records"
I say sometimes, because I cannot recreate conditions to trigger it consistently. I have a remote mySQL database connected to my app through the DataSource Wizard, which produces the dataset, tables and linked DataTableAdapters.
My reading suggests that this error is meant to occur when there is more than one open connection to the database trying to update the same record? This shouldn't happen in my instance, as the only updates are sequential from my app.
I am wondering whether it has something to do with running the update from a background worker? I have my table updates in one, for example, thusly:
Gi_gamethemeTableAdapter.Update(dbDS.gi_gametheme)
Gi_gameplaystyleTableAdapter.Update(dbDS.gi_gameplaystyle)
Gi_gameTableAdapter.Update(dbDS.gi_game)
These run serially in the backgroundworker, however, so unsure about this. The main thread also waits for it to finish, and there are no other db operations going on before or after this is started.
I did read about going into the dataset designer view, choosing "configure" in the datatableadapter > advanced options and setting "Use optimistic concurrency" to false. This might have worked (hard to say because of the seemingly random nature of the error), however, there are drawbacks to this that I want to avoid:
I have around 60 tables. I don't want to do this for each one.
I sometimes have to re-import the mysql schema into the dataset designer, or delete a table and re-add it. This would obviously lose this setting and I would have to remember to do it on all of them again, potentially. I also can't find a way to do this automatically in code.
I'm afraid I'm not at code level in terms of the database updates etc, relying on the Visual Studio wizards. It's a bit late to change the stack as well (e.g. can't change to Entity Framework etc).
SO my question is:
what is/how can I find what's causing the error?
What can I do about it?
thanks
When you have tableadapters that download data into datatables, they can be configured for optimistic concurrency
This means that for a table like:
Person
ID Name
1 John
They might generate an UPDATE query like:
UPDATE Person SET Name = #newName WHERE ID = #oldID AND Name = #oldName
(In reality they are more complex than this but this will suffice)
Datatables track original values and current values; you download 1/"John", and then change the name to "Jane", you(or the tableadapter) can ask the DT what the original value was and it will say "John"
The datatable can also feed this value into the UPDATE query and that's how we detect "if something else changed the row in the time we had it" i.e. a concurrency violation
Row was "John" when we downloaded it, we edited to "Jane", and went to save.. But someone else had been in and changed it to "Joe". Our update will fail because Name is no longer "John" that it was (and we still think it is) when we downloaded it. By dint of the tableadapter having an update query that said AND Name = #oldName, and setting #oldName parameter to the original value somedatarow["Name", DataRowVersion.Original].Value (i.e. "John") we cause the update to fail. This is a useful thing; mostly they will succeed so we can opportunistically hope our users can update our db without needing to get into locking rows while they have them open in some UI
Resolving the cases where it doesn't work is usually a case of coding up some strategy:
My changes win - don't use an optimistic query that features old values, just UPDATE and erase their changes
Their changes win - cancel your attempts
Re-download the latest DB state and choose what to do - auto merge it somehow (maybe the other person changed fields you didn't), or show the user so they can pick and choose what to keep etc (if both people edited the same fields)
Now you're probably sat there saying "but noone else changes my DB" - we can still get this though, if the database has changed some values upon one save and you don't have the latest ones in your dataset..
There's another option in the tableadapter wizardd - "refresh the dataset" - it's supposed to run a select after a modification to import any latest database calculated values (like auto inc primary keys or triggers/defaults/etc). Some query like INSERT INTO Person(Name) VALUES(#name) is supposed to silently have a SELECT * FROM PERSON WHERE ID = last_inserted_id() tagged on the end of it to retrieve the latest values
Except "refresh the dataset" doesn't work :/
So, while I can't tell you exactly why youre getting your CV exception, I hope that explaining why they occur and pointing out that there are sometimes bugs that cause them (insert new record, calculated ID is not retreieved, edit this recent record, update fails because data wasn't fresh) will hopefully arm you with what you need to find the problem: when you get one, keep the app stopped on the breakpoint and inspect the datarow: take a look at the query being run and what original/current values are being put as parameters - inspect the original and current values held by the row using the overload of the Item indexer that allows you to state the version you want and look in the DB
Somewhere in all of that there will be the mismatch that explains why 0 records were updated - the db has "Joe" as the name or 174354325 as the ID, your datarow has "John" as the original name or -1 as the ID (it never refreshed), and the WHERE clause is finding 0 records as a result
Some of your tables will contain a field that is marked as [ConcurrencyCheck] or [TimeStamp] concurrency token.
When you update a record, the SQL generated will include a WHERE [ConcurrencyField]='Whatever the value was when the record was retrieved'.
If that record was updated by another thread or process or something other than the current thread, then your UPDATE will return 0 records updated, rather than the 1 (or more) that was expected.
What can you do about it? Firstly, put a try/catch(DbConcurrencyException) around your code. Then you can re-read the offending record and try and update it again.
I have a table in an Excel worksheet where I need to programatically remove entire rows using VSTO. After a lot of searching here and everywhere else, I was unable to find the answer. Due to some unrelated code, I also cannot delete the first row of the table, but need to remove all other rows.
Here are the specific requirements:
One of the functions of this addin is to populate the table. This is done through a loop starting with the "root" named range in the left column of the first row of the table.
Whenever populating the table, I first need to delete all data from the table and then add the new data. I need to use the "root" to add the data, so I can't have it deleted.
I am using the Table for the automated formatting instead of formatting the table manually after adding each cell.
I never know how many rows will be added, but it will always be at least one.
After banging my head on this for a few hours, I slept on it and came at it refreshed this morning. After much trial and error, here is the code I came up with.
var deplTable = ThisSheet.Evaluate("DeploymentTable");
if (deplTable.ListObject.ListRows.Count > 1)
{
do deplTable.ListObject.ListRows[2].Delete();
while (deplTable.ListObject.ListRows.Count > 1);
}
NOTE: ThisSheet is set to the correct sheet earlier. The application works on multiple sheets, so it needs to be flexible.
I tried this a few ways before finally getting it to work. Looping through the rows gave unexpected results; possibly due to timing issues between Excel and VSTO.
Hope this helps other people!
The error occurs when I try populating a datasets datatable.
ds = dsMyDataset;
sqlAdapter.Fill(ds, dsMyDataset.MyTable.TableName); // Uses the SQLAdapters 'select command'
On fill, I get this error:
Additional information: 'Failed to enable constraints. One or more rows contain values violating non-null, unique, or foreign-key constraints.'
I run this command in the command window:
dsMyDataset.MyTable.GetErrors()[0].RowError
And it informs me that "Column 'XMLHistory' exceeds the MaxLength limit."
Odd because in the dataset's length property specifies a length of '2147483647'for that field.
So in my sqladapter I specify a size:
sqlComm.Parameters.Add("#XMLHistory", SqlDbType.Text, 2147483647, "XMLHistory");
But this apparently does not help my issue out. Also noticed that the size in the DB for that column is '16'.
Now here's an odd thing I noticed. What this program allows me to do is archive data from one DB to another. So I have a list of tables that the user can select from to archive. What's odd is that if I select one table it archives just fine. This issue as I've listed above doesn't actually fire until after the first table archives and the system attempts to move on to the second table. The moment it tries to fill the DS, booooom. Dead.
Any suggestions/thoughts appreciated. Cheers.
I just had the same error when reading a database in an old project. Was weird because it was reporting cannot exceed length 110 on column B but in the DB design that was the size of its adjacent field A. I tried a reconfigure on the tabeladapter in the dataset designer and it still acted cross wired on the field max sizes. I then just dropped the table in the designer and re-added it. Then it worked. The database table design had been modified in the database since the xsd was created.
So I figured out the issue. One of the xmls that ties to the backend of the dataset itself had a node specifying max length. It was set to 7700. This 7700 was overwriting the property setting in the designer. Not sure how it got set to begin with, but it was super frustrating.
Thanks all for the help.
I have long field in my table. When I make the copies by bulkcopy the following message appears.
ORA-26089: LONG column "CORPO_EMAIL" must be specified last
I do not understand the reason for this error. Is there any special configuration I need to do for this column in a DataTable?
To resolve the issue you need to specify the LONG column ie, CORPO_EMAIL at the last.
The cause of this error is that a client of the direct path API specified a LONG column to be loaded, but the LONG column was not the last column to be specified.
Also check How to solve Error Code ORA-26089.
I'm asking about a problem with C# and Visual Studio 2012; I'm trying to resize a column of a table in my database, effect with the dataTableAdapter on a Dataset.xsd
I'm using DataTableAdapter from a stored procedure with a SELECT statement to populate a DataGridView, reports and many more.
I created the table long time ago, but now there is an a problem with it.
I had to increase the length of a column and I changed the appropriate column length of the DataTable also. But it didn't give me the solution. still whenever I Fill or Get data through that DataTableAdapter it response with the previous (original) size of the column.
But when I create a new DataTable and redirect my code to the new DataTableAdapter, it works.
Why is this happening ?
Because redirecting code to the new DataTableAdapter is little bit difficult because I don't know all the places it use in the entire solution.
And also if can please tell me how to add new column to the table and deal with the DataTableAdapter with it also.
Thanks and waiting for your reply.
after doing small research with my friends , i got an proper way to fix this error.
not even re-sizing, but also any other change with the database Table or storedprocedure you have to reconfigure the dataTableAdaptor, unless it just work as , when it was created.
it will continue with major errors or sometimes, it will function incorrectly, even you cant figure out there is an error.
so whenever you do any change with the database Table or storedprocedure go to the dataSet.xsd , where the dataTable locate and right on the dataTable , then configure it again.
this saved me and worked.