SMO - PropertyNotSetException when accessing table properties - c#

I'm not sure if this is intended behaviour since I can't find it documented anywhere on MSDN, but this still seems a little bit odd.
I'm writing a program that makes use of SMO. For the sake of testing. I've basically got a test database, with a table in it (the structure is unimportant) that has the following trigger.
CREATE TRIGGER [dbo].[MyTableInsert] ON [dbo].[MyTable] AFTER INSERT
AS
BEGIN
PRINT 'after insert';
END
Now in my code. I expected to use this code to see (and prove) that the table 'MyTable' has an INSERT trigger. However running that code throws a PropertyNotSetException.
var table = new Table(database, tableName.ToString());
Console.WriteLine(table.HasInsertTrigger); // Throws PropertyNotSetException
But if I call the Refresh() method after initialising the table. The call to HasInsertTrigger returns true as excepted.
var table = new Table(database, tableName.ToString());
table.Refresh();
Console.WriteLine(table.HasInsertTrigger); // Returns true.
Making the call to Refresh seems unnecessary, but is it required as I can't find any documentation that says it needs to be called before accessing any properties.

I should have seen this a mile off... I was creating new instances of the tables rather than accessing existing ones.

Related

SqlCommandBuilder() creates insert/update for underlying tables instead for a view

I have two schemas, like this:
Schema 'data' --> holds tables, and nobody has access to them from outside
Schema 'ui' --> holds views which are accessibly from outside; the idea is that you can select/delete/update/insert on these views. Thus, I am doing ownership chaining.
For example:
create table data.tblTest (TestKey int not null primary key);
create view ui.vwTest as select * from data.tblTest;
Now, if I connect as a user with SQL Studio, everything is OK:
select * from ui.vwTest; -- WORKS (this is correct)
select * from data.tblTest; -- ERROR (this is correct)
insert into ui.vwTest (TestKey) values (17); -- WORKS (this is correct)
insert into data.tblTest (TestKey) values (17); -- ERROR (this is correct)
However, if I write a program in .NET/C# which uses SqlCommandBuilder:
SqlDataAdapter ada = new SqlDataAdapter('select * from ui.vwTest', conn);
SqlCommandBuilder b = new SqlCommandBuilder(mSQLAda);
ada.UpdateCommand = b.GetUpdateCommand();
ada.InsertCommand = b.GetInsertCommand();
ada.DeleteCommand = b.GetDeleteCommand();
==> Then in the following, the INSERT DOES NOT WORK!
[EDIT]:
The SqlCommandBuilder is analyzing the View, and instead of creating a command like
INSERT INTO ui.vwTest ...
it is creating
INSERT INTO data.tblTest ...
So in fact, the SqlCommandBuilder tries to be "intelligent" and accesses the underlying tables of the view, instead of accessing the view.
Question: Can this behaviour be changed ?
BTW, just to make it more clear, I am doing ownership chaining here.
My users have the right to see the views in schema ui, but they no rights to schema data. However, due to ownership chaning, the users can access the tables indirectly via the views in schema data.
In detail, a user is attached to a custom role, e.g. "role_user", and the role has rights to the schema, as follows:
GRANT SELECT, UPDATE, INSERT, DELETE ON SCHEMA ui TO role_user ;
but the role has NO RIGHTS on Schema 'data' !!
The nice thing of this setup is that you can apply row-level-security. With a where filter within the view, you can select only records the user is allowed to see.
As said, it works fine within the SQL window, but not with the SQLCommandBuilder. The SQLCommandBuilder analyzes the view, and tries to directly access the underlying tables, instead of accessing the view.
7 years ago, someone asked this: https://stackoverflow.com/a/320684/2504785
And his solution then was to write the SQL commands himself.
But possibly, there exists now another solution? However, I found none so far ...
[/EDIT]
OK, the answer now definitively is:
The SqlCommandBuilder is trying to be "intelligent". If you open it with a command like SELECT * FROM vwTest, then it analyzes the view and creates commands for the underlying table, like INSERT into tblTest ...
So the problem is: SqlCommandBuilder creates commands for the underlying table, instead of for the view.
Solution:
So far, I found no way to change the behaviour of SqlCommandBuilder.
Therefore, I rewrote all the loading and updating, and I am doing now everything manually. Loading now takes place purely with SqlDataReader -- without loading into a DataTable with SqlDataAdapter. And all updating is done via creating and executing SqlCommand, without SqlCommandBuilder.
It was a lot of work, but as a reward, the application is now blazing fast. Loading is far faster than with the SqlCommandBuilder and SqlDataAdapter. Possibly I will make a benchmark comparison at some time. But when a load took 5 seconds before, it is now done "immediatedly".
Note this part of the ownership chaining documentation:
When an object is accessed through a chain, SQL Server first compares the owner of the object to the owner of the calling object. This is the previous link in the chain. If both objects have the same owner, permissions on the referenced object are not evaluated."
Therefor, in order for the ownership chaining to work properly, your table and view must be owned by the same principle.
You can change the view's owner or the table's owner by executing an ALTER AUTHORIZATION sql statement on the object.
Note that this statement only change the owner, not the schema the object belongs to.
In your case, I would recommend changing the owner of the UI schema to the same owner of the Data schema, while keeping the permissions of the database principle that uses the UI schema intact.
ALTER AUTHORIZATION ON SCHEMA::UI TO <owner_of_the_data_schema>;
Note: <owner_of_the_data_schema> is a placeholder I've used since I don't know the owner name.
This way, your application user still only have access to whatever is in the ui schema, but the ownership chaining allows the objects in the ui schema to interact with the objects in the data schema.

Why does the old value stay in this object after it is reset?

I've shown three programmers this problem and we're all stumped. I call a Sql Server stored procedure in a foreach loop and the result always is the same as the first call. Even if I hard code parameters (removing the loop) only the first result is assigned to all subsequent calls.
The stored procedure is called by an Entity Framework function import (EF4 database first using the designer). The calling code lives in a repository that is a class library. The repository is called by a separate Asp.net webforms project. The problem code looks like this:
IEnumerable<WorkOrder> orders = _context.GetWorkOrders(UserName, workOrder, customerCode).ToList();
OrderStatus lastStatus = new OrderStatus();
foreach (Order order in orders)
{
lastStatus = _context.GetOrderStatus(order.OrderNumber).FirstOrDefault();
order.LastOrderStatus = lastStatus.OrderStatus;
}
As you can see this is pretty basic stuff. Depending on the order numbers passed in I always get the result of the first order number in the loop. I've turned off Ajax (part of the Telerik controls I use) because that has caused baffling errors for me in the past. I really hope you can suggest a way to debug this problem! Thanks in advance.
EDIT: Daniel J.G.'s comment led me to this possible solution. Now I need to figure out how to apply Ladislav Mrnka's answer..."Try to call ExecuteFunction directly with MergeOption.OverwriteChanges."
I'm answering my own question (since no one else has after a few days). The problem is caused by the Entity Framework database first designer. It generates code that caches the first stored procedure result causing the bad results in subsequent calls.
As I mentioned in the edit to my question the fix involves replacing the default MergeOption parameter used by ExecuteFunction. You need to use MergeOption.OverwriteChanges instead of the default (which I believe is MergeOption.PreserveChanges).
You could change that parameter in the generated code but your changes would be lost each time the designer is rebuilt. Instead I simply copied the generated code to my repository class, changed the MergeOption to OverwriteChanges, and stopped using the generated code. The end result looks like this:
IEnumerable<WorkOrder> orders = _context.GetWorkOrders(UserName, workOrder, customerCode).ToList();
OrderStatus lastStatus = new OrderStatus();
foreach (Order order in orders)
{
ObjectParameter workOrderParameter;
if (wo.WorkOrder != null)
{
workOrderParameter = new ObjectParameter("WorkOrder", order.WorkOrder);
}
else
{
workOrderParameter = new ObjectParameter("WorkOrder", typeof(global::System.String));
}
lastStatus = _context.ExecuteFunction<OrderStatus>("GetOrderStatus", MergeOption.OverwriteChanges, workOrderParameter).FirstOrDefault();
if (status != null)
{
order.LastOrderStatus = status.OrderStatus;
}
}
I also see that there is a way you can modify the T4 template to make the generated code use the correct MergeOption parameter. I haven't tried it though. If you're interested take a look here.
I'm back with a second answer to my own question. Be sure the Entity Key is truly a unique identifier for each Entity!
In my case, the OrderStock Entity was missing the OrderID (along with StockID) as the Entity Key. Typically the designer culls the primary key fields from the database but I have a unique situation (where my entity is based on a view). Since I left off OrderID from the Entity Key I saw duplicate rows for a single OrderStock Entity.
When I marked OrderID Entity Key = True the duplicate problem went away.

Fine Grained CRUD with Subsonic's SimpleRepository

Let' say I have a TestClass in my C# app with property A and property B.
I change the value of B by my code, I leave property A unchanged.
I update TestClass in database by SimpleRepository's Update method.
As I see it updates also property A value in the database.
It is easy to test: I change value A in my database outside my app ('by hand'), then I make the update from my app. Value of property A changes back to its value according to TestClass's state in my app.
So, my question: is it possible to make updates only to some properties, not for the whole class by SimpleRepository? Are there some 'IgnoreFields' possibilities?
What you need is optimistic concurrency on your UPDATE statement, not to exclude certain fields. In short what that means is when updating a table, a WHERE clause is appended to your UPDATE statement that ensures the values of the fields in the row are in fact what they were when the last SELECT was run.
So, let's assume in your example I selected some data and the values for A and B were 1 and 2 respectively. Now let's assume I wanted to update B (below statement is just an example):
UPDATE TestClass SET B = '3' WHERE Id = 1;
However, instead of running that statement (because there's no concurrency there), let's run this one:
UPDATE TestClass SET B = '3' WHERE Id = 1 AND A = '1' AND B = '2';
That statement now ensures the record hasn't been changed by anybody.
However, at the moment it doesn't appear that Subsonic's SimpleRepository supports any type of concurrency and so that's going to be a major downfall. If you're looking for a very straight forward repository library, where you can use POCO's, I would recommend Dapper. In fact, Dapper is used by Stackoverflow. It's extremely fast and will easily allow you to build in concurrency into your update statements because you send down parameterized SQL statements, simple.
This Stackoverflow article is an overall article on how to use Dapper for all CRUD ops.
This Stackoverflow article shows how to perform inserts and updates with Dapper.
NOTE: with Dapper you could actually do what you're wanting to as well because you send down basic SQL statements, but I just wouldn't recommend not using concurrency.
Don't call the update method on the DataObject for such cases, you are basically indicating that the object has been changed and needs to be updated in the DB.
So subsonic will generate a query like
UPDATE TestClass SET A ='', B='', ModifiedOn = 'DateHere' WHERE PrimaryKey = ID
to change only the property B you need to consturct the UPDATE query manually.
have a look at the Subsonic.Update class.
Ideally you shouldn't be forming a new instance of the data object manually, if you do so make sure
the values are copied from the object retured from the Subsonic.Select query.
So when you update the value of even only one property all other properties will hold their own value from DB rather than a default value depending on the type of the property.

Can entities be attached to an ISession that weren't previously attached?

I'm playing around with NHibernate 3.0. So far things are pretty cool. I'm trying to attach an entity that wasn't detached previously:
var post = new Post(){ Id = 2 };
session.Update(post); // Thought this would work but it doesn't.
post.Title = "New Title After Update";
session.Flush();
What I'm trying to write is the following:
var post = new Post(){ Id = 2 };
session.Attach(post);
post.Title = "New Title After Update";
session.Flush(); // Sql should be something like: UPDATE Post SET Title='New Title After Update' WHERE Id=2
Is this possible so that only Title gets updated? This is currently possible in EntityFramework. I'd like to not have to load Post from the database when I just need to update a few properties. Also, I'm trying to avoid a method call that would create the object... since it's moving away from an object oriented approach in my opinion.
EDIT: I know about using transactions, I just used Flush() to make the code simple. Ok so I think we're sort of getting on the right track for what I'm trying to achieve. I'd like to be able to create an entity with a known Id using the constructor, like I have in the 2nd code block above. I don't want to have to make a call to Get<T> or Load<T> since it feels rather wrong constructing objects like this that already exist in the database. For example, in Entity Framework I can write the 2nd code example and it will "just work". It only updates the Title property.
You can session.Save() or session.SaveOrUpdate()
update
Okay, I think I see now what you are trying to do. You are trying to update a single property on a Post that was previously persisted, not a new Post, and to do that you're instantiating a new Post and giving it the Id of one in the database.
I'm not sure what you mean when you say you're trying to avoid a method call that would create the object, but the way to do this with NHibernate is this:
var post = session.Load<Post>(2);
post.Title = "New Title";
session.SaveOrUpdate(post);
In general, you should not be calling Flush() on your sessions.
The important thing to note here is the use of session.Load. Calling Load with an id in and of itself does not load the entity from the database. The entity's property values will only be loaded when/if you access them.
Of course, in this scenario, I believe that NHibernate will load the properties for the Post, (but not collections unless you've specified an eager fetch mode), and that makes sense (frankly, I don't understand why EF would not load the entity). What if the setter for your Title property does something important, like check it against the existing title, validate the title's length, check your credentials, or update another property? Simply sending an UPDATE to the database isn't sufficient.
It's possible to only update changed properties by setting dynamic-update in the mapping. However, as far as I know, it is not possible (without reverting to SQL) to perform an update without retrieving the object from the database at some point.
Use the Merge method. You have to create a new instance variable to accept the attached entity = nhibernate will not do anything else with your detached instance.
var post = new Post(){ Id = 2 };
post.Title = "New Title After Update";
// Must create a new instance to hold final attached entity
var attachedPost = session.Merge(post);
session.Update(attachedPost);
session.Flush();
// Use attachedPost after this if still needed as in session entity
That covers the "attach" functionality you are looking for, but I don't see how you are going to be able to only update the one property. if the object instance has not been populated from the database, the properties will be different. Dynamic mapping will not solve this - NHibernate sees the properties as "updated" to a bunch of nulls, empty strings.
Gotta say, you are creating a new instance but what you are actually doing is updating an existing instance. You are working directly with IDs not objects. And you are setting a single property and now have an instance potentially hanging around and doing more things but it has not enforced any invariants and may in fact bear no resemblence to the real deal other than the id property...
It all feels pretty anti-object oriented to me personally.

Object is reinserted into database immediately after delete (DbLinq)

I have a MySql database, whose general structure looks like this:
Manufacturer <== ProbeDefinition <== ImagingSettings
ElementSettings ====^ ^==== ProbeInstance
I'm using InnoDB to allow foreign keys, and all foreign keys pointing to a ProbeDefinition have set ON DELETE CASCADE.
The issue I'm having is when I delete a ProbeDefinition in my code, it gets immediately reinserted. The cascading delete happens properly so other tables are cleared, but it seems that the LINQ to SQL may be sending an insert for no reason. Checking the ChangeSet property on the database shows 1 delete, and no inserts.
I'm using the following small bit of code to perform the delete:
database.ProbeDefinition.DeleteOnSubmit(probe);
database.SubmitChanges();
Logs in MySql show the following commands being executed when this is run:
BEGIN
use `wetscoprobes`; DELETE FROM wetscoprobes.probedefinition WHERE ID = 10
use `wetscoprobes`; INSERT INTO wetscoprobes.probedefinition (CenterFrequency, Elements, ID, IsPhased, ManufacturerID, Name, Pitch, Radius, ReverseElements) VALUES (9500000, 128, 10, 1, 6, 'Super Probe 2', 300, 0, 1)
COMMIT /* xid=2424 */
What could cause this unnecessary INSERT? Note that deleting a Manufacturer in the exact same way deletes correctly, with the following log:
BEGIN
use `wetscoprobes`; DELETE FROM wetscoprobes.manufacturer WHERE ID = 9
COMMIT /* xid=2668 */
Edit: Upon further testing, it seems that this only happens after I've populated a ListBox with a list of ProbeDefinitions.
I tried running the above delete code before and after the following snippet had run:
var manufacturer = (Manufacturer)cbxManufacturer.SelectedItem;
var probes = manufacturer.ProbeDefinition;
foreach (var probe in probes)
{
cbxProbeModel.Items.Add(probe);
}
The object gets deleted properly before said code has run, but anytime after this point, it performs an insert after the delete. Does it not like the fact that the object is referenced somewhere?
Here's the code I'm running to test deleting a definition from the intermediate window:
database.ProbeDefinition.DeleteOnSubmit(database.ProbeDefinition.Last())
database.SubmitChanges()
Turns out there are issues when there are multiple references to your object. Stepping through the DbLinq source, I learned that after a DELETE is completed, it steps through all other "watched" objects, looking for references.
In this case, I have multiple references through the table database.ProbeDefinition as well as through the manufacturer reference, manufacturer.ProbeDefinition. This isn't an issue till I have accessed objects through both methods. Using Remove can delete the reference from manufacturer, using DeleteOnSubmit will delete the the object from the table. If I do one or the other, the other reference still exists, and thus the object is marked to be reinserted. I'm not sure if this is a bug in DbLinq that it doesn't delete other references, or expected behavior.
Either way, in my case, the solution is to either access the table using only a single method, and delete using that method, or to delete using both methods. For the sake of getting it working, I used the second method:
// Delete probe
this.manufacturer.ProbeDefinition.Remove(probe);
database.ProbeDefinition.DeleteOnSubmit(probe);
database.SubmitChanges();
EDIT: Upon further work on the project and similar issues, I have found the true underlying issue of my implementation. I have a long-lived DataContext, and with how caching works (to make SubmitChanges work), you can't do this. The real solution is to have a short-lived DataContext, and reconnect to the database in each method.

Categories

Resources