I'm porting some existing (dynamic) SQL to C# via the SMO namespace. I'm having a little trouble figuring out how to join an existing database to my AlwaysOn Availability Group, though. The SMO namespace has a Database object and an AvailabilityDatabase object, but the two seem to be somewhat orthogonal...I can't see a way to move back and forth between the two concepts. In our existing implementation we create the database, perform some operations on it, create a full backup and then join it to the Availability Group. I'm trying to recreate this workflow in SMO, but getting hung up at the join to Availability Group step. If I do this...
AvailabilityGroup ag = new AvailabilityGroup(sqlServer, myExistingAgName);
AvailabilityDatabase agDb = new AvailabilityDatabase(ag, myExistingDbName);
agDb.JoinAvailablityGroup();
The operation fails and tells me that the AvailabilityDatabase hasn't been created yet. However, if I do this...
AvailabilityGroup ag = new AvailabilityGroup(sqlServer, myExistingAgName);
AvailabilityDatabase agDb = new AvailabilityDatabase(ag, myExistingDbName);
agDb.Create();
agDb.JoinAvailablityGroup();
The operation fails and tells me that the creation of the AvailabilityDatabase failed. Although the error message doesn't explicitly state this, I would assume the reason for the failure was that a DB by the name of myExistingDbName already exists, which is expected. I'm sure I'm just missing something fundamental here, but the MSDN documentation isn't very illustrative, and I'm not having any luck finding any tutorials/examples of this sort of thing online.
To add the database to the Availability Group on primary, I used the following...
// note that the Availability Group is instantiated differently than above
AvailabilityGroup ag = sqlServer.AvailabilityGroups[availabilityGroup];
AvailabilityDatabase agDb = new AvailabilityDatabase(ag, database);
agDb.Create();
After restoring the database on the secondary server(s), here's what I used to join it to the Availability Group...
AvailabilityGroup ag = sqlServer.AvailabilityGroups[availabilityGroup];
ag.AvailabilityDatabases[database].JoinAvailablityGroup();
Related
I have an issue with next scheme, I attached it.I want to query from my database with only one object with "Manufacturer" class. Like:
var res = new XPQuery<Manufacturer>(session);
And then query all info that are related to my condition in LINQ.
I have tried XPLiteObject, XPObject, Association attribute, NoForeignKey Attribute, XPOCollection and a lot of stuff but nothing didn't help me.
I have tried a lot of approaches and every time I have new exception like:
SelectMany - method is not supported.
Can't set foreign key in table.
Duplicate primary key.
My question is: how to describe classes for normal extraction data from db?
UPD:
My solution now is: to use .ToList() at every object
and then use linq-query for join data and make needed query.
var manufacturer = new XPQuery<Manufacturer>(session).ToList();
var cars = new XPQuery<Car>(session).ToList();
var countries = new XPQuery<Country>(session).ToList();
var result = from m in manufacturer ....
So, I have found a solution to my question.
I downloaded DevExpress that can add templates for visual studio.
Then I select Add new item to my project named "DevExpress ORM DataModel Wizard".
This wizard can create persistent objects for existing database.
After that I can query database with next syntax:
var manufacturer = new XPQuery<Manufacturer>(session).Select(x => x....)...;
But if you want to use .SelectMany() in your LINQ query you should use .ToList() and then use .SelectMany(). I faced with a lot of issues when I have tried to join or perform some other LINQ related operations. Well, if you got some errors, firstly after .Select() try .ToList() and then perform your operation.
I'm trying to join two tables from two different catalogs but I'm not able to get to work.
I know I have to do something with CatalogNameOVerwriteHashtable but apparently I'm doing something wrong.
The documentation link (http://www.llblgen.com/documentation/2.6/using%20the%20generated%20code/Adapter/gencode_dataaccessadapter_adapter.htm) doesn't give enough information to resolve my challenge.
I have the following situation:
I've got two Catalogs: CatalogA and CatalogB
There's Article-table in CatalogA and a StockCount-table in CatalogB
I've created a manual relation. So far so good.
My guess is that I have the following actions:
Create a new CatalogNameOverwriteHashtable instance:
var foo = new CatalogNameOverwriteHashtable();
foo.Add("StockCount", "CatalogA");
foo.Add("Article", "CatalogB");
Assign it to the adapter: adapter.CatalogNameOverwrites = foo;
Which results in the following query:
SELECT
[dbo].[StockCount].[ArticleId],
[dbo].[Article].[Description],
[dbo].[StockCount].[ShopId],
[dbo].[StockCount].[LastMutationDateTime]
FROM ( [dbo].[StockCount] INNER JOIN
[dbo].[Article] ON [dbo].[StockCount].[ArticleId]=[dbo].[Article].[ArticleId])
Clearly I'm doing something wrong because the catalog name is missing in the query. The question is, what?
In the end the solution is very simple and proves why LLBLGen is so powerful and easy to use.
First of all. When you generate your entities the catalog name is stored in FieldInfoProviders. This can be retrieved with following instruction:
var catalogName = ORMapper.Utils.DatabasePersistenceInfo.GetFieldPersistenceInfo(ArticleFields.PKey).SourceCatalogName;
This catalog name is the catalog you used to generate the entities against. This is probably your development database.
Next thing you do is map the development catalog name with the production catalog name. Like so:
adapter = new DataAccessAdapter("Your connection string");
adapter.CatalogNameOverwrites =new CatalogNameOverwriteHashtable();
adapter.CatalogNameOverwrites.Add(catalogName, "ProductionCatalogName");
I'm working on a piece of code, written by a coworker, that interfaces with a CRM application our company uses. There are two LINQ to Entities queries in this piece of code that get executed many times in our application, and I've been asked to optimize them because one of them is really slow.
These are the queries:
First query, this one compiles pretty much instantly. It gets relation information from the CRM database, filtering by a list of relation IDs given by the application:
from relation in context.ADRELATION
where ((relationIds.Contains(relation.FIDADRELATION)) && (relation.FLDELETED != -1))
join addressTable in context.ADDRESS on relation.FIDADDRESS equals addressTable.FIDADDRESS
into temporaryAddressTable
from address in temporaryAddressTable.DefaultIfEmpty()
join mailAddressTable in context.ADDRESS on relation.FIDMAILADDRESS equals
mailAddressTable.FIDADDRESS into temporaryMailAddressTable
from mailAddress in temporaryMailAddressTable.DefaultIfEmpty()
select new { Relation = relation, Address = address, MailAddress = mailAddress };
The second query, which takes about 4-5 seconds to compile, and takes information about people from the database (again filtered by a list of IDs):
from role in context.ROLE
join relationTable in context.ADRELATION on role.FIDADRELATION equals relationTable.FIDADRELATION into temporaryRelationTable
from relation in temporaryRelationTable.DefaultIfEmpty()
join personTable in context.PERSON on role.FIDPERS equals personTable.FIDPERS into temporaryPersonTable
from person in temporaryPersonTable.DefaultIfEmpty()
join nationalityTable in context.TBNATION on person.FIDTBNATION equals nationalityTable.FIDTBNATION into temporaryNationalities
from nationality in temporaryNationalities.DefaultIfEmpty()
join titelTable in context.TBTITLE on person.FIDTBTITLE equals titelTable.FIDTBTITLE into temporaryTitles
from title in temporaryTitles.DefaultIfEmpty()
join suffixTable in context.TBSUFFIX on person.FIDTBSUFFIX equals suffixTable.FIDTBSUFFIX into temporarySuffixes
from suffix in temporarySuffixes.DefaultIfEmpty()
where ((rolIds.Contains(role.FIDROLE)) && (relation.FLDELETED != -1))
select new { Role = role, Person = person, relation = relation, Nationality = nationality, Title = title.FTXTBTITLE, Suffix = suffix.FTXTBSUFFIX };
I've set up the SQL Profiler and took the SQL from both queries, then ran it in SQL Server Management Studio. Both queries ran very fast, even with a large (~1000) number of IDs. So the problem seems to lie in the compilation of the LINQ query.
I have tried to use a compiled query, but since those can only contain primitive parameters, I had to strip out the part with the filter and apply that after the Invoke() call, so I'm not sure if that helps much. Also, since this code runs in a WCF service operation, I'm not sure if the compiled query will even still exist on subsequent calls.
Finally what I tried was to only select a single column in the second query. While this obviously won't give me the information I need, I figured it would be faster than the ~200 columns we're selecting now. No such case, it still took 4-5 seconds.
I'm not a LINQ guru at all, so I can barely follow this code (I have a feeling it's not written optimally, but can't put my finger on it). Could anyone give me a hint as to why this problem might be occurring?
The only solution I have left is to manually select all the information instead of joining all these tables. I'd then end up with about 5-6 queries. Not too bad I guess, but since I'm not dealing with horribly inefficient SQL here (or at least an acceptable level of inefficiency), I was hoping to prevent that.
Thanks in advance, hope I made things clear. If not, feel free to ask and I'll provide additional details.
Edit:
I ended up adding associations on my entity framework (the target database didn't have foreign keys specified) and rewriting the query thusly:
context.ROLE.Where(role => rolIds.Contains(role.FIDROLE) && role.Relation.FLDELETED != -1)
.Select(role => new
{
ContactId = role.FIDROLE,
Person = role.Person,
Nationality = role.Person.Nationality.FTXTBNATION,
Title = role.Person.Title.FTXTBTITLE,
Suffix = role.Person.Suffix.FTXTBSUFFIX
});
Seems a lot more readable and it's faster too.
Thanks for the suggestions, I will definitely keep the one about making multiple compiled queries for different numbers of arguments in mind!
Gabriels answer is correct: Use a compiled query.
It looks like you are compiling it again for every WCF request which of course defeats the purpose of one-time initialization. Instead, put the compiled query into a static field.
Edit:
Do this: Send maximum load to your service and pause the debugger 10 times. Look at the call stack. Did it stop more often in L2S code or in ADO.NET code? This will tell you if the problem is still with L2S or with SQL Server.
Next, let's fix the filter. We need to push it back into the compiled query. This is only possible by transforming this:
rolIds.Contains(role.FIDROLE)
to this:
role.FIDROLE == rolIds_0 || role.FIDROLE == rolIds_1 || ...
You need a new compiled query for every cardinality of rolIds. This is nasty, but it is necessary to get it to compile. In my project, I have automated this task but you can do a one-off solution here.
I guess most queries will have very few role-id's so you can materialize 10 compiled queries for cardinalities 1-10 and if the cardinality exceeds 10 you fall back to client-side filtering.
If you decide to keep the query inside the code, you could compile it. You still have to compile the query once when you run your app, but all subsequent call are gonna use that already compiled query. You can take a look at MSDN help here: http://msdn.microsoft.com/en-us/library/bb399335.aspx.
Another option would be to use a stored procedure and call the procedure from your code. Hence no compile time.
I’d like to consult about a problem I have faced. I've started working on a project with a very difficult database: many tables in the DB don’t have primary keys or have multiple PKs, so I can't add correct associations for all entities in my edmx. However, for some entities it’s possible and I managed to do so. Thus, I have two entities with an association between them: Vulner and VulnerDescription. And I have a "bad" connection table for Vulners called VulnerObjectTie (with a mental FK: VulnerObjectTie.Vulner = Vulner.Id), which I can’t add correct associations to. So, I decided to do add the following LinqtoEntities query:
var vulerIdList = from vulner in _businessModel.DataModel.VulverSet.Include("Descriptions")
join objectVulnerTie in _businessModel.DataModel.ObjectVulnerTieSet on vulner.Id equals objectVulnerTie.Vulner
where softwareId.Contains(objectVulnerTie.SecurityObject)
select vulner;
where description is Navigation Property for an association with the VulnerDescription table. The query works, but it does not load the Descriptions property. However, if I remove the join operator, then Descriptions are loaded correctly.
The most obvious solution for this problem is to divide one query into the next two queries:
var vulerIdList = from vulner in _businessModel.DataModel.VulverSet
join objectVulnerTie in _businessModel.DataModel.ObjectVulnerTieSet on vulner.Id equals objectVulnerTie.Vulner
where softwareId.Contains(objectVulnerTie.SecurityObject)
select vulner.Id;
var query = from vulner in _businessModel.DataModel.VulverSet.Include("Descriptions")
where vulerIdList.Contains(vulner.Id)
select vulner;
But I think it looks ugly. Can anyone suggest a more simple solution for this problem, or is it just a special feature of EF4??
thankyouplease :))
It's a known 'feature' or limitation, depending on how you look at it. Here's an interesting discussion on the topic, I'm sure there are more references to find: http://social.msdn.microsoft.com/forums/en-US/adodotnetentityframework/thread/d700becd-fb4e-40cd-a334-9b129344edc9/
The problem here is that EF is not very well suited for "bad databases". EF (and especially all automation tools like model wizard) expects clear and correct database design.
Include is not supported in queries using custom joins or projections. Not supported in this case means that it is completely omitted.
I have 2 data contexts in my application (different databases) and need to be able to query a table in context A with a right join on a table in context B. How do I go about doing this in LINQ2SQL?
Why?: We are using a SaaS product for tracking our time, projects, etc. and would like to send new service requests to this product to prevent our team from duplicating data entry.
Context A: This db stores service request information. It is a third party DB and we are not able to make changes to the structure of this DB as it could have unintended non-supportable consequences downstream.
Context B: This data stores the "log" data of service requests that have been processed. My team and I have full control over this DB's structure, etc. Unprocessed service requests should find their way into this DB and another process will identify it as not being processed and send the record to the SaaS product.
This is the query that I am looking to modify. I was able to do a !list.Contains(c.swHDCaseId) initially, but this cannot handle more than 2100 items. Is there a way to add a join to the other context?
var query = (from c in contextA.Cases
where monitoredInboxList.Contains(c.INBOXES.inboxName)
//right join d in contextB.CaseLog on d.ID = c.ID....
select new
{
//setup fields here...
});
you could try using a GetTable command. I think this loads all of contextB.TableB's data first, not 100% sure on that though. I don't have an environment set up to play around in or test this out so let me know if it works =)
from a in contextA.TableA
join b in contextB.GetTable<TableB>() on a.id equals b.id
select new { a, b }
Your best bet, outside of database solutions, is to join using LINQ (to objects) after execution.
I realize this isn't the solution you were hoping for. At least at this level, you won't have to worry about the IN list limitation (.Contains)
Edit:
outside of database solutions above really points to linked server solutions where you allow the table/view from context A to exist in the database from context B.
If you cannot extract the 2 tables into List objects and then join them then you will probably have to do something database side. I would recomend creating a linked server and a view on the DB server you have control of. You can then do the join in the view and you would have a very simple LINQ query to just retrieve the view. I am njot sure how LINQtoSQL could every do a join between 2 data contexts pointing to 2 different servers.