Dynamics Crm: creating Connection entities via API - c#

So Connections in Dynamics CRM provide a general purpose way of linking things together.
Internally the Connections entity has a Record1Id attribute and a Record2Id attribute, among other things.
When you create a connection via the UI, CRM actually "creates two entries in the Connection table in the database. Each entry allows you to search for the related record from the originating record or the related record."
That is, if you connect A and B, it saves two rows to the (behind the scenes) table:
one with Record1Id = A and Record2Id = B
and one with Record1Id = B and Record2Id = A
This is to make searching for connections easier. If you do an Advanced Find on connections, you only have to do the search 'one way round'.
So my question is:
When you create Connections via the API (late bound), which goes something like this:
Entity connection = new Entity("connection");
connection["record1id"] = new EntityReference("contact", someContactId);
connection["record1objecttypecode"] = new OptionSetValue(2);
connection["record1roleid"] = new EntityReference("connectionrole", someConnectionRoleId);
connection["record2id"] = new EntityReference("incident", someCaseId);
connection["record2objecttypecode"] = new OptionSetValue(122);
connection["record2roleid"] = new EntityReference("connectionrole", someOtherConnectionRoleId);
var newId = service.Create(connection);
... is it sufficient to create them 'one way round' as above, and then behind the scenes CRM will create connections in both directions?
... or do you need to manually create them in both directions? (by saving twice and swapping round the record1id record2id values, etc)
Or, in other words, does the CRM API for Connections encapsulate the 'its actually two connections behind the scenes' functionality, or do you need to manually handle that yourself?

You just need to create one connection record. One thing to note is I don't think you need to set the typecodes as you are doing above. Just setting the logical names in the entity references should be enough. Here is the sample from the SDK:
Connection newConnection = new Connection
{
Record1Id = new EntityReference(Account.EntityLogicalName,
_accountId),
Record1RoleId = new EntityReference(ConnectionRole.EntityLogicalName,
_connectionRoleId),
Record2RoleId = new EntityReference(ConnectionRole.EntityLogicalName,
_connectionRoleId),
Record2Id = new EntityReference(Contact.EntityLogicalName,
_contactId)
};
_connectionId = _serviceProxy.Create(newConnection);

Related

How to programmatically create an Azure SQL Database in an Elastic Pool?

I want to write code in C# which would programmatically create and add an Azure SQL database into an existing elastic pool.
I have looked into the Elastic Database Client Library, but it does not handle creation of databases, only registering existing databases as shards, which I would definitely make use of.
Is it possible to create the database by using something simple like SqlClient, or maybe this can be done by using the Azure SQL Management SDK or some other option?
You can use Transact SQL to create the database and add it to an elastic pool in one statement. In this example, we are creating a new database in a pool named S3M100:
CREATE DATABASE db1 ( SERVICE_OBJECTIVE = ELASTIC_POOL ( name = S3M100 ) )
You can also use Transact-SQL to first create the database.
CREATE DATABASE YourNewDB ( EDITION = 'GeneralPurpose' );
It can be a copy of another database.
CREATE DATABASE YourNewDB AS COPY OF OldDB;
After that you can move it to any elastic pool.
ALTER DATABASE YourNewDB
MODIFY ( SERVICE_OBJECTIVE = ELASTIC_POOL ( name = pool1 ) ) ;
Please see this tutorial: Create a new elastic database pool with C#.
It gives you the c# code example to Create a new database in a pool:
// Create a database: configure create or update parameters and properties explicitly
DatabaseCreateOrUpdateParameters newPooledDatabaseParameters = new DatabaseCreateOrUpdateParameters()
{
Location = currentServer.Location,
Properties = new DatabaseCreateOrUpdateProperties()
{
Edition = "Standard",
RequestedServiceObjectiveName = "ElasticPool",
ElasticPoolName = "ElasticPool1",
MaxSizeBytes = 268435456000, // 250 GB,
Collation = "SQL_Latin1_General_CP1_CI_AS"
}
};
var poolDbResponse = sqlClient.Databases.CreateOrUpdate("resourcegroup-name", "server-name", "Database2", newPooledDatabaseParameters);
If you already have Azure SQL databases, you can reference
Monitor and manage an elastic database pool with C#.
For example, Move a database into an elastic pool:
// Retrieve current database properties.
currentDatabase = sqlClient.Databases.Get("resourcegroup-name", "server-name", "Database1").Database;
// Configure create or update parameters with existing property values, override those to be changed.
DatabaseCreateOrUpdateParameters updatePooledDbParameters = new DatabaseCreateOrUpdateParameters()
{
Location = currentDatabase.Location,
Properties = new DatabaseCreateOrUpdateProperties()
{
Edition = "Standard",
RequestedServiceObjectiveName = "ElasticPool",
ElasticPoolName = "ElasticPool1",
MaxSizeBytes = currentDatabase.Properties.MaxSizeBytes,
Collation = currentDatabase.Properties.Collation,
}
};
// Update the database.
var dbUpdateResponse = sqlClient.Databases.CreateOrUpdate("resourcegroup-name", "server-name", "Database1", updatePooledDbParameters);
Hope this helps.

Mongo DB with C# - document added regardless of transaction

I'm trying to test the newly supported transactions in Mongo DB with a simple example I wrote.
I'm using Mongo DB version 4.0.5 with driver version 2.8.1.
It's only a primary instance with no shards/replicas.
I must be missing something basic in the following code.
I create a Mongo client, session & database, then start a transaction, add a document and abort the transaction. After this code, I expect nothing to change in the database, but the document is added. When debugging I can also see the document right after the InsertOne() by using Robo 3T (Mongo client GUI).
Any idea what am I missing?
var client = new MongoClient("mongodb://localhost:27017");
var session = client.StartSession();
var database = session.Client.GetDatabase("myDatabase", new MongoDatabaseSettings
{
GuidRepresentation = GuidRepresentation.Standard,
ReadPreference = ReadPreference.Primary,
WriteConcern = new WriteConcern(1,
new MongoDB.Driver.Optional<TimeSpan?>(TimeSpan.FromSeconds(30))),
});
var entities = database.GetCollection<MyEntity>("test");
session.StartTransaction();
// After this line I can already see the document in the db collection using Mongo client GUI (Robo 3T), although I expect not to see it until committing
entities.InsertOne(new MyEntity { Name = "Entity" });
// This does not have any effect
session.AbortTransaction();
Edit:
It's possible to run MongoDB as a 1-node replica set, although I'm not sure what's the difference between a standalone and a 1-node replica set.
See my post below.
In any case, to use the started transaction the insertion code must receive the session as a parameter:
entities.InsertOne(session, new MyEntity { Name = "Entity" });
With these 2 change now the transaction works.
This is inherently a property of MongoDB itself. (More here and here)
Transactions are only available in a replica set setup
Why isnt it available for standalone instances?
With subdocuments and arrays, document databases (MongoDB) allow related data to be unified hierarchically inside a single data structure. The document can be updated with an atomic operation, giving it the same data integrity guarantees as a multi-table transaction in a relational database.
I found a solution, although not sure what the consequences are, maybe someone can point it out:
It seems it's possible to use Mongo DB as a 1-node replica set (instead of a standalone) by simply adding the following in the mongod.cfg file:
replication:
replSetName: rs1
Also, thanks to the following link the code should use the correct overload of InsertOne() which receives the session as the first parameter (see the edit on the original post):
multiple document transaction not working in c# using mongodb 4.08 community server

C# how to get list of XEvent Sessions in a given database

I'm trying to write some simple Extended Events management code in C#, but am fairly new to it. I am able to setup XEvent sessions in SSMS and was able to get the Linq stream from that created session in C# using this example
What I would like to do now, is to be able to query a given database for what sessions exist. I could manually query the sys.dm_xe* tables and create the mapped classes for those, but it looks like the classes already exist in the Microsoft.SqlServer.Management.XEvent namespace - so I'd hate to do a poor re-implementation if something already exists.
The specific table holding what sessions exist is sys.dm_xe_sessions.
Any example code or help is appreciated. Thanks!
The class to look for is XEStore in Microsoft.SqlServer.Managment.XEvent. With this you can see what extended event sessions exist as well as create new ones.
using (SqlConnection conn = new SqlConnection(connString)) {
XEStore store = new XEStore(new SqlStoreConnection(conn));
if (store.Sessions[sessionName] != null) {
Console.WriteLine("dropping existing session");
store.Sessions[sessionName].Drop();
}
Session s = store.CreateSession(sessionName);
s.MaxMemory = 4096;
s.MaxDispatchLatency = 30;
s.EventRetentionMode = Session.EventRetentionModeEnum.AllowMultipleEventLoss;
Event rpc = s.AddEvent("rpc_completed");
rpc.AddAction("username");
rpc.AddAction("database_name");
rpc.AddAction("sql_text");
rpc.PredicateExpression = #"sqlserver.username NOT LIKE '%testuser'";
s.Create();
s.Start();
//s.Stop();
//s.Drop();
}

DbMigrator does not detect pending migrations after switching database

EntityFramework migrations become useless after switching to new Context.
DbMigrator is using list of Pending Migrations from first database instance, which makes means no migrations are applied to other databases, which then leads to errors during Seed();
C# .NET 4.5 MVC project with EF 6
MS SQL Server 2014, multiple instances of same database model.
CodeFirst approach with migrations.
DbContext initializer is set to null.
On Application Start we have custom Db Initialization to create and update databases. CreateDatabaseIfNotExists is working as intended, new databases have all migrations applied. However both MigrateDatabaseToLatestVersion initializer and our custom one are failing to update databases other than first one on list.
foreach (var connectionString in connectionStrings)
{
using (var context = new ApplicationDbContext(connectionString))
{
//Create database
var created = context.Database.CreateIfNotExists();
var conf = new Workshop.Migrations.Configuration();
var migrator = new DbMigrator(conf);
migrator.Update();
//initial values
conf.RunSeed(context);
}
}
context.Database.CreateIfNotExists(); works correctly.
migrator.GetLocalMigrations() is always returning correct values.
migrator.GetPendingMigrations() after first database is returning
empty list.
migrator.GetDatabaseMigrations() is mirror of pending migrations,
after first database it contains full list event for empty databases.
Fetching data (context.xxx.ToList()) from Db instance confirms connection is up and working, and links to correct instance.
Forcing update to most recent migration with migrator.Update("migration_name"); changes nothing. From what I gather by reading EF source code, it checks pending migration list on its own, which gives it faulty results.
There seems to be some caching going in under the hood, but it eludes me how to reset it.
Is there a way to perform migrations on multiple databases or is it yet another "bug by design" in EF?
Edit:
Real problem is DbMigrator creating new Context for its own use. It does it via default parameterless constructor, which in my case had fallback to default (first) connection string in web.Config.
I do not see good solution for this problem but primitive workaround in my case is to temporarily edit default connection string:
var originalConStr = WebConfigurationManager.ConnectionStrings["ApplicationDbContext"].ConnectionString;
var setting = WebConfigurationManager.ConnectionStrings["ApplicationDbContext"];
var fi = typeof(ConfigurationElement).GetField("_bReadOnly", BindingFlags.Instance | BindingFlags.NonPublic);
//disable readonly flag on field
fi.SetValue(setting, false);
setting.ConnectionString = temporaryConnectionString; //now it works
//DO STUFF
setting.ConnectionString = originalConStr; //revert changes
Cheat from: How do I set a connection string config programatically in .net?
I still hope someone will find real solution so for now I will refrain with self-answer.
You need to correctly set DbMigrationsConfiguration.TargetDatabase property, otherwise the migrator will use the default connection info.
So in theory you can do something like this
conf.TargetDatabase = new System.Data.Entity.Infrastructure.DbConnectionInfo(...);
Unfortunately the only 2 public constructors of the DbConnectionInfo are
public DbConnectionInfo(string connectionName)
connectionName: The name of the connection string in the application configuration.
and
public DbConnectionInfo(string connectionString, string providerInvariantName)
connectionString: The connection string to use for the connection.
providerInvariantName: The name of the provider to use for the connection. Use 'System.Data.SqlClient' for SQL Server.
I see you have the connection string, but have no idea how you can get the providerInvariantName.
UPDATE: I didn't find a good "official" way of taking the needed information, so I've ended using a hack with accessing internal members via reflection, but still IMO it's a quite more safer than what you have used:
var internalContext = context.GetType().GetProperty("InternalContext", BindingFlags.Instance | BindingFlags.NonPublic).GetValue(context);
var providerName = (string)internalContext.GetType().GetProperty("ProviderName").GetValue(internalContext);
var conf = new Workshop.Migrations.Configuration();
conf.TargetDatabase = new System.Data.Entity.Infrastructure.DbConnectionInfo(connectionString, providerName);

SQL Change Tracking and Microsoft Sync Framework

I'm kind of new with databases and SQL and I'm struggling trying to understand how SQL Change Tracking and Microsoft Sync Framework work together.
I couldn't find some clear examples about how to sync databases with Microsoft Sync Framework but hopefully I found this site, modified the code and got syncing working on my two databases, here is the code I got:
// Server connection
using (SqlConnection serverConn = new SqlConnection(serverConnectionString))
{
if (serverConn.State == ConnectionState.Closed)
serverConn.Open();
// Client connection
using (SqlConnection clientConn = new SqlConnection(clientConnectionString))
{
if (clientConn.State == ConnectionState.Closed)
clientConn.Open();
const string scopeName = "DifferentPKScope";
// Provision Server
var serverProvision = new SqlSyncScopeProvisioning(serverConn);
if (!serverProvision.ScopeExists(scopeName))
{
var serverScopeDesc = new DbSyncScopeDescription(scopeName);
var serverTableDesc = SqlSyncDescriptionBuilder.GetDescriptionForTable(table, serverConn);
// Add the table to the descriptor
serverScopeDesc.Tables.Add(serverTableDesc);
serverProvision.PopulateFromScopeDescription(serverScopeDesc);
serverProvision.Apply();
}
// Provision Client
var clientProvision = new SqlSyncScopeProvisioning(clientConn);
if (!clientProvision.ScopeExists(scopeName))
{
var clientScopeDesc = new DbSyncScopeDescription(scopeName);
var clientTableDesc = SqlSyncDescriptionBuilder.GetDescriptionForTable(table, clientConn);
// Add the table to the descriptor
clientScopeDesc.Tables.Add(clientTableDesc);
clientProvision.PopulateFromScopeDescription(clientScopeDesc);
clientProvision.SetCreateTrackingTableDefault(DbSyncCreationOption.CreateOrUseExisting);
clientProvision.Apply();
}
// Create the sync orchestrator
var syncOrchestrator = new SyncOrchestrator();
// Setup providers
var localProvider = new SqlSyncProvider(scopeName, clientConn);
var remoteProvider = new SqlSyncProvider(scopeName, serverConn);
syncOrchestrator.LocalProvider = localProvider;
syncOrchestrator.RemoteProvider = remoteProvider;
// Set the direction of sync session
syncOrchestrator.Direction = direction;
// Execute the synchronization process
return syncOrchestrator.Synchronize();
}
}
So on this way any changes are synchronized between my two databases. But I wanted a way for my C# app to automatically synchronize both databases when something changes so I found something called Change Tracking here. I downloaded the example code that provides a SynchronizationHelper that also creates tables in my databases called "{TableName}_tracking". This is another table that tracks the changes and indeed it does, whenever I change something in my database the _tracking is updated with the elements I changed, added or removed. Change Tracking doesn't automatically synchronize my databases, it just keeps track of the changes in them, what's the purpose of this?
With the first code, synchronization works but no _tracking table is created, does it just synchronize everything in the table no matter what changed? If that's the case, for big databases I should be using Change Tracking?
Maybe this is something trivial but I have been googling and testing a lot of code but I can't find a clear answer.
When you install Sync Framework, it comes with a help file that includes several walkthroughs of synchronizing databases. the first link you referred to and the second uses the same sync provider and they both have tracking tables. Sync Framework supports using the built-in SQL Change Tracking feature or using a custom-one that Sync Framework creates by itself (the _tracking).
Sync Framework sits outside of your database and you need to invoke it in order to fire the synchronization. Change Tracking is what it says it is- tracking changes.
if you want your databases to do the sync, you might want to check SQL Replication instead.

Categories

Resources