NHibernate transaction not working properly on Postgres - c#

I have common dll referenced by desktop application and web application. One of the methods creates 3 objects and inserts them into DB within transaction.
However, when I call this method at the same time from web application and desktop application, objects are not inserted 3by3...their order is mixed (1 from desktop application, followed by 1 from web application, etc).
Is my code ok?Is there something to do with mapping, or nhibernate cfg???
Thank you very much in advance.
Notifications not = new Notifications();
not.Notes = applicationName;
not.GeneratedTime = DateTime.Now;
using (var session = SessionManager.OpenSession())
using (var transaction = session.BeginTransaction())
{
// do what you need to do with the session
session.Save(not);
not = new Notifications();
not.Notes = applicationName;
not.GeneratedTime = DateTime.Now;
session.Save(not);
not = new Notifications();
not.Notes = applicationName;
not.GeneratedTime = DateTime.Now;
// do what you need to do with the session
session.Save(not);
transaction.Commit();
}

You are under the incorrect assumption that performing inserts in one transaction will prevent inserts from happening in other concurrent transactions.
This is just not true (unless you use some specific transaction isolation level, exclusively lock the whole table, etc... but let's just say it's not true)

Related

Dynamics CRM: CreateRequest concurrency issue

I am using MS Dynamics CRM SDK with C#. In this I have a WCF service method which creates an entity record.
I am using CreateRequest in the method. Client is calling this method with 2 identical requests one after other immediately.
There is a fetch before creating a record. If the record is available we are updating it. However, 2 inserts are happening at the exact time.
So 2 records with identical data are getting created in CRM.
Can someone help to prevent concurrency?
You should force the duplicate detection rule & decide what to do. Read more
Account a = new Account();
a.Name = "My account";
CreateRequest req = new CreateRequest();
req.Parameters.Add("SuppressDuplicateDetection", false);
req.Target = a;
try
{
service.Execute(req);
}
catch (FaultException<OrganizationServiceFault> ex)
{
if (ex.Detail.ErrorCode == -2147220685)
{
// Account with name "My account" already exists
}
else
{
throw;
}
}
As Filburt commented in your question, the preferred approach would be to use an Alternate Key and Upsert requests but unfortunately that's not an option for you if you're working with CRM 2013.
In your scenario, I'd implement a very lightweight cache in the WCF service, probably using the MemoryCache object from the System.Runtime.Caching.dll library (small example). Before executing the query to CRM, you can check if the record exists in the cache and continue with you current processing if it doesn't (remembering to add the record to the cache with a small expiration time for potential concurrent executions) or handle the scenario where the record already exists in the cache (and here you can go from having quite complex checks to detect and prevent potential data loss/unnecessary updates to a simple and stupid Thread.Sleep(1000)).

TFS Object Model: Do WorkItems keep sessions open?

We're currently investigating an issue, that, according to the Firewall provider, we have at times around 1500 parallel sessions open. I have the strong suspicion, that our TFS-Replication, a Service, which fetches Workitems via TFS Object Model from an external TFS and saves some data into a local SQL database, is causing the issue.
The access to the object model is looking like this:
internal static async Task<IReadOnlyCollection<WorkItem>> QueryWorkItemsAsync(string wiqlQuery)
{
var tfsConfig = TFSConfiguration.Instances[Constants.TfsPrefix.External];
var uri = new Uri(tfsConfig.WebbaseUri);
var teamProject = new TfsTeamProjectCollection(uri, new VssBasicCredential(string.Empty, ConfigurationManager.AppSettings[Infrastructure.Constants.APP_SETTINGS_TFS_PAT]));
var workItemStore = teamProject.GetService<WorkItemStore>();
var query = new Query(workItemStore, wiqlQuery, null, false);
var result = await Task.Run(
() =>
{
var castedWorkItems = query.RunQuery().Cast<WorkItem>();
return castedWorkItems.ToList();
});
return result;
}
Nothing too fancy: A WIQL can be passed into the method. Currently, I'm fetching blocks, so the WIQL would look like
var wiql = "SELECT * FROM WorkItems";
wiql += $" WHERE [System.Id] > {minWorkItemId}";
wiql += $" AND [System.Id] <= {maxWorkItemId}";
wiql += " ORDER BY [System.Id] DESC";
Im pretty much doing nothing with this WorkItems except mapping some of their fields, but not writing, saving or anything. I didn't get any hint on the objects I'm using regarding open Sessions and also the WorkItem-Objects itself are only very short living in the memory.
Am I missing something here, that could explain the open sessions within that service?
The client object model does a number of things:
It keeps a connection pool for each user/projectcollection combination to speed up data transfers.
Each work item revision you hit needs to be fetched.
Each work item materialized from a query contains only the fields selected in the query, additional fields being accessed are fetched on demand.
The TFSTeamprojectCollection class implements IDisposable and must be cleaned up once in a while to ensure connections are closed. In internal cache is maintained, but it ensures that connections are closed.
Its probably a good idea to wrap this code in a try/catch block or provide the Team Project Collection through Dependency injection and manage the connection at a higher level (otherwise your additional fields will fail to be populated).
I don't know the very details behind the workitem class but I observed that when u e.g. specify in the select of the wiql only a few fields u can still access others ... And that is comparable slow. If I select all fields I later access through indexer it is much faster.
From that observation I would say: yes, a communication is kept open.

How to track MongoDB requests from a console application

I have a Console Application project written in C# which I've added Application Insights to with the following NuGet packages.
Microsoft.ApplicationInsights
Microsoft.ApplicationInsights.Agent.Intercept
Microsoft.ApplicationInsights.DependencyCollector
Microsoft.ApplicationInsights.NLogTarget
Microsoft.ApplicationInsights.PerfCounterCollector
Microsoft.ApplicationInsights.Web
Microsoft.ApplicationInsights.WindowsServer
Microsoft.ApplicationInsights.WindowsServer.TelemetryChannel
I've configured my InstrumentationKey in the config file and I'm firing up a TelemetryClient on startup using the with the following code:
var telemetryClient = new TelemetryClient();
telemetryClient.Context.User.Id = Environment.UserName;
telemetryClient.Context.Session.Id = Guid.NewGuid().ToString();
telemetryClient.Context.Device.OperatingSystem = Environment.OSVersion.ToString();
Everything is working well except AI is not capturing any requests that get sent to Mongo, I can see requests going off to SQL server in the 'Application map' but no sign of any other external requests. Is there any way that I can see telemetry of requests made to Mongo?
EDIT - Thanks to Peter Bons I ended up with pretty much the following which works like a charm and allows me to distinguish between success and failure:
var telemetryClient = new TelemetryClient();
var connectionString = connectionStringSettings.ConnectionString;
var mongoUrl = new MongoUrl(connectionString);
var mongoClientSettings = MongoClientSettings.FromUrl(mongoUrl);
mongoClientSettings.ClusterConfigurator = clusterConfigurator =>
{
clusterConfigurator.Subscribe<CommandSucceededEvent>(e =>
{
telemetryClient.TrackDependency("MongoDB", e.CommandName, DateTime.Now.Subtract(e.Duration), e.Duration, true);
});
clusterConfigurator.Subscribe<CommandFailedEvent>(e =>
{
telemetryClient.TrackDependency("MongoDB", $"{e.CommandName} - {e.ToString()}", DateTime.Now.Subtract(e.Duration), e.Duration, false);
});
};
var mongoClient = new MongoClient(mongoClientSettings);
I am not familiar with MongoDB but as far as I can tell there is no default support for it when it comes to Application Insights. But that does not mean you cannot do this, it will just involve some more code.
Again, I am not familiar with MongoDB but according to http://www.mattburkedev.com/logging-queries-from-mongodb-c-number-driver/ there is built-in support for logging the generated queries. Now, we only need to hook this up to Application Insights.
Since you already know how to use the TelemetryClient we can use the custom tracking methods provided by that class. See https://learn.microsoft.com/nl-nl/azure/application-insights/app-insights-api-custom-events-metrics for the available custom tracking methods.
All you need to do is to insert some code like this:
telemetryClient.TrackDependency(
"MongoDB", // The name of the dependency
query, // Text of the query
DateTime.Now, // Time that query is executed
TimeSpan.FromSeconds(0), // Time taken to execute query
true); // Indicates success
The class telemetryClient is thread-safe so you can reuse it.
Now, according to the referenced blogpost you should be able to do something like this:
var client = new MongoClient(new MongoClientSettings()
{
Server = new MongoServerAddress("localhost"),
ClusterConfigurator = cb =>
{
cb.Subscribe<CommandStartedEvent>(e =>
{
telemetryClient.TrackDependency(
"MongoDB", // The name of the dependency
e.Command.ToJson() // Text of the query
DateTime.Now, // Time that query is executed
TimeSpan.FromSeconds(0), // Time taken to execute query
true); // Indicates success
});
}
});
Again, I am not familiar with MongoDB but I hope this is a starting point for your imagination on how to adapt it to your needs using your knowledge of MongoDB.
EDIT:
If there is also a CommandCompletedEvent or similar event as opposed to the CommandStartedEvent event you should probably track the dependency there because you should then be able to calculate (or simpel read) the time spent and maybe get the actual value for the success indicator.

How to make SqlConnection.RetrieveStatistics work?

I can't make SqlConnection.RetrieveStatistics work, it always return a hashtable of 18 elements that are all zeros.
What I want to do : In my test environnent, I want to add a header for each httpresponse with the number of sql commands that were executed during the request processing.
My app is only (Rest) webservices, I have one sql connection par request ( One NHibernate session per request ).
After the request is processed, I do :
var connection = statelessSession.NHSession.Connection as ReliableSqlDbConnection;
if (connection != null)
queries += Convert.ToInt32(connection.ReliableConnection.Current.RetrieveStatistics()["Prepares"]);
The number is always 0, and when I break on this line, I see that all numbers are zeros.
After the session is opened, I do :
var ret = module.OpenSession();
var connection = (ret.Connection as ReliableSqlDbConnection);
if(connection != null)
{
connection.ReliableConnection.Current.StatisticsEnabled =true;
ret.CreateSQLQuery("SET STATISTICS IO ON; SET STATISTICS TIME ON;").ExecuteUpdate();
connection.ReliableConnection.Current.ResetStatistics();
}
return ret;
I use https://github.com/MRCollective/NHibernate.SqlAzure because in production I'm using sql azure, but in local I have an instance of Sql Server 2014 Developer Edition.
For my database, I have :
Auto create statistics true
Auto update statistics true
Did I miss something ?
The cause of this was that NHibernate opens a new connection for each request, this is why I was loosing the statistics each time.

mongodb C# query doesn't respond

I'm trying to get Item from mongodb Server, sometimes its work and after 4-5 attemps its stop resonding in the last row (I can't take out the object out side the query)
any one had it before? what is the right way to take out the object?
var client = new MongoClient(connectionString);
var server = client.GetServer();
var database = server.GetDatabase("myPlaces");
var collection = database.GetCollection<MongoPlace>("Places");
int startDay = int.Parse(Request.QueryString["day"]);
MongoPlace mp = collection.AsQueryable<MongoPlace>().Where(x => x.guid ==
Request.QueryString["id"]).FirstOrDefault();
It's likely you're hitting the default connection pool limit.
As it looks like this is a web application, you shouldn't be opening the client more than once per instance of your web application.
The MongoClient, MongoServer, MongoDatabase and MongoCollection are all thread-safe and generally there should only be one instance of each. (See here for more information).
You'd probably want to do this as the application starts and then maintain the connections statically until the application exits.
In my ASP.NET MVC applications, I usually add a "DatabaseConfig" class that's called in the same way as other app configurations. As an example here's some code I've got in the project I'm currently building using MongoDB (there isn't any error handling yet):
var client = new MongoClient(ConfigurationManager.ConnectionStrings["DefaultConnection"].ConnectionString);
var server = client.GetServer();
DataLayer.Client = client;
DataLayer.Server = server;
var settings = new MongoDatabaseSettings(server, "default");
settings.WriteConcern = WriteConcern.Acknowledged;
DataLayer.Database = DataLayer.GetDatabase(settings);
Then, in Application_Start, I call an Initialize method that contains the code above.
DatabaseConfig.Initialize();

Categories

Resources