Amazon rds MySql -> concurrent connections deadlocks - c#

I've built an application in C#.NET that uses MySql as a database. My application inserts and updates data in the database in a using the Parallel API (so multithreaded). For reference, the queries that are being fired multithreaded would look like:
Update User set IsActive = 1 where UserID = 1 --each thread updates a different user id
This has always worked on servers that I manage and development machines.
We recently migrated the MySql database to Amazon RDS, however. Since then, if I run this query multithreaded (even with as low as two concurrent threads), it throws an exception:
Deadlock found when trying to get lock; try restarting transaction
Any experience with this issue? Are there some settings in Amazon RDS which are configured differently from a default MySql installation?
Thanks in advance!
P.S. I have posted the same question on dba.stackexchange.com - I hope that is ok with the mods!

Related

Real Time message broadcasting using SignalR and Azure SQL Db

We are having real time application which is hosted on on-premises(IIS) using .Net core 2.1. Whenever there is an update happening in a table, we have to capture the change and show in page and to achieve this we have implemented SignalR. Since we are using Azure SQL db, we couldn't SQLTableDependency as it doesn't have Service Broker.
Current Implementation:
We have created a Stored procedure to check the records which captures changes and where timeout for stored procedure is set to 90 sec. Once the execution completes it captures the change and push it to front-end. In order to achieve this, we have kept a loop running for 10 min in back-end.
Problem Statement:
As number of users increases, the overall SQL CPU consumption is high. Is there a way to reduce the CPU cost or any other approach to get real time data using Azure SQL db and SignalR?
SignalR Service is a fully managed Azure service that simplifies the process of adding real-time web functionality to applications over HTTP.
It seems there is less compatibility between Azure SQL and SignalR. From this discussion also we can see that there is a compatibility issue between these two.
The solution to this problem is to use SQL server instead of Azure SQL DB. You can install SQL Server on one of the servers or another option is to install SQL Server in one VM on Azure.
You can follow this two documents for the process:
SignalR Scaleout with SQL Server.
SignalR with SQL Server.

Background running app in asp.net c#

I want to make a background running desktop app using asp.net c#.
The idea is to check a local database table and get all data from that table in order to insert all those data in live database table.
If there is any better solution for that scenario please suggest me one.
It seems that what you want to solve is moving data from multiple local databases into a central one so you can consume it via an asp.net website.
I won't be discussing application part since that's an easy problem to solve once you can connect to a db that has all the data you need.
There are several things to consider here:
Recurrence (one time migration vs continuous synchronization)
Direction (do you need to sync one-way from local to central or two-way)
Data Contract (do local databases have same or different schema than central database)
The Data Contract is your biggest problem if schemas are different since you will need to design at least a target schema for the central database that can take in data from local dbs.
Even if schemas are identical, you will need to devise a way that data is partitioned in central database, you might need to introduce a sourceDatabaseId column in your tables so you won't have conflicting primary keys (you won't have this problems if your primary keys are guids)
The others can be solved either building:
A windows service - Inputs: periodicity (e.g. every hour), source db and target db connection strings. You will have a main loop that waits until time to run has come (based on periodicity) and fetches data from source db and saves them into target db (preferably in batches)
A console application - Inputs: source db and target db connection strings. You will just move data in batches. You can configure a Scheduled Task on the server that will perform scheduled runs of your console application to solve the periodic running part.
You would set up one such windows service or scheduled console app per local database.
If you have complex databases you can look into tools like Microsoft Sync Framework to perform this data synchronization.
you can develop a windows service to do this type of work and install on server with timer setting.

There are no active servers. Background tasks will not be processed

I have a problem which I already looking at it for a few days and still have no solution.
I found this exception in my C# web app log.
[2015-12-03 13:56:06] [ERROR] [] [Error occurred during execution of
'Server Bootstrapper' component. Execution will be retried (attempt
120 of 2147483647) in 00:05:00 seconds.]
[System.Data.SqlClient.SqlException (0x80131904): Login failed for
user "A Network account".
It appears to me that it is using the network account to access the SQL database and because that network account is not granted access to the database, hence login failed and server cannot startup.
However, when I go to the Hangfire dashboard, I can see the recurring jobs, which seems to me that hangfire can access the database with the right account for retrieving the recurring jobs.
Also, in the IIS server, we already set the Identity to "ApplicationPoolIdentity" for the application pools. Hence, we should use the virtual account instead of the network account.
May I know anyone has the similar problem and have the solution. Really appreciate for your help!!
I ran into the same issue using Hangfire with SQL Server and EntityFramework Code First where EntityFramework was responsible for creating the database, so this suggestion is based purely off of that scenario.
If you are using EF Code First, your database isn't created until the context is created and accessed for the first time.
This means that at App Start your database might not exist. Hangfire won't create the database, it only creates the tables in an already existing database.
So what you might be seeing is:
No database exists.
Your app start's up and Hangfire tries to get itself running, the server process throws an error because EF hasn't created the DB.
The web application start finishes since the hangfire service crashing isn't fatal to the application.
Something in your web app calls into EntityFramework.
EF runs and creates the database (likely no hangfire tables at this point)
you access the hangfire dashboard, hangfire is now able to connect, and see the tables don't exist, so it creates them. (now you will see the tables in your db)
The dashboard can now see the database and show you the stats (likely 0 servers running) so it seems like everything is working.
The way I solved it was to make sure that even if the database is empty (not tables), it is at least created. This was hangfire can access it to install it's tables and start, and EF can access it, and create it's schema.
Also, (and this probably isn't it), GlobalConfiguration.Configuration.UseSqlServerStorage() should run first before other Hangfire startup configuration.
Hope that helps!
Steve
You should create background job server instance, see document here
In my case, the server name was too long and there was a SQL error saying it would be truncated. I was only able to see this error after starting it with Just My Code turned off in Visual Studio.
Moral of the story: Use a shorter name for the server and don't append your own unique-per-instance identifiers to it so it can't run into a truncate error. The column is an nvarchar(100).

How to Synchronize SQLServer Database and MySQL Database

My Scenario:
I have two applications. First one is a website which is connected to MySQL Database and 2nd one is a Desktop Application which is connected to SQL Server2008 R2 Database.
The Desktop application updates records locally and the MySQL database is updated online though the website.
Problem:
Two different databases, how can we update at the spot when changes are made either in MySQL or SQL Database?
What I Want:
Databases should be synchronized to each other (e.g. if changes are made in MySQL then SQL server database should be updated, or if changes are made in SQL Database then MySQL database should be updated)
Could anybody please suggest some code, any idea, or any solution to solve this issue?
Make use of Restful API's to Update information from MS SQL server to MYSQL server.
One of the first things I would point out is that complete and perfect syncing is not possible. Unfortunately there will be data types that exist in SQL Server that don't exist in MySQL and vice versa.
But assuming the data types are pretty simple and the schemas are similar, here are some options:
Use a service bus. You can write an application that monitors both database systems and when it sees a change, it pushes an object onto the service bus. Listeners to the service bus will see the objects and write them to the appropriate destination.
Use triggers like Alex suggested. SQL Server can have CLR code execute on a trigger. The CLR code could be some C# that writes directly to MySQL. Takes some setup, but it's possible. I've investigated running a process from a trigger in MySQL and all options are ugly. It's possible, but security is a major concern. The idea is that a record is changed, trigger is fired and an external process is run.
Write an application that constantly looks for "diffs" in tables and moves data back and forth. You'll need to modify all tables to make sure there is support for date/time stamps for each record so you can track when a record has "changed".

Synchronize Oracle database to Access database

My client application maintains Access database and most of time work offline.
In server side there is a web service with Oracle back end. I need to update Access database row data tables with latest Oracle database table data. Currently I'm doing this by C# windows service which triggered by timer. Is there any alternative to achieve this data synchronization with fault tolerance and good performance. Please share your experience.
Quartz.NET (is a full-featured, open source job scheduling system that can be used from smallest apps to large scale enterprise systems.) to schedule synchronization.
Data base synchronization can be done using Microsoft Sync Framework but I'm not sure whether it support for Access database or not.

Categories

Resources