Real Time message broadcasting using SignalR and Azure SQL Db - c#

We are having real time application which is hosted on on-premises(IIS) using .Net core 2.1. Whenever there is an update happening in a table, we have to capture the change and show in page and to achieve this we have implemented SignalR. Since we are using Azure SQL db, we couldn't SQLTableDependency as it doesn't have Service Broker.
Current Implementation:
We have created a Stored procedure to check the records which captures changes and where timeout for stored procedure is set to 90 sec. Once the execution completes it captures the change and push it to front-end. In order to achieve this, we have kept a loop running for 10 min in back-end.
Problem Statement:
As number of users increases, the overall SQL CPU consumption is high. Is there a way to reduce the CPU cost or any other approach to get real time data using Azure SQL db and SignalR?

SignalR Service is a fully managed Azure service that simplifies the process of adding real-time web functionality to applications over HTTP.
It seems there is less compatibility between Azure SQL and SignalR. From this discussion also we can see that there is a compatibility issue between these two.
The solution to this problem is to use SQL server instead of Azure SQL DB. You can install SQL Server on one of the servers or another option is to install SQL Server in one VM on Azure.
You can follow this two documents for the process:
SignalR Scaleout with SQL Server.
SignalR with SQL Server.

Related

Two-Way Synchronization between Azure SQL and wpf/uwp clients

I have a Azure SQL database, and then clients (UWP and WPF) running SQLite that gets data from the database.
The clients can change the data and it updates the Azure SQL rows and they can insert new items/rows.
My problem is that multiple clients are working at the same time and i would like the client to update as soon as there is a change (insert/update/delete) on the Azure SQL database. Today the user has to press update in order to get latest.
However I not sure how to implement this.
I have been looking into using SQL "Change Tracking" https://learn.microsoft.com/en-us/sql/relational-databases/track-changes/about-change-tracking-sql-server?view=sql-server-2017and can easily active Change Tracking in Azure on the SQL server and Table, however i cannot find any documentation on how to use this to sync to client in C#, is there some?
Maybe there is a better way of doing this?
Any input is much appreciated.

How many database should replicate (publisher, distributor, subscriber) in sitecore 9.0?

We have Sitecore 9.0 production server which has two MSSQL database. Environment is setup with AWS (VM) on premises MSSQL database. We want to replicate all database from one sql server (master,core,shared,reporting etc..) to another sql server.
How to replicated AWS VM on premises sitecore 9.0 databases?
check this link
https://doc.sitecore.com/developers/90/platform-administration-and-architecture/en/scale-databases.html
For Sitecore there should be only 1 master database. Also there should be only 1 CM Server active (exception are possible with config in the picture it is named cold so that looks good).
To stay close to the drawing which is not necessarily the best solution. Alternative options are certainly possible especially for the web databases. An option is to have a High Availability database, SQL server always on. See Sitecore Configure SQL server always on and see Support for SQL Server scaling features for the other options if data loss or long recovery time is allowed.
SQL server always on is basically the Supported by Sitecore option on VM's. For Database Mirroring: This feature will be removed from Microsoft SQL Server in the future. It is recommended to use AlwaysOn Availability Groups instead.

Amazon rds MySql -> concurrent connections deadlocks

I've built an application in C#.NET that uses MySql as a database. My application inserts and updates data in the database in a using the Parallel API (so multithreaded). For reference, the queries that are being fired multithreaded would look like:
Update User set IsActive = 1 where UserID = 1 --each thread updates a different user id
This has always worked on servers that I manage and development machines.
We recently migrated the MySql database to Amazon RDS, however. Since then, if I run this query multithreaded (even with as low as two concurrent threads), it throws an exception:
Deadlock found when trying to get lock; try restarting transaction
Any experience with this issue? Are there some settings in Amazon RDS which are configured differently from a default MySql installation?
Thanks in advance!
P.S. I have posted the same question on dba.stackexchange.com - I hope that is ok with the mods!

Efficiently using manual data synchronization between SQL Server Compact and WCF service

I have a web-server and a database server. There is a WCF service on the web-server and a website using it. Website requests the data from the WCF service and WCF service connects to the database server, fetches the data and returns it to the website.
To optimize this process and decrease the calls to WCF service I decided to manually cache the data on the web-server. One option I can think of was Microsoft Sync Framework. But then I realized that I have to create a sync framework by myself to achieve my objective. Because Microsoft Sync Framework does not provide any option for my kind of process. My process will be actually like this:
Website requests the data.
Business logic of the website checks whether it is available on the compact edition (sdf) database in the website's App_Data folder.
If present, fetch the data from the compact edition.
If not present, connect to WCF service and fetch the data from main database server and copy it to the compact and then fetch from compact edition.
So what I want to ask, is this technique efficient? and if YES is it there any alternative way to quickly achieve this technique? Or I have to code all of it manually?
It could work, but if it was me, and my goal was the increase performance/cache the calls between your web server and a back-end sql server, I wouldn't choose sql server compact edition for caching purposes.
Something like redis or memcached might be a more appropriate/higher performance way to increase performance of your caching layer. The compact sql server may cut down on the number of calls to your back-end server, but it might do that at the expense of slower response times overall.
It could be efficient in some circumstances, but I could propose you to use presreve data cache in local SQL Server Express instead of Compact.
The type system of SQL Server Compact is different then SQL Server. I know that EF6 support both but I would not start this journey on my own...
In any case there should be reasons why you need to work with cache throwgh RDBMS..
You can also use NHibernate, 2nd level caching. For more information, read this article http://www.codeproject.com/Articles/529016/NHibernate-Second-Level-Caching-Implementation. It will store data in the MemCache, and detect changes to records.
Another caching technology, is AppFabric, and you can see more details here : http://msdn.microsoft.com/en-us/library/ff383731(v=azure.10).aspx
By using memcache performance can be increased.
These are the steps to implement the memcache
You have to create a window service that will retrieve data from database and store in memcache in JSON format as (key value pair).
For website create a handler file as an API that will retrieve data from memcache and display the result.
I have implemented this in one of my project it retrieves thousands of data in milliseconds

IIS 7 Can't read data from ASP.Net application services tables

I'm working on deploying a web application written in C# with ASP.Net Application services databases.
The application runs fine on the development machine.
Windows Server 2003 has been built to test the application.
The database has been scripted across using MS SQL Server GUI.
ASP.Net application services tables were created using an utility.
The connection strings are stored in the web.config and connectionStrings.config.
The application connects to the database successfully, but then it times out after 10 seconds.
I think we'll need more than this to figure out why the timeout is happening. How do you know that the application is successfully connecting to the database?
If that's all the information I had and I observed those symptoms, I'd likely try to run SQL Server Profiler against the SQL instance in question to see what activity against the database is timing out.
Connection string was hardcoded somewhere in the app...ahhh

Categories

Resources