C# Update Application Based on Updates in DB - c#

I'm creating a chatting application in C# where clients can chat privately, however I’m struggling to update the messages based on changes within the DB (sql-server).
Should I create a constant running thread which requests data after the last message id outputted? I think this may be inefficient, is there other methods? Or could you recommend a better method?
public void Display_Messages(dynamic listbox, int conversation_id)
{
Connection _connection = new Connection();
_connection.conn.Open();
string sql = "SELECT userId, text, messageId FROM message WHERE conversationId =" + conversation_id + ";"; // Update Later to use PS
dynamic command = new SqlCommand(sql, _connection.conn);
dynamic data_reader = command.ExecuteReader();
while (data_reader.Read())
{
listbox.Items.Add(data_reader.GetValue(0) +":"+data_reader.GetValue(1));
}
data_reader.Close(); command.Dispose(); _connection.conn.Close();
}

You can use SQL Server's broker service which notifies your custom data changes, but there must not be too many listeners on it for performance considerations. The polling mechanisms usually are not efficient enough and put a constant load on the system even when there is no new data to be notified.
You can find all the information you need to use the broker service online.

Related

Can SQL Server push a message to a program?

Can SQL Server push message to a program which listens to SQL Server?
For example:
There is a program A, listening for SQL Server. SQL Server will view a table named B, when B has some data, SQL Server will get the data and push it to A.
Yes, it's possible, see How to run a program from SQL?.
But, as that post states, there are a lot of reasons not to do so. SQL Server was written to be queried, so it will be a lot more efficient answering queries than pushing them.
You can use SqlDependecy (class Details) to detect changes in tables/views. This does require Enabling Query Notifications.
void Initialization()
{
// Create a dependency connection.
SqlDependency.Start(connectionString, queueName);
}
void SomeMethod()
{
// Assume connection is an open SqlConnection.
// Create a new SqlCommand object.
using (SqlCommand command=new SqlCommand(
"SELECT ShipperID, CompanyName, Phone FROM dbo.Shippers",
connection))
{
// Create a dependency and associate it with the SqlCommand.
SqlDependency dependency=new SqlDependency(command);
// Maintain the refence in a class member.
// Subscribe to the SqlDependency event.
dependency.OnChange+=new
OnChangeEventHandler(OnDependencyChange);
// Execute the command.
using (SqlDataReader reader = command.ExecuteReader())
{
// Process the DataReader.
}
}
}
// Handler method
void OnDependencyChange(object sender,
SqlNotificationEventArgs e )
{
// Handle the event (for example, invalidate this cache entry).
}
void Termination()
{
// Release the dependency.
SqlDependency.Stop(connectionString, queueName);
}
To use SqlDependency, Service Broker must be enabled for the SQL Server database being used, and users must have permissions to receive notifications. Service Broker objects, such as the notification queue, are predefined.
Are you looking for some kind of SqlServerNotifications | MSDN? For me that did not work because it requers some Configuration that we were not able to implement ( security reasons ) ... so i implemented my own notification layer with my own TCP network wrapper. When one client updates the database, it sends to all other clients a message with ID and table name, and the Client will update the entry by it self. Its not easy to implement and requers a lot of desgin.

Row by row streaming of data while waiting for response

I have a WCF service and client developed using C# which handles data transfer from SQL server database on the server to the SQL server database on the client end. I am facing some issues with the current architecture and planning to modify it to an idea I have, and would like to know if it is possible to achieve it, or how best can I modify the architecture to suite my needs.
The Server side database server is SQL 2008 R2 SP1 and client side servers are SQL 2000
Before I state the idea, below is the overview and current shortcomings of the architecture design I am using.
Overview:
Client requests for a table’s data.
WCF service queries the Server database for all pending data for the requested table. This data is loaded into a dataset.
WCF Compresses the Dataset using GZIP compression and converts it to byte for the client to download.
Client receives the Byte stream, un-compresses it and replicates the data from the Dataset to the physical table on the client database. This data is inserted row by row since in need the Primary key column filed to be returned to the server so that it can be flagged of as transferred.
Once the client has finished replicating the data, it uploads the successful rows Primary key fields back to the server, and in turn the server update each field one by one.
The above procedure uses a basic http binding, with streamed transfer mode.
Shortcomings:
This works great for little data, but when it comes to bulk data, maintaining the dataset in memory as the download is ongoing and also at the client side as replication is ongoing, is becoming impossible as sometimes the dataset size goes up to 4gb. The server can hold this much data since it’s a 32gb RAM server, but at the client side I get System out of memory exception since the client machine has 2gb RAM.
There are numerous deadlocks as the select query is running and also when updating since I am using transaction mode as read committed.
For bulk data it is very slow and completely hangs the client machine when the DTS is ongoing.
Idea in mind:
Maintain the same service and logic of row by row transfer since I cannot change this due to sensitivity of the data, but rather than downloading bulk data I plan to use the sample given in http://code.msdn.microsoft.com/Custom-WCF-Streaming-436861e6.
Thus the new flow will be as:
Upon receiving the download request, the server will open a connection to the DB using snapshot isolation as the transaction level.
Build the row by row object on the server and send it to the client on the requested channel, as the client receives each row object, it gets processed and a success or failure response is sent back to the server on the same method same channel, as I need to update the data on the same snapshot transaction.
This way I will reduce bulk objects in memory, and rely on SQL for the snapshot data that will be maintained in temdb once the transaction is initiated.
Challenge:
How can I send the row object and wait for a confirmation before sending the next one, as the update to the server row has to occur on the same snapshot transaction. Since if I create another method on the service to perform the flagging off, the snapshots will be different and this will cause issues in the integrity of the data in case the data undergoes changes after the snapshot transaction was initiated.
If this is the wrong approach, then please suggest a better one, as I am open to any suggestions.
If my understanding of the snapshot isolation is wrong, then please correct me as I am new to this.
Update 1:
I would like to achieve something like this when the client is the one requesting:
//Client Invokes this method on the server
public Stream GetData(string sTableName)
{
//Open the Snapshot Transaction on the Server
SqlDataReader rdr = operations.InitSnapshotTrans("Select * from " + sTableName + " Where Isnull(ColToCheck,'N') <> 'Y'");
//Check if there are rows available
if(rdr.HasRows)
{
while rdr.read()
{
SendObj sendobj = Logic.CreateObejct(rdr);
//Here is where i am stuck
//At this point I want to write the object to the Stream
...Write sendobj to Stream
//Once the client is done processing it reverts with a true for success or false for failuer.
if (returnObj == true)
{
operations.updateMovedRecord(rdr);
}
}
}
}
For the server sending i have written the code as Such (I used Pub Sub Model for this):
public void ServerData(string sServerText)
{
List<SubList> subscribers = Filter.GetClients();
if (subscribers == null) return;
Type type = typeof(ITransfer);
MethodInfo publishMethodInfo = type.GetMethod("ServerData");
foreach (SubList subscriber in subscribers)
{
try
{
//Open the Snapshot Transaction on the Server
SqlDataReader rdr = operations.InitSnapshotTrans("Select * from " + sTableName + " Where Isnull(ColToCheck,'N') <> 'Y'");
//Check if there are rows available
if(rdr.HasRows)
{
while rdr.read()
{
SendObj sendobj = Logic.CreateObejct(rdr);
bool rtnVal = Convert.ToBoolean(publishMethodInfo.Invoke(subscriber.CallBackId, new object[] { sendobj }));
if (rtnVal == true)
{
operations.updateMovedRecord(rdr);
}
}
}
}
catch (Exception ex)
{
Debug.WriteLine(ex.Message);
}
}
}
Just off the top of my head, this sounds like it might take longer. That may or may not be a concern.
Given the requirement in challenge 1 (that everything happen in the context of one method call), it sounds like what actually needs to happen is for the server to call a method on the client, sending a record, and then waiting for the client to return confirmation. That way, everything that needs to happen, happens in the context of a single call (server to client). I don't know if that's feasible in your situation.
Another option might be to use some kind of double-queue system (perhaps with MSMQ?) so that the server and client can maintain an ongoing conversation within a single session.
I assume there's a reason why you can't just divide the data to be downloaded into manageable chunks and repeatedly execute the original process on the chunks. That sounds the least ambitious option, but you probably would have done it already if it met all your needs.

SqlDependency in a Windows Service not firing

I am trying to monitor a database table for changes using the SqlDependency class. Though I must be missing something. I have followed all of the examples that I see online and I have reviewed all the questions on this site. I just don't see what I am missing. Here are the initial commands that I ran on the database to enable the service broker and create the queue and service.
CREATE QUEUE ScheduleChangeQueue
GO
CREATE SERVICE ScheduleChangeService ON QUEUE ScheduleChangeQueue ([http://schemas.microsoft.com/SQL/Notifications/PostQueryNotification])
GO
ALTER DATABASE [database] SET ENABLE_BROKER
On the C# side, I have created a class that has a single static Setup method that is called to initiate the process. Here is the code for that:
public class SqlDependencyManager
{
private static bool DoesUserHavePermission()
{
var success = false;
try
{
Program.Log.Info("Retrieving SqlPermission to establish dependency...");
var clientPermission = new SqlClientPermission(System.Security.Permissions.PermissionState.Unrestricted);
// this will throw an error if the user does not have the permissions
clientPermission.Demand();
success = true;
Program.Log.Info("SqlPermission established. Continue setting up dependency.");
}
catch (Exception ex)
{
Program.Log.Error(ex, "SqlPermission not able to be established.");
}
return success;
}
public static void Setup()
{
if (!DoesUserHavePermission())
{
return;
}
var connectionString = ConfigurationManager.ConnectionStrings["ShowMakerPro"].ConnectionString;
// You must stop the dependency before starting a new one.
// You must start the dependency when creating a new one.
SqlDependency.Stop(connectionString);
SqlDependency.Start(connectionString);
using (var cn = new SqlConnection(connectionString))
{
using (var cmd = cn.CreateCommand())
{
cmd.CommandType = CommandType.Text;
//cmd.CommandText = "SELECT MAX(LastChangeTime) FROM Schedule WHERE ChannelID IN ( SELECT ID FROM Channels WHERE Type = 1 ) AND StartTime BETWEEN (GETDATE() - 7) AND (GETDATE() + 30)";
cmd.CommandText = "SELECT LastChangeTime FROM dbo.Schedule";
cmd.Notification = null;
// Creates a new dependency for the SqlCommand. Then creates attaches handler for the notification of data changes
new SqlDependency(cmd).OnChange += SqlDependency_OnChange;
cn.Open();
cmd.ExecuteReader();
}
}
Program.Log.Info("SQL Dependency set. Now monitoring schedule table for changes.");
}
private static void SqlDependency_OnChange(object sender, SqlNotificationEventArgs e)
{
if (e.Type == SqlNotificationType.Change)
{
// this will remove the event handler since the dependency is only for a single notification
((SqlDependency)sender).OnChange -= SqlDependency_OnChange;
ScheduleOutputterService.BuildSchedules();
Program.Log.Info("SQL Dependency triggered schedule rebuild. Resetting SqlDependency to monitor for changes.");
Setup();
}
}
}
I see the code get setup ok and the OnChange method is fired once for the Subscribe but then I never see it fire after that. I manually go into the database and change the LastChangeTime field hoping that it will force the firing of the event but nothing happens.
Can someone please shed some light on where I am screwing up? I see some people saying on line that this works fine in a windows form but they are also having some problems while in a service.
So I finally figured out the answer to my question and I thought I should list all the steps I took to get to this point so someone else coming along behind me will also have another place to look for answers since I seemed unable to find all of my answers in one place.
First off, I noticed in my situation that as soon as the subscription was set the OnChange event would fire right away. That is why I put in the check for change type so I could ignore those events. It turns out that ignoring those events was not a good thing because those events were actually trying to tell me something. Doing a search on my values directed me here:
http://msmvps.com/blogs/siva/archive/2011/11/22/subtle-sqldependency-notification-issue.aspx
This was very valuable because it helped me to see that there must have been a problem with some of my options in the database. Upon further inspection I noticed that my database was set to SQL Server 2000 compatibility. That is obviously my first problem because this is a 2005 and greater feature. So I tried to change my settings to the high version. This worked ok but then I still noticed that I was receiving the same event. So then I checked my database settings and I found that they were not set to match the options required to run service broker. You can see all the required option settings here:
http://msdn.microsoft.com/en-us/library/ms181122(v=SQL.100).aspx
After I inspected all these and tried to do some workarounds to get the right settings all squared away the setup was still failing. But this time it was failing because the updates would not save. It turned out that the client had triggers on the table in question and it's database settings that were being used to execute the trigger were in conflict with the needed settings to run QueryNotifications.
So long story short, the client decided that they did not want to change all the triggers that were being used so they abandoned the effort. But I learned a lot about how to troubleshoot SqlDependency and ServiceBroker. Hopefully these few links I provided will be helpful for someone else. A couple more links that were very helpful were mentioned in the comments but I am going to repost them in this answer so you can have some other items to review.
http://rusanu.com/2006/06/17/the-mysterious-notification/
http://rusanu.com/2005/12/20/troubleshooting-dialogs/

alternate options for infinite loop

We are running one third party application which uses C#, SQL SERVER. We have created one other application which prints the pass.
Basically it does continuous checking of new entry from third party application in one of the table on remote database. If new entry is present then it prints pass. Accessing network database in such way is not good way and also sometimes application hang.
Instead of continuous loop, I am searching for some other way like: As the new entry comes, it trigger my application for print. Or any other good way to implement.
What you are looking for is the SqlDependency that can help you in
listening to the OnChange event.
Example from msdn:
void Initialization()
{
// Create a dependency connection.
SqlDependency.Start(connectionString, queueName);
}
void SomeMethod()
{
// Assume connection is an open SqlConnection.
// Create a new SqlCommand object.
using (SqlCommand command=new SqlCommand(
"SELECT ShipperID, CompanyName, Phone FROM dbo.Shippers",
connection))
{
// Create a dependency and associate it with the SqlCommand.
SqlDependency dependency=new SqlDependency(command);
// Maintain the refence in a class member.
// Subscribe to the SqlDependency event.
dependency.OnChange+=new
OnChangeEventHandler(OnDependencyChange);
// Execute the command.
using (SqlDataReader reader = command.ExecuteReader())
{
// Process the DataReader.
}
}
}
// Handler method
void OnDependencyChange(object sender,
SqlNotificationEventArgs e )
{
// Handle the event (for example, invalidate this cache entry).
}
void Termination()
{
// Release the dependency.
SqlDependency.Stop(connectionString, queueName);
}
Check it out: http://msdn.microsoft.com/en-us/library/62xk7953.aspx
If any user subsequently changes the underlying data, Microsoft SQL
Server detects that there is a notification pending for such a change,
and posts a notification that is processed and forwarded to the client
through the underlying SqlConnection that was created by calling
SqlDependency.Start. The client listener receives the invalidation
message. The client listener then locates the associated SqlDependency
object and fires the OnChange event.
let me just sum up the data:
you need to give us more information for a more intelligent help here bu basically you can:
Use a timer: that way you are not in an infinite loop and you don't check every few ticks the 3rd party. you can be more sophisticated even and make the time interval grow and shrink according to data changes found or not
if you can change the 3rd party so it'll send events or signal when it has new data then you can use that and register to the events and save a lot of process time
if you can change the DB add triggers and use them to know then new data come and thus you don't need nothing more then that

SQLDependency + Service Broker

I'm using SqlDependency to get notification when data in some table are changed.
private void subscribeBroker()
{
using (var conn = new SqlConnection(connString))
{
conn.Open();
var cmd = new SqlCommand("SELECT text FROM dbo.Test");
cmd.Connection = conn;
var dependency = new SqlDependency(cmd);
dependency.OnChange += dependency_OnChange;
SqlDependency.Start(connString);
cmd.ExecuteNonQuery();
}
}
void dependency_OnChange(object sender, SqlNotificationEventArgs e)
{
//Do something...
subscribeBroker();
}
It is working but I have some questions.
1) I didn't find a way how to get information which row was changed. I need to read all data from entire table to see what is different. Is there a way how to get this information? (primary ID, or something) Maybe to use different approach than SqlDependency?
2) What if "somebody" changing data very fast. It is possible that some changes will not being notified? (I'm concerned about time between notification and time when I subscribe it again.
Thank you.
About 1- query notification informs you about the fact, that something is changed. If you want to get what was changed since last time- you could probably use timestamp column.
About 2- query notification informs you about changes and then is dropped. then you again subscribe for notification again. that mean- time between dropping and creation of notifications is that time in which notification about changes is not send.
Query notifications is more for the situations, when your data is not changing frequently. For example- some cashed classification values. So- you subscribe for changes in some table, wait for changes and at the time they happen you get latest version of data. Should consider that query notification also uses server resources, so if you have huge table and want to get changes on some small subset of data, a lot of queries can be affected in terms of performance (something like indexed view).
If you need to take some action based on changed data and each change is important, then i would guess that trigger + service broker could be more effective. Or, depending on your needs, Change Data Capture.

Categories

Resources