Performance improvement of client server system - c#

I have a legacy client server system where the server maintains a record of some data stored in a sqlite database. The data is related to monitoring access patterns of files stored on the server. The client application is basically a remote viewer of the data. When the client is launched, it connects to the server and gets the data from the server to display in a grid view. The data gets updated in real time on the server and the view in the client automatically gets refreshed.
There are two problems with the current implementation:
When the database gets too big, it takes a lot of time to load the client. What are the best ways to deal with this. One option is to maintain a cache at the client side. How to best implement a cache ?
How can the server maintain a diff so that it only sends the diff during the refresh cycle. There can be multiple clients and each client needs to display the latest data available on the server.
The server is a windows service daemon. Both the client and the server are implemented in C#

Could put a date/timestamp (indexed) on the data and then load the data > last successful timestamp.
Load data in pages so you get a quicker startup and then load the rest in the background.

If you go for the "work offline" (cached) solution then you should take a look at the MS ADO.NET Sync Framework. It supports providers and solves most of the hard problems with synchronizing data.
Another option is to retrieve only selected columns, like primary key and a single descriptive column. The remaining data could be lazy-loaded on demand, such as when it scrolls into view or is being accessed.

Related

How to send receive data via dialup by Atapi.dll?

I have a c# project that retrieves data from sql server database and stores in it , and I have a copies of this software every one works in different place (different area ) each software stores in its own database. I have to sync the data among these databases by phone line . I have recently read about atapi.dll . Could I use this dll to make synchronization among databases by send receive data between softwares.
for ex: in the first place i have to send the new records to the other place
the first place have a phone number (dial up ex: 1234566)
the other place have a number (dial up ex: 3456784) how can send and receive file between two softwares by dialup numbers
Writing your own file-sync mechanism may sound simple, but it is not easy, especially if you need to sync multiple parties.
Rather than writing your own sync tool, I would strongly encourage you to use SQL Server replication which is a feature built-in to SQL Server itself to support exactly the scenario you describe above.
If I am understanding your scenario:
You have a master database with all records from all branch sites
You have a subset of that data at each site - the latest copy of the master data plus any changes made at the local site
You periodically want to have each site dial-in to the master server and sync data back and forth so that your site-changes are pushed up to the master server and the master DB's changes are pushed out to the branch DB.
To support this scenario, you just configure your branch offices to dial-into the master office periodically, and configure SQL Server to replicate data as appropriate.
I've previously configured a 25-branch organization to use dial-up and broadband connections to sync a large SQL Server production database in less than 2 days, including time to update their backup strategy to account for the needs of the replication strategy employed.
Compared to writing your own sync engine, using SQL Server replication will likely save you many months' of development effort and many man-years of debugging & operational support!
You don't want to be dealing with dial-up yourself. Investigate Windows RAS, which sets up a TCP/IP connection between two hosts using dial-up. It can be driven from C#.
Once you've done that, investigate SQL Server Replication in order to sync the data once the connection is up.

Database Data Protection

The scenario is that our client owns and manages a system (we wrote it) hosted at their clients premises. Their client is contractually restricted from changing any data in the database behind the system but they could change the data if they chose because they have full admin rights (the server is procured by them and hosted on their premises).
The requirement is to get notification if they change any data. For now, please ignore deleting data, this discussion is about amendments to data in tables.
We are using Linq to Sql and have overridden the data context so that for each read of the data, we compare a hash of the rows data against a stored hash, previously made during insert/update, held on each row in the table.
We are concerned about scalability so I would like to know if anyone has any other ideas. We are trying to get notified of data changes in SSMS, queries run directly on the db, etc. Also, if someone was to stop our service (Windows service), upon startup we would need to know a row had been changed. Any thoughts?
EDIT: Let me just clarify as I could have been clearer. We are not necessarily trying to stop changes being made (this is impossible as they have full access) more get notified if they change the data.
The answer is simple: to prevent the client directly manipulating the data, store it out of their reach in a Windows Azure or Amazon EC2 instance. The most they will be able to do is get the connection string which will then connect them as a limited rights user.
Also, if someone was to stop our service (Windows service), upon startup we would need to know a row had been changed.
You can create triggers which will write whatever info you want to an audit table, you can then inspect the audit table to determine changes made by your application and directly by the client. Auditing database changes is a well known problem that has been solved many times before, there is plenty of information out there about it.
for each read of the data, we compare a hash of the rows data against a stored hash
As you can probably guess, this is painfully slow and not scalable.

What is the best way to sync multiple SqlServers to one SQL Server 2005?

I have several client databases that use my windows application.
I want to send this data for online web site.
The client databases and server database structure is different because we need to add client ID column to some tables in server data base.
I use this way to sync databases; use another application and use C# bulk copy with transaction to sync databases.
My server database sql server is too busy and parallel task cannot be run.
I work on this solution:
I use triggers after update, delete, insert to save changes in one table and create sql query to send a web service to sync data.
But I must send all data first! Huge data set (bigger than 16mg)
I think can't use replication because the structure and primary keys are different.
Have you considered using SSIS to do scheduled data synchronization? You can do data transformation and bulk inserts fairly easily.
As I understand what you're trying to do, you want to allow multiple client applications to have their data synchronized to a server in such a way that the server has all the data from all the sites, but that each record also has a client identifier so you can maintain traceability back to the source.
Why must you send all the data to the server before you get the other information setup? You should be able to build all these things concurrently. Also, you don't have to upload all the data at one time. Stage them out to one per day (assuming you have a small number of client databases), that would give you a way to focus on each in turn and make sure the process was completed accurately.
Will you be replicating the data back to the clients after consolidating all the data into one table? Your size information was miscommunicated, were you saying each database was larger than 16GB? So then 5 sites would have a cumulative size of 80GB to be replicated back to the individual sites?
Otherwise, the method you outlined with using a separate application to specifically handle the uploading of data would be the most appropriate.
Are you going to upgrade the individual schemas after you update the master database? You may want to ALTER TABLE and add a bool column to every record and mark them as "sent" or "not sent" to keep track of any records that were updated/inserted "late". I have a feeling you're going to be doing a rolling deployment/upgrade and you're trying to figure out how to keep it all "in sync" without losing anything.
You could use SQL Server Transactional Replication: HOW TO: Replicate Between Computers Running SQL Server in Non-Trusted Domains or Across the Internet

Content download from within a Windows client - Best Practices

Our application is a Windows client (C++/MFC migrating to C#) that uses SQL Server Express as its data store. We release regular updates to the data the application works with. (Our users use the content data we provide as a basis for their own projects built using the client; the database stores their projects as well, to enable collaboration across a network).
For a while we've used the archaic method of providing huge "update packages" containing all of the latest data. The updater would have to be run on the server, and would swap out the database files with the ones included in the package. Yes, horrible practice for many reasons. We want to do away with it.
Specifically, we will provide an update dialog wherein the user checks off the items they want updated, then clicks Update. A background process then pulls the selected items from the content server and inserts it into the user's database.
What's a secure way to pull this data from inside a Windows client, given that they may opt in or out of potentially hundreds of separate items?
I have considered:
Remote connection to a SQL server. Query for the data directly. Easy to implement, but not secure. Searching for better ways to do this turned up suggestions to use VPN or SSH, neither of which seems particularly convenient for our customers to set up on their client machines.
HTTP content service that provides zip files of each separate content item. Or it builds then dynamically per the user's request. (e.g., we submit an XML file listing the desired items, then a server-side process exports the data from the sql server and zips it up into one package).
How do other apps featuring in-program content updates do it? Any suggestions for heading in the right direction (or at least a good one)?
Thanks,
D
Edit: We aren't necessarily concerned about whether the data transmission is secure (the content is not sensitive data); by "security" in this case I mean "is there a better way than exposing a remote SQL server to slammers in China".
Exposing SQL Server directly is generally a bad idea. Use webservice, WCF instead.

Best way to synchronise client database with server database

I have a datalogging application (c#/.net) that logs data to a SQLite database. This database is written to constantly while the application is running. It is also possible for the database to be archived and a new database created once the size of the SQLite database reaches a predefined size.
I'm writing a web application for reporting on the data. My web setup is c#/.Net with a SQL Server. Clients will be able to see their own data gathered online from their instance of my application.
For test purposes, to upload the data to test with I've written a rough and dirty application which basically reads from the SQLite DB and then injects the data into the SQL Server using SQL - I run the application once to populate the SQL Server DB online.
My application is written in c# and is modular so I could add a process that periodically checks the SQLite DB then transfer new data in batches to my SQL Server.
My question is, if I wanted to continually synchronise the client side SQLLite database (s) with my server as the application is datalogging what would the best way of going about this be?
Is there any technology/strategy I should be looking into employing here? Any recommended techniques?
Several options come to mind. You can add a timestamp to each table that you want to copy from and then select rows written after the last update. This is fast and will work if you archive the database and start with an empty one.
You can also journal your updates for each table into an XML string that describes the changes and store that into a new table that is treated as a queue.
You could take a look at the Sync Framework. How complex is the schema that you're looking to sync up & is it only one-way or does data need to come back down?
As a simply solution I'd look at exporting data in some delimited format and then using bcp/BULK INSERT to pull it in to your central server.
Might want to investigate concept of Log Shipping
There exists a open source project on Github also available on Nuget. It is called SyncWinR, it implements the Sync Framework Toolkit to enabled synchronization with WinRT or Windows Phone 8 and SQLite.
You can access the project from https://github.com/Mimetis/SyncWinRT.

Categories

Resources