I have a datalogging application (c#/.net) that logs data to a SQLite database. This database is written to constantly while the application is running. It is also possible for the database to be archived and a new database created once the size of the SQLite database reaches a predefined size.
I'm writing a web application for reporting on the data. My web setup is c#/.Net with a SQL Server. Clients will be able to see their own data gathered online from their instance of my application.
For test purposes, to upload the data to test with I've written a rough and dirty application which basically reads from the SQLite DB and then injects the data into the SQL Server using SQL - I run the application once to populate the SQL Server DB online.
My application is written in c# and is modular so I could add a process that periodically checks the SQLite DB then transfer new data in batches to my SQL Server.
My question is, if I wanted to continually synchronise the client side SQLLite database (s) with my server as the application is datalogging what would the best way of going about this be?
Is there any technology/strategy I should be looking into employing here? Any recommended techniques?
Several options come to mind. You can add a timestamp to each table that you want to copy from and then select rows written after the last update. This is fast and will work if you archive the database and start with an empty one.
You can also journal your updates for each table into an XML string that describes the changes and store that into a new table that is treated as a queue.
You could take a look at the Sync Framework. How complex is the schema that you're looking to sync up & is it only one-way or does data need to come back down?
As a simply solution I'd look at exporting data in some delimited format and then using bcp/BULK INSERT to pull it in to your central server.
Might want to investigate concept of Log Shipping
There exists a open source project on Github also available on Nuget. It is called SyncWinR, it implements the Sync Framework Toolkit to enabled synchronization with WinRT or Windows Phone 8 and SQLite.
You can access the project from https://github.com/Mimetis/SyncWinRT.
Related
I have an application that currently runs in MS SQL Server Standard in multiple clients/local server. The program was builded using WPF + StoredProcedures/functions/triggers, and there aren't MS Jobs.
I want to create an desktop version of my application, where the user can save all the data to a file and load it in another desktop with the program running.
I never done this type of application before, and I have a few doubts about how I should do this, I don't know if what I'm planning is going to work.
Basically, I was thinking in install my application with SQL Server Express and to save/restore it to files, I thought about calling backup/restore database through storedprocedures.
Am I thinking right? Is there a better way to do this?
I'm not sure I understand why you would need to export the entire contents of the database to a file. The central database should be accessible by all of the client applications, avoiding the need for the client application to have its own dedicated copy of the database. But ... if this is really what you want, then I think your two options are:
1.) Use Backup+Restore.
2.) Use LocalDB with the AttachDbFileName property, which is designed to keep your database in a single self-contained local file that you can read/write without having to connect to an actual database server. Some starting info here.
I have a c# project that retrieves data from sql server database and stores in it , and I have a copies of this software every one works in different place (different area ) each software stores in its own database. I have to sync the data among these databases by phone line . I have recently read about atapi.dll . Could I use this dll to make synchronization among databases by send receive data between softwares.
for ex: in the first place i have to send the new records to the other place
the first place have a phone number (dial up ex: 1234566)
the other place have a number (dial up ex: 3456784) how can send and receive file between two softwares by dialup numbers
Writing your own file-sync mechanism may sound simple, but it is not easy, especially if you need to sync multiple parties.
Rather than writing your own sync tool, I would strongly encourage you to use SQL Server replication which is a feature built-in to SQL Server itself to support exactly the scenario you describe above.
If I am understanding your scenario:
You have a master database with all records from all branch sites
You have a subset of that data at each site - the latest copy of the master data plus any changes made at the local site
You periodically want to have each site dial-in to the master server and sync data back and forth so that your site-changes are pushed up to the master server and the master DB's changes are pushed out to the branch DB.
To support this scenario, you just configure your branch offices to dial-into the master office periodically, and configure SQL Server to replicate data as appropriate.
I've previously configured a 25-branch organization to use dial-up and broadband connections to sync a large SQL Server production database in less than 2 days, including time to update their backup strategy to account for the needs of the replication strategy employed.
Compared to writing your own sync engine, using SQL Server replication will likely save you many months' of development effort and many man-years of debugging & operational support!
You don't want to be dealing with dial-up yourself. Investigate Windows RAS, which sets up a TCP/IP connection between two hosts using dial-up. It can be driven from C#.
Once you've done that, investigate SQL Server Replication in order to sync the data once the connection is up.
My Scenario:
I have two applications. First one is a website which is connected to MySQL Database and 2nd one is a Desktop Application which is connected to SQL Server2008 R2 Database.
The Desktop application updates records locally and the MySQL database is updated online though the website.
Problem:
Two different databases, how can we update at the spot when changes are made either in MySQL or SQL Database?
What I Want:
Databases should be synchronized to each other (e.g. if changes are made in MySQL then SQL server database should be updated, or if changes are made in SQL Database then MySQL database should be updated)
Could anybody please suggest some code, any idea, or any solution to solve this issue?
Make use of Restful API's to Update information from MS SQL server to MYSQL server.
One of the first things I would point out is that complete and perfect syncing is not possible. Unfortunately there will be data types that exist in SQL Server that don't exist in MySQL and vice versa.
But assuming the data types are pretty simple and the schemas are similar, here are some options:
Use a service bus. You can write an application that monitors both database systems and when it sees a change, it pushes an object onto the service bus. Listeners to the service bus will see the objects and write them to the appropriate destination.
Use triggers like Alex suggested. SQL Server can have CLR code execute on a trigger. The CLR code could be some C# that writes directly to MySQL. Takes some setup, but it's possible. I've investigated running a process from a trigger in MySQL and all options are ugly. It's possible, but security is a major concern. The idea is that a record is changed, trigger is fired and an external process is run.
Write an application that constantly looks for "diffs" in tables and moves data back and forth. You'll need to modify all tables to make sure there is support for date/time stamps for each record so you can track when a record has "changed".
I'm very new to .NET, C# and SQL Server.
I need to develop a socket application using C#. That application should insert, delete, update, etc the data in database with respect to some request.
I need to develop another windows service application using C# that sends the data in the database to the web application through http.
The both applications run in the same system. The database is SQL Server. Both the applications use the same database.
I am unsure if while one application is deleting or inserting data in the database, then is the other application still able to access the database at a same time.
Sql Server can handle multiple requests just fine. Each connected client gets a different spid.
However, it is up to your sql server code (batches or sprocs) to handle data conceurrency issues via isolation levels and transactions. So in order to control one user reading data while others are updating, deleting etc..., you need to use isolation levels and locks.
Isolation levels:
http://msdn.microsoft.com/en-us/library/aa213034(v=sql.80).aspx
Also: http://dbalink.wordpress.com/2008/05/27/isolation-levels-and-locks-in-sql-server-2005/
Good writeup on transactions:
https://web.archive.org/web/20210614153200/http://aspnet.4guysfromrolla.com/articles/072705-1.aspx
I think that the simple answer is yes - multiple applications/people can access a database at the same time.
You may want to read up on transactions within a database and concurrency:
http://en.wikipedia.org/wiki/Atomicity_(database_systems)
http://msdn.microsoft.com/en-us/library/cs6hb8k4(v=vs.80).aspx
It is possible to access the same sql server database by as many applications as you want. If you are afraid that one application can attempt to read the data being changed by another one at the same time, read about transactions and isolation. But in very short, if you make a small atomic change which requires one line of sql code only, you probably shouldn't care about this at the moment.
I have several client databases that use my windows application.
I want to send this data for online web site.
The client databases and server database structure is different because we need to add client ID column to some tables in server data base.
I use this way to sync databases; use another application and use C# bulk copy with transaction to sync databases.
My server database sql server is too busy and parallel task cannot be run.
I work on this solution:
I use triggers after update, delete, insert to save changes in one table and create sql query to send a web service to sync data.
But I must send all data first! Huge data set (bigger than 16mg)
I think can't use replication because the structure and primary keys are different.
Have you considered using SSIS to do scheduled data synchronization? You can do data transformation and bulk inserts fairly easily.
As I understand what you're trying to do, you want to allow multiple client applications to have their data synchronized to a server in such a way that the server has all the data from all the sites, but that each record also has a client identifier so you can maintain traceability back to the source.
Why must you send all the data to the server before you get the other information setup? You should be able to build all these things concurrently. Also, you don't have to upload all the data at one time. Stage them out to one per day (assuming you have a small number of client databases), that would give you a way to focus on each in turn and make sure the process was completed accurately.
Will you be replicating the data back to the clients after consolidating all the data into one table? Your size information was miscommunicated, were you saying each database was larger than 16GB? So then 5 sites would have a cumulative size of 80GB to be replicated back to the individual sites?
Otherwise, the method you outlined with using a separate application to specifically handle the uploading of data would be the most appropriate.
Are you going to upgrade the individual schemas after you update the master database? You may want to ALTER TABLE and add a bool column to every record and mark them as "sent" or "not sent" to keep track of any records that were updated/inserted "late". I have a feeling you're going to be doing a rolling deployment/upgrade and you're trying to figure out how to keep it all "in sync" without losing anything.
You could use SQL Server Transactional Replication: HOW TO: Replicate Between Computers Running SQL Server in Non-Trusted Domains or Across the Internet