I want to make a background running desktop app using asp.net c#.
The idea is to check a local database table and get all data from that table in order to insert all those data in live database table.
If there is any better solution for that scenario please suggest me one.
It seems that what you want to solve is moving data from multiple local databases into a central one so you can consume it via an asp.net website.
I won't be discussing application part since that's an easy problem to solve once you can connect to a db that has all the data you need.
There are several things to consider here:
Recurrence (one time migration vs continuous synchronization)
Direction (do you need to sync one-way from local to central or two-way)
Data Contract (do local databases have same or different schema than central database)
The Data Contract is your biggest problem if schemas are different since you will need to design at least a target schema for the central database that can take in data from local dbs.
Even if schemas are identical, you will need to devise a way that data is partitioned in central database, you might need to introduce a sourceDatabaseId column in your tables so you won't have conflicting primary keys (you won't have this problems if your primary keys are guids)
The others can be solved either building:
A windows service - Inputs: periodicity (e.g. every hour), source db and target db connection strings. You will have a main loop that waits until time to run has come (based on periodicity) and fetches data from source db and saves them into target db (preferably in batches)
A console application - Inputs: source db and target db connection strings. You will just move data in batches. You can configure a Scheduled Task on the server that will perform scheduled runs of your console application to solve the periodic running part.
You would set up one such windows service or scheduled console app per local database.
If you have complex databases you can look into tools like Microsoft Sync Framework to perform this data synchronization.
you can develop a windows service to do this type of work and install on server with timer setting.
Related
I have a distributed application with multiple processes on multiple servers connect to a SQL Server database.
I need to migrate the database schema in code during first startup, because upgrade deployment can be done with the user without database access (we use computer object database access).
Currently this is done by providing a sql file with statements and then a user with db access (but potentially without app access) would run this independently.
Because apps do not talk to each other (firewalls, different DCs etc.) I was thinking that i'd have to designate one server as 'master', all others as 'slaves' and then on the master the first process that'd start would obtain the mutex and do the schema migrations; all others could simply wait until they can see the schema is migrated.
However, this has a certain code smell to me.
I tried researching how Entity Framework handles this in code first migrations and seems they don't (e.g. if two processes start at exact same time they would both try to migrate schema).
Any other approaches?
You can change mode of database to single-user (other connections) will be refused. Make the changes and then change back multi-user mode.
EDIT:
There is trick, how to get "mutex". You can update/delete record(s). Until transaction is open, the exclusive lock is stil holded. Probably, if you delete 0 records (with table-lock hint) from every table in transaction, you probably achieve same behaviour as "global mutex" for the users of database. But I don't know what behaviour will be with schema changes.
My Scenario:
I have two applications. First one is a website which is connected to MySQL Database and 2nd one is a Desktop Application which is connected to SQL Server2008 R2 Database.
The Desktop application updates records locally and the MySQL database is updated online though the website.
Problem:
Two different databases, how can we update at the spot when changes are made either in MySQL or SQL Database?
What I Want:
Databases should be synchronized to each other (e.g. if changes are made in MySQL then SQL server database should be updated, or if changes are made in SQL Database then MySQL database should be updated)
Could anybody please suggest some code, any idea, or any solution to solve this issue?
Make use of Restful API's to Update information from MS SQL server to MYSQL server.
One of the first things I would point out is that complete and perfect syncing is not possible. Unfortunately there will be data types that exist in SQL Server that don't exist in MySQL and vice versa.
But assuming the data types are pretty simple and the schemas are similar, here are some options:
Use a service bus. You can write an application that monitors both database systems and when it sees a change, it pushes an object onto the service bus. Listeners to the service bus will see the objects and write them to the appropriate destination.
Use triggers like Alex suggested. SQL Server can have CLR code execute on a trigger. The CLR code could be some C# that writes directly to MySQL. Takes some setup, but it's possible. I've investigated running a process from a trigger in MySQL and all options are ugly. It's possible, but security is a major concern. The idea is that a record is changed, trigger is fired and an external process is run.
Write an application that constantly looks for "diffs" in tables and moves data back and forth. You'll need to modify all tables to make sure there is support for date/time stamps for each record so you can track when a record has "changed".
My client application maintains Access database and most of time work offline.
In server side there is a web service with Oracle back end. I need to update Access database row data tables with latest Oracle database table data. Currently I'm doing this by C# windows service which triggered by timer. Is there any alternative to achieve this data synchronization with fault tolerance and good performance. Please share your experience.
Quartz.NET (is a full-featured, open source job scheduling system that can be used from smallest apps to large scale enterprise systems.) to schedule synchronization.
Data base synchronization can be done using Microsoft Sync Framework but I'm not sure whether it support for Access database or not.
I have several client databases that use my windows application.
I want to send this data for online web site.
The client databases and server database structure is different because we need to add client ID column to some tables in server data base.
I use this way to sync databases; use another application and use C# bulk copy with transaction to sync databases.
My server database sql server is too busy and parallel task cannot be run.
I work on this solution:
I use triggers after update, delete, insert to save changes in one table and create sql query to send a web service to sync data.
But I must send all data first! Huge data set (bigger than 16mg)
I think can't use replication because the structure and primary keys are different.
Have you considered using SSIS to do scheduled data synchronization? You can do data transformation and bulk inserts fairly easily.
As I understand what you're trying to do, you want to allow multiple client applications to have their data synchronized to a server in such a way that the server has all the data from all the sites, but that each record also has a client identifier so you can maintain traceability back to the source.
Why must you send all the data to the server before you get the other information setup? You should be able to build all these things concurrently. Also, you don't have to upload all the data at one time. Stage them out to one per day (assuming you have a small number of client databases), that would give you a way to focus on each in turn and make sure the process was completed accurately.
Will you be replicating the data back to the clients after consolidating all the data into one table? Your size information was miscommunicated, were you saying each database was larger than 16GB? So then 5 sites would have a cumulative size of 80GB to be replicated back to the individual sites?
Otherwise, the method you outlined with using a separate application to specifically handle the uploading of data would be the most appropriate.
Are you going to upgrade the individual schemas after you update the master database? You may want to ALTER TABLE and add a bool column to every record and mark them as "sent" or "not sent" to keep track of any records that were updated/inserted "late". I have a feeling you're going to be doing a rolling deployment/upgrade and you're trying to figure out how to keep it all "in sync" without losing anything.
You could use SQL Server Transactional Replication: HOW TO: Replicate Between Computers Running SQL Server in Non-Trusted Domains or Across the Internet
I have a datalogging application (c#/.net) that logs data to a SQLite database. This database is written to constantly while the application is running. It is also possible for the database to be archived and a new database created once the size of the SQLite database reaches a predefined size.
I'm writing a web application for reporting on the data. My web setup is c#/.Net with a SQL Server. Clients will be able to see their own data gathered online from their instance of my application.
For test purposes, to upload the data to test with I've written a rough and dirty application which basically reads from the SQLite DB and then injects the data into the SQL Server using SQL - I run the application once to populate the SQL Server DB online.
My application is written in c# and is modular so I could add a process that periodically checks the SQLite DB then transfer new data in batches to my SQL Server.
My question is, if I wanted to continually synchronise the client side SQLLite database (s) with my server as the application is datalogging what would the best way of going about this be?
Is there any technology/strategy I should be looking into employing here? Any recommended techniques?
Several options come to mind. You can add a timestamp to each table that you want to copy from and then select rows written after the last update. This is fast and will work if you archive the database and start with an empty one.
You can also journal your updates for each table into an XML string that describes the changes and store that into a new table that is treated as a queue.
You could take a look at the Sync Framework. How complex is the schema that you're looking to sync up & is it only one-way or does data need to come back down?
As a simply solution I'd look at exporting data in some delimited format and then using bcp/BULK INSERT to pull it in to your central server.
Might want to investigate concept of Log Shipping
There exists a open source project on Github also available on Nuget. It is called SyncWinR, it implements the Sync Framework Toolkit to enabled synchronization with WinRT or Windows Phone 8 and SQLite.
You can access the project from https://github.com/Mimetis/SyncWinRT.