My client application maintains Access database and most of time work offline.
In server side there is a web service with Oracle back end. I need to update Access database row data tables with latest Oracle database table data. Currently I'm doing this by C# windows service which triggered by timer. Is there any alternative to achieve this data synchronization with fault tolerance and good performance. Please share your experience.
Quartz.NET (is a full-featured, open source job scheduling system that can be used from smallest apps to large scale enterprise systems.) to schedule synchronization.
Data base synchronization can be done using Microsoft Sync Framework but I'm not sure whether it support for Access database or not.
Related
I want to make a background running desktop app using asp.net c#.
The idea is to check a local database table and get all data from that table in order to insert all those data in live database table.
If there is any better solution for that scenario please suggest me one.
It seems that what you want to solve is moving data from multiple local databases into a central one so you can consume it via an asp.net website.
I won't be discussing application part since that's an easy problem to solve once you can connect to a db that has all the data you need.
There are several things to consider here:
Recurrence (one time migration vs continuous synchronization)
Direction (do you need to sync one-way from local to central or two-way)
Data Contract (do local databases have same or different schema than central database)
The Data Contract is your biggest problem if schemas are different since you will need to design at least a target schema for the central database that can take in data from local dbs.
Even if schemas are identical, you will need to devise a way that data is partitioned in central database, you might need to introduce a sourceDatabaseId column in your tables so you won't have conflicting primary keys (you won't have this problems if your primary keys are guids)
The others can be solved either building:
A windows service - Inputs: periodicity (e.g. every hour), source db and target db connection strings. You will have a main loop that waits until time to run has come (based on periodicity) and fetches data from source db and saves them into target db (preferably in batches)
A console application - Inputs: source db and target db connection strings. You will just move data in batches. You can configure a Scheduled Task on the server that will perform scheduled runs of your console application to solve the periodic running part.
You would set up one such windows service or scheduled console app per local database.
If you have complex databases you can look into tools like Microsoft Sync Framework to perform this data synchronization.
you can develop a windows service to do this type of work and install on server with timer setting.
Existing Application:
I have an existing distributed Windows Forms application which is developed using VS 2013 & SQL Server 2008. Application is communicating through web api's from various remote locations to the central database.
Requirement/Feature:
1. I have to add new feature that application may be run in offline mode if there is no internet connectivity. When application is connected to the internet it should SYNCH all the data automatically to the server. Also if there are any changes to the database server in central database it should be SYNCH to remote locations as well.
The synch mechanism must use the webapis to synch the data and database structure.
My Proposal:
I'm planning to use Microsoft SYNCH framework to add SYNCH feature but it can not be done through webapi's (It has to be done through API's as per client requirement).
Also we need complete audit/reporting of the SYNCH process and hence this can not be done without customizing the program for adding SYNCH feature and we can not use Micorsoft SYNCH framework.
My Question/Queries
Can anyone suggest the best approach to do this so that both data and architecture can be SYNCH?
Would be great if anyone has used the generic approach which may be used for all database tables?
Thank you very much in advance!!
Not sure to understand the best way to do this. I have an application which runs with a ms sql db in a local server and I want to migrate this to Azure. I want to establish a sync between local and cloud db and be able to switch my connections string between local and cloud regularly to test if it's suitable to work with azure before switching completely to Azure.
I am have this issue with the MS SQL Data sync preview:
Unable to apply a row that exceeds the Maximum Application Transaction Size for table 'dbo.XXXXXX'. Please increase the transaction size to be greater than the size of the largest row being synchronized
Shall I use the Microsoft.Synchronization framework to do my custom sync ? Is this a lot of work ? Or is there a way to allow the sync agent to synchronize heavy tables ?
The root cause of your sync failure is the row size of table dbo.XXX is 32278KB, which larger than the Data Sync support size of 24576KB. Here is a doc describing the limitations of Data Sync Service.
http://msdn.microsoft.com/en-us/library/jj590380.aspx.
I'm an engineer from SQL Data Sync team. The potential root cause of this issue is like Paul said. For confirmation, could you please provide the related Sync Group Id and Azure Subscription Id here (You can find the information in "PROPERTIES" tab of the specific Sync Group)? And then I'll do further check on our service backend.
https://learn.microsoft.com/en-us/azure/sql-database/sql-database-sync-data
This is a known limitation of Azure Data Sync, try the Azure ADFv2 using the delta copy its simple and Effective for database sizes more then 20GB's
I'm very new to .NET, C# and SQL Server.
I need to develop a socket application using C#. That application should insert, delete, update, etc the data in database with respect to some request.
I need to develop another windows service application using C# that sends the data in the database to the web application through http.
The both applications run in the same system. The database is SQL Server. Both the applications use the same database.
I am unsure if while one application is deleting or inserting data in the database, then is the other application still able to access the database at a same time.
Sql Server can handle multiple requests just fine. Each connected client gets a different spid.
However, it is up to your sql server code (batches or sprocs) to handle data conceurrency issues via isolation levels and transactions. So in order to control one user reading data while others are updating, deleting etc..., you need to use isolation levels and locks.
Isolation levels:
http://msdn.microsoft.com/en-us/library/aa213034(v=sql.80).aspx
Also: http://dbalink.wordpress.com/2008/05/27/isolation-levels-and-locks-in-sql-server-2005/
Good writeup on transactions:
https://web.archive.org/web/20210614153200/http://aspnet.4guysfromrolla.com/articles/072705-1.aspx
I think that the simple answer is yes - multiple applications/people can access a database at the same time.
You may want to read up on transactions within a database and concurrency:
http://en.wikipedia.org/wiki/Atomicity_(database_systems)
http://msdn.microsoft.com/en-us/library/cs6hb8k4(v=vs.80).aspx
It is possible to access the same sql server database by as many applications as you want. If you are afraid that one application can attempt to read the data being changed by another one at the same time, read about transactions and isolation. But in very short, if you make a small atomic change which requires one line of sql code only, you probably shouldn't care about this at the moment.
I have a datalogging application (c#/.net) that logs data to a SQLite database. This database is written to constantly while the application is running. It is also possible for the database to be archived and a new database created once the size of the SQLite database reaches a predefined size.
I'm writing a web application for reporting on the data. My web setup is c#/.Net with a SQL Server. Clients will be able to see their own data gathered online from their instance of my application.
For test purposes, to upload the data to test with I've written a rough and dirty application which basically reads from the SQLite DB and then injects the data into the SQL Server using SQL - I run the application once to populate the SQL Server DB online.
My application is written in c# and is modular so I could add a process that periodically checks the SQLite DB then transfer new data in batches to my SQL Server.
My question is, if I wanted to continually synchronise the client side SQLLite database (s) with my server as the application is datalogging what would the best way of going about this be?
Is there any technology/strategy I should be looking into employing here? Any recommended techniques?
Several options come to mind. You can add a timestamp to each table that you want to copy from and then select rows written after the last update. This is fast and will work if you archive the database and start with an empty one.
You can also journal your updates for each table into an XML string that describes the changes and store that into a new table that is treated as a queue.
You could take a look at the Sync Framework. How complex is the schema that you're looking to sync up & is it only one-way or does data need to come back down?
As a simply solution I'd look at exporting data in some delimited format and then using bcp/BULK INSERT to pull it in to your central server.
Might want to investigate concept of Log Shipping
There exists a open source project on Github also available on Nuget. It is called SyncWinR, it implements the Sync Framework Toolkit to enabled synchronization with WinRT or Windows Phone 8 and SQLite.
You can access the project from https://github.com/Mimetis/SyncWinRT.