SYNCH databases customization - c#

Existing Application:
I have an existing distributed Windows Forms application which is developed using VS 2013 & SQL Server 2008. Application is communicating through web api's from various remote locations to the central database.
Requirement/Feature:
1. I have to add new feature that application may be run in offline mode if there is no internet connectivity. When application is connected to the internet it should SYNCH all the data automatically to the server. Also if there are any changes to the database server in central database it should be SYNCH to remote locations as well.
The synch mechanism must use the webapis to synch the data and database structure.
My Proposal:
I'm planning to use Microsoft SYNCH framework to add SYNCH feature but it can not be done through webapi's (It has to be done through API's as per client requirement).
Also we need complete audit/reporting of the SYNCH process and hence this can not be done without customizing the program for adding SYNCH feature and we can not use Micorsoft SYNCH framework.
My Question/Queries
Can anyone suggest the best approach to do this so that both data and architecture can be SYNCH?
Would be great if anyone has used the generic approach which may be used for all database tables?
Thank you very much in advance!!

Related

Visual FoxPro data in .NET

We are in the process of migrating an old VFP application into a .NET WPF application with SQL server.
During the process we still need to read/write to the DBF files to keep our business working properly.
To do this, we use the standard OLEDB adapter that is available. However, our sysadmin is asking if we have an alternative way to access the DBF files.
Having each user connect to the files is not the best option from a network/security perspective. Specially when connecting from home through a VPN.
I've already tried to move the connection to a single server by exposing the data through an API. But that was slowing down the application too much. In some situations we synchronise the data through background jobs (Hangfire implementation). But this can be time consuming to implement.
Has anybody used any other techniques to do something similar while migrating a VFP application?
OLEDB is still the best option. Within the application, you could impersonate a specific user that has access to files.
Also Sybase Advantage Server can connect and work with VFP data files. Local mode is (was) for free and server mode paid. You might try checking that too.
Locate data on single PC as server. Access via RDP - kludges available to support multiple connections. Increase security if needed by connecting over VPN - then RDP.

Advice on options for shared database for distributed c# application

I'd like to know my options for the following scenario:
I have a C# winforms application (developed in VS 2010) distributed to a number of offices within the country. The application communicates with a C# web service which lies on a main server at a separate location and there is one database (SQL Server 2012) at a further location. (All servers run Windows Server 2008)
Head Office (where we are) utilize the same front-end to manage certain information on the database which needs to be readily available to all offices - real-time. At the same time, any data they change needs to be readily available to us at Head Office as we have a real-time dashboard web application that monitors site-wide statistics.
Currently, the users are complaining about the speed at which the application operates. They say it is really slow. We work in a business-critical environment where every minute waiting may mean losing a client.
I have researched the following options, but do not come from a DB background, so not too sure what the best route for my scenario is.
Terminal Services/Sessions (which I've just implemented at Head Office and they say it's a great improvement, although there's a terrible lag - like remoting onto someones desktop, which is not nice to work on.)
Transactional Replication (Sounds like something quite plausible for my scenario, but would require all offices to have their own SQL server database on their individual servers and they have a tendency to "fiddle" and break everything they're left in charge of!) Wish we could take over all their servers, but they are franchises so have their own IT people on site.)
I've currently got a whole lot of the look-up data being cached on start-up of the application but this too takes 2-3 minutes to complete which is just not acceptable!
Does anyone have any ideas?
With everything running through the web service, there is no need for additional SQL Servers to be deployed local to the client. The WS wouldn't be able to communicate with these databases, unless the WS was also deployed locally as well.
Before suggesting any specific improvements, you need to benchmark where your bottlenecks are occurring. What is the latency between the various clients and the web service, and then from the web service and the database? Does the database show any waiting? Once you know the worst case scenario, improve that, and then work your way down.
Some general thoughts, though:
Move the WS closer to the database
Cache the data at the web service level to save on DB calls
Find the expense WS calls, and try to optimize the throughput
If the lookup data doesn't change all that often, use a local copy of SQL CE to cache that data, and use the MS Sync Framework to keep the data synchronized to the SQL Server
Use SQL CE for everything on the client computer, and use a background process to sync between the client and WS
UPDATE
After your comment, two additional thoughts. If your web service payload(s) is/are large, you can try adding compression on the web service (if it hasn't already been implemented).
You can also update your client to do the WS calls asynchronously, either in a thread or if you are using .NET 4.5 using async/await. This would at least allow the client to use the UI, but wouldn't necessary fix any issues with data load times.

SQL Data synchronization

We have multiple client operating the same software application which has same database structure on their individual PCs. All the clients are offline and at different locations (Not connected by LAN etc.)
Is it possible that, each client pc collects its own data and then server can restore client's back up and get updated with each of the clients data. By updating/restoring the data on server should be merged with all the client's data, so that server admin can view activity at each client side.
I hope I am clear.
Thank you very much in advance.
Regards.
Edit: We will be using SQL Server Express 2008 Edition.
As #Dennis mentioned you could use Microsoft Sync Framework.
Synchronization is a non-trivial task because of many factors like conflict handling, change detection, timestamp synchronization.... The Microsoft Sync Framework does that all for you.
There are several other Frameworks that do synchonization. See OpenSync or SymmetricDS.

Synchronize Oracle database to Access database

My client application maintains Access database and most of time work offline.
In server side there is a web service with Oracle back end. I need to update Access database row data tables with latest Oracle database table data. Currently I'm doing this by C# windows service which triggered by timer. Is there any alternative to achieve this data synchronization with fault tolerance and good performance. Please share your experience.
Quartz.NET (is a full-featured, open source job scheduling system that can be used from smallest apps to large scale enterprise systems.) to schedule synchronization.
Data base synchronization can be done using Microsoft Sync Framework but I'm not sure whether it support for Access database or not.

Implementing a desktop .NET application that can work offline.

I need to create a desktop WPF application in .NET.
The application communicates with a web server, and can work in offline mode when the web server isn't available.
For example the application needs to calculate how much time the user works on a project. The application connects to the server and gets a list of projects, the user selects one project, and presses a button to start timer. The user can later stop the timer. The project start and stop times need to be sent to the server.
How to implement this functionality when the application is in offline mode?
Is there are some existing solution or some libraries to simplify this task?
Thanks in advance.
You'll need to do a couple of things differently in order to work offline.
First, you'll need to cache a list of projects. This way, the user doesn't have to go online to get the project list - you can pull it from your local cache when the user is offline.
Secondly, you'll need to save your timing results locally. Once you go online again, you can update the server will all of the historic timing data.
This just requires saving the information locally. You can choose to save it anywhere you wish, and even a simple XML file would suffice for the information you're saving, since it's simple - just a project + a timespan.
It sounds like this is a timing application for business tracking purposes, in which case you'll want to prevent the user from easily changing the data. Personally, I would probably save this in Isolated Storage, and potentially encrypt it.
You can use Sql Server Compact for you local storage and then you microsoft sync framework to sync your local database to the server database. I recommend doing some research on the Microsoft Sync Framework.
Hello all I implemented this application I've created my own off-line framework
based on this article and Microsoft Disconnected Service Agent
DSA
I've adapted this framework for my needs.
Thank you for all.
you can use a typed or untyped dataset for offline-storage.
when online (connected to internet) you can download the data into a dataset and upload it back to the database server. the dataset can be loaded from and saved to a local file.

Categories

Resources