Not sure to understand the best way to do this. I have an application which runs with a ms sql db in a local server and I want to migrate this to Azure. I want to establish a sync between local and cloud db and be able to switch my connections string between local and cloud regularly to test if it's suitable to work with azure before switching completely to Azure.
I am have this issue with the MS SQL Data sync preview:
Unable to apply a row that exceeds the Maximum Application Transaction Size for table 'dbo.XXXXXX'. Please increase the transaction size to be greater than the size of the largest row being synchronized
Shall I use the Microsoft.Synchronization framework to do my custom sync ? Is this a lot of work ? Or is there a way to allow the sync agent to synchronize heavy tables ?
The root cause of your sync failure is the row size of table dbo.XXX is 32278KB, which larger than the Data Sync support size of 24576KB. Here is a doc describing the limitations of Data Sync Service.
http://msdn.microsoft.com/en-us/library/jj590380.aspx.
I'm an engineer from SQL Data Sync team. The potential root cause of this issue is like Paul said. For confirmation, could you please provide the related Sync Group Id and Azure Subscription Id here (You can find the information in "PROPERTIES" tab of the specific Sync Group)? And then I'll do further check on our service backend.
https://learn.microsoft.com/en-us/azure/sql-database/sql-database-sync-data
This is a known limitation of Azure Data Sync, try the Azure ADFv2 using the delta copy its simple and Effective for database sizes more then 20GB's
Related
I want to make a background running desktop app using asp.net c#.
The idea is to check a local database table and get all data from that table in order to insert all those data in live database table.
If there is any better solution for that scenario please suggest me one.
It seems that what you want to solve is moving data from multiple local databases into a central one so you can consume it via an asp.net website.
I won't be discussing application part since that's an easy problem to solve once you can connect to a db that has all the data you need.
There are several things to consider here:
Recurrence (one time migration vs continuous synchronization)
Direction (do you need to sync one-way from local to central or two-way)
Data Contract (do local databases have same or different schema than central database)
The Data Contract is your biggest problem if schemas are different since you will need to design at least a target schema for the central database that can take in data from local dbs.
Even if schemas are identical, you will need to devise a way that data is partitioned in central database, you might need to introduce a sourceDatabaseId column in your tables so you won't have conflicting primary keys (you won't have this problems if your primary keys are guids)
The others can be solved either building:
A windows service - Inputs: periodicity (e.g. every hour), source db and target db connection strings. You will have a main loop that waits until time to run has come (based on periodicity) and fetches data from source db and saves them into target db (preferably in batches)
A console application - Inputs: source db and target db connection strings. You will just move data in batches. You can configure a Scheduled Task on the server that will perform scheduled runs of your console application to solve the periodic running part.
You would set up one such windows service or scheduled console app per local database.
If you have complex databases you can look into tools like Microsoft Sync Framework to perform this data synchronization.
you can develop a windows service to do this type of work and install on server with timer setting.
I have an metro app which takes a picture of a burning flame and sends it Azure. I am storing image directly in SQL Server table and not in BLOB because the image is generally < 100KB. The way I am implementing it is the image is inserted into the table, and after successful insert a push notification is sent to client with a set of instructions which indicate action to be taken for a flame.
Now, I am researching how I can implement pattern matching in the SQL Server table.
The table already has 10 images and my app takes a picture, inserts it into table and tries to compare it and finds the closest match and based on the match the specific instructions will be sent to metro app.
IS there any framework which I can use to do this pattern matching in cloud and carry specific task based on this pattern matching?
Can anybody please help me with any info in this regard?
While I can't recommend anything specific: Whatever app you find that will install and run in either a Windows or Linux virtual machine (or in a Windows VM in a Cloud Service, if the install can be automated and quick), should be ok. Just make sure whatever library you use doesn't rely on any specific GPU (since Azure doesn't offer GPU support today).
I saw another answer recommending a CLR procedure to process the images. I really wouldn't recommend that, as you're now stressing the CPU of your SQL Server, and that's not something that can easily be scaled out to multiple servers. And if you choose to use Windows Azure SQL Database, you won't have CLR as an option. You're better off placing processing in, say, a Cloud Service worker role, where you can scale out to any number of instances, and you can then use an Azure Queue to instruct the workers to perform specific comparisons / processing.
My client application maintains Access database and most of time work offline.
In server side there is a web service with Oracle back end. I need to update Access database row data tables with latest Oracle database table data. Currently I'm doing this by C# windows service which triggered by timer. Is there any alternative to achieve this data synchronization with fault tolerance and good performance. Please share your experience.
Quartz.NET (is a full-featured, open source job scheduling system that can be used from smallest apps to large scale enterprise systems.) to schedule synchronization.
Data base synchronization can be done using Microsoft Sync Framework but I'm not sure whether it support for Access database or not.
I have several client databases that use my windows application.
I want to send this data for online web site.
The client databases and server database structure is different because we need to add client ID column to some tables in server data base.
I use this way to sync databases; use another application and use C# bulk copy with transaction to sync databases.
My server database sql server is too busy and parallel task cannot be run.
I work on this solution:
I use triggers after update, delete, insert to save changes in one table and create sql query to send a web service to sync data.
But I must send all data first! Huge data set (bigger than 16mg)
I think can't use replication because the structure and primary keys are different.
Have you considered using SSIS to do scheduled data synchronization? You can do data transformation and bulk inserts fairly easily.
As I understand what you're trying to do, you want to allow multiple client applications to have their data synchronized to a server in such a way that the server has all the data from all the sites, but that each record also has a client identifier so you can maintain traceability back to the source.
Why must you send all the data to the server before you get the other information setup? You should be able to build all these things concurrently. Also, you don't have to upload all the data at one time. Stage them out to one per day (assuming you have a small number of client databases), that would give you a way to focus on each in turn and make sure the process was completed accurately.
Will you be replicating the data back to the clients after consolidating all the data into one table? Your size information was miscommunicated, were you saying each database was larger than 16GB? So then 5 sites would have a cumulative size of 80GB to be replicated back to the individual sites?
Otherwise, the method you outlined with using a separate application to specifically handle the uploading of data would be the most appropriate.
Are you going to upgrade the individual schemas after you update the master database? You may want to ALTER TABLE and add a bool column to every record and mark them as "sent" or "not sent" to keep track of any records that were updated/inserted "late". I have a feeling you're going to be doing a rolling deployment/upgrade and you're trying to figure out how to keep it all "in sync" without losing anything.
You could use SQL Server Transactional Replication: HOW TO: Replicate Between Computers Running SQL Server in Non-Trusted Domains or Across the Internet
I have a datalogging application (c#/.net) that logs data to a SQLite database. This database is written to constantly while the application is running. It is also possible for the database to be archived and a new database created once the size of the SQLite database reaches a predefined size.
I'm writing a web application for reporting on the data. My web setup is c#/.Net with a SQL Server. Clients will be able to see their own data gathered online from their instance of my application.
For test purposes, to upload the data to test with I've written a rough and dirty application which basically reads from the SQLite DB and then injects the data into the SQL Server using SQL - I run the application once to populate the SQL Server DB online.
My application is written in c# and is modular so I could add a process that periodically checks the SQLite DB then transfer new data in batches to my SQL Server.
My question is, if I wanted to continually synchronise the client side SQLLite database (s) with my server as the application is datalogging what would the best way of going about this be?
Is there any technology/strategy I should be looking into employing here? Any recommended techniques?
Several options come to mind. You can add a timestamp to each table that you want to copy from and then select rows written after the last update. This is fast and will work if you archive the database and start with an empty one.
You can also journal your updates for each table into an XML string that describes the changes and store that into a new table that is treated as a queue.
You could take a look at the Sync Framework. How complex is the schema that you're looking to sync up & is it only one-way or does data need to come back down?
As a simply solution I'd look at exporting data in some delimited format and then using bcp/BULK INSERT to pull it in to your central server.
Might want to investigate concept of Log Shipping
There exists a open source project on Github also available on Nuget. It is called SyncWinR, it implements the Sync Framework Toolkit to enabled synchronization with WinRT or Windows Phone 8 and SQLite.
You can access the project from https://github.com/Mimetis/SyncWinRT.