WPF C# MS SQL - DesktopVersion and Save/Load data to file - c#

I have an application that currently runs in MS SQL Server Standard in multiple clients/local server. The program was builded using WPF + StoredProcedures/functions/triggers, and there aren't MS Jobs.
I want to create an desktop version of my application, where the user can save all the data to a file and load it in another desktop with the program running.
I never done this type of application before, and I have a few doubts about how I should do this, I don't know if what I'm planning is going to work.
Basically, I was thinking in install my application with SQL Server Express and to save/restore it to files, I thought about calling backup/restore database through storedprocedures.
Am I thinking right? Is there a better way to do this?

I'm not sure I understand why you would need to export the entire contents of the database to a file. The central database should be accessible by all of the client applications, avoiding the need for the client application to have its own dedicated copy of the database. But ... if this is really what you want, then I think your two options are:
1.) Use Backup+Restore.
2.) Use LocalDB with the AttachDbFileName property, which is designed to keep your database in a single self-contained local file that you can read/write without having to connect to an actual database server. Some starting info here.

Related

Visual FoxPro data in .NET

We are in the process of migrating an old VFP application into a .NET WPF application with SQL server.
During the process we still need to read/write to the DBF files to keep our business working properly.
To do this, we use the standard OLEDB adapter that is available. However, our sysadmin is asking if we have an alternative way to access the DBF files.
Having each user connect to the files is not the best option from a network/security perspective. Specially when connecting from home through a VPN.
I've already tried to move the connection to a single server by exposing the data through an API. But that was slowing down the application too much. In some situations we synchronise the data through background jobs (Hangfire implementation). But this can be time consuming to implement.
Has anybody used any other techniques to do something similar while migrating a VFP application?
OLEDB is still the best option. Within the application, you could impersonate a specific user that has access to files.
Also Sybase Advantage Server can connect and work with VFP data files. Local mode is (was) for free and server mode paid. You might try checking that too.
Locate data on single PC as server. Access via RDP - kludges available to support multiple connections. Increase security if needed by connecting over VPN - then RDP.

Can't save data in SQL Server database, where to place the file?

In my application I need to use a local database (the application I'm creating works with everything locally). I had one database that worked really bad, because sometimes it saved the data and other times don't. And if I published the program I couldn't find the database file.
But I am having some trouble to know where to place the database. I have created one in E:\PAP\Trabalhos\Trabalhos\database.mdf and other in E:\PAP\Trabalhos\Trabalhos\bin\Debug\database.mdf, but in any of those paths the database is recreated/goes back to the previous state, when I try to start the program.
In my connection string I have this:
Data Source=(LocalDB)\MSSQLLocalDB;AttachDbFilename=|DataDirectory|database.mdf; Integrated Security=True
and that points to the file ...\bin\debug\database.mdf
I want to be able to access the database in any computer I use the program and be able to actually save data.
Where is the recommended path to place the database file and be able to access it independent of the computer I am using?
Should I use Windows authentication or SQL Server authentication?
tl/dr: Database doesn't save data and I want to be able to access it in any computer without any extra steps.
You can't use "(LocalDB)\" and access it from any computer. LocalDB is by design accessible only from the applications running on the same computer (it is an embedded database).
To access database over the network you need to install instance of SQL Server, like full SQL Server Express instance or use some cloud service like AWS or Azure.

detach an SQL server from visual studio

I'm pretty new to developing in Visual studio and working on databases.
I am working on a program that deals with reading and writing data to a database that I created with Visual studio.
I need to work on this project from another computer and copying over the project files was a breeze but I'm facing issues when it came to copying the .mdf database file.
Upon research it seems like, at least in the Microsoft SQL Server program, that I would have to "Detach" the database before copying it over to a different computer. So I am assuming I would have to do something similar with my Visual Studio Database as well.
Anyone has any inputs with regards to this?
If there is not much that I can do, I guess I could recreate all my tables and everything in Microsoft SQL Server program, so that it would be easier to move the database if needed.
I was in a similar situation such as yourself when I began developing my first core application. You have a few different options including:
Detach an already created database from the hosted SQL Server service and "re-attach" to another SQL Server service that is accessible from the desired set of hosts. You have to essentially disconnect the database from the service before you are able to transfer or migrate it since the process will have an exclusive lock on the .mdf file. https://msdn.microsoft.com/en-us/library/ms190794.aspx
Create the necessary .sql scripts to construct the database and run in the appropriate order e.g. create database, create tables, etc. to re-construct the database at the service location.The neat thing about this technique is if you have already created the database (which it sounds like you have) SQL Server allows you to generate scripts rather than having to write them yourself. https://technet.microsoft.com/en-us/library/ms178078(v=sql.105).aspx
Finally you may use a subscription based service such as SQL Server through Azure to host the service for controlled global access aka DBaaS (Database as a Service). I can't post anymore links, but look at Microsoft's Azure SQL Server hosting service if you are curious about this option.
The unfortunate part you have to decide is how much time you would like to invest in this. I began developing the application from scratch which led me to developing scripts to conjure up the database for deployment purposes. Good luck!

Implementing a desktop .NET application that can work offline.

I need to create a desktop WPF application in .NET.
The application communicates with a web server, and can work in offline mode when the web server isn't available.
For example the application needs to calculate how much time the user works on a project. The application connects to the server and gets a list of projects, the user selects one project, and presses a button to start timer. The user can later stop the timer. The project start and stop times need to be sent to the server.
How to implement this functionality when the application is in offline mode?
Is there are some existing solution or some libraries to simplify this task?
Thanks in advance.
You'll need to do a couple of things differently in order to work offline.
First, you'll need to cache a list of projects. This way, the user doesn't have to go online to get the project list - you can pull it from your local cache when the user is offline.
Secondly, you'll need to save your timing results locally. Once you go online again, you can update the server will all of the historic timing data.
This just requires saving the information locally. You can choose to save it anywhere you wish, and even a simple XML file would suffice for the information you're saving, since it's simple - just a project + a timespan.
It sounds like this is a timing application for business tracking purposes, in which case you'll want to prevent the user from easily changing the data. Personally, I would probably save this in Isolated Storage, and potentially encrypt it.
You can use Sql Server Compact for you local storage and then you microsoft sync framework to sync your local database to the server database. I recommend doing some research on the Microsoft Sync Framework.
Hello all I implemented this application I've created my own off-line framework
based on this article and Microsoft Disconnected Service Agent
DSA
I've adapted this framework for my needs.
Thank you for all.
you can use a typed or untyped dataset for offline-storage.
when online (connected to internet) you can download the data into a dataset and upload it back to the database server. the dataset can be loaded from and saved to a local file.

Best way to synchronise client database with server database

I have a datalogging application (c#/.net) that logs data to a SQLite database. This database is written to constantly while the application is running. It is also possible for the database to be archived and a new database created once the size of the SQLite database reaches a predefined size.
I'm writing a web application for reporting on the data. My web setup is c#/.Net with a SQL Server. Clients will be able to see their own data gathered online from their instance of my application.
For test purposes, to upload the data to test with I've written a rough and dirty application which basically reads from the SQLite DB and then injects the data into the SQL Server using SQL - I run the application once to populate the SQL Server DB online.
My application is written in c# and is modular so I could add a process that periodically checks the SQLite DB then transfer new data in batches to my SQL Server.
My question is, if I wanted to continually synchronise the client side SQLLite database (s) with my server as the application is datalogging what would the best way of going about this be?
Is there any technology/strategy I should be looking into employing here? Any recommended techniques?
Several options come to mind. You can add a timestamp to each table that you want to copy from and then select rows written after the last update. This is fast and will work if you archive the database and start with an empty one.
You can also journal your updates for each table into an XML string that describes the changes and store that into a new table that is treated as a queue.
You could take a look at the Sync Framework. How complex is the schema that you're looking to sync up & is it only one-way or does data need to come back down?
As a simply solution I'd look at exporting data in some delimited format and then using bcp/BULK INSERT to pull it in to your central server.
Might want to investigate concept of Log Shipping
There exists a open source project on Github also available on Nuget. It is called SyncWinR, it implements the Sync Framework Toolkit to enabled synchronization with WinRT or Windows Phone 8 and SQLite.
You can access the project from https://github.com/Mimetis/SyncWinRT.

Categories

Resources