Now, i am planning to have 1 main computer and 2 client computer in the same domain. I want to run a form application in the main computer that uses Sqlite database. Then I want to query some data in the main computer from client computers. What would be your suggestions about those 2 questions of mine:
What is the best way to implement this server-client structure to communicate computers.
What is the best way to get a big datatable from main computer that uses sqlite.
I am using .Net Framework 4.0 for Form applications.
you may share the directory, either via Netbios (aka samba, for linux users) or nfs.
however, this is not a good idea, since sqlite locks the file, this may break your implementation if the filesystem fails for whatever reason. you might want to use a real distributed architecture, helping concurrency and load balancing.
another way would be to use sqlite, but proxy it through a web service, made by you specifically for this. it will serve the requests, and it would be run in the same server where you want the sqlite file, so you can avoid sharing the file/directory containing the database
You could set up a WCF service that serializes and deserializes commonly shared model/domain objects. WCF sends it across the wire.
Related
I am developing an UWP application (Windows phone 10) and I have a SQLite database in a shared folder in a PC in my LAN. I would like to know if I can use this database in the windows phone app, like I do with my WPF application, that I can set the path of the database and I can use it from any computer in my lan.
Thanks.
Real Path:
\\192.168.1.102\Database\MyDB.db3
USE THIS Connection String:
Data Source='//192.168.1.102/Database/MyDB.db3';Version=3
cnStr.Replace("\", "/") in windows
Short answer:
Yes you can however you will need to be careful about the journaling mode you choose for example WAL does not work over a network file system.
Long answer:
If you see yourself in a situation where many clients/programs need to access a common database over a network you should consider a client/server database or provide an API of some sort that would sequentially persist the client's data to the SQLite DB.
For more info see Appropriate Uses of SQLite
As much I remember, it is not recommended by the developer, since file locking is very restricted in such settings (using networked file systems).
From the FAQ:
You should avoid putting SQLite database files on NFS if multiple processes might try to access the file at the same time. On Windows, Microsoft's documentation says that locking may not work under FAT filesystems if you are not running the Share.exe daemon. People who have a lot of experience with Windows tell me that file locking of network files is very buggy and is not dependable. If what they say is true, sharing an SQLite database between two or more Windows machines might cause unexpected problems.
But, if you intend to use it only from one process at a time (no concurrency involved) it should be fine.
You can't just arbitrarily open a file from a UWP app, you need to use the RuntimeBroker to handle it, and the only mechanism that can talk to RuntimeBroker is the StorageFile API, which in turn knows nothing of shared folders in a LAN environment.
To access a shared database (SQLite or otherwise), your best option is to host it on some form of server, and build an API to interface to it (be it SOAP, REST, etc). You could still use a local cache on each client for disconnected usage, but you would need to handle the replication yourself.
cnStr.Replace("\", "/") in windows
This works great!
I have an application that has two main parts. First, the client, basicly is the user iterface, second, a repository that is a library, that connects with the database and has all the logic to insert, update, delete... and ensures the coherence of the data.
The application is not deplyed yet, and by the moment the client uses directly the repository to access to the database. But when I will have to deploy the application to be used for many users, inside the LAN, I think that this is not the best solution.
First solution
Install the client and the repository in all the computer of the users that need the application.
This have the disadvantage that when I update the application, I have to update many applications, and perhaps not all the applications are updated because of any reason. So if the update is of the repository that fix some problem, if the client that has not updated the application will introduce incoherence data in the database, if the fix is to correct this type of problem.
Second solution
The client use direcly the repository, but the application is installed in a network drive. I have only one installation, so if I need to update the application, I have to do it once.
The application is not so big, about 12MB, but it could be a bit slow because has to go through the net from the server to user computer. So perhaps some user could copy the application to the local computer, so I can't ensure that happens the problem with the first solution.
Third solution
The client application does not use the repository directly, the repository is in the server and the client use WCF to communicate with the server, and the server uses the repository to access to the database.
The disadvantage is that the server has to run the repository, so if there are many clients connected, it needs a lot of RAM, instead that if the computers of the users have the application in local, the memory is needed in the local computer.
In sumary, when I have to deply this kind of application, which is the best solution, or which is the solution that would you use in your projects?
Thank you so much.
This really depends on your deployment method, are you using a ClickOnce to deploy it? If so you could keep the data local to each PC, avoid those RAM issue, and if you send out a new update change the required version number and set it to check prior to running, that way they will be unable to run the program without updating it. The problem is they must have network access, but this would also be an issue with remote data. In this situation you would only need network access during the update, not sure if this would be an issue or not.
I am planning in creating a student information system where multiple computers can access the same server through LAN connection. The server will have a database about the students information. I pretty much have a background in creating a program similar to this but i was only able to make it for 1 computer. I don't know how to connect that database to another computer. Like for instance:
Teacher A uses Computer 1 to access the Student Information System and
Teacher B uses Computer 2 to access it as well. Any modification done
by Teacher A will be updated to the database and it will be seen as
well for Teacher B.
And my other concern is whether I should make it web-based or not. I only have an experience with using WPF Application for this kind of system but someone told me that its easier if i used a web-based instead in this kind of networking idea. Can you tell me the advantages and disadvantages of using a web-based and its counterpart.
to summarize:
How can I connect multiple computers to access a single database system.
Which is better to use if its web-based application or desktop application.
Much appreciated.
Couple of points
Set up a machine for making it a server where your database will be residing. All the machines will be pointing to this server and database using LAN.
You can create both web based and desktop based application. I would prefer to go for web based application so that in future you can extend to access this system outside the college premises.
Both Web based and desktop based application can work for you in this case. For the database you can use SQlServer2008 and share this database over the LAN so that it can be accessed from all the computers in the network.
You must create one server machine for the database which will be accessed by other client machines on your LAN.
It is quite easy you just need to give the ipaddress of sql machine in connection string instead of "." And every thing else will be same .
You will have more than one solution to make the database be used in different locations.
1- Regarding the desktop application we can setup the application on a terminal server (considering the server configurations and how many users will access the server) and the user will access the sever through the terminal service (the application will access only one database for all users in different locations.
2- The desktop application can be run in a several locations accessing a database at the same location, By using the (Database Replication) we will have an updated data in all location.
3- The application that the users access over the internet is called a (web application), it is a software that is accessed through a web browser running on client’s machine called a web application, the application will access only one database for all users.
4- Adding a (web service) to the desktop application will allows different machines to interact with each other through a network.
The question is which is the best solution?
Option (1) is the easiest one because we don't need changing anything in the application but if the internet connection is disabled no one can access the application and if a bad internet connection then we will facing many complaints from the users.
Option(2) is good because we don't need changing anything in the application as well the database server will replicate the information with each other and update the information for all locations.
I prefer option(4) when designing a new application.
I prefer option(2) when the application is old and there is no ability to modify or it will take a lot of time to modify.
I really don't like the concept of opening my SQL server(s) to the internet - even if I can lock down the firewall. However I've always been working directly with databases. I'm building a system now which involves 1 SQL Server database, a web application in ASP.NET/C#, and a few windows applications in Delphi XE2. But from the beginning, I'd like to put some sort of 'filter' around the database so I don't have to open it up.
I know there are many things out there for this, but don't know anything about them or what to get for my scenario. I'd like to keep it native to SQL Server; I don't plan on using any other type of database engine.
It needs to be connected from client to server by other means than the standard SQL connection, like a filter. It creates its own encrypted packets and transfers data its own way. I will have a wrapper class for both Delphi and C# which will pretty much be identical - and be able to stream its data into the DLL to interact with the DB.
Now there are three different ways I can go about this...
Complete SQL Server wrapper, most likely no source code, might even have its own language (I don't want to pick up another database language), and independent from my project as its own separate system.
Open-source wrapper, preferably in Delphi (XE2), or if not then C#, specific protocols for my system, entirely dedicated to my project, and in the final form of a DLL which can be used on both the Website (in C#) and the applications (in Delphi).
Web Service - however I only have 1 hosting spot (Paying for 1 site, 2nd site will be a double charge on me). I can't host any additional web services or windows services; it has to be integrated with the website. Otherwise, I would have done a web service for this.
I would much prefer the second option, and do not want to go anywhere close to the first one, and can't do the third one at all.
So any good libraries for database layers? And might there be some already installed in Delphi XE2? I'm thinking maybe an encrypted XML packet?
As an example, let's say I have a table for 'Customers'. In both my website and applications, I should never have any SQL script like select * from Customers or no SQL script in general. Instead, I will have a wrapper around the database. So I can call a function such as DBGetCustomers(Conditions: TGetCustomersConditions): TDBCustomers; where TGetCustomersConditions is some way of filtering the query, and TDBCustomers represents the results from the query.
There could also be a function DBAddCustomer(Item: TCustomerToAdd): TInsertSuccess; where TCustomerToAdd represents what to insert, and TInsertSuccess represents any result, such as error message(s) or rows affected. I do not intend for it to be working exactly like this, but just to explain the concept of any wrapper in general. When the app sends to request to the server, it still has not converted anything to a SQL Query. By the time the request gets to the server (which is able to connect to the database), then the server alone decodes everything to the SQL query.
What's the problem even if you have just one "hosting spot"? A web service is just a "site". And a web server can easily host multiple site even with a single IP address. Anyway, what you're looking for is an "application server" in a multi-tier design.
While Java invested heavily in that direction, MS did not. Delphi has Datasnap, which is a so-so framework, don't know if the new "restful" interface is easily callable from C#, it looks to have security flaws though. The .NET way of doing it is using WCF, as long as it uses a standard protocol you can call if from Delphi has well.
You could also look at RemObjects DataAbstract. It's not open source, but it is a mature library.
One of the 'traditional' ways to do this is via webservices (although this technique is now considered by some to be a bit dated).
One disadvantage is that it is not a generic wrapper you can throw around the database, but it has the advantage that you can limit access to the database easily to specific stored procedures for example, which will maximise security, and is a standard technique if you wish to provide limited authorised access to external applications.
If you already use a ASP.NET C# web application, you could also add a WCF (Windows Communication Foundation) based web service to your web site. This can provide database access to external applications, which need to connect the web service somehow. It should even be possible to use the same HTTP standard port for both the normal web site pages and the service, by mapping the web service to a specific context path like www.example.com/services/servicename
Kbmw allows you to make a ntier db architecture.
However since you have a web application involved, a better option would be WCF (as already suggested)
I am writing a plugin for an application in C#. The plugin allows me full access to the internal information model for the application.
I would like to create a mechanism to allow external applications to be able to connect to the information so they can report on it etc.
In days of old this used to be achieved via ODBC links - is that still the way to go.
I assume it's a significant task to create an ODBC driver for this, are there any easier recommendations or example C# code for cresting a driver.
Looking back I was not very clear in the original question. The requirement is to allow two applications on the same PC to share data. The "host" application use a proprietary storage format and as such access to the data cannot be achieved without using the "Host" application. The "host" applications allows the development of plugins (using C#) and the plugins have access to all of the data within the application. On that basis I was exploring whether a plugin could therefore expose an interface to an other external application and as such could act as a "Data Access Layer"
My reference to ODBC is probably a "red herring" - just shows how out of touch I am in this area.
Probably you are looking for something like Remoting and\or Web Services and\or the more modern WCF (windows communication foundation).
You can write your own services and access to that services from every language you want.
C# support for WCF and Remoting and WebServices is very good and allow you to write your server-client infrastructure in a very clean, object oriented and easy way.
Use HTTP: each services is handled in a serialized object sent in XML through an HTTP server, for example, IIS.
Clients can be written in every kind of language you want, from PHP to C# to C++ to JAVA to wathever, they need only to connect through HTTP and parse\deserialize\serialize XML.
You can choose your architecture. If both clients and servers are written in C# all is transparent to you, serialization and deserialization of XML, remote procedure call and IIS integration are all ready for you to use. You need only to write your applications.
You can export services instead of tables like a relational DBMS does, in this way you can divide the logic of your system from the data layer and the presentation layer.
In this way you can obtain scalability, multiplatform and multisystem support.
Some links to read:
http://en.wikipedia.org/wiki/Windows_Communication_Foundation
http://www.codeproject.com/KB/webservices/myservice.aspx
http://msdn.microsoft.com/en-us/library/aa730857(v=vs.80).aspx
http://msdn.microsoft.com/en-us/library/kwdt6w2k(v=vs.71).aspx
http://blogs.microsoft.co.il/blogs/bursteg/archive/2008/02/10/how-to-build-an-n-tier-application-with-wcf-and-datasets-in-visual-studio-2008.aspx
http://msmvps.com/blogs/williamryan/archive/2008/05/16/doing-tiers-with-wcf.aspx
Instead, if you are in an intranet, for example, or a single computer and you want just to share a DB service, you can just use SQLServer or MySql or PostgreSql and connect to it via TCP/IP.
Is not safe\secure however to expose a DB service on internet or in an intranet where security can be a problem.
Note also that SQLServer Express is free and may be suitable for you if you don't have much users\connections or a DB not greater than 4gb.
MySql and PostgreSql are free and open source.