Single solutions with varous database management software - c#

Actually I am new to the software development, I have an idea to create an web application in which I am going to use front end C#.Net and Back end:SQL Server. What happen if my client ask me to use other database management software other than SQL Server? Is there any solutions to run same application without changing the SQL code like dynamic database creation based on the client requirement?
Help me..

Each Database like SQL Server , Oracle is having it's own coding syntax and are totally different . It's not possible to use the code of one DBMS for another.
In most cases SQL statements will work across various Database as it's standard for relational databases . But moreover when work with different databases you need to work in it's own standard.
Also there are different kind of databases available such as RDBMS ,NoSQL etc.. so each different in it's own way.
There might be some tools available to help you to convert one code to another.

You need to isolate database access from the rest of the program so that database specific code is in one place only.

For this convenience you need to learn MVC to perform these tasks. It can happen to certain level but still altogether certains functionalities need specific methods to be called but it increases efficiency...

Related

Does migrating data from SQL Server to SQLite have effects on the application codes

I have a project I developed with C# windows forms using SQL Server as database but I want to make the project standalone without any database server. I found SQLite as an option to go for, Is it possible to migrate my SQL Server database to SQLite and not affecting my code? And how do I go about it?
I used entityframework code first in connecting the SQL Server database
The answer is almost certainly going to be "yes." Depending on a few things, you might have to change very little (or no) code, or you might have to change a lot.
The first consideration is your SQL code. If you were very careful to write ANSI-compliant SQL and you didn't use any of the built-in SQL Server views or T-SQL-specific functions, you may not have to re-write much code at all. In reality, you probably will have to at least write some. In particular, while SQL Server's engine is meant to handle multiple concurrent sessions and queries, SQLite is not: you will need to manage your program carefully to ensure no two threads attempt to access the SQLite database at once.
The second consideration is how your application calls the database. Again, depending on your design, you may need to re-write almost no code, or you may need to re-write a lot. In my C# applications, I create an interface for database providers that defines common functionality (select, delete, insert, etc). Then I create simple wrapper classes for different RDBMS that implement the interface. When I need to switch databases, I simply instantiate and use a different class. If you have your project setup like this, then you'd simply need to create a new class for SQLite that implements your database interface and instantiate that instead of your SQL Server class. If you wrote a lot of SQL Server specific C# code into your business logic, you might have a lot of coding to do.

c# Service to copy data between two sql servers

I have two sql servers installed on my computer (SQL2008 EXPRESS) and also SQL2008 that comes with the specific software that we are using.
I need to make a service that runs all the time and at a specific time updates the non existing records in the SQL2008 EXPRESS from SQL2008.. can you suggest a way of doing this?
Currently the best thing I got is making a local copy in excel file, but that will result 365 excel files per year which I dont think is a good idea :)
p.s. sorry if my english is bad :)
You don't have to hand-craft your own software for that. There are 3rd party tools like OpenDbDiff or RedGate dbdiff to do that. These tools generate the differential sql that you can apply on your target database.
I'm confused when you mention Excel. What would Excel have anything to do with moving data from one SQL database to another?
The short answer is, if you need a C# service, then write a C# service that copies the data directly from one database to the other. The problem that you are trying to solve is not very clear.
Having said all that, and with my limited understanding of the problem, it sounds like what you need is a SQL job that is scheduled to run once a day that copies the data from one server to the other. Since it sounds like they are on separate instances, you'll just need to set up a linked server on either the source or destination database and either push or pull the data into the correct table(s).
EDIT:
Ok, so if a windows service is a requirement, that is perfectly acceptable. But, like I mentioned, you should forget about Excel. You wouldn't want to go from SQL->Excel->SQL if you have no other reason for the data to exist in Excel.
Here is some information on creating a windows service:
Easiest language for creating a Windows service
Here is a simple tutorial on accessing SQL in C#: http://www.codeproject.com/Articles/4416/Beginners-guide-to-accessing-SQL-Server-through-C
If you want a more formal solution (read: data access layer), I'd point you toward Entity Framework. The complexity of the project will probably be the driving factor on whether you just want to do SQL statements in your code vs. going with a full blown DAL.

Correct solution for persistent table/grid in C# that does not require a full database solution?

My WinForms C#/.NET application requires a table/grid control to display records to the end user. The records will be simple, containing only two fields, a string and a date/time field. I need to persist the data and I am wondering what the most efficient control and storage back-end is to use. The data is non-critical (i.e. - not health or financial records, or anything sensitive requiring extensive safety or any encryption).
One solution I have found so far is the DataGrid control in conjunction with SQL Server Compact Edition. I learned about this solution from this tutorial:
http://www.dotnetperls.com/datagridview-tutorial
It seems though that this may be overkill for my application. In addition, I am worried about the complexities of installing SQL Server CE, especially when it comes to admin vs. user account privilege issues during installation:
http://msdn.microsoft.com/en-us/library/aa983326(v=vs.80).aspx
Is there a table or grid control with built-in file load/save capabilities that uses a simple disk file as the storage method, perhaps a comma delimited ASCII file? I'd like something that I can still use SQL (via LINQ) to interface with. also, I am hoping that this can be done transparently. That is, if I want to upgrade to an SQL database engine solution later, the code from my end that interfaces with the data would not change (except perhaps for the database open/create code of course).
Or am I better off simply biting the bullet and going with SQL Server CE or perhaps SQLite:
Good embedded database solution (like SQLite) for .Net
If you have any caveats or anecdotes regarding installation issues and ease of use, they would be appreciated.
In my projects, we use Object datasources. Grid's can be bound to collections of objects just as easily as they can dataTables. You can store/restore the data using a simple serialization engine (XmlSerializer is rather easy to implement). Make a basic object, use List or BindingList as the dataset, and serialize/de-serialize it in the backEnd when you need it.
List and BindingList both support Linq queries.
Adding database save later is as simple as writing the code that saves the object to the database, in place of the serialization code, no change to the front end at all.
As far as a "Correct" solution is concerned...there are so many different ways to do it that it boils down to personal preference, and possibly actual requirements and expected future development. I find it easier to code using objects because the data manipulation is easier, but if you are going for straight record entry, no data manipulation required, going direct to a database is easier. It just depends on the data and what you plan on doing with it.
I strongly recommend you to use an embedded database, because it will be easier to go to a full database in a near future. SQL Server CE is a good option, and if you want to go big you can simply go to a full SQL Server Database with minimal changes in your code, the only downside of SQL Server CE is that you need to install it and it requires the .NET Framework 4, aside from that I don't see a big problem with it.

Legacy MySQL database mapping to a good .NET ORM for system migration

This is quite a long one, but I'd very much appreciate your thoughts and suggestions.
We are busy rebuilding a legacy system which was written in PHP and MySQL and replacing its components with ASP.MVC in C# and SQL Server. The legacy architecture leaves much to be desired and there is a serious issue with spaghetti code, no referential integrity in the DB, unused code and database fields and just generally bad coding.
As much as I'd love to, we can't just rip out all of the old code and replace it. The company needs to stay functional during the development process, so we will need to build new functionality while using the old databases to ensure that their data is accurate at all times. The level of data accuracy isn't real-time, but if we had 2 systems, they would have to be in sync 100% of the time. The old system uses 6 different MySQL databases, all on the same server, running Linux. We will be running Windows 2008 R2 on the new server for the new system and we are planning to use the latest version of SQL Server.
The problem I'm having to solve is: I need to somehow map all of these databases into a consolidated model that we can use through C# to develop the new system on. Once we have moved all the functionality over to C#, we need to port the data into a DB that matches our code model. This DB will be running on SQL Server. I'm not too worried about the migration just yet; my current issue is finding an ORM tool that will allow me to map these 6 MySQL databases into a single, well planned out and designed model that we can use for the new development.
The new model might have additional fields that we would have to store in a new MySQL database until we port the data across at some stage, so the ORM should support easily building entities that span multiple tables and databases.
Is what I'm trying to do possible? Is it viable in terms of effort? Is there an ORM that can do all of this? and what other way is there to maintain operational capacity of the company whilst developing on the system actively?
I have looked at these ORM options:
SubSonic (great, but I think too lightweight for what we are trying)
Entity Framework (looks like I might be able to use this if I use very dirty models with tons of stored procedures for inserts, updates and deletes)
NHibernate (the client does not want us to use this due to bad experiences in the past)
LLBLGen (seems like it can do what we need it to, but long term support could be a concern with the client)
Anything else I should look at? Is there a different approach I could try?
ORMs aren't designed to solve the problem you have. That said, a quality ORM will get you some percentage of the way toward a solution.
NHibernate is the easy choice. LLBLGen would be my second choice. I wouldn't even bother with EF or SubSonic as they are very feature poor compared to the other two and you need decent feature support in your scenario.
You'll likely have to invest a lot of time in writing custom code around your migration requirements. Your use case is not a standard, well traveled path.
For Entity Framework: if you're prepared to maintain one complete set of stored procedures with a static interface (i.e. same signature) you could implement them all in Transact-SQL on the SQL Server box, with linked servers (to the MySQL farm).
When the time comes, you could migrate the data into SQL Server and update your stored procedures.
Basically, design a nice model with nice stored procedures, and as a temporary solution implement any ugliness inside the stored procedures. Once MySQL is out of the way, you can replace the stored procedures with better ones.
SQL Server has a tendency to retrieve the entire remote table when you're running queries against a linked server, so if performance is a concern it might eventuate that all your stored procedures are wrappers around OPENROWSET (see Example A for running a query on a remote server).

RavenDB - synchronize with Sql Server DB

I was thinking about utilizing RavenDB for some of my look-up scenarios I am doing in a high throughput application. This would replace all of the look-up calls I need to make to the DB to get things like site location, etc. Looking at a couple of options really (also .Net caching). I know that you can replicate Indexes from RavenDB to SQL Server, but wondering if anyone has done the reverse where they sync RavenDB with Sql Server?
Any suggestions / comments would be appreciated.
--S
I've done a similar scenario where data needed to be transferred in batch from a SQL Server system nightly into our RavenDB instance.
I couldn't find an off the shelf tool to do what I wanted as typically you should optimise the model you give RavenDB differently to SQL Server.
I wrote a custom console app that put the data into my RavenDB instance.
For example my console app:
Compacted several relationships into one document
Dealt with the different datatypes
TLDR: I wrote my own console app as I couldn't find a generic product that could do it.
So far the only avaible solution is write your own sync process.
I was looking for ways to improve the search scenearios using RavenDB , the RavenDB will be filled using my sql server relational database.
I think it should be a better way, however the only i can think rith now is to use a ETL process that keeps updating your NoSQL version of your structured data.

Categories

Resources