I have a problem with using different databases in a mvc application with Entity Framework 6. Our client will use a database of their own which could be a MySQL, PostgreSQL or Oracle database.
I have made an .edmx file for MySQL and PostgreSQL, these models work individually but if possible I want to use only one model for all databases. The databases will have identical tables and columns.
So want to make something like this:
using (var connection = new DbConnection())
{
string id = connection.Set<user>().First().Id;
}
The DbConnection has to be a connection to the right database (you can see which database to use in a resource file)
I am encountering the following problem with a database first approach:
I have read http://www.codeproject.com/Articles/82017/Preparing-an-Entity-Framework-model-for-multi-prov.aspx and when I followed the instructions I got a
MetadataException: Unable to load the specified metadata resource exception.
I have tried everything to find the .ssdl file but I just can't find it.
Am I doing it the right way or does anyone know if there is a better way to do this?
I tried doing exactly what you are trying to do but with MS SQL and Vista DB. I followed that tutorial and ran into the same problem.
By following the tutorial, you will have a separate ssdl file for each DB. Make sure you have set it to copy to output directory and then updated your connection string accordingly.
To set it to copy to the output folder, right click on the ssdl file in Solution Explorer and change "Copy to OUtput Directory" to "Copy if Newer".
Then change the metadata part of your connection string to something like this:
metadata=res://*/DataModel.csdl|c:\dev\program\bin\debug\DataModel.VistaDB.ssdl|res://*/DataModel.msl;
Notice the path to the ssdl is a folder path. The path I have hear is the path I used. You will need to change it accordingly
Related
I'm working on a 2008 SSIS in which I need to read a flat file so that I can access its content (which has 3 directories paths), so I can store those 3 paths into variables.
The flat file would be in 3 different servers, according to the instance I'm working on (dev,qa,production), so I can't just write the path into a variable because I'd have to rewrite that value every time I'd need to deploy the solution in a different instance.
Something I've tried on the past is to read a flat file using the Directory.GetCurrentDirectory(), but I couldn't debug that and using the F5/run package on VS2008 didn't work (I've read that it doesn't work on VS but once you deploy the package it works fine, but I have no means to prove it but to try).
So, I figured out that, If I can read the path saved on a flat file connection and save it in a string variable, I could modify the connection string value in the .config file once the package is deployed, and read its contents like a normal flat file.
My problem is that I can't figure out how to read the connection string value, and I couldn't find anything online that pointed me in the right direction.
Thanks in advance.
To access connection managers informations from script tasks you can use Dts.Connections property, just declare a string variable, and read the connectionstring property:
string cs;
cs = Dts.Connections["myFlatFileConnection"].AcquireConnection(Dts.Transaction);
Reference:
According to this Microsoft Docs article:
Connection managers provide access to data sources that have been configured in the package. For more information.
The Script task can access these connection managers through the Connections property of the Dts object. Each connection manager in the Connections collection stores information about how to connect to the underlying data source. Read more (+examples)
You'd want something like a C# Script task. You can modify the connection string dynamically there. Within the script you'd modify the value of (if I recall correctly) Dts.Connections.["YourConnection"].ConnectionString.
Since nothing seemed to work, I ended up doing the following:
Inserted the values I needed in a parameters table in the database
generated an Execute SQL Task
assigned the results of that task to the variables
It took me all day but I finally got it.
I followed this thread for references.
I am now involved in a small c# project on winform (just 2-3 select statements),
and the project need to connect to mssql.
I know there are many ways to connect DB.
but why people use .ini or xml file to connect DB?
Is not good insert connect statement(server, id, pw ...) to class?
It is easier to use Configuration file (.ini, .xml) if you want to change connection string later. If you put your data in your code, everytime you change it, you must re-compile your code.
I'm using Entity framework 5 with a database first approach. I created a new database table and updated my .edmx. The .cs file was not created for my Model.tt, as explained here this is a bug in VS 2012.
I followed the work arounds as explained in this thread and I eventually updated to VS2012.4. The .cs file is still not created for my new table, any idea why this is happening and how I can fix this?
Your Tables must have primary key to create .cs files.
I decided to delete my edmx file and create a new connection to the database, and this error:
Connection failed, your password expired
Needless to say, a lot of time could have been saved if entity framework at least warned me when updating my model from database that my password expired, instead of just 'Updating' as if nothing is wrong.
So I renewed my password and guess what, everything is working!
Are you using a source control system that could be setting the directory as read only ? I've seen that problem before where automated tools were not generating what they are supposed to be doing because of that.
If you are using source control, check out the whole directory, exit VS. Open it back again and try to generate again.
I've had this problem both with and without an .edmx file and both times the problem was the same. When you create and save a new table in SQL Server, EF can't see it until you click on Tables > Refresh in the Object Browser.
I had a similar problem and it turned out that my .context.tt file and my .tt file had the wrong .edmx file name in them for the inputFile string variable. I believe this occurred as a result of someone renaming the .edmx file at some point.
Working with a model first approach on Entity Framework 4, I'd like to switch the database from real SQL (Data.SQLClient) to SQL CE (Data.ServerCe) back and forth.
I know how to do it manually:
Change the provider from System.Data.SqlServerCe.3.5 to System.Data.SqlClient
change the connection string of the Model Container/Context
change in the .edmx file (the Schema Namespace="Model1.Store" provider="..." attribute)
What I can't figure out is how I could make that change at build/compile time, so I could easily switch between SQLClient and SQLServerCE based on a configuration.
Any other way how to achieve the same result would be appreciated too!
(have one model where the data source can be switched between SQL and SQL CE)
I don't know if its the best way for you in this situation, however I would just like to make sure that you know of this way to solve your problem.
You can (right) click inside the designer view of your edmx (not on the file in the solution explorer) and click "Properties", in the PropertyGrid look for "Metadata Artifact Processing" and change the value from "Embed in Output Assembly" to "Copy to Output Directory".
However, once you changed this, instead of having the edmx baked into the assembly, you will notice three XML based files (MyModel.ssdl, MyModel.csdl, MyModel.msl) in your output directory.
You are now free to script any changes to those files as a part of your build process.
Also make sure to change your connection string to something like this:
<add name="MyEntities" connectionString="metadata=.\MyModel.csdl|.\MyModel.ssdl|.\MyModel.msl; (..) />
As I said, I'm not sure if it will be the best approach for your specific problem. However, I use it to generate different builds for different database schemas. It works.
I have an SSIS package that copies the data in a table from one SQL Server 2005 to another SQL Server 2005. I do this with a "Data Flow" task. In the package config file I expose the destination table name.
Problem is when I change the destination table name in the config file (via notepad) I get the following error "vs_needsnewmetadata". I think I understand the problem... the destination table column mapping is fixed when I first set up the package.
Question: what's the easiest way to do the above with an ssis package?
I've read online about setting up the metadata programmatically and all but I'd like to avoid this. Also I wrote a C# console app that does everything just fine... all tables etc are specified in the app.config ... but apparently this solution isn't good enough.
Have you set DelayValidation to False on the Data Source Destination properties? If not, try that.
Edit: Of course that should be DelayValidation to True, so it just goes ahead and tries rather than checking. Also, instead of altering your package in Notepad, why not put the table name in a variable, put the variable into an Expression on the destination, then expose the variable in a .DtsConfig configuration file? Then you can change that without danger.
Matching source destination column with case sensitive has done the work for me.
Like in my case SrNo_prod was column in dev and using it we developed the dtsx, while it is been created as SrNo_Prod in prod, after making case change from P to p, we got successful execution of package.
Check if the new destination table has the same columns as the old one.
I believe the error occurs if the columns are different, and the destination can no longer map its input columns to the table columns. If two tables have the same schema, this error should not occur.
If all you are doing is copying data from one SQL2005 server to another I would just create a Linked Server and use a stored proc to copy the data. An SSIS package is overkill.
How to Create linked server
Once the linked server is created you would just program something like...
INSERT INTO server1.dbo.database1.table1(id,name)
SELECT id, name FROM server2.dbo.database1.table1
As far the SSIS package I have always had to reopen and rebuild the package so that the meta data gets updated when modifying the tables column properties.