I have an SSIS package that copies the data in a table from one SQL Server 2005 to another SQL Server 2005. I do this with a "Data Flow" task. In the package config file I expose the destination table name.
Problem is when I change the destination table name in the config file (via notepad) I get the following error "vs_needsnewmetadata". I think I understand the problem... the destination table column mapping is fixed when I first set up the package.
Question: what's the easiest way to do the above with an ssis package?
I've read online about setting up the metadata programmatically and all but I'd like to avoid this. Also I wrote a C# console app that does everything just fine... all tables etc are specified in the app.config ... but apparently this solution isn't good enough.
Have you set DelayValidation to False on the Data Source Destination properties? If not, try that.
Edit: Of course that should be DelayValidation to True, so it just goes ahead and tries rather than checking. Also, instead of altering your package in Notepad, why not put the table name in a variable, put the variable into an Expression on the destination, then expose the variable in a .DtsConfig configuration file? Then you can change that without danger.
Matching source destination column with case sensitive has done the work for me.
Like in my case SrNo_prod was column in dev and using it we developed the dtsx, while it is been created as SrNo_Prod in prod, after making case change from P to p, we got successful execution of package.
Check if the new destination table has the same columns as the old one.
I believe the error occurs if the columns are different, and the destination can no longer map its input columns to the table columns. If two tables have the same schema, this error should not occur.
If all you are doing is copying data from one SQL2005 server to another I would just create a Linked Server and use a stored proc to copy the data. An SSIS package is overkill.
How to Create linked server
Once the linked server is created you would just program something like...
INSERT INTO server1.dbo.database1.table1(id,name)
SELECT id, name FROM server2.dbo.database1.table1
As far the SSIS package I have always had to reopen and rebuild the package so that the meta data gets updated when modifying the tables column properties.
Related
I'm working on a 2008 SSIS in which I need to read a flat file so that I can access its content (which has 3 directories paths), so I can store those 3 paths into variables.
The flat file would be in 3 different servers, according to the instance I'm working on (dev,qa,production), so I can't just write the path into a variable because I'd have to rewrite that value every time I'd need to deploy the solution in a different instance.
Something I've tried on the past is to read a flat file using the Directory.GetCurrentDirectory(), but I couldn't debug that and using the F5/run package on VS2008 didn't work (I've read that it doesn't work on VS but once you deploy the package it works fine, but I have no means to prove it but to try).
So, I figured out that, If I can read the path saved on a flat file connection and save it in a string variable, I could modify the connection string value in the .config file once the package is deployed, and read its contents like a normal flat file.
My problem is that I can't figure out how to read the connection string value, and I couldn't find anything online that pointed me in the right direction.
Thanks in advance.
To access connection managers informations from script tasks you can use Dts.Connections property, just declare a string variable, and read the connectionstring property:
string cs;
cs = Dts.Connections["myFlatFileConnection"].AcquireConnection(Dts.Transaction);
Reference:
According to this Microsoft Docs article:
Connection managers provide access to data sources that have been configured in the package. For more information.
The Script task can access these connection managers through the Connections property of the Dts object. Each connection manager in the Connections collection stores information about how to connect to the underlying data source. Read more (+examples)
You'd want something like a C# Script task. You can modify the connection string dynamically there. Within the script you'd modify the value of (if I recall correctly) Dts.Connections.["YourConnection"].ConnectionString.
Since nothing seemed to work, I ended up doing the following:
Inserted the values I needed in a parameters table in the database
generated an Execute SQL Task
assigned the results of that task to the variables
It took me all day but I finally got it.
I followed this thread for references.
I am working on a small windows forms program that reads data from a local database, which I have created by following this guide.
I have populated these tables with data using the Designer in Visual Studio, and there is no ability (nor will there ever be) for the program to make changes to this database at run-time, as they represent static, known data -- I could get identical results if I hard-coded the instantiation of each corresponding table row object in a constructor somewhere.
When I have Visual Studio build my solution, it generates two files -- the .exe that opens the Form, and a .mdf file with the database tables.
Two related questions -- does it even make sense to use a database with this kind of read-only data? And if so, is there a way to combine the .MDF file into the .exe? Again, there is zero need to ever modify the data, so I wouldn't think that the fact that you can't modify .exes need prevent this.
Yes you can do this. But first you need to construct your dataset. I would create your data in SQL or similar then export it to XML along with an XSD schema.
Then in Visual Studio, add a DataSet object to your project.
Then you can talk to the DataSet object and use the XSD and XML to populate it.
https://msdn.microsoft.com/en-us/library/atchhx4f(v=vs.110).aspx
To keep this all within the .EXE, embed your XML data into a Resource file, then access it via the Resources static class.
This is a add on to this question that was asked previously with no answer.
The problem I have is a need to occasionally update a set 35 children SSIS packages with one parent. They are all the same, differing only in what data they process. When I make a change, I delete all the children and paste them again in the same folder, updating the value of a variable that tells the package which child package it is so it knows which data to process (has a value 1-35).
My goal was to find a solution that allows the packages to somehow be aware to who they are (by file name, variable, configuration, etc) so that it would cut down on maintenance and setup up for production after a update.
The file names of the package keep the appended numbered value after the paste (packagename 1, packagename 2,....packagename X) in the same folder. I am using package deployment in SSIS 2012, so I don't have access to the file name as a parameter like would if I were using project deployment. All of the packages are in a SSDT solution with a parent package calling all 35 children. With Package Deployment, I'm using configurations in a SQL table to change the file path as its promoted from server to server.
I'd love to automate other things related to the children, but I can't unless I get this part solved first. Also, I need to add another 15 children or so and this would save a LOT of time.
Any help is appreciated
Have you tried to use environment variables?
And start the packages with diffrent parameters.
Packages_with_Parameter_from_Environments
(Sorry I am not allowed to comment.)
update a set 35 children SSIS packages with one parent. They are all the same, differing only in what data they process.
It seems like you shouldn't be using 35 different copies of the same package as a child and instead should just use parameters to fix the problem.
If the way they are processed is in the filename, you can use a filename parameter with a mask to pull out the variables using a for each loop, feed those parameters into the package being called. If not, you can store the processing options in a sql table, load those with file name and parameters and have that contain all your information and have the parent package pull that information out and use it to call the child packages.
I have a problem with using different databases in a mvc application with Entity Framework 6. Our client will use a database of their own which could be a MySQL, PostgreSQL or Oracle database.
I have made an .edmx file for MySQL and PostgreSQL, these models work individually but if possible I want to use only one model for all databases. The databases will have identical tables and columns.
So want to make something like this:
using (var connection = new DbConnection())
{
string id = connection.Set<user>().First().Id;
}
The DbConnection has to be a connection to the right database (you can see which database to use in a resource file)
I am encountering the following problem with a database first approach:
I have read http://www.codeproject.com/Articles/82017/Preparing-an-Entity-Framework-model-for-multi-prov.aspx and when I followed the instructions I got a
MetadataException: Unable to load the specified metadata resource exception.
I have tried everything to find the .ssdl file but I just can't find it.
Am I doing it the right way or does anyone know if there is a better way to do this?
I tried doing exactly what you are trying to do but with MS SQL and Vista DB. I followed that tutorial and ran into the same problem.
By following the tutorial, you will have a separate ssdl file for each DB. Make sure you have set it to copy to output directory and then updated your connection string accordingly.
To set it to copy to the output folder, right click on the ssdl file in Solution Explorer and change "Copy to OUtput Directory" to "Copy if Newer".
Then change the metadata part of your connection string to something like this:
metadata=res://*/DataModel.csdl|c:\dev\program\bin\debug\DataModel.VistaDB.ssdl|res://*/DataModel.msl;
Notice the path to the ssdl is a folder path. The path I have hear is the path I used. You will need to change it accordingly
I'm using Entity framework 5 with a database first approach. I created a new database table and updated my .edmx. The .cs file was not created for my Model.tt, as explained here this is a bug in VS 2012.
I followed the work arounds as explained in this thread and I eventually updated to VS2012.4. The .cs file is still not created for my new table, any idea why this is happening and how I can fix this?
Your Tables must have primary key to create .cs files.
I decided to delete my edmx file and create a new connection to the database, and this error:
Connection failed, your password expired
Needless to say, a lot of time could have been saved if entity framework at least warned me when updating my model from database that my password expired, instead of just 'Updating' as if nothing is wrong.
So I renewed my password and guess what, everything is working!
Are you using a source control system that could be setting the directory as read only ? I've seen that problem before where automated tools were not generating what they are supposed to be doing because of that.
If you are using source control, check out the whole directory, exit VS. Open it back again and try to generate again.
I've had this problem both with and without an .edmx file and both times the problem was the same. When you create and save a new table in SQL Server, EF can't see it until you click on Tables > Refresh in the Object Browser.
I had a similar problem and it turned out that my .context.tt file and my .tt file had the wrong .edmx file name in them for the inputFile string variable. I believe this occurred as a result of someone renaming the .edmx file at some point.