SSIS: Column Size Not Changing Based on Query - c#

I have a package. It has a query that feeds into a Script Component.
In the query I am selecting a varchar(8) column from a table and then I CAST(myDateCol AS varchar(10)).
SELECT
myPK,
CAST(myDateCol AS varchar(10)), --myDateCol defined as varchar(8)
myOtherCol
FROM
MyServer.MySchema.MyTable
In my script, I am trying to add two characters to the Row.myDateCol in Input0 but I get a Buffer Error and it is in the property setter for myDateCol. You can see that it sets the property to 8 characters but errors out after that.
What I've done is add an output column with Length = 10, set it, and mapped that to the next component in the package but that seems a little silly.
Is there a way to force the size of your input columns based off of the query OR is there a way that I can manually force a refresh in case the package is just stuck thinking that I'm dealing with a varchar(8) as the CAST operation was added later?
Additional Info:
Row.myDateCol = "20170404"
Row.myDateCol = "2017-04-04" // Errors out

This is normal behavior for SSIS. When you create a data source which uses a SQL query, SSIS will look at your query and build the the metadata for the dataflow. The data source will only recalculate that metadata if you change the structure of your query, for example number columns or their names.
The easiest way to force a refresh of the data types without resorting to renaming columns is to go to the columns page of the data source editor, Untick and then tick the top tick box of the Available External Columns. This will deselect all columns and re-select them and at the same time refresh the metadata. You can easily confirm this by hovering your mouse over the External\Output column names listed in the lower section.

Your problem is the result of dealing with Date(Time) as text instead of the number(s) it is. And I really cannot tell from your question if you want to want to add the extra characters added in at the Data Layer (Sql) or at the Application (C#) Layer.
Casting VarChar(8) => VarChar(10) will still just return VarChar(8) if you don't fill in (pad) that value. You could try a Cast VarChar(8) to Char(10).
Another option would be a double conversion of your column value to Date and then back to your desired varchar(10).
SELECT myPK,
Convert(VarChar(10), Convert(Date, myDateCol, 112), 120),
myOtherCol
FROM
MyServer.MySchema.MyTable

So, after some playing around, I found that renaming the column changed the size to varchar(10) per below:
SELECT
myPK,
CAST(myDateCol AS varchar(10)) AS DATECOL,
myOtherCol
FROM
MyServer.MySchema.MyTable
I then changed it back
SELECT
myPK,
CAST(myDateCol AS varchar(10)),
myOtherCol
FROM
MyServer.MySchema.MyTable
And the change stuck. I don't know why or how but VS/SSIS somehow never refreshed itself to change to a different type. I assume it has no handling for query changes after the initial query is input unless names/aliases change.
This wasn't just my machine either. Weird.

Related

How I can create a column using script component in SSIS?

I have the follow situation:
I need create a project in SSIS to import some datas from csv to our system but for to do this I must read some columns, and one of this columns is "group" of values.
Are values of planning horizon and this horizon can change each process, so some process can be 5 months and others 15 months.
The file (csv) will be filled with 21 columns always, but after (22, 23...) I don`t know if is 1, 2 or more columns (horizon).
And with this situation I can`t create columns in "Input and Columns" from Script Transformation Edit, I need create based on lenght of horizon.
So, my question if is possible create a column in run time, when I discovery the length of horizon.
Regards
SSIS doesn't work that way. The number of columns is set at design time.
If you can set a reasonable upper limit - say 50 columns, you can read in the last "column" of data and then parse that, via Script Component, into those fields. Otherwise, you're looking at preprocessing the file to unpivot the variable width rows into a normalized set.
You can do this in two different ways.
Add column(s) to a script component
https://msdn.microsoft.com/en-us/library/ms188192.aspx
Add a derived column transformation and add a custom column with the appropriate expression.
Tks for all answers. I changed my vision to create different.
I use script tranformation to check :
how much columns I needed create;
open a connection and delete columns of horizon;
create again columns based in new horizon;
After I included a Execute Sql Task to call a procedure that to do all logic to fill the columns.
Regards,

Store values in separate, C# type-specific columns or all in one column?

I'm building a C# project configuration system that will store configuration values in a SQL Server db.
I was originally going to set the table up as such:
KeyId int
FieldName varchar
DataType varchar
StringValue varchar
IntValue int
DecimalValue decimal
...
Values would be stored and retrieved with the value in the DataType column determining which Value column to use, but I really don't like that design. So I thought I'd go this route:
KeyId int
FieldName varchar
DataType varchar
Value varbinary
Here the value in DataType would still determine the type of Value brought back, but it would all be in one column and I wouldn't have to write a ton of overloads to accommodate the different types like I would have with the previous solution. I would just pull the Value in as a byte array and use DataType to perform whatever conversion(s) necessary to get my Value.
Is the varbinary approach going to cause any performance issues or is it just bad practice to drop all these different types of data into a varbinary? I've been searching around for about an hour and I can't get to a definitive answer.
Also, if there is a more preferred method anyone can think of to reach the same conclusion, I'm all ears (or eyes).
You could serialize your settings as JSON and just store that as a string. Then you have all the settings within one row and your clients can deserialize as needed. This is also a safe way to add additional settings at any time without any modifications to your database.
We are using the second solution and it works well. Remember, that the disk access is in orders of magnitude greater, than the ex. casting operation (it's milliseconds vs. nanoseconds, see ref), so do not look for bottleneck here.
The solution can be to implement polymorphic association (1, 2). But I dont think there is a need for that, or that you should do this. The second solution is close to non-Sql db - you can dump as a value anything, might be as well entire html markup for a page. It should be the caller responsability to know what to do wit the data.
Also, see threads on how to store settings in DB: 1, 2 and 3 for critique.

linq for entities hide columns from model

I am using linq for entities to read and update data from a SQL server. This database is a Dynamic NAV database, and every time someone is changing a column in the database – my application need to be recompiled.
Is it possible to ignore or hide columns in the database from linq for entities, and still get update to work correctly? Let’s say there is 100 columns in a table, and that I am using on only 10, when I update a value – I want the remaining 90 values to stay in the row.
You can just tell the people that add new columns to either
Allow null for newer columns
Or add a default constraint so a good default value is added automatically added for newer rows
Either of these will allow linq to work correctly
The best way would be to create a custom view in your database. If you want to be able to insert / update / delete from that view, you can create the appropriate triggers on the view. Linq will treat the view just like any other table.

Converting logic of DateTime.FromBinary method in TSQL query

I have a table that contain column with VARBINARY(MAX) data type. That column represents different values for different types in my c# DB layer class. It can be: int, string and datetime. Now I need to convert that one column into three by it's type. So values with int type go to new column ObjectIntValue and so on for every new column.
But I have a problems with transmitting data to datetime column, because the old column contains datetime value as a long received from C# DateTime.ToBinary method while data saving.
I should make that in TSQL and can't using .NET for convert that value in new column. Have you any ideas?
Thanks for any advice!
Using CLR in T_SQl
Basically you use Create Assembly to register the dll with your function(s) in it,
Then create a user defined function to call it, then you can use it.
There's several rules depending on what you want to do, but as basically you only want DateTime.FromBinary(), shouldn't be too hard to figure out.
Never done it myself, but these guys seem to know what they are talking about
CLR in TSQL tutorial
This is a one off convert right? Your response to #schglurps is a bit of a concern.
If I get you there would have to be break in your update script, ie the one you have woukld work up to when you implement this chnage, then you's have a one off procedure for this manouevre, then you would be updating from a new version.
If you want to validate it, just check for the existnec or non-existance of the new columns.
Other option would be to write a wee application that filled in the new columns from the old one and invoke it. Ugh...
If this isn't one off and you want to keep and maintain the old column, then you have problems.

Best Practice - Handling multiple fields, user roles, and one stored procedure

I have multiple fields both asp:DropDownList's and asp:TextBox's. I also have a number of user roles that change the Visible property of certain controls so the user cannot edit them. All of this data is saved with a stored procedure call on PostBack. The problem is when I send in the parameters and the control was not on the page obviously there wasn't a value for it, so in the stored procedure I have the parameters initialized to null. However, then the previous value that was in the database that I didn't want changed is overwritten with null.
This seems to be a pretty common problem, but I didn't have a good way of explaining it. So my question is, how should I go about keeping some fields from being on the page but also keeping the values in the database all with one stored procedure?
Apply the same logic when chosing what data to update as the logic you're actually using when chosing what data (and its associated UI) to render.
I think the problem is you want to do the update of all fields in a single SQL update, regardless of their value.
I think you should do some sanity check of your input before your update, even if that implies doing individual updates for certain parameters.
Without an example, it is a little difficult to know your exact circumstances, but here is a fictitious statement that will hopefully give you some ideas. It is using t-sql (MS SQL Server) since you did not mention a specific version of SQL:
UPDATE SomeImaginaryTable
SET FakeMoneyColumn = COALESCE(#FakeMoneyValue, FakeMoneyColumn)
WHERE FakeRowID = #FakeRowID
This basically updates a column to the parameter value, unless the parameter is null, in which case it uses the columns existing value.
Generally to overcome this in my update function
I would load the current values for the user
Replacing any loaded values with the newly changed values from the form
Update in db.
This way I have all the current plus everything that has been changed will get changed.
This logic will also work for an add form because all the fields would be null then get replaced with a new value before being sent to the db. You would of course just have to check whether to do an insert or update.

Categories

Resources