EzAPI OLE DB Destination - c#

I've searched all over and I now have to ask SO. I'm trying to construct a simple dataflow using EzAPI. It's been anything but easy, but I'm committed to figuring this out. What I can't figure out is how to get the EzOleDBDestination working. Here's my complete code
var a = new Application();
// using a template since it's impossible to set up an ADO.NET connection to MySQL
// using EzAPI and potentially even with the raw SSIS API...
var pkg = new EzPackage(a.LoadPackage(#"C:\...\Package.dtsx", null));
pkg.Name = "Star";
var df = new EzDataFlow(pkg);
df.Name = "My DataFlow";
var src = new EzAdoNetSource(df);
src.Name = "Source Database";
src.SqlCommand = "SELECT * FROM enum_institution";
src.AccessMode = AccessMode.AM_SQLCOMMAND;
src.Connection = new EzConnectionManager(pkg, pkg.Connections["SourceDB"]);
src.ReinitializeMetaData();
var derived = new EzDerivedColumn(df);
derived.AttachTo(src);
derived.Name = "Prepare Dimension Attributes";
derived.LinkAllInputsToOutputs();
derived.Expression["SourceNumber"] = "id";
derived.Expression["Name"] = "(DT_STR,255,1252)description";
// EDIT: reordered the operation here and I no longer get an error, but
// I'm not getting any mappings or any input columns when I open the package in the designer
var dest = new EzOleDbDestination(df);
dest.AttachTo(derived, 0, 0);
dest.Name = "Target Database";
dest.AccessMode = 0;
dest.Table = "[dbo].[DimInstitution]";
dest.Connection = new EzConnectionManager(pkg, pkg.Connections["TargetDB"]);
// this comes from Yahia's link
var destInput = dest.Meta.InputCollection[0];
var destVirInput = destInput.GetVirtualInput();
var destInputCols = destInput.InputColumnCollection;
var destExtCols = destInput.ExternalMetadataColumnCollection;
var sourceColumns = derived.Meta.OutputCollection[0].OutputColumnCollection;
foreach(IDTSOutputColumn100 outputCol in sourceColumns) {
// Now getting COM Exception here...
var extCol = destExtCols[outputCol.Name];
if(extCol != null) {
// Create an input column from an output col of previous component.
destVirInput.SetUsageType(outputCol.ID, DTSUsageType.UT_READONLY);
var inputCol = destInputCols.GetInputColumnByLineageID(outputCol.ID);
if(inputCol != null) {
// map the input column with an external metadata column
dest.Comp.MapInputColumn(destInput.ID, inputCol.ID, extCol.ID);
}
}
}
Basically, anything that involves calls to ReinitializeMetadata() results in 0xC0090001, because that method is where the error happens. There's no real documentation to help me, so I have to rely on any gurus here.
I should mention that the source DB is MySQL and the target DB is SQL Server. Building packages like this using the SSIS designer works fine, so I know it's possible.
Feel free to tell me if I'm doing anything else wrong.
EDIT: here's a link to the base package I'm using as a template: http://www.filedropper.com/package_1 . I've redacted the connection details, but any MySQL and SQL Server database will do. The package will read from MySQL (using the MySQL ADO.NET Connector) and write to SQL Server.
The database schema is mostly irrelevant. For testing, just make a table in MySQL that has two columns: id (int) and description (varchar), with id being the primary key. Make equivalent columns in SQL Server. The goal here is simply to copy from one to the other. It may end up being more complex at some point, but I have to get past this hurdle first.

I can't test this now BUT I am rather sure that the following will help you get it working:
Calling ReinitializeMetadata() causes the component to fetch the table metadata. This should only be called after setting the AccessMode and related property. You are calling it before setting AccessMode...
Various samples including advice on debugging problems
define the derived column(s) directly in the SQL command instead of using a EzDerivedColumn
try to get it working with 2 SQL Server DBs first, some of the available MySQL ADO.NET provider have some shortcomings under some circumstances
UPDATE - as per comments some more information on debugging this and a link to a complete end-to-end sample with source:
http://blogs.msdn.com/b/mattm/archive/2009/08/03/looking-up-ssis-hresult-comexception-errorcode.aspx
http://blogs.msdn.com/b/mattm/archive/2009/08/03/debugging-a-comexception-during-package-generation.aspx
Complete working sample with source

I've had this exact same issue and been able to resolve it with a lot of experimentation. In short you must set the connection for both the source and destination, and then call the attachTo after both connections are set. You must call attachTo for every component.
I've written a blog about starting with an SSIS package as a template, and then manipulating it programmatically to produce a set of new packages.
The article explains the issue more.

Related

Sync to all CLs within a certain Date/Time range

My documentation and Google-fu is seriously failing me on this one, so:
how do I use P4API's GetChangelist() function to sync a range of files (i.e. all files from #now to #twoDaysAgo)? I can easily construct the command line to do this like so:
p4 changes -s submitted //...#2016/12/01,2016/12/06
but the API wants me to interface with the server via
GetChangelist(Options options, FileSpec[] files)
It's driving me crazy that I have to construct a combo of Options and Filespecs[] to make the request instead, and (AFAIK) can't just pass the actual command line string. Especially because all documentation seems to be non-existent.
Can somebody enlighten me as to what kind of filespec parameters I have to pass along? (I think that's what I need to use to specify the fact that I want to get a range of all CLs inside a certain time?) Thanks!
(As an aside: I was surprised there isn't a "P4API" tag yet, and I can't create one.)
And here's the non-command line version that you really want to use, from the Perforce documentation (once you find it :))
PathSpec path = new DepotPath("//depot/...");
DateTimeVersion lowerTimeStamp = new DateTimeVersion(new DateTime(2016,12,06));
DateTimeVersion upperTimeStamp = new DateTimeVersion(DateTime.Now);
VersionSpec version = new VersionRange(lowerTimeStamp, upperTimeStamp);
FileSpec[] fileSpecs = { new FileSpec(path, version) };
ChangesCmdOptions changeListOptions = new ChangesCmdOptions(ChangesCmdFlags.FullDescription | ChangesCmdFlags.IncludeTime, null, 0, ChangeListStatus.None, null);
IList<Changelist> changes = m_Repository.GetChangelists(changeListOptions, fileSpecs);
Alright, after a couple more hours of digging, I have found that there is a way to feed the actual command line parameters to the command. You create a DepotSpec, and then something like this is working for me to restrict the time range for CLs retrieved from the server:
ChangesCmdOptions changeListOptions = new ChangesCmdOptions(ChangesCmdFlags.FullDescription|ChangesCmdFlags.IncludeTime, null, 0, ChangeListStatus.None, null);
FileSpec[] fileSpecs = new FileSpec[1] { new FileSpec(new DepotPath("//depot/...#2016/12/05 21:57:30,#now"), null, null, null) };
IList<Changelist> changes = m_Repository.GetChangelists(changeListOptions, fileSpecs);
All this might be "indulgent smile" old news to people who've worked with the API for a while. It's just all a bit confusing to newcomers when documentation like the two pages mentioned in this post ("FileSpec object docs", "SyncFiles method docs") are offline now: Perforce Api - How to command "get revision [changelist number]"

One RFC call is returning data, but another is not

I'm using two different BAPIs to get data from SAP. When I use BAPI_SALESORDER_GETLIST, it works perfectly. But when I use BAPI_BILLINGDOC_GETLIST, I get no data. This is how I'm trying to call the BAPI:
DataTable table = null;
SapConfig cfg = new SapConfig();
if (RfcDestinationManager.TryGetDestination("SAP") == null)
RfcDestinationManager.RegisterDestinationConfiguration(cfg);
RfcDestination dest = RfcDestinationManager.GetDestination("SAP");
RfcRepository repo = dest.Repository;
IRfcFunction fnc = repo.CreateFunction("BAPI_BILLINGDOC_GETLIST");
IRfcStructure param = fnc.GetStructure("REFDOCRANGE");
param.SetValue("SIGN", "I");
param.SetValue("OPTION", "EQ");
param.SetValue("REF_DOC_LOW", salesOrderNumber);
param.SetValue("REF_DOC_HIGH", "");
fnc.Invoke(dest);
table = fnc.GetTable("BILLINGDOCUMENTDETAIL").ToDataTable();
return table;
As far as I can tell, it all looks right. I got with the SAP team and they made sure the account I'm using has access to everything and we ran the BAPI in SAP and it worked fine.
So SAP seems fine. Any ideas on what I'm doing wrong here?
Please check if the parameter value is correct. Sometimes SAP function modules expect document numbers to be submitted with leading zeros. You can test function modules in transaction SE37 to check your parameters.

SQL query in Crystal Reports doesn't update

I'm writing an application which changes Crystal Reports database access parameters in report files. I open reports witin a .NET windows forms app and apply the SDK functionality to change driver type (ODBC/OLEDB), server name, database name, user, password, authentication type etc. I'm having a problem with the database name. My code DOES change the specific properties of the table ConnectionInfo (in subreports too) but fails to update the general SQL Query within the report. This results in the report still accessing the old database.
So if the original report was configured to access database_1 and I'm changing it to database_2, it will have all table properties properly changed to database_2 (verifiable in the Designer). It will still have database_1 in the query though. The database name remains unchanged in both the SDK RowsetController.GetSQLStatement() result and in the Crystal Reports Developer query view (Database->Show SQL Query...).
Also I have to have both databases (database_1 and database_2) online while the conversion takes place, otherwise I get exceptions on either GetSQLStatement(when database_1 is offline; becuase it still refers to it) or SetTableLocation (when database_2 is offline - this is expected and acceptable behavior though). If both db are online, there are no errors.
Here is exactly what I'm using:
1) CrystalDecisions.CrystalReports.Engine.ReportDocument.Load(filePath, OpenReportMethod.OpenReportByTempCopy)
(...)
2) Make and fill CrystalDecisions.ReportAppServer.DataDefModel.PropertyBag
3) Iterate through CrystalDecisions.ReportAppServer.DataDefModel.Tables and apply all properties with SetTableLocaiton() for each one.
4) Repeat with each subreport
5) RowsetController.GetSQLStatement() to view the report's sql query.
Is there some way to update the query basing on the new table ConnectionInfos (which seem to be set properly)? I don't even see any possibility of manually updating the query (GET, search&replace, SET).
I'm using:
.NET 4.5,
Visual Studio 2012,
CR for VS 13.0.5,
Crystal Reports Developer 9.2.2.693 for results verification (source reports are also created with it)
Answer: set propper QualifiedName for each table. The QualifiedName is the full name of the table including the DbName. This later appears in the SQL Query of the report. By qualified name we understand:
myDatabase.mySchema.myTableName
Code example:
CrystalDecisions.ReportAppServer.DataDefModel.Table boTable = new CrystalDecisions.ReportAppServer.DataDefModel.Table();
CrystalDecisions.ReportAppServer.DataDefModel.PropertyBag boMainPropertyBag = new CrystalDecisions.ReportAppServer.DataDefModel.PropertyBag();
CrystalDecisions.ReportAppServer.DataDefModel.PropertyBag boInnerPropertyBag = new CrystalDecisions.ReportAppServer.DataDefModel.PropertyBag();
// Custom function to fill property bags with values which influence the table properties as seen in CR Developer
FillPropertyBags(boMainPropertyBag, boInnerPropertyBag);
CrystalDecisions.ReportAppServer.DataDefModel.ConnectionInfo boConnectionInfo = new CrystalDecisions.ReportAppServer.DataDefModel.ConnectionInfo();
boConnectionInfo.Attributes = boMainPropertyBag;
boConnectionInfo.Kind = CrystalDecisions.ReportAppServer.DataDefModel.CrConnectionInfoKindEnum.crConnectionInfoKindCRQE;
boTable.ConnectionInfo = boConnectionInfo;
CrystalDecisions.ReportAppServer.DataDefModel.Tables boTables = boReportDocument.ReportClientDocument.DatabaseController.Database.Tables;
for (int i = 0; i < boTables.Count; i++)
{
boTable.Name = boTables[i].Name;
// the QualifiedName is directly taken into the CR general query so this is a quick fix to change it
boTable.QualifiedName = boTables[i].QualifiedName.Replace("oldDbName", "newDbName");
boTable.Alias = boTables[i].Alias;
boReportDocument.ReportClientDocument.DatabaseController.SetTableLocation(boTables[i], boTable);
}
Uhh...Researching for whole day and found answer after publishing question on SO.

SQL Server CE 4.0 - DbUpdate Exception was unhandled

I am struggling with SQL Server CE 4.0 almost all day. I get this error when I try to save new record (entity) to my database.
First a little preview - I installed SQL Server CE 4.0 via NuGet and I created a database for my project. Then I created the Entity Framework model layer and started working with that.
When I get this error my inner exception says this :
The column cannot be modified. [ Column name = Id ]
I did a little research and find out that this may be cause because of my settings for the Id property so I changed it like that :
I already have 17 records that I have inserted manually for testing purposes but I doubt this may cause any problem.
So what I try to do is this:
public override void Save()
{
using (RalBaseEntities ctx = new RalBaseEntities())
{
System.Data.Entity.DbSet<MainInfo> mainInfoEntity = ctx.Set<MainInfo>();
MainInfo entity = new MainInfo();
entity.Manager = txtManager.Text;
entity.Broker = txtBroker.Text;
mainInfoEntity.Add(entity);
ctx.SaveChanges();
}
}
So when I try to execute the Save() method I get the error above. I wrote that in previous versions one should have to create the id manually, but it's fixed in v. 4.0 and judging by the settings that I show as an image here I don't see a reason just to get a new record with an auto generated unique Id.
The word Update in DbUpdate Exception that I get is worrying me a bit. Maybe I'm trying to save the data in a wrong way but I spend a lot of time googling and it seems to be the right way.

subsonic in visual studio design host

I'm facing currently a problem regarding Subsonic configuration.
What I want to achieve is using subsonic data access in a System.Web.UI.Design.ControlDesigner class.
This class is hosted in Visual Studio Environment and enables design time operations on the attached System.Web.UI.WebControls.Control.
The only problem is SubSonic seems always looking for SubSonicSection in the application configuration regardless passing connection string to it.
The relevant code snippet:
using (SharedDbConnectionScope dbScope = new SharedDbConnectionScope(new SqlDataProvider(), ConnectionString))
{
Table1 _table1 = new Select().From<..().Where(...).IsEqualTo(...).ExecuteSingle<...>();
Throws exception on ExecuteSingle() method (configuration section was not found)
while
using (SharedDbConnectionScope dbScope = new SharedDbConnectionScope(ConnectionString))
{
Throws exception on new SharedDbConnectionScope() (configuration section was not found)
So the question is:
Is there any way to pass the settings runtime to bypass the configuration section lookup as I don't want to add any subsonic specific configuration to devenv.configuration
Thanks
I don't think you can do this in 2.x without customising the templates (which can obviously give support issues when a newer version of SubSonic is released).
Sorry, don't know about 3.0
I'm assuming you're using SubSonic 2.x based on your query syntax. Have a look at the following two forum posts which should point you in the right direction. What you're trying to do is possible, in fact SubCommander does exactly this, download the source and have a look at the SetProviderManually() method.
http://forums.subsonicproject.com/forums/t/1617.aspx
http://forums.subsonicproject.com/forums/t/1502.aspx
The method to use Subsonic runtime provider configuration:
(example):
private void SetSubsonicProviderManually(string ConnectionString)
{
//clear the providers and reset
DataService.Provider = new SqlDataProvider();
DataService.Providers = new DataProviderCollection();
//instance a section - we'll set this manually for the DataService
SubSonicSection section = new SubSonicSection();
section.DefaultProvider = __SubsonicProviderName__;
//set the properties
DataProvider provider = DataService.Provider;
NameValueCollection config = new NameValueCollection();
//need to add this for now
config.Add("connectionStringName", __ConnectionString__);
//initialize the provider
provider.Initialize(__SubsonicProviderName__, config);
provider.DefaultConnectionString = ConnectionString;
DataService.Providers.Add(provider);
}

Categories

Resources