I'm writing an application which changes Crystal Reports database access parameters in report files. I open reports witin a .NET windows forms app and apply the SDK functionality to change driver type (ODBC/OLEDB), server name, database name, user, password, authentication type etc. I'm having a problem with the database name. My code DOES change the specific properties of the table ConnectionInfo (in subreports too) but fails to update the general SQL Query within the report. This results in the report still accessing the old database.
So if the original report was configured to access database_1 and I'm changing it to database_2, it will have all table properties properly changed to database_2 (verifiable in the Designer). It will still have database_1 in the query though. The database name remains unchanged in both the SDK RowsetController.GetSQLStatement() result and in the Crystal Reports Developer query view (Database->Show SQL Query...).
Also I have to have both databases (database_1 and database_2) online while the conversion takes place, otherwise I get exceptions on either GetSQLStatement(when database_1 is offline; becuase it still refers to it) or SetTableLocation (when database_2 is offline - this is expected and acceptable behavior though). If both db are online, there are no errors.
Here is exactly what I'm using:
1) CrystalDecisions.CrystalReports.Engine.ReportDocument.Load(filePath, OpenReportMethod.OpenReportByTempCopy)
(...)
2) Make and fill CrystalDecisions.ReportAppServer.DataDefModel.PropertyBag
3) Iterate through CrystalDecisions.ReportAppServer.DataDefModel.Tables and apply all properties with SetTableLocaiton() for each one.
4) Repeat with each subreport
5) RowsetController.GetSQLStatement() to view the report's sql query.
Is there some way to update the query basing on the new table ConnectionInfos (which seem to be set properly)? I don't even see any possibility of manually updating the query (GET, search&replace, SET).
I'm using:
.NET 4.5,
Visual Studio 2012,
CR for VS 13.0.5,
Crystal Reports Developer 9.2.2.693 for results verification (source reports are also created with it)
Answer: set propper QualifiedName for each table. The QualifiedName is the full name of the table including the DbName. This later appears in the SQL Query of the report. By qualified name we understand:
myDatabase.mySchema.myTableName
Code example:
CrystalDecisions.ReportAppServer.DataDefModel.Table boTable = new CrystalDecisions.ReportAppServer.DataDefModel.Table();
CrystalDecisions.ReportAppServer.DataDefModel.PropertyBag boMainPropertyBag = new CrystalDecisions.ReportAppServer.DataDefModel.PropertyBag();
CrystalDecisions.ReportAppServer.DataDefModel.PropertyBag boInnerPropertyBag = new CrystalDecisions.ReportAppServer.DataDefModel.PropertyBag();
// Custom function to fill property bags with values which influence the table properties as seen in CR Developer
FillPropertyBags(boMainPropertyBag, boInnerPropertyBag);
CrystalDecisions.ReportAppServer.DataDefModel.ConnectionInfo boConnectionInfo = new CrystalDecisions.ReportAppServer.DataDefModel.ConnectionInfo();
boConnectionInfo.Attributes = boMainPropertyBag;
boConnectionInfo.Kind = CrystalDecisions.ReportAppServer.DataDefModel.CrConnectionInfoKindEnum.crConnectionInfoKindCRQE;
boTable.ConnectionInfo = boConnectionInfo;
CrystalDecisions.ReportAppServer.DataDefModel.Tables boTables = boReportDocument.ReportClientDocument.DatabaseController.Database.Tables;
for (int i = 0; i < boTables.Count; i++)
{
boTable.Name = boTables[i].Name;
// the QualifiedName is directly taken into the CR general query so this is a quick fix to change it
boTable.QualifiedName = boTables[i].QualifiedName.Replace("oldDbName", "newDbName");
boTable.Alias = boTables[i].Alias;
boReportDocument.ReportClientDocument.DatabaseController.SetTableLocation(boTables[i], boTable);
}
Uhh...Researching for whole day and found answer after publishing question on SO.
Related
I have been given a task to use the TFS API in order to check which build has which changeset number, after deployment. I haven't worked with TFS before, so mainly I've been trying to Google things, to find the answer. I've been at it for 2 days now, so I'm hoping someone can nudge me in the right direction...
Here is what I have done so far:
Uri collectionUri = new Uri("mytfs/tfs/");
var server = TfsConfigurationServerFactory.GetConfigurationServer(collectionUri);
server.Authenticate();
server.EnsureAuthenticated();
var service = server.GetService<TswaClientHyperlinkService>();
var projectCollection = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(new Uri("mytfs/tfs/collection"));
var cssService = projectCollection.GetService<ICommonStructureService3>();
var project = cssService.GetProjectFromName("project");
WorkItemStore workItemStore = projectCollection.GetService<WorkItemStore>();
WorkItemCollection workItemCollection = workItemStore.Query("SELECT * FROM WorkItems");
So in the workItemCollection object, I tried a few queries, but it seems it doesn't allow me to change database, use joins etc. just a simple select/from statement.
Am I on the right track - is this how I should be getting build and changeset number?? If yes, where can I see what tables I need to query?
The problem here is that you're thinking of this as a database. It's not a database. It's an object model that allows you to programmatically access various aspects of TFS through a well-defined API.
Work item queries are not SQL, they are WIQL (work item query language). The work item object will definitely have a link to the associated changeset, but it won't have a link to a build. Some work item types have a field for "fixed in" that will be automatically updated with the build, but not all of them, so it's not necessarily reliable.
To find particular builds, you'll need to use the IBuildServer service and search for a build spec.
I am struggling with SQL Server CE 4.0 almost all day. I get this error when I try to save new record (entity) to my database.
First a little preview - I installed SQL Server CE 4.0 via NuGet and I created a database for my project. Then I created the Entity Framework model layer and started working with that.
When I get this error my inner exception says this :
The column cannot be modified. [ Column name = Id ]
I did a little research and find out that this may be cause because of my settings for the Id property so I changed it like that :
I already have 17 records that I have inserted manually for testing purposes but I doubt this may cause any problem.
So what I try to do is this:
public override void Save()
{
using (RalBaseEntities ctx = new RalBaseEntities())
{
System.Data.Entity.DbSet<MainInfo> mainInfoEntity = ctx.Set<MainInfo>();
MainInfo entity = new MainInfo();
entity.Manager = txtManager.Text;
entity.Broker = txtBroker.Text;
mainInfoEntity.Add(entity);
ctx.SaveChanges();
}
}
So when I try to execute the Save() method I get the error above. I wrote that in previous versions one should have to create the id manually, but it's fixed in v. 4.0 and judging by the settings that I show as an image here I don't see a reason just to get a new record with an auto generated unique Id.
The word Update in DbUpdate Exception that I get is worrying me a bit. Maybe I'm trying to save the data in a wrong way but I spend a lot of time googling and it seems to be the right way.
So, I have a sync fx C# project which connects to two databases. I am currently adding provision to both projects, but before I provision, I deprovision first, as seen in the code.
SqlSyncScopeProvisioning sqlAzureProv = new SqlSyncScopeProvisioning(sqlAzureConn, myScope);
SqlSyncScopeDeprovisioning sqlAzureDeprov = new SqlSyncScopeDeprovisioning(sqlAzureConn);
sqlAzureDeprov.DeprovisionStore();
sqlAzureProv.Apply();
the problem is, everytime I call the Apply() method, it throws the error
The column 'local_update_peer_timestamp' was specified multiple times
for 'changes'.
I haven't used any column local_update_peer_timestamp in my database, I've checked everything.
It only happens when reprovisioning, a fresh provision doesn't return any errors.
Any ideas? Thanks.
just putting the actual issue here.
when populating the scope description, make sure you don't include the tables created by Sync Fx (_tracking tables and scope_xxx tables) in the sync scope description.
this normally happens when you just scan all tables in the database, loop thru them and add them to the sync scope description.
Below the basic structure of provisioning using scope (filter), if you want without scope just take:` // define a new scope named ProductsScope
scopeDesc = new DbSyncScopeDescription(values);
// get the description of the Products table from SyncDB dtabase
tableDesc = SqlSyncDescriptionBuilder.GetDescriptionForTable(values, serverConn);
// add the table description to the sync scope definition
scopeDesc.Tables.Add(tableDesc);
// create a server scope provisioning object based on the ProductScope
serverProvision = new SqlSyncScopeProvisioning(serverConn, scopeDesc);
// skipping the creation of table since table already exists on server
serverProvision.SetCreateTableDefault(DbSyncCreationOption.Skip);
// start the provisioning process
serverProvision.Apply();
//Console.WriteLine(values);`
I've searched all over and I now have to ask SO. I'm trying to construct a simple dataflow using EzAPI. It's been anything but easy, but I'm committed to figuring this out. What I can't figure out is how to get the EzOleDBDestination working. Here's my complete code
var a = new Application();
// using a template since it's impossible to set up an ADO.NET connection to MySQL
// using EzAPI and potentially even with the raw SSIS API...
var pkg = new EzPackage(a.LoadPackage(#"C:\...\Package.dtsx", null));
pkg.Name = "Star";
var df = new EzDataFlow(pkg);
df.Name = "My DataFlow";
var src = new EzAdoNetSource(df);
src.Name = "Source Database";
src.SqlCommand = "SELECT * FROM enum_institution";
src.AccessMode = AccessMode.AM_SQLCOMMAND;
src.Connection = new EzConnectionManager(pkg, pkg.Connections["SourceDB"]);
src.ReinitializeMetaData();
var derived = new EzDerivedColumn(df);
derived.AttachTo(src);
derived.Name = "Prepare Dimension Attributes";
derived.LinkAllInputsToOutputs();
derived.Expression["SourceNumber"] = "id";
derived.Expression["Name"] = "(DT_STR,255,1252)description";
// EDIT: reordered the operation here and I no longer get an error, but
// I'm not getting any mappings or any input columns when I open the package in the designer
var dest = new EzOleDbDestination(df);
dest.AttachTo(derived, 0, 0);
dest.Name = "Target Database";
dest.AccessMode = 0;
dest.Table = "[dbo].[DimInstitution]";
dest.Connection = new EzConnectionManager(pkg, pkg.Connections["TargetDB"]);
// this comes from Yahia's link
var destInput = dest.Meta.InputCollection[0];
var destVirInput = destInput.GetVirtualInput();
var destInputCols = destInput.InputColumnCollection;
var destExtCols = destInput.ExternalMetadataColumnCollection;
var sourceColumns = derived.Meta.OutputCollection[0].OutputColumnCollection;
foreach(IDTSOutputColumn100 outputCol in sourceColumns) {
// Now getting COM Exception here...
var extCol = destExtCols[outputCol.Name];
if(extCol != null) {
// Create an input column from an output col of previous component.
destVirInput.SetUsageType(outputCol.ID, DTSUsageType.UT_READONLY);
var inputCol = destInputCols.GetInputColumnByLineageID(outputCol.ID);
if(inputCol != null) {
// map the input column with an external metadata column
dest.Comp.MapInputColumn(destInput.ID, inputCol.ID, extCol.ID);
}
}
}
Basically, anything that involves calls to ReinitializeMetadata() results in 0xC0090001, because that method is where the error happens. There's no real documentation to help me, so I have to rely on any gurus here.
I should mention that the source DB is MySQL and the target DB is SQL Server. Building packages like this using the SSIS designer works fine, so I know it's possible.
Feel free to tell me if I'm doing anything else wrong.
EDIT: here's a link to the base package I'm using as a template: http://www.filedropper.com/package_1 . I've redacted the connection details, but any MySQL and SQL Server database will do. The package will read from MySQL (using the MySQL ADO.NET Connector) and write to SQL Server.
The database schema is mostly irrelevant. For testing, just make a table in MySQL that has two columns: id (int) and description (varchar), with id being the primary key. Make equivalent columns in SQL Server. The goal here is simply to copy from one to the other. It may end up being more complex at some point, but I have to get past this hurdle first.
I can't test this now BUT I am rather sure that the following will help you get it working:
Calling ReinitializeMetadata() causes the component to fetch the table metadata. This should only be called after setting the AccessMode and related property. You are calling it before setting AccessMode...
Various samples including advice on debugging problems
define the derived column(s) directly in the SQL command instead of using a EzDerivedColumn
try to get it working with 2 SQL Server DBs first, some of the available MySQL ADO.NET provider have some shortcomings under some circumstances
UPDATE - as per comments some more information on debugging this and a link to a complete end-to-end sample with source:
http://blogs.msdn.com/b/mattm/archive/2009/08/03/looking-up-ssis-hresult-comexception-errorcode.aspx
http://blogs.msdn.com/b/mattm/archive/2009/08/03/debugging-a-comexception-during-package-generation.aspx
Complete working sample with source
I've had this exact same issue and been able to resolve it with a lot of experimentation. In short you must set the connection for both the source and destination, and then call the attachTo after both connections are set. You must call attachTo for every component.
I've written a blog about starting with an SSIS package as a template, and then manipulating it programmatically to produce a set of new packages.
The article explains the issue more.
I've done a asp.net application to generate reports over a particular data. Initially i created local reports (.rdlc) to generate reports. I created separate .xsd for each rdlc and designed the reports. I build the dataset programmatically and bind it to the rdlc. I used the following code for binding the datasource to the reports -
rptMyReport.LocalReport.ReportPath = Server.MapPath(srdlcName);
rptMyReport.LocalReport.DataSources.Add(rds);
Now i have converted all the rdlc to rdl following this msdn article and i've published the reports to the report server.
rptMyReport.ServerReport.ReportServerUrl = new System.Uri("http://ReportServer/ReportServer");
rptMyReport.ServerReport.ReportPath = "/ReportFolder/ReportName";
Now how can i set the datasource to the reports programmatically?
This work is impossible. you should create your datasource in your rdl report. you must write needed queries for report data gathering. you can use this query as a text or stored procedure. You can pass parameters to this query and filter the output of the query.
you can only pass the parameters to rdl report like this:
ReportParameter[] Params = new ReportParameter[1];
Params[0] = "Parameter Value";
ReportViewerControl.ServerReport.SetParameters(Params);
What are you trying to do? I believe the datasource is mentioned within the .RDL file itself.
For ex: When you create an Report using BIDS, you can specify the datasource. This gets added to the .RDL file. The same concept holds true here as well.
This is not impossible, but may take a little effort if this is what you want to do.
You have a couple of options:
If you create a shared datasource on your report server you can add it manually using the RDLObjectModel. Get the shared datasource name and guid from your reportserver and you can add it to your report.
Example:
'create the datasource for the report
Dim dataSrcRFoo = New RdlObjectModel.DataSource
dataSrcRFoo.Name = "DataSourceName"
dataSrcRFoo.DataSourceReference = "/path/to/DataSource"
dataSrcRFoo.IsShared = True
dataSrcRFoo.SecurityType = 2 ' RdlObjectModel.SecurityTypeEnum.DataBase
dataSrcRFoo.DataSourceID = New Guid("shareddatasourceguid")
'add data source to report
rdlRpt.DataSources.Add(dataSrcRFoo)
Another option is to use templates on the server that have the share (or report level) datasource built in.