Creating and populating an excel sheet with late-binding - c#

I've been looking at the microsoft support page Binding for Office automation servers with Visual C# .NET to try to create an Excel Worksheet, populate it with values in a datatable, and save it to the machine.
I have an implementation that uses early-binding and simply loops through the items, but I don't know how you would achieve this with late-binding, and I need to be able to embed the Interop types to make the application version independent in regard to MS Office.
How would I add the rows from a datatable to a new Excel Worksheet using late-binding?

I would recommend writing an interface and abstracting the data population step, and the excel step. That way you can have a system that implements early binding with excel to do things, and then an engine that uses this interface to populate the excel sheet. Step 2 would be to write a second implementation of the interface using Late Binding rather than early binding. Then you just substitute the second implementation for the first in your code when you create the interface.
In the code, you would only create 1 object, the interface itself. When creating it though, you can assign it as any other class/implementation that implements that interface...here's an example from my own code:
ISpreadsheetControl SSInterface;
if (conditionCheck())
SSInterface = new ExcelImplementer();
else
SSInterface = new OpenOfficeImplementer();
I Only ever use the 1 object, SSInterface, when placing data or changing page settings, etc etc...whatever else I implemented...but it can do so in two different manners based on which Class I assigned to the interface at load time.
As for the specifics and details on "how to"...I find the second example in the link you provided to be very helpful indeed. Its all about Type and Invoke. The difficulty will be keeping track of what you are working with at any given time. That is one of the things that will make it harder to work with, and a good reason to extract the early binding implementation first. That way you can see all the method names you will need and their parameter lists when writing the second.
I also want to add this: The very simple and short answer to your question is "Do it exactly the same way you already are" You just change 'how' you are calling the method that is populating the data...and all the rest of the excel interop implementation along with it.
UPDATE
I think this might do what you are looking for, although its messy enough that I would recommend putting it (both operations actually, one can call the other) into its own separate method, somewhere.
//Get a range object that contains the cell.
Parameters = new Object[2];
Parameters[0] = iRow + 1;
Parameters[1] = iCol;
objRange_Late = objSheet_Late.GetType().InvokeMember( "Cells",
BindingFlags.GetProperty, null, objSheet_Late, Parameters );
//Write value in cell
Parameters = new Object[1];
Parameters[0] = row[col.ColumnName];
objRange_Late.GetType().InvokeMember( "Value", BindingFlags.SetProperty,
null, objRange_Late, Parameters );
I have to admit, I don't have an implementation that I can test this on right now, but according to the things I know about it, that should work. If "Cells" doesn't work, I would also the same code with "Range"...I dont actually know if that one takes numerical input or not.
link to Cells property description (msdn)
you might also want to explore that whole system a bit, it can help you find some of the things you might be looking for.
Tested Managed to successfully create and test the above code, it works perfectly.

Related

C# custom file parsing with 2 delimiters and different record types

I have a (not quite valid) CSV file that contains rows of multiple types. Any record could be one of about 6 different types and each type has a different number of properties. The first part of any row contains the timestamp and the type of record, followed by a standard CSV of the data.
Example
1456057920 PERSON, Ted Danson, 123 Fake Street, 555-123-3214, blah
1476195120 PLACE, Detroit, Michigan, 12345
1440581532 THING, Bucket, Has holes, Not a good bucket
And to make matters more complex, I need to be able to do different things with the records depending on certain criteria. So a PERSON type can be automatically inserted into a DB without user input, but a THING type would be displayed on screen for the user to review and approve before adding to DB and continuing the parse, etc.
Normally, I would use a library like CsvHelper to map the records to a type, but in this case since the types could be different, and the first part uses a space instead of comma, I dont know how to do that with a standard CSV library. So currently how I am doing it each loop is:
String split based off comma.
Split the first array item by the space.
Use a switch statement to determine the type and create the object.
Put that object into a List of type object.
Get confused as to where to go now because i now have a list of various types and will have to use yet another switch or if to determine the next parts.
I don't really know for sure if I will actually need that List but I have a feeling the user will want the ability to manually flip through records in the file.
By this point, this is starting to make for very long, confusing code, and my gut feeling tells me there has to be a cleaner way to do this. I thought maybe using Type.GetType(string) would help simplify the code some, but this seems like it might be terribly inefficient in a loop with 10k+ records and might make things even more confusing. I then thought maybe making some interfaces might help, but I'm not the greatest at using interfaces in this context and I seem to end up in about this same situation.
So what would be a more manageable way to parse this file? Are there any C# parsing libraries out there that would be able to handle something like this?
You can implement an IRecord interface that has a Timestamp property and a Process method (perhaps others as well).
Then, implement concrete types for each type of record.
Use a switch statement to determine the type and create and populate the correct concrete type.
Place each object in a List
After that you can do whatever you need. Some examples:
Loop through each item and call Process() to handle it.
Use linq .OfType<{concrete type}> to segment the list. (Warning with 10k
records, this would be slow since it would traverse the entire list for each concrete type.)
Use an overridden ToString method to give a single text representation of the IRecord
If using WPF, you can define a datatype template for each concrete type, bind an ItemsControl derivative to a collection of IRecords and your "detail" display (e.g. ListItem or separate ContentControl) will automagically display the item using the correct DataTemplate
Continuing in my comment - well that depends. What u described is actually pretty good for starters, u can of course expand it to a series of factories one for each object type - so that you move from explicit switch into searching for first factory that can parse a line. Might prove useful if u are looking to adding more object types in the future - you just add then another factory for new kind of object. Up to you if these objects should share a common interface. Interface is used generally to define a a behavior, so it doesn't seem so. Maybe you should rather just a Dictionary? You need to ask urself if you actually need strongly typed objects here? Maybe what you need is a simple class with ObjectType property and Dictionary of properties with some helper methods for easy typed properties access like GetBool, GetInt or generic Get?

How to create the ability to apply a generic data source class

This is maybe something I know how to do or have already done it in the past. For some reason I am drawing a blank on how to wrap my head around it. This is more for learning as well as trying to implement something in my app.
I am using a set of third party controls. These controls offer a lot of functionality which is great. However, I want to be able to create a custom object that handle the logic/properties for the datasource of this control.
For example, there is a spreadsheet like object that I am using. You supply the spreadsheet like object some data and it pulls in your data. The problem here is that you need to set the columns, their data types, and other formatting/events as well as some logic to spit the data back to the user.
List<CustomClassWithProperties> dataSource
The custom class has some properties that will be translated to the columns. Like ProductName, Price, SalesDepartment, DatePurchased etc. This can be done by supplying the spreadsheet the columns and their data types each time. I want to be able to create a helper class that you just supply a list, a visible column list, and an editable column list and the data will fill in without any other issues.
Using the above list, I would imagine something similar to this:
DataHelperClass dtHlpr = new DataHelperClass(List<CustomClassWithProperties> data, List<string> visibleColumns, List<string> editableColumns)
This data helper class will take the data input list as the spreadsheet data source. It would then take the visibleColumns list and use that to set the visible rows, same for editableColumns.
Where I am running into a mental block (long week) is when I want to be able to reuse this. Let's say I have a List that has completely different properties. I would want my constructor for the data helper to be able to handle any List I send to it. Looking at whatever code I can get to for the third party controls, it appears that their data source is of type object.
Could someone point me in the right direction? I am thinking it has to do with generics and some interface implementation. I just honestly cannot think of where to start.
You can make the class itself generic:
public class DataHelperClass<T>
{
public DataHelperClass(List<T> data, ...) { ... }
}
DataHelperClass<CustomClassWithProperties> dtHlpr = new DataHelperClass<CustomClassWithProperties>(List<CustomClassWithProperties> data, List<string> visibleColumns, List<string> editableColumns)
You'd then perform your reflection against typeof(T).
I'd also be tempted to use IEnumerable<T> rather than List<T> if possible, but that's a matter of preference, more or less.
This is similar to using a simple List<object>, except that it enforces that all objects in the list inherit from the same type (which might well be object), so you get some more type-checking than you otherwise would.
You mentioned interfaces, I don't see any reason here to include that (from what you've told us, at least), but you can certainly make a generic interface via the same syntax.

Integration Test for All References of a Method Invocation

So, I've been searching around on the internet for a bit, trying to see if someone has already invented the wheel here. What I want to do is write an integration test that will parse the current project, find all references to a certain method, find it's arguments, and then check the database for that argument. For example:
public interface IContentProvider
{
ContentItem GetContentFor(string descriptor);
}
public class ContentProvider : IContentProvider
{
public virtual ContentItem GetContentFor(string descriptor)
{
// Fetches Content from Database for descriptor and returns in
}
}
Any other class will get an IContentProvider injected into their constructor using IOC, such that they could write something like:
contentProvider.GetContentFor("SomeDescriptor");
contentProvider.GetContentFor("SomeOtherDescriptor");
Basically, the unit test finds all these references, find the set of text ["SomeDescriptor", "SomeOtherDescriptor"], and then I can check the database to make sure I have rows defined for those descriptors. Furthermore, the descriptors are hard coded.
I could make an enum value for all descriptors, but the enum would have thousands of possible options, and that seems like kinda a hack.
Now, this link on SO: How I can get all reference with Reflection + C# basically says it's impossible without some very advanced IL parsing. To clarify; I don't need Reflector or anything - it's just to be an automated test I can run so that if any other developers on my team check in code that calls for this content without creating the DB record, the test will fail.
Is this possible? If so, does anyone have a resource to look at or sample code to modify?
EDIT: Alternatively, perhaps a different method of doing this VS trying to find all references? The end result is I want a test to fail when the record doesnt exist.
This will be very difficult: your program may compute the value of the descriptor, which will mean your test is able to know which value are possible without executing said code.
I would suggest to change the way you program here, by using an enum type, or coding using the type safe enum pattern. This way, each and every use of a GetContentFor will be safe: the argument is part of the enum, and the languages type checker performs the check.
Your test can then easily iterate on the different enum fields, and check they are all declared in your database, very easily.
Adding a new content key requires editing the enum, but this is a small inconvenient you can live with, as it help a log ensuring all calls are safe.

DataTable vs. Collection in .Net

I am writing a program that needs to read a set of records that describe the register map of a device I need to communicate with. Each record will have a handfull of fields that describe the properties of each register.
I don't really need to edit or modify the data in my VB or C# program, though I would like to be able to display the data on a grid. I would like to store the data in a CSV file, or perhaps an XML file. I need to enable users to edit the data off-line, preferably in excel.
I am considering using a DataTable or a Collection of "Register" objects (which I would define).
I prototyped a DataTable, and found I can read/write XML easily using the built in methods and I can easily bind to a DataGridView. I was not able to find a way to retreive info on a single register without using a query that returns a collection of rows, even though I defined a unique primaty key column. The syntax to get a value from a column is also complex, though I could be missing something on both counts.
I'm tempted to use a collection of "Register" objects that I can access via a unique key. It would be a little more coding up front, but seems like a cleaner solution overall. I should still be able to use LINQ to dataset to query subsets of registers when I need them, but would also be able to grab a single field using a the key value, something like this: Registers(keyValue).fieldName).
Which would be a cleaner approach to the problem?
Is there a way to read/write XML into a Collection without needing custom code?
Could this be accomplished using String for a key?
UPDATE: Sounds like the consensus is towards the Collection of register Objects. Makes sense to me. I was leaning that way, and since nobody pointed out any DataTable features that would simplify acessing a single row, it looks like the Collection is clearly the way to go. Thanks to those who weighed in.
I would be inclined not to use data sets. It would be better to work with objects and collections. Your code will be more maintainable/readable, composable, testable & reusable.
Given that you can do queries on the data set to return particular row, you might find that a LINQ query to turn the rows into objects may be all the custom code that you need.
Using a Dictionary<string, Register> for look ups is a good idea if you have a large number of items (say greater than 1000). Otherwise a simple LINQ query should be fine.
It depends on how you define 'clean'.
A generic collection is potentially MUCH more lightweight than a DataTable. But on the other hand that doesn't seem to be too much of an issue for you. And unless you go into heavy reflection you'll have to write some code to read/write xml.
If you use a key I'd also recommend (in the case of the collection) to use a Dictionary. That way you have a Collection of the raw data and still can identify each entry through the key in the Dictionary.
I usually use datatables if its something quick and unlikely to be used in any other way. If it's something I can see evolving into an object that has its own use within the app (like your Register Object you mentioned).
It might be a little extra code up front - but it saves converting from a datatable to the collection in the future if you come up with something you would like to do based on an individual row, or if you want/need to add some sort of extra functionality to that element down the road.
I would go with the collection of objects so you can swap out the data access later if you need to.
You can serialize classes with an xml serializer and defining a Serialize attribute or something like that (it has been a while since I done that, sorry for the vagueness). A DataSet or DataTable works great with XML.
Both DS and DT have ReadXml and WriteXml methods. XML must be predefined format, but it works seamlessly.
Otherwise, I personally like collections or dictionaries; DS/DT are OK, but I like custom objects, and LINQ adds in some power.
HTH.

Programatic way to do linear referencing in ArcGIS

I am working on a custom ArcGIS Desktop tool project and I would like to implement an automated linear referencing feature in it. To make a long story short, I would like to display problematic segments along a route and show the severity by using a color code (say green, yellow, red, etc.). I know this is a pretty common scenario and have come to understand that the "right way" of accomplishing this task is to create a linear event table which will allow me to assign different codes to certain route segments. Some of my colleagues know how to do it manually but I can't seem to find any way to replicate this programaticaly.
The current tool is written in C# and already performs all the needed calculations to determine the problematic areas. The problem mainly is that I don't know where to start since I don't know a lot about ArcObjects. Any code sample or suggestion is welcome (C# is preferred but C++, VB and others will surely help me anyway).
EDIT :
I'm trying to use the MakeRouteEventLayer tool but can't seem to get the different pre-conditions met. The routes are hosted on an SDE server. So far, I am establishing a connection this way :
ESRI.ArcGIS.esriSystem.IPropertySet pConnectionProperties = new ESRI.ArcGIS.esriSystem.PropertySet();
ESRI.ArcGIS.Geodatabase.IWorkspaceFactory pWorkspaceFactory;
ESRI.ArcGIS.Geodatabase.IWorkspace pWorkspace;
ESRI.ArcGIS.Location.ILocatorManager pLocatorManager;
ESRI.ArcGIS.Location.IDatabaseLocatorWorkspace pDatabaseLocatorWorkspace;
pConnectionProperties.SetProperty("server", "xxxx");
pConnectionProperties.SetProperty("instance", "yyyy");
pConnectionProperties.SetProperty("database", "zzzz");
pConnectionProperties.SetProperty("AUTHENTICATION_MODE", "OSA");
pConnectionProperties.SetProperty("version", "dbo.DEFAULT");
pWorkspaceFactory = new ESRI.ArcGIS.DataSourcesGDB.SdeWorkspaceFactory();
pWorkspace = pWorkspaceFactory.Open(pConnectionProperties, 0);
pLocatorManager = new ESRI.ArcGIS.Location.LocatorManager();
pDatabaseLocatorWorkspace = (ESRI.ArcGIS.Location.IDatabaseLocatorWorkspace)pLocatorManager.GetLocatorWorkspace(pWorkspace);
Now I am stuck trying to prepare everything for MakeRouteEventLayer's constructor. I can't seem to find how i'm supposed to get the Feature Layer to pass as the Input Route Features. Also, I don't understand how to create an event table properly. I can't seem to find any exemple relating to what I am trying to accomplish aside from this one which I don't understand since it isn't documented/commented and the datatypes are not mentionned.
I'm not entirely certain what it is you want to do. If you want to get Linear Referencing values or manipulate them directly in a feature class that already has linear referencing defined, that's pretty straight forward.
IFeatureClass fc = ....;
IFeature feature = fc.GetFeature(...);
IMSegmentation3 seg = (IMSegmentation3)feature;
... blah ...
If you need to create a Feature class with linear referencing, you should start witht he "Geoprocessing" tools in the ArcToolbox. If the out-of-the-box tools can do most of what you need, this will minimize your coding.
I would strongly recommend trying to figure what you need to do with ArcMap if at all possible... then backing out the ArcObjects.
Linear Referencing API
Linear Referencing Toolbox
Understanding Linear Referencing

Categories

Resources