Early Bound Class Generation, how to make seperate files with CrmSvcUtil? - c#

We're in the process of upgrading from CRM 2011 to CRM 2016 and as such are rewriting and restructuring quite a lot of our back end code. One of the things we'd like to do is move the Early Bound classes from one gigantic file with all of the classes in, to a single class per file.
I know this was available in CRM 4.0, and that it's doable via the XrmToolBox using the Early Bound Generator plugin, but I can't figure it out for the life of me!
Here is my command line for running CrmSvcUtil:
"C:\CRM_SDK\sdk\bin\crmsvcutil.exe" /url:http://XXX/XRMServices/2011/Organization.svc /o:"C:\CRM_SDK\sdk\Bin\Entities" /n:XXX.crm /serviceContextName:XrmServiceContext /domain:XXX /username:XXX /password:XXX
Currently this just outputs 1 file, but I want somewhere in the region of 250 files!
Any help would be appreciated!

The Early Bound Generator has custom code that runs as a part of the write process of the CrmSvcUtil that splits apart the file, post generation. There is no supported method for doing it via the CrmSvcUtil without custom code. Any reason you don't just use the Early Bound Generator? You can run it in command line mode if you're wanting to make it part of the build process. It actually spits out the required command line when you generate the entities/optionsets/actions.

Related

AXL API - Extremely long processing time of CS file

Just need a point in the right direction with this one.
I've created the Cisco Unified Call Manager API via the instructions provided by Cisco, the API for CUCM is called AXL.
It's currently in my C# WPF project and works just fine (i've retrieved some phone data successfully), the issue is that the API is in a single CS file that's 345K lines long. This is causing an extremely long delay when I attempt the first action using the API (after it has compiled).
As one user on the Cisco forum advised:
There is a very high chance that your problem is with the time that it takes the .net framework to generate the xml serialization assembly.
Pre-generate the xml serialization assembly when using AXL on .net and your first response will be MUCH faster.
I've tried to pre-generate it using the instructions from user brain backup in this thread. Unfortunately the first use of the API is still around ~45 seconds (it did reduce it by about a minute). I'm not extremely savvy with the debugging tools within Visual Studio so unsure how to check what exactly is causing the issue (but it certainly looks like an issue related to generating the XML).
I was wondering if anyone could recommend of a way to remove the unnecessary methods from the CS file (99% of it won't be used anyway) without having to manually re-create it. Any type of tool that can pull/delete methods and their dependencies from a CS file would be absolutely brilliant.
There is a way to check whether you method has been used or not and if used how many times and where check this out.
https://visualstudiomagazine.com/Blogs/Tool-Tracker/2014/12/Finding-Method-Property-Variable.aspx
It might make sense to pare the AXL WSDL itself and re-compile - as mentioned, it's unlikely you'll ever use anywhere near the whole schema.
You should be able to just edit AXLAPI.wsdl and remove all of the and elements except for the items you are actually useing.
Had the same issue, it was almost unusable with the delay. Two things that I have found to get around this with almost instant results.
Don't use the WSDL. Write your own methods to handle SOAP requests. Takes time and can be error prone but results are almost instant.
Use a tool that can handle large text files, like Notpad++ to open your WSDL generated code file and take only what methods out that you need. This is the method I've choose and it works great.
Also, I believe you could just use the executeSQLQuery methods and cut out a good portion of the rest of the code but I've yet to try it. Each method above I have tried without pregenerating the xml serialization. I found the problem to be with the generated C# axl code file size.

Programmatically create MSTest unit tests

I am looking for a way to programmatically create unit tests using MSTest. I would like to loop through a series of configuration data and create tests dynamically based on the information. The configuration data will not be available at compile time and may come from an external data source such as a database or an XML file. Scenario: Load configuration data into a test harness and loop through the data while creating a new test for each element. Would like each dynamically created test to be reported (success/fail) on separately.
You can use Data Driven Testing depending on how complex your data is. If you are just substituting values and testing to make sure that your code can handle the same inputs that might be the way to go, but this doesn't really sound like what you are after. (You could make this more complex, after all all you are doing is pulling in values from a data source and then making a programmatic decision based on it)
All MS Test really does is run a series of tests and then produce the results (in an xml file) which is then interpreted by the calling application. It's just a wrapper for executing methods that you designate through attributes.
What it sounds like you're asking is to write C# code dynamically, and have it execute in the harness.
If you really want to run this through MS test you could:
Build a method (or series of methods) which looks at the XML file
Write out the C# code (I would maybe look at T4 Templates for this) (Personally, I would use F# to do this, but I'm more partial to functional languages, and this would be easier for me).
Calls the csc.exe (C# compiler)
Invokes MS Test
You could also write MSIL code into the running application directly, and try to get MS Test to execute it, which for some might be fun, but that could be time consuming and not necessarily guaranteed to work (I haven't tried it, so I don't know what the pit falls would be).
Based on this, it might be easier to quickly build your own harness which will interpret your XML file and dynamically build out your test scenarios and produce the same results file. (After all the results are what's important, not how you got there.) Since you said it won't be available during compile time, I would guess that you aren't interested in viewing the results in the VS studio window.
Actually, personally, I wouldn't use XML as your Domain Specific Language (DSL). The parsing of it is easy, because .NET already does that for you, but it's limiting in how it would define how your method can function. It's meant for conveying data, and although technically code is a form of data, it doesn't have the sufficient expressive strength to convey many abilities in more formal language. This is just my personal opinion though, and there are many ways to skin a cat.

Manipulating a Python file from C#

I'm working on some tools for a game I'm making. The tools serve as a front end to making editing game files easier. Several of the files are python scripting files. For instance, I have an Items.py file that contains the following (minimalized for example)
from ItemModule import *
import copy
class ScriptedItem(Item):
def __init__(self, name, description, itemtypes, primarytype, flags, usability, value, throwpower):
Item.__init__(self, name, description, itemtypes, primarytype, flags, usability, value, throwpower, Item.GetNextItemID())
def Clone(self):
return copy.deepcopy(self)
ItemLibrary.AddItem(ScriptedItem("Abounding Crystal", "A colourful crystal composed of many smaller crystals. It gives off a warm glow.", ItemType.SynthesisMaterial, ItemType.SynthesisMaterial, 0, ItemUsage.Unusable, 0, 50))
As I Mentioned, I want to provide a front end for editing this file without requring an editor to know python/edit the file directly. My editor needs to be able to:
Find and list all the class types (in this example, it'd be only
Scripted Item)
Find and list all created items (in this case there'd only be one,
Abounding Crystal). I'd need to find the type (in this
caseScriptedItem) and all the parameter values
Allow editing of parameters and the creation/removal of items.
To do this, I started writing my own parser, looking for the class keyword and when these recorded classes are use to construct objects. This worked for simple data, but when I started using classes with complex constructors (lists, maps, etc.) it became increasing difficult to correctly parse.
After searching around, I found IronPython made it easy to parse python files, so that's what I went about doing. Once I built the Abstract Syntax Tree I used PythonWalkers to identify and find all the information I need. This works perfectly for reading in data, but I don't see an easy way to push updated data into the Python file. As far as I can tell, there's no way to change the values in the AST and much less so to convert the AST back into a script file. If I'm wrong, I'd love for someone to tell me how I could do this. What I'd need to do now is search through the file until I find the correctly line, then try to push the data into the constructor, ensuring correct ordering.
Is there some obvious solution I'm not seeing? Should I just keeping working on my parser and make it support more complex data types? I really though I had it with the IronPython parser, but I didn't think about how tricky it'd be to push modified data back into the file.
Any suggestions would be appreciated
You want a source-to-source program transformation tool.
Such a tool parses a language to an internal data structure (invariably an AST), allows you to modify the AST, and then can regenerate source text from the modified AST without changing essentially anything about the source except where the AST changes were made.
Such a program transformation tool has to parse text to ASTs, and "anti-parse" (called "Prettyprint") ASTs to text. If IronPython has a prettyprinter, that's what you need.
If it doesn't, you can build one with some (maybe a lot) of effort; as you've observed,
this isn't as easy as one might think. See my answer
Compiling an AST back to source code
If that doesn't work, our DMS Software Reengineering Toolkit with its Python front end might do the trick. It has all the above properties.
Provided you can find a complete and up-to-date context free grammar file for Python, you could use CoCo/R parser generator to generate a python parser in C#.
You can add production code to the grammar file itself to populate a data structure in your C# app. Said data structure can hold all the information you need (methods and their arguments, properties, constructors, destructors etc). Once you have this data structure, its just a task of designing a front end for the user and representing this data structure in a way that makes it editable to them (this is more of a design task than a complicated programming task).
Finally, iterate through you data structure and write out a .py file.
You can use the python inspect module to print the source of an object. In your case: To print the source of your module - the file you just parsed with IronPython. I haven't checked to see if inspect works with IronPython yet, though.
As to adding stuff, well, it's a module, right? You can just add stuff to a module... I'd load the module and then alter it, use inspect to view print it and save to disk.
From your post, it looks like you're already deep in the trenches and having fun, so I'd be really happy to see a post here on how you solved this problem!
To me it sounds more like you are at the point where you shove it all into a sqlite database and start editing it that way. Hooking up some forms to edit tables is simpler for the UI. At that point you generate new python files by dumping your tables out with some formatting to provide the surrounding python scripts.
SVN / Git / whatever can merge the updated changes via the python files.
This is what I ended up doing for my project at any rate. I started using python to hook up the various items using their computed keys and then just added some forms UI to avoid editing mistakes in the python files.

Monitoring .NET ASP.NET Applications

I have a number of applications running on top of ASP.NET I want to monitor. The main things I care about are:
Exceptions: We currently some custom code which will email us when an exception occurs. If the application is failing hard it will crash our outlook... I know (and use) elmah which partly solves the problem however it is still just a big table of exceptions with a pretty(ish) UI. I want something that makes sense of all of these exceptions (e.g. groups exceptions, alerts when new ones occur, tells me what the common ones are that I should fix, etc)
Logging: We currently log to files which are then accessible via a shared folder which dev's grep & tail. Does anyone know of better ways of presenting this information. In an ideal world I want to associate it with exceptions.
Performance: Request times, memory usage, cpu, etc. whatever stats I can get
I'm guessing this is probably going to be solved by a number of tools, has anyone got any suggestions?
You should take a look at Gibraltar not used it myself but looks very good! Also works with nLog and log4net so if you use those you are in luck!!
Well, we have exactly the same current solution. Emails upon emails litter my inbox and mostly get ignored. Over the holidays an error caused everyone in dev to hit their inbox limit ;)
So where are we headed with a solution?
We already generate classes for all our excpetions, they rarely get used from more than one place. This was essentially step one, but we started the code base this way.
We modified the generator to create unique HRESULT values for all exceptions.
Again we added a generator to create a .MC message resource file for every exception.
Now every exception can write itself to the Windows Event Log, and thus we removed all emails etc, and rely on the event log.
Now with an event log full of information, including unique message codes and parameters for each exception, we can use off-the-shelf tools to aggregate, monitor, and alert.
The exception generator (before modifications above) is located here:
http://csharptest.net/browse/src/Tools/Generators
It integrates with visual studio by replacing the ResX generator with this:
http://csharptest.net/browse/src/Tools/CmdTool
I have not yet published the MC generator, or the HRESULT generation; however, it will be available in the above locations when I get the time.
-- UPDATE --
All the tools and source are now available online for this. So where do I go from here?
Download the source or binaries from: http://code.google.com/p/csharptest-net/
Take a look at the help for CmdTool.exe Visual Studio Integration
Then review the help on Generators for ResX and MC files, there are several ways to generate MC files or complete message DLLs from ResX files. Pick the approach that fits you best.
Run CmdTool.exe REGISTER to register the tool with Visual Studio
Create a ResX file as normal, then change the custom tool to CmdTool
You will need to add some entries to the resx file. At minimal create the following:
".AutoLog" = true
".NextMessageId" = 1
".EventSource" = "HelloWorld"
"MyCustomException" = "Some exception text"
Other settings exampled by the NUnit: http://csharptest.net/browse/src/Tools/Generators/Test/TestResXAutoLog.cs#80
Now you should have an exception class being generated that writes log events. You will need to build the MC file as a pre/post build action with something like:
CSharpTest.Net.Generators.exe RESXTOMESSAGEDLL /output=MyCustomMessages.dll /input=TheProjectWithAResX.csproj
Lastly, you need to register it, run the framework's InstallUtil.exe MyCustomMessages.dll
That should get you started until I get time to document it all.
One suggestion from Ryans Roberts I really like is exceptioneer which seems to solve my exception woes at least.
I would first go for log4net. The SmtpAppender can wait for N exceptions to cumulate before sending an email and avoid crashing Outlook. And log4net also logs to log files that can be stored on network drives, read with cat and grep, etc.
About stats, you can perform a health/performance logging with the same tools, ie. spawn a thread that every minute logs CPU usage etc.
I don't have a concrete answer for the first part of question, since it implies automated log analysis and coalescence. At university, we made a tool that is designed to do part of these things but doesn't apply to your scenario (but it's two-way integrated with log4net).
In terms of handled exceptions or just typical logging l4ndash is worth a look. I always set our log4net to not only write out text files, but to append to the database. That way l4ndash can analyse it easily. It'll group your errors, let you see where bad things are occurring a lot. You get one free dev license
With Elmah we just pull down the logs periodically. It can exports as csv, then we use Excel do filter/group the data. It's not ideal, but it works. It would be nice to write a script to automate this a bit more. I've not seen much else out there for Elmah.
You can get some metrics on request times (and anything else that's saved) by running LogParser over the IIS logs.
We have built a simple monitoring app that sits on the desktop and flashes up red when there is either an exception written to the event log from one of the apps on the server or it writes an error to the error log table on the database. It also monitors the database health, checking fragmentation and the like.
The only problem we have with this is that it can be a little intrusive on the desktop as it keeps popping up with a red message box if there is a a problem. However it does encourage you to fix it asap.
We currently have this running on several of the developers machines. The improvement we are thinking of making is to have one monitoring app running on a server that then publishes an rss feed so that the app is only checking once in one place but we can consume the information from anywhere using whichever method we choose at the time (such as through our phones when we aren't in the office).
You can have an RSS feed select from your Exceptions table (and other things).
Then you can subscribe to the RSS feed in MS Outlook or on any smart phone. I use an RSS feed reader called NewsRob because it alerts me when there is something new.
I blog about how to do this HERE.
As a related step, I found a way to notify myself when something DIDN'T happen. That blog is HERE.

Simple & Practical C# code generation (VS 2008 or 2010)

I've put off using generated code as part of the build process for fear of the complexity it introduces into the build process.
Is there a simple way to integrate build-time generated code into an app?
The kind of code I'm thinking of is similar to the resource and settings file code generation that Visual studio performs:
Having intellisense here is valuable
There are a lot of properties and links between properties that are trivial to describe, but impossible to implement tersely in C#.
The underlying resource is be modifiable and the code is automatically regenerated without needing any user interaction and without any need to understand the internals of the generator.
For (a non-real-world) example consider a precompiler that generated accessor to the named capture groups of a Regex via similarly named C# properties (or methods). This is typical of the kinds of things I'd like to generate: long snippets of boilerplate wrappers whose primary function is to enable compile time checking for errors (in the above; accessing non-existant capturing groups or writing and invalid regex) and no less importantly, intellisense for these properties. Finally, this setup should be trivially usable by others on the team with only the bare minimum of learning curve. I.e., it's absolutely not acceptable to require manual intervention to regenerate the code, nor acceptable to commit the generated code into source control. At worst, everyone should just need to install some extension; ideally the extension should be installable into the source-tree so that anyone that checks out the tree can build the project without any introduction.
For that to work well, it's critical that the IDE integration be excellent: Updating the underlying "resource" definition file should trigger a regeneration of the code without any user interaction, and ideally the generator itself would be easy to maintain for other developers later on (i.e. some amount of generator debug-ability is a plus).
Finally, an XSLT-like approach where the same template can be applied to various input resources is ideal; both because this means that you don't even need to look at the actual generator code if all you want to do is is update the resource, and because it makes template reuse trivial.
I've looked at T4, but from what I've seen this has a less handy ASP-like approach where template and resource aren't cleanly split (i.e, the generator is responsible for finding the resource - which makes template reuse less easy).
Is there a better (cleaner) solution or some way of running T4 such that the same template is can be trivially reused and (much like .NET settings files) that any update of the resource automatically triggers a regeneration of the implemented code?
Summary:
I'm looking for a code-gen approach that can
Regenerate code automatically without dev intervention when the underlying resource (not the template!) changes.
Be somewhat simple to maintain
Be able to share the same generator template between several resources (which, with point #1 probably implies the resource should refer to the generator and not vice-versa).
You can use T4ScriptFileGenerator from T4 Toolbox. Change "Custom Tool" property for your "resource" file to T4ScriptFileGenerator and save changes. The custom tool will generate a new, empty T4 script (.tt file). Place your code generation logic in this .tt file. Any time you modify (and save) the resource file, the T4ScriptFileGenerator will use the .tt file to generate the output code. For an example of how this works, see "LINQ to SQL Model" generator in the T4 Toolbox, which uses a .dbml file as the "resource". In the .tt file created by this generator, you will see that all of the code generation logic resides in separate .tt files and is reused with the help of include directives.
You may want to keep an eye on ABSE (http://www.abse.info). ABSE is a code-generation and model-driven software development methodology that is completely agnostic in terms of platform and language, so you wouldn't have any trouble creating your own generators for C# and anything else you wish. The big plus is that you can generate code exactly the way you want. The downside is that you may have more work to do at first to build your templates.
ABSE allows you to capture your domain knowledge into "Atoms", which are basically fragments of larger models you can build. ABSE is both declarative and executable. The model is able to generate code by your specification and incorporate custom code at the model level.
Unfortunately, ABSE is still work in progress and an Integrated Development Environment (named AtomWeaver) is still in the making. Anyway a CTP release of the generator is scheduled for January 2010, so we're already close to it.

Categories

Resources