How can I test the backward compatibility of API between .net assemblies - c#

I have an assembly that provides an API and is used by some other assemblies. I need to verify that a newer version of API dll is still compatible with the older assemblies that were using the older version of API.
I've found a couple of questions that ask the same, but there are no answers that resolve my problem:
Tool to verify compatibility of a public APIs
Tool for backwards compatibility for the C#/.NET API
Suggested tools can only compare two assemblies and say if there are possible breaking changes in API, but not if the newest API really breaks the older assembly that uses it.
I'd like to find a tool or write a test that will be able to check whether each of the older dlls can work with my new API dll.
As for the changes in API more likely that I will only extend it, but even though it still can break the code in older assemblies. Some of the examples of such changes can be found here:
A definite guide to API-breaking changes in .NET
.NET: with respect to AssemblyVersion, what defines binary compatibility?
For now the only solution I see is to compile the source code of the older assemblies with the newest API, but I would like to do it only with assemblies and add them as part of my unit tests. Is there any better way I can handle that?
edit:
I'm looking for a tool that will be able to automate the process of verifying the backward compatibility between .net assemblies. (command line or with some api too)

What you want is to do a diff and generate a the list of breaking changes. Then you want to search if of your assemblies does use any of the broken APIs. You can do this with ApiChange tool to do the diff and to find any affected users of it.
To make it more concrete. If you have removed a method from an interface then you need to find all implementers and users of this method in classes which uses the interface method or any class that does implement this method.
ApiChange can search for implementers and users of specific methods on the command line with the commands -whoimplementsinterface and -whousesmethod. It is not automated at the command line but you can directly use the ApiChange.Api.dll to automate this queries.
Edit1:
I just forgot: The ApiChange tool has actually the functionality you are interested in already. It is the option
-ShowrebuildTargets -new -old [-old2 ] -searchin
We did use it in our department with good results. The only gotcha are the XML Intellisense files. If another target does not use the removed method but references it inside the XmlDoc the compiler will write a warning that a non existing method was referenced. This is quite hard to catch and would involve to parse the intellisense docu files as well. But this is quite an edge case.

I've spent the day looking around for an answer to this. It seems like the tools referenced on the related (unhelpfully closed) questions are now dead or as good as. But I've just taken a look at Telerik's assembly diff tool JustAssembly and this looks much better than rolling your own, which, if you look at their library seems to be a whole heap of work and likely to go wrong.
They have a UI which isn't of that much help from the point of view of integrating into your CI build, it is pretty basic, but you can build the library from source, which I've just done and the library looks like it has everything you need to get yourself up and running pretty quickly.

Related

determine minimum compatible API version

Given that the THINC API is written to be backwards compatible, and lower versions enable a greater number of potential machines to run a given application, everyone should strive to use the minimum version necessary.
Does anyone know if there is an easy way determine what the minimum version necessary for a given application would be?
For example, I've got an application that only uses 3 API functions:
GetHourMeterCount, GetActiveProgramName, and GetMachiningReport
How do I know what API version I can use?
I can think of several possibilities:
For your situation, the easiest solution I can think of is just to check the .chm documentation for your earliest THINC API version to see if it supports GetHourMeterCount, GetActiveProgramName, and GetMachiningReport. If not, continue checking later versions until you find one that does.
If you had a more complicated solution that used more THINC API functionality, a quick check would be:
Make sure the project builds cleanly.
Go into the project references and remove the reference to THINC API. Now you will have a compilation error everywhere that THINC API is referenced.
Add a reference to your earliest version of THINC API.
Rebuild. If there are still compiler errors, then your code references one or more THINC methods that do not exist in this version. Move on to the next version and rebuild.
Once your project builds cleanly again, you have found the version of THINC API to reference.
You could also code a tool that examines your code (via code analysis) or your compiled assembly (via reflection) to find all THINC API functionality, and then looks at multiple versions of THINC API to find the earliest that implements all of the functionality. That shouldn't be difficult, but still seems like overkill.
For your purposes, it would also be convenient to have a table of all THINC API methods, vs. the versions in which those methods are supported. I don't have such a table, but someone conceivably might.
All of these methods just check whether certain functions exist in a given version of THINC API. They won't warn you about any breaking changes or different behavior between different versions. That requires knowing the API, checking the release notes, and/or testing.

Sonar runner on a project containing c# and javascript

I have been trying to get Sonar code analysis work on a c# project. Since it's a web project I'd also like to run analysis on JavaScript.
However, as mentioned in the following link, you cannot run multi-module projects on a .NET solution (http://sonar.15.x6.nabble.com/Multi-language-javascript-amp-c-td5011530.html).
The suggested workaround is to trigger two analysis profiles separately and then combine them with the views plugin (http://www.sonarsource.com/products/plugins/governance/portfolio-management/)
But this plugin costs about 1800$. Because Sonar has the possibility to analyse multiple projects in .NET through the solution file, it therefore disables multiple modules for .NET solutions (to prevent a specific error).
I find it really annoying that by doing this, it forces me to use a paid module (and not a cheap one) to create a sub-optimal workaround.
Are there any other better solutions for this?
No, there's currently no better solution for this case. This issue has been identified and we'll take a look at it during the next spring - but I'm not sure that it can be solved easily though.
You can and watch and vote here: http://jira.codehaus.org/browse/SONARDOTNT-291

Umbraco Code Library - Version Non-specific

I've created a code library for use with Umbraco, as you'd expect it does all of the common tasks that I use over and over. I work for a digital agency and we support sites that are built from Umbraco versions (4.5.x onwards).
To date we've always complied the library against the same dlls as we're using for the current project, but this isn't great and we've ended up with lots of different branches, one for each version. Having this many branches is a nightmare and I'm trying to find a solution that has one project that can be used to all versions.
I'm just wondering if anyone can think of or knows a way of having doing this or has any experience in this?
If you code purely to the INode interface then you should be able to create your library independent of the version. DynamicNode and DynamicMedia both implement INode.

Roslyn vs Reflection for TypeScript code generator

I'm developing a TypeScript code generator that will use custom attributes on C# classes to generate TypeScript definitions and code files.
I'm considering two options for TypeScript code generation / source file analysis:
Reflection on compiled assemblies
Roslyn CTP
The tool would use custom attributes on properties and methods to generate a TypeScript file. Right now I'm not planning to convert the C# method body to JavaScript, but in the future this may be done. So for this reason I am seriously considering Roslyn. However to simply generate the outline of my TypeScript classes I think I could use reflection and custom attributes.
I am wondering:
a) Does Roslyn provide functionality that is impossible with Reflection? My understanding is that I cannot get method bodies with Reflection.
b) Would the Roslyn CTP license prevent my from distributing the tool under an open source license? This is not clear to me after reading the license
I just did something along these lines - works great for creating your datamodel in Typescript from your c# classes. I built it to generate a single AMD-module with an interface which mimics the basic data of your Models. Also supports Generics, and creates a class with Knockout properties, including a toJS() method and an update(data:Interface) method to update your class.
The whole thing is just a single T4 template. If anyone finds this and is interested: http://spabuilder.wordpress.com/2014/07/31/generating-typescript-from-c/
Also honors [KeyAttribute] and [Timespan] attributes for data models if you are using data annotations.
I've been messing around with generating js, and I'm finding Reflection to be a better tool for this. I'm basically pointing my generator at the bin folder of the project which the metadata comes from. There might be some difficulties with loading all the needed assemblies, and caveats with versions of assemblies in the bin folder, and versions of the same assemblies that your generator project references. But once you get over all of this, which I did with minimal difficulty, Reflection is a lot easier to use, and more reliable.
With Roslyn, you are basically just parsing c#. Roslyn does this very well, but I'm hesitant to switch to it from Reflection. With reflection, you get metadata more reliably.
Let's say you want the Prefix property of a RoutePrefixAttribute that decorates a controller class. If you're parsing c#, you may have:
[RoutePrefix("stringliteral")] or [RoutePrefix(constantString)]. So, you have to worry about whether it's a literal or a constant expression, then find out how to get the value of a constant expression, worry about all the different ways in which you can pass parameters to an atatribute (for example, will this break your code: [RoutePrefix(Prefix="literal")]...
Once you're dealing with the actual runtime objects with reflection, everything is just easier. You have a nice RoutePrefixAttribute object, and you can go routePrefix.Prefix to get, reliably, the value of the prefix.
This is just one example of how doing things with Reflection is easier. It's the difference between gathering metadata from a set of c# objects in a type-safe way, and scraping data from c# code, albeit with a really nice scraping tool.
EDIT: Since writing this answer, I've bit the bullet and switched to Roslyn. It's fairly powerful once you get the hang of it, and I did find one big advantage: you can get a reference to the workspace from a visual studio plugin, and easily do all kinds of stuff within the plugin.
Update Nov, 2018
The accepted answer is valid because it's dated in Aprl,2013
Now roslyn is distributed under Apache License Version 2.0
excerpt from the license:
Redistribution.
You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You meet the following conditions:...
Roslyn have a number of nuget packages
Doesn't the license only forbid you personally from distributing the binaries? It doesn't forbid you from adding a dependency from your NuGet package to the Rosyln CTP NuGet package. You personally cannot deliver the bits, but you can have NuGet pull in Roslyn automatically.
So just avoid checking Rosyln source or binaries into your version control.
The Roslyn website not clearly states that:
The current license is for evaluation and preview purposes only and
does not allow redistribution of the Roslyn binaries. Sharing of
sample projects built on the Roslyn APIs is permitted, but sample
users must have either the Roslyn CTP or the Roslyn NuGet package
installed in order to build and run.
I wouldn't use the current Roslyn CTP - simply because there will be new versions in 2014 and those will bring many breaking changes for sure. So you might end up with totally deprecated code.
(There recently was a blog post on this by a MS team member, but I'm afraid I currently don't have the link at hand.)
EditThere's a good chance that Roslyn then will get a license that also permits for commercial use...
Update - July 2015
Roslyn is still in CTP, but their FAQ on GitHub is much more to the point:
For sample code or learning purposes, the recommended way to redistribute the Roslyn DLLs is with the Roslyn NuGet package: [url:Microsoft.CodeAnalysis|http://www.nuget.org/packages/Microsoft.CodeAnalysis].
So it appears that you still cannot redistribute the DLLs in finished products. The project will need to be open sourced and the solution will need a reference the NuGet package.
Original Answer (November 2012)
I don't believe you can distribute under open source.
6.DISTRIBUTABLE CODE. The software contains code that you are permitted to distribute in programs you develop if you comply with the
terms below.
6.c Distribution Restrictions you may not modify or distribute the source code of any Distributable Code so that any part of it becomes
subject to an Excluded License. An Excluded License is one that
requires, as a condition of use, modification or distribution,
the code be disclosed or distributed in source code form; or item
others have the right to modify it.
At first it sounds like you could do it if you just include the Roslyn binaries, but the Distributable Code definition specifically says "The software contains code..." and I believe that is what everything after is referring to.
To your other question, Roslyn isn't fully finished and is still Beta. I don't know exactly if it is currently in a state that allows it to handle your needs. That's something you may just want to spend a couple of hours tinkering with. I wouldn't think it had more functionality than what .NET currently allows. You can see what they recently added in September here and what is currently not implemented here.
For my experience using T4 generations based on reflection, as TypeLite does, is somehow simpler but has some drawbacks, like once the project depends on the classes that have been generated, regenerating them with a breaking change (renamed a class) will lead to a non compiling project so running the template again will output a blanck file and the user will have an hard time making everything compile again.
So, having the same need, i started experimenting with Roslyn, and it seems very promising, but i have many doubts on how to use it properly...
You can take a look at what i'm doing and maybe help me here: https://github.com/TrabacchinLuigi/RoslynExporter

Plugin-like architecture in .NET

I'm trying to implement a plug-in like application. I know there are already several solution out there but this is just going to be proof of the concept, nothing more. The idea would be to make the application main application almost featureless by default and then let the plugins know about each other, having them have implement all the needed features.
A couple of issues arise:
I want the plugins at runtime to know about each other through my application. That wouldn't mean that at code-time they couldn't reference other plugin's assemblies so they could use its interfaces, only that plugin-feature initialization should be always through my main app. For example: if I have both plugins X and Y loaded and Y wants to use X's features, it should "register" its interest though my application to use its features. I'd have to have a kind of "dictionary" in my application where I store all the loaded plugins. After registering for interest in my application, plugin Y would get a reference to X so it could use it. Is this a good approach?
When coding plugin Y that uses X, I'd need to reference X's assembly, so I can program against its interface. That has the issue of versioning. What if I code my plugin Y against an outdated version of plugin X? Should I always use a "central" place where all assemblies are, having there always the up to date versions of the assemblies?
Are there by chance any books out there that specifically deal with these kinds of designs for .NET?
Thanks
edit: I think people are drifting away from the 2 questions I made. I can take a look at both MEF and #develop, but I'd like to get specifics answers to the questions I made.
I recommend looking into MEF. This is a new way of doing plugins in .NET. It is the recommend way of doing new addins for VS2010, for example. I've not used it myself, but what I've looked into about it looks great. Adding this as an answer on prodding of others :)
Look into the System.AddIn namespace. It's a little lower-level than MEF, and so should give you the "implement it myself" experience you're looking for.
There is a good book on building what you are looking for: Dissecting a C# Application: Inside SharpDevelop. Here's a link: http://www.icsharpcode.net/OpenSource/SD/InsideSharpDevelop.aspx
The SharpDevelop application is fully plugin-based and the book talks about how they built it, the pitfalls they faced, and how they overcame it. The book is freely available from the site, or you can buy it too.
Once I done it using this example. I loved it, but it was couple years ago, I think there might be better solutions now. As long as I remember the basic idea was that there is abstract class in your program, and your plug-ins inherit that class and compiled as DLLs... or something similar using Interfaces. Anyways that approach worked great for me. Later I added a filesystemwatcher so it could load those DLL plugins while it is running.
To load an Assembly
To get the types the assembly exposes
About the two specific issues you exposed:
1) I'm not sure what are you trying to achieve, but my guess is that you want to have lazy initialization of features, and maybe lazy loading of add-ins. If that's the goal, what you are proposing might work. So it could work like this:
The Y plugin provides a list of features it needs to use (it could be done for example through a specific interface implementation or through an xml manifest).
The X add-in implements an API which allows initializing a feature, with a method like Initialize(featureId).
The host application gets the feature list required by Y, loads/initializes the X plugin, and calls Initialize for each feature.
The host application also provides a GetFeature() method which Y can use to get a reference to a 'feature' object, which would be implemented in X.
However, if the plugin Y has direct access to the X API, I think it is unnecessary to have all that infrastructure for registering features. Y can just access the X features by directly using the X API, and Y would take care of lazy initializing each feature when required. For example, Y could just call SomeXFeature.DoSomething(), and the implementation of that class would initialize the feature the first time it is used.
2) If the API of an assembly changes, any assembly depending on it may break. Plugins are just assemblies which depend on other assemblies, so they will also break. Here are some things you can do to alleviate this problem.
Assign a version number to each plugin. This could be just the assembly version.
When loading a plugin, ensure that all dependencies can be properly satisfied (that is, all plugins on which it depends must be present and have the required version). Refuse to load the plugin if dependencies can't be satisfied.
Implement a plugin management tool, to be used for all plugin install/uninstall operations. The manager can check dependencies and report errors when trying to install plugins with unsatisfied dependencies, or when trying to uninstall a plugin on which other plugins depend.
Similar solutions are used by the Mono.Addins framework. In Mono.Addins, each add-in has a version number and a list of add-ins/versions on which it depends. When loading an add-in, the add-in engine ensures that all dependent add-ins with the correct versions are also loaded. It also provides an API and a command line tool for managing the installation of add-ins.

Categories

Resources