I'm having trouble finding information on this topic, possibly because I'm not sure how to phrase the question. Hopefully the braintrust here can help or at least advise. This situation might just be me being retentive but it's bugging me so I thought I'd ask for help on how to get around it.
I have a C# library filled with utility classes used in other assemblies. All of my extensions reside in this library and it comes in quite handy. Any of my other libraries or executables that need to use those classes must naturally reference the library.
But one of the extensions I have in there is an extension on the Control class to handle cross thread control updates in a less convoluted fashion. As a consequence the utility library must reference System.Windows.Forms.
The problem being that any library or executable that references the utilities library must now have a reference to System.Windows.Forms as well or I get a a build error for the missing reference. While this is not a big deal, it seems sort of stupid to have assemblies that have nothing to do with controls or forms having to reference System.Windows.Forms just because the utilities library does especially since most of them aren't actually using the InvokeAsRequired() extension I wrote.
I thought about moving the InvokeAsRequired() extension into it's own library, which would eliminate the System.Windows.forms problem as only assemblies that needed to use the InvokeAsRequired() extension would already have a reference to SWF.... but then I'd have a library with only one thing in it which will probably bother me more.
Is there a way around this requirement beyond separating out the 'offending' method and creating a nearly empty library? Maybe a compile setting or something?
It should be noted that the 'offending method' is actually used across multiple projects that have UI. A lot of the UI updates I do are as a result of events coming in and trying to update windows form controls from another thread causes various UI thread problems. Hence the method handling the Invoke when needed. (Though personally I think that whole InvokeRequired pattern should be wrapped up into the control itself rather than having something external do the thread alignment in the first place).
If it's just one function, then package it up as a source code file into a NuGet package and then add the NuGet package to your projects. Then, this code will be easily deployable to new projects (as well as easily updateable), but you don't need to create a separate assembly. Just compile it into your application.
You would then store your NuGet package in a local NuGet repository, or get a myget account, or even just store it somewhere on your network. Worst case, you can check it into your version control, but I would just check in the "project" that you build the nuget package from, so you can rebuild the package if need be.
Who knows, at some point, you may add more utility functions that require windows forms, and at that point you could justify a separate assembly.
It's easy: you have to move the offending code out. For now it might be a little concern, but in the end it might be a blast you did it now instead of at the moment you are forced to.
Even if it is just one method (for now), just move the method into another assembly. I didn't say a new one, it can be in the assembly that uses it if only one, or all others that need that moment derive from it.
You can solve your problem by switching the utility library code from the early-binding pattern to the late-binding pattern when it comes to types declared in the System.Windows.Forms namespace.
This article shows how to do it the short way: Stack Overflow: C#.NET - Type.GetType(“System.Windows.Forms.Form”) returns null
And this code snippet shows how the monoresgen tool from the Mono Project (open source ECMA CLI, C# and .NET implementation) solves the System.Windows.Forms dependency problem.
public const string AssemblySystem_Windows_Forms = "System.Windows.Forms, Version=" + FxVersion + ", Culture=neutral, PublicKeyToken=b77a5c561934e089";
// ...
static Assembly swf;
static Type resxr;
static Type resxw;
/*
* We load the ResX format stuff on demand, since the classes are in
* System.Windows.Forms (!!!) and we can't depend on that assembly in mono, yet.
*/
static void LoadResX () {
if (swf != null)
return;
try {
swf = Assembly.Load(Consts.AssemblySystem_Windows_Forms);
resxr = swf.GetType("System.Resources.ResXResourceReader");
resxw = swf.GetType("System.Resources.ResXResourceWriter");
} catch (Exception e) {
throw new Exception ("Cannot load support for ResX format: " + e.Message);
}
}
// ...
static IResourceReader GetReader (Stream stream, string name, bool useSourcePath) {
string format = Path.GetExtension (name);
switch (format.ToLower (System.Globalization.CultureInfo.InvariantCulture)) {
// ...
case ".resx":
LoadResX ();
IResourceReader reader = (IResourceReader) Activator.CreateInstance (
resxr, new object[] {stream});
if (useSourcePath) { // only possible on 2.0 profile, or higher
PropertyInfo p = reader.GetType ().GetProperty ("BasePath",
BindingFlags.Public | BindingFlags.Instance);
if (p != null && p.CanWrite) {
p.SetValue (reader, Path.GetDirectoryName (name), null);
}
}
return reader;
// ...
}
}
Snippet source: https://github.com/mono/mono/blob/mono-3.10.0/mcs/tools/resgen/monoresgen.cs#L30
Related
The question in short is: How do you reference a second script containing reusable script code, under the constraints that you need to be able to unload and reload the scripts when either of them changes without restarting the host application?
I'm trying to compile a script class using the CS-Script "compiler as service" (CSScript.Evaluator), while referencing an assembly that has just been compiled from a second "library" script. The purpose is that the library script should contain code that can be reused for different scripts.
Here is a sample code that illustrates the idea but also causes a CompilerException at runtime.
using CSScriptLibrary;
using NUnit.Framework;
[TestFixture]
public class ScriptReferencingTests
{
private const string LibraryScriptCode = #"
public class Helper
{
public static int AddOne(int x)
{
return x + 1;
}
}
";
private const string ScriptCode = #"
using System;
public class Script
{
public int SumAndAddOne(int a, int b)
{
return Helper.AddOne(a+b);
}
}
";
[Test]
public void CSScriptEvaluator_CanReferenceCompiledAssembly()
{
var libraryEvaluator = CSScript.Evaluator.CompileCode(LibraryScriptCode);
var libraryAssembly = libraryEvaluator.GetCompiledAssembly();
var evaluatorWithReference = CSScript.Evaluator.ReferenceAssembly(libraryAssembly);
dynamic scriptInstance = evaluatorWithReference.LoadCode(ScriptCode);
var result = scriptInstance.SumAndAddOne(1, 2);
Assert.That(result, Is.EqualTo(4));
}
}
To run the code you need NuGet packages NUnit and cs-script.
This line causes a CompilerException at runtime:
dynamic scriptInstance = evaluatorWithReference.LoadCode(ScriptCode);
{interactive}(7,23): error CS0584: Internal compiler error: The invoked member is not supported in a dynamic assembly.
{interactive}(7,9): error CS0029: Cannot implicitly convert type '<fake$type>' to 'int'
Again, the reason for using CSScript.Evaluator.LoadCode instead of CSScript.LoadCode is so that the script can be reloaded at any time without restarting the host application when either of the scripts changes. (CSScript.LoadCode already supports including other scripts according to http://www.csscript.net/help/Importing_scripts.html)
Here is the documentation on the CS-Script Evaluator: http://www.csscript.net/help/evaluator.html
The lack of google results for this is discouraging, but I hope I'm missing something simple. Any help would be greatly appreciated.
(This question should be filed under the tag cs-script which does not exist.)
There is some slight confusion here. Evaluator is not the only way to achieve reloadable script behavior. CSScript.LoadCode allows reloading as well.
I do indeed advise to consider CSScript.Evaluator.LoadCode as a first candidate for the hosting model as it offers less overhead and arguably more convenient reloading model. However it comes with the cost. You have very little control over reloading and dependencies inclusion (assemblies, scripts). Memory leaks are not 100% avoidable. And it also makes script debugging completely impossible (Mono bug).
In your case I would really advice you to move to the more conventional hosting model: CodeDOM.
Have look at "[cs-script]\Samples\Hosting\CodeDOM\Modifying script without restart" sample.
And "[cs-script]\Samples\Hosting\CodeDOM\InterfaceAlignment" will also give you an idea how to use interfaces with reloading.
CodeDOM was for years a default CS-Script hosting mode and it is in fact very robust, intuitive and manageable. The only real drawback is the fact that all object you pass to (or get from) the script will need to be serializable or inherited from MarshalByRef. This is the side effect of the script being executed in the "automatic" separate domain. Thus one have to deal with the all "pleasures" of Remoting.
BTW this is the only reason why I implemented Mono-based evaluator.
CodeDOM model will also automatically manage the dependencies and recompile them when needed. But it looks like you are aware about this anyway.
CodeDOM also allows you to define precisely the mechanism of checking dependencies for changes:
//the default algorithm "recompile if script or dependency is changed"
CSScript.IsOutOfDateAlgorithm = CSScript.CachProbing.Advanced;
or
//custom algorithm "never recompile script"
CSScript.IsOutOfDateAlgorithm = (s, a) => false;
The quick solution to the CompilerException appears to be not use Evaluator to compile the assembly, but instead just CSScript.LoadCode like so
var compiledAssemblyName = CSScript.CompileCode(LibraryScriptCode);
var evaluatorWithReference = CSScript.Evaluator.ReferenceAssembly(compiledAssemblyName);
dynamic scriptInstance = evaluatorWithReference.LoadCode(ScriptCode);
However, as stated in previous answer, this limits the possibilities for dependency control that the CodeDOM model offers (like css_include). Also, any change to the LibraryScriptCode are not seen which again limits the usefulness of the Evaluator method.
The solution I chose is the AsmHelper.CreateObject and AsmHelper.AlignToInterface<T> methods. This lets you use the regular css_include in your scripts, while at the same time allowing you at any time to reload the scripts by disposing the AsmHelper and starting over. My solution looks something like this:
AsmHelper asmHelper = new AsmHelper(CSScript.Compile(filePath), null, false);
object obj = asmHelper.CreateObject("*");
IMyInterface instance = asmHelper.TryAlignToInterface<IMyInterface>(obj);
// Any other interfaces you want to instantiate...
...
if (instance != null)
instance.MyScriptMethod();
Once a change is detected (I use FileSystemWatcher), you just call asmHelper.Dispose and run the above code again.
This method requires the script class to be marked with the Serializable attribute, or simply inherit from MarshalByRefObject.
Note that your script class does not need to inherit any interface. The AlignToInterface works both with and without it. You could use dynamic here, but I prefer having a strongly typed interface to avoid errors down the line.
I couldn't get the built in try-methods to work, so I made this extension method for less clutter when it is not known whether or not the interface is implemented:
public static class InterfaceExtensions
{
public static T TryAlignToInterface<T>(this AsmHelper helper, object obj) where T : class
{
try
{
return helper.AlignToInterface<T>(obj);
}
catch
{
return null;
}
}
}
Most of this is explained in the hosting guidelines http://www.csscript.net/help/script_hosting_guideline_.html, and there are helpful samples mentioned in previous post.
I feel I might have missed something regarding script change detection, but this method works solidly.
I am developing C#/.NET 3.5 application. I am using legacy dll written in C, signals.dll. I invoke it from a .NET wrapper using P/Invoke. I am calling 2 types of processing functions, type A and B. When I call only one type of processing, all works fine. When I interleave calls to A and B processing, data result is corrupted. I believe that dll, signals.dll is using C style global variables, and data gets corrupted.
To resolve that, I created 2 copies of dll on disk, signals.dll and signals2.dll. Then I modified .NET wrapper using P/Invoke to direct type A processing to one dll, type B processing to another instance. And now, all works fine.
Then I saw similar problem on forums and solution there. (Supporting multiple instances of a plugin DLL with global data ).
Basically, that proposed solution is dynamic lay from code, creating a new instance of .dll on disk (based on need), and loads it and invokes functions from it. Key part of code looks like this:
private IntPtr dllHandle;
string myDllPath = Path.Combine(dllDir, String.Format("mylib-{0}.dll", GetHashCode()));
File.Copy(origDllPath, myDllPath);
dllPath = myDllPath;
dllHandle = LoadLibrary(dllPath);
_getVersion = GetProcEntryDelegate<_getVersionDelegate>(dllHandle, "GetVersion");
private delegate int _getVersionDelegate();
private readonly _getVersionDelegate _getVersion;
public int GetVersion()
{
return _getVersion();
}
private static D GetProcEntryDelegate<D>(IntPtr hModule, string name)
where D: class
{
IntPtr addr = _getProcAddress(hModule, name);
if (addr == IntPtr.Zero)
throw new Win32Exception();
return Marshal.GetDelegateForFunctionPointer(addr, typeof(D)) as D;
}
What is coming to my mind, would it be possible to modify above code to create copy of dll IN MEMORY, not on disk and load it from there. I think just that IntPtr dllHandle needs to be fooled into getting value from memory, not from LoadLibrary. How to do that?
Both LoadLibrary and LoadLibraryEx requires a file path. You'll need a custom loading procedure, including memory mappings and what-not. I've found a blog post ("Loading a DLL from memory") describing the procedure, and a matching GitHub project; MemoryModule.
There is nothing just about it :) - it's far more complex and involving
Here is a link that might help - Load Library/Module from Memory
And as #Hans Passant said I'd discourage you to go that way - even though it may be a tempting solution for some scenarios (but I don't see that you really need that honestly, nice maybe).
It involves dealing with the portable executable format - and I doubt that project covers all that needs to be done.
You could try making your C++/CLI wrapper - or exporting the MemoryLoadLibrary and try P/Invoking - but I doubt that'd work easily.
Before I start, Let me clear up any confusion before anyone suggests or asks the question..
This Question is related to "Windows Mobile 6 Professional", NOT Windows Phone 7 and it can't be ported to windows phone 7 as it has to go on to an older device.
Now, with that out of the way...
I'm currently trying to port a C# library that I have the source for to run on a windows mobile 6 device, Iv'e coded these things it seems like for ever, but there's one thing Iv'e never had to deal with until now and that's reflection.
Now everyone know that the .NET compact framework does have some limitations, and it appears that lack of support for a lot of the methods and properties in the 'System.Reflection' namespace is one of them.
The actual desktop version of the library is set to target .NET V2.0, and I have devices that are running .NET 3.5 SP1 so for the most part Iv'e had very little problem in getting things to work, a cannot however seem to find a sensible to get the following 2 chunks of code working:
var a = AppDomain.CurrentDomain**.GetAssemblies**();
foreach (var assembly in a)
{
if (assembly is System.Reflection**.Emit.**AssemblyBuilder) continue;
if (assembly**.GetType().**FullName == "System.Reflection.Emit.InternalAssemblyBuilder") continue;
if (assembly**.GlobalAssemblyCache** && assembly**.CodeBase** == Assembly.GetExecutingAssembly()**.CodeBase**) continue;
foreach (var t in GetLoadableTypes(assembly))
{
if (t.IsInterface) continue;
if (t.IsAbstract) continue;
if (t.IsNotPublic) continue;
if (!typeof(IGeometryServices).IsAssignableFrom(t)) continue;
var constuctors = t.GetConstructors();
foreach (var constructorInfo in constuctors)
{
if (constructorInfo.IsPublic && constructorInfo.GetParameters().Length == 0)
return (IGeometryServices)Activator.CreateInstance(t);
}
}
}
And
catch (**ReflectionTypeLoadException** ex)
{
var types = ex**.Types**;
IList<Type> list = new List<Type>(types**.Length**);
foreach (var t in types)
if (t != null && t**.IsPublic**)
list.Add(t);
return list;
}
Specifically, those items in bold in the above code are the methods and properties that don't appear to be present in the compact framework, and after spending quite a chunk of time with intellisense and the object browser, Iv'e not found anything that returns (or makes available) the same types.
My question then is as follows:
Does anyone have any experience of using reflection in the Compact .NET framework, and can suggest how this code can be made to work as expected, or am I going to have to start writing custom stubs and functionality to replace the missing methods.
I know there is some reflection capabilities on the framework, so I'm sure there must be an equivalent way of achieving it.
Just on a final note, for anyone who may recognise the code. YES it is from the .NET topology suite, and yes it is that library I'm trying to build a WM6 version of, so if you know of anyone that has already done it please do put a comment on to that effect, and I'll go take a look at the easier path :-)
======================================================================
Update after posting
It appears 'Bold' text doesn't work in code snippets, so those methods / properties in the above code that are surrounded by ** are the parts supposed to be in bold.
So after studying some older builds and an experimental Silverlight build (Which apparently has a lot of the same limitations as Windows Mobile / CE)
I figured out how to make the magic work.
The first part was to fill the array with 'Assembly' structures representing the assemblies to search. Originally it was:
var a = AppDomain.CurrentDomain.GetAssemblies();
but since GetAssemblies doesn't exist in WM then the quickest way is to use the following:
var a = new[] {Assembly.GetCallingAssembly(), Assembly.GetExecutingAssembly()};
Now in my case this meant that I got two identical assemblies, but if I was calling an Assembly from my main exe which in turn used GeoAPI then 2 different assemblies would be shown here.
The next challenge was the "stripping out" of assemblies we didn't need to check:
if (assembly is System.Reflection.Emit.AssemblyBuilder) continue;
if (assembly.GetType().FullName == System.Reflection.Emit.InternalAssemblyBuilder") continue;
if (assembly.GlobalAssemblyCache && assembly.CodeBase == Assembly.GetExecutingAssembly().CodeBase) continue;
The first and second instances never crop up in WM6 so it's safe to just comment them out. The second one does work, but since you'll never see any assemblies with the given check name (Due to it being missing) on WM6 then again, you should be safe to comment it out. I didn't comment it out, but it also never got triggered as true.
The final part (at least as far as the original puzzle went anyway) was this:
foreach (var t in GetLoadableTypes(assembly))
{
if (t.IsInterface) continue;
if (t.IsAbstract) continue;
if (t.IsNotPublic) continue;
if (!typeof(IGeometryServices).IsAssignableFrom(t)) continue;
In my original attempt 'isInterface' & 'isNotPublic' where missing, however once I managed to fix up the contents of the 'var a' variable above with the data type it expected, everything started to work ok with nothing missing.
The final question is? Did this solve everything? Well not quite....
It turns out that the whole purpose of the 'ReflectInstance' method in GeoAPI was to find a user defined 'GeometryFactory' using an 'IGeometryServices' interface.
Since I was only reflecting over the assembly I was calling from (var a above) then the 'NetTopologySuite' (Where the geometry factory is defined) was not added to the select list (Obviously CurrentDomain.GetAssemblies includes this)
The net result was I still ended up hitting the exception at the end as no Assemblies of the correct type could be located.
It turns out however, that in the 'Geometry' constructor in GeoAPI has an overload that allows you to pass in a GeometryFactory, when you do this it completely ignores the ReflectInstance method and just uses what it's told.
or to put it another way , I never had to do any of this in the first place, I could have just set the method to return 'null' and passed in the geometry factory I wanted to use.
Anyway, if anyone is interested I now have a working copy of:
GeoAPI.NET
NetTopologySuite
Wintellect.PowerCollections
Built and working fine under Windows Mobile 6 and Windows CE with .NET CF 3.5.
In my VSPackage I need to replace reference to a property in code with its actual value. For example
public static void Main(string[] args) {
Console.WriteLine(Resource.HelloWorld);
}
What I want is to replace "Resource.HelloWorld" with its actual value - that is, find class Resource and get value of its static property HelloWorld. Does Visual Studio expose any API to handle code model of the project? It definitely has one, because this is very similar to common task of renaming variables. I don't want to use reflection on output assembly, because it's slow and it locks the file for a while.
There is no straight forward way to do this that I know of. Reliably getting an AST out of Visual Studio (and changes to it) has always been a big problem. Part of the goal of the Rosalyn project is to create an unified way of doing this, because many tool windows had their own way of doing this sort of stuff.
There are four ways to do this:
Symbols
FileCodeModel + CodeDOM
Rosalyn AST
Unexplored Method
Symbols
I believe most tool windows such as the CodeView and things like Code Element Search use the symbols created from a compiled build. This is not ideal as it is a little more heavy weight and hard to keep in sync. You'd have to cache symbols to make this not slow. Using reflector, you can see how CodeView implements this.
This approach uses private assemblies. The code for getting the symbols would look something like this:
var compilerHost = new IDECompilerHost();
var typeEnumerator = (from compiler in compilerHost.Compilers.Cast<IDECompiler>()
from type in compiler.GetCompilation().MainAssembly.Types
select new Tuple<IDECompiler, CSharpType>(compiler, type));
foreach (var typeTuple in typeEnumerator)
{
Trace.WriteLine(typeTuple.Item2.Name);
var csType = typeTuple.Item2;
foreach (var loc in csType.SourceLocations)
{
var file = loc.FileName.Value;
var line = loc.Position.Line;
var charPos = loc.Position.Character;
}
}
FileCodeModel + CodeDOM
You could try using the EnvDTE service to get the FileCodeModel associated with a Code Document. This will let you get classes and methods. But it does not support getting the method body. You're messing with buggy COM. This ugly because an COM object reference to a CodeFunction or CodeClass can get invalided without you knowing it, meaning you'd have to keep your own mirror.
Rosalyn AST
This allows provides the same capabilities as both FileCodeModel and Symbols. I've been playing with this and it's actually not too bad.
Unexplored Method
You could try getting the underlying LanguageServiceProvider that is associated with the Code Document. But this is really difficult to pull off, and leaves you with many issues.
There are a a lot of questions and answers around AOP in .NET here on Stack Overflow, often mentioning PostSharp and other third-party products. So there seems to be quite a range of AOP optons in the .NET and C# world. But each of those has their restrictions, and after downloading the promising PostSharp I found in their documentation that 'methods have to be virtual' in order to be able to inject code (edit: see ChrisWue's answer and my comment - the virtual constraint must have been on one of the contenders, I suppose). I haven't investigated the accuracy of this statement any further, but it's categoricality made me return back to Stack Overflow.
So I'd like to get an answer to this very specific question:
I want to inject simple "if (some-condition) Console.WriteLine" style code to every method and property (static, sealed, internal, virtual, non-virtual, doesn't matter) in my project that does not have a custom annotation, in order to dynamically test my software at run-time. This injected code should not remain in the release build, it is just meant for dynamic testing (thread-related) during development.
What's the easiest way to do this? I stumbled upon Mono.Cecil, which looks ideal, except that you seem to have to write the code that you want to inject in IL. This isn't a huge problem, it's easy to use Mono.Cecil to get an IL version of code written in C#. But nevertheless, if there was something simpler, ideally even built into .NET (I'm still on .NET 3.5), I'd like to know. [Update: If the suggested tool is not part of the .NET Framework, it would be nice if it was open-source, like Mono.Cecil, or freely available]
I was able to solve the problem with Mono.Cecil. I am still amazed how easy to learn, easy to use, and powerful it is. The almost complete lack of documentation did not change that.
These are the 3 sources of documentation I used:
static-method-interception-in-net-with-c-and-monocecil
Migration to 0.9
the source code itself
The first link provides a very gentle introduction, but as it describes an older version of Cecil - and much has changed in the meantime - the second link was very helpful in translating the introduction to Cecil 0.9. After getting started, the (also not documented) source code was invaluable and answered every question I had - expect perhaps those about the .NET platform in general, but there's tons of books and material on that somewhere online I'm sure.
I can now take a DLL or EXE file, modify it, and write it back to disk. The only thing that I haven't done yet is figuring out how to keep debugging information - file name, line number, etc. currently get lost after writing the DLL or EXE file. My background isn't .NET, so I'm guessing here, and my guess would be that I need to look at mono.cecil.pdb to fix that. Somewhere later - it's not that super important for me right now. I'm creating this EXE file, run the application - and it's a complex GUI application, grown over many years with all the baggage you would expect to find in such a piece of, ahem, software - and it checks things and logs errors for me.
Here's the gist of my code:
DefaultAssemblyResolver assemblyResolver = new DefaultAssemblyResolver();
// so it won't complain about not finding assemblies sitting in the same directory as the dll/exe we are going to patch
assemblyResolver.AddSearchDirectory(assemblyDirectory);
var readerParameters = new ReaderParameters { AssemblyResolver = assemblyResolver };
AssemblyDefinition assembly = AssemblyDefinition.ReadAssembly(assemblyFilename, readerParameters);
foreach (var moduleDefinition in assembly.Modules)
{
foreach (var type in ModuleDefinitionRocks.GetAllTypes(moduleDefinition))
{
foreach (var method in type.Methods)
{
if (!HasAttribute("MyCustomAttribute", method.method.CustomAttributes)
{
ILProcessor ilProcessor = method.Body.GetILProcessor();
ilProcessor.InsertBefore(method.Body.Instructions.First(), ilProcessor.Create(OpCodes.Call, threadCheckerMethod));
// ...
private static bool HasAttribute(string attributeName, IEnumerable<CustomAttribute> customAttributes)
{
return GetAttributeByName(attributeName, customAttributes) != null;
}
private static CustomAttribute GetAttributeByName(string attributeName, IEnumerable<CustomAttribute> customAttributes)
{
foreach (var attribute in customAttributes)
if (attribute.AttributeType.FullName == attributeName)
return attribute;
return null;
}
If someone knows an easier way how to get this done, I'm still interested in an answer and I won't mark this as the solution - unless no easier solutions show up.
I'm not sure where you got that methods have to be virtual from. We use Postsharp to time and log calls to WCF service interface implementations utilizing the OnMethodBoundaryAspect to create an attribute we can decorate the classes with. Quick Example:
[Serializable]
public class LogMethodCallAttribute : OnMethodBoundaryAspect
{
public Type FilterAttributeType { get; set; }
public LogMethodCallAttribute(Type filterAttributeType)
{
FilterAttributeType = filterAttributeType;
}
public override void OnEntry(MethodExecutionEventArgs eventArgs)
{
if (!Proceed(eventArgs)) return;
Console.WriteLine(GetMethodName(eventArgs));
}
public override void OnException(MethodExecutionEventArgs eventArgs)
{
if (!Proceed(eventArgs)) return;
Console.WriteLine(string.Format("Exception at {0}:\n{1}",
GetMethodName(eventArgs), eventArgs.Exception));
}
public override void OnExit(MethodExecutionEventArgs eventArgs)
{
if (!Proceed(eventArgs)) return;
Console.WriteLine(string.Format("{0} returned {1}",
GetMethodName(eventArgs), eventArgs.ReturnValue));
}
private string GetMethodName(MethodExecutionEventArgs eventArgs)
{
return string.Format("{0}.{1}", eventArgs.Method.DeclaringType, eventArgs.Method.Name);
}
private bool Proceed(MethodExecutionEventArgs eventArgs)
{
return Attribute.GetCustomAttributes(eventArgs.Method, FilterAttributeType).Length == 0;
}
}
And then us it like this:
[LogMethodCallAttribute(typeof(MyCustomAttribute))]
class MyClass
{
public class LogMe()
{
}
[MyCustomAttribute]
public class DoNotLogMe()
{
}
}
Works like a charm without having to make any methods virtual in Postsharp 1.5.6. Maybe they have changed it for 2.x but I certainly don't hope so - it would make it way less useful.
Update: I'm not sure if you can easily convince Postsharp to only inject code into certain methods based on with which attributes they are decorated. If you look at this tutorial it only shows ways of filtering on type and method names. We have solved this by passing the type we want to check on into the attribute and then in OnEntry you can use reflection to look for the attributes and decide whether to log or not. The result of that is cached so you only have to do it on the first call.
I adjusted the code above to demonstrate the idea.