Generic dependency injection with Unity - c#

We are wrapping an existing logging library in our own logging service in a C# application to surround it with predefined methods for certain logging situations.
public class LoggingBlockLoggingService : ILoggingService
{
private LogWriter writer;
public LoggingBlockLoggingService(LogWriter writer)
{
this.writer = writer;
}
....//logging convenience methods, LogWarning(), etc...
}
I would like to modify this implementation so that it takes in the Type of the class that instantiates it (the underlying logger, LogWriter in this case, would be a singleton). So either make this implementation (and the interface ILoggingService) generic:
public class LoggingBlockLoggingService<T> : ILoggingService<T>
{
...
private string typeName = typeof(T).FulName;
...
Or add an additional constructor parameter:
public class LoggingBlockLoggingService : ILoggingService
{
private LogWriter writer;
private string typeName;
public LoggingBlockLoggingService(LogWriter writer, Type type)
{
this.writer = writer;
this.typeName = type.FullName;
}
....//Include the typeName in the logs so we know the class that is logging.
}
Is there a way to configure this once in Unity when registering our types? I'd like to avoid having to add an entry for every class that wants to log. Ideally, if someone wants to add logging to a class in our project, they'd just add an ILoggingService to the constructor of the class they are working in, instead of adding another line to our unity config to register each class they are working on.
We are using run time/code configuration, not XML

Yes, you can use:
container.RegisterType(typeof(IMyGenericInterface<>), typeof(MyConcreteGenericClass<>));

In your case, when there's simple direct generic-param--to--generic-param mapping the Unity maybe actually handles that, but I doubt that any more advanced cases are not handled, because something at some point of time must provide the mapping of generic-parameters between the types (liek reordering Key-Value vs. Value-Key etc).
If Dave's answer is not enough, I'm fairly sure that you could write a plugin to Unity/ObjectBuilder that would register a new strategy or set of strategies that would cover just any type mapping you would like, including automatic assembly scanning or materialization of generics.
See the series of articles at http://www.orbifold.net/default/unity-objectbuilder-part-ii/ and the section near
Context.Strategies.AddNew< buildkeymappingstrategy >(UnityBuildStage.TypeMapping);

Related

Using dependency injection in SpecFlow step-file

We're using Unity as our dependency injection framework.
I want to create an acceptance test and need an instance of DossierService.
Unfortunately I get the following exception:
BoDi.ObjectContainerException: 'Interface cannot be resolved [...]'
[Binding]
public class DossierServiceSteps : BaseSteps
{
private IDossierService dossierService;
public DossierServiceSteps(IDossierService dossierService)
{
this.dossierService = dossierService;
}
}
What exactly is BoDi? I can't find any useful information..
How can I tell SpecFlow to use the normal Unity container?
Thanks in advance
Edit:
I've tried using SpecFlow.Unity like so:
public static class TestDependencies
{
[ScenarioDependencies]
public static IUnityContainer CreateContainer()
{
var container = UnityConfig.GetConfiguredContainer();
container.RegisterTypes(typeof(TestDependencies).Assembly.GetTypes().Where(t => Attribute.IsDefined(t, typeof(BindingAttribute))),
WithMappings.FromMatchingInterface,
WithName.Default,
WithLifetime.ContainerControlled);
return container;
}
}
In UnityConfig the types are correctly registered
container.RegisterType<IDossierService, DossierService>(new InjectionConstructor(typeof(IDataService), typeof(IDossierRepository), typeof(IDbContext), true));
But I still get the same exception. When I put a breakpoint at the start of the CreateContainer() method of TestDependencies it doesn't break...
For anyone looking for available plugins/libraries that support DI in Specflow project: https://docs.specflow.org/projects/specflow/en/latest/Extend/Available-Plugins.html#plugins-for-di-container
I prefer - https://github.com/solidtoken/SpecFlow.DependencyInjection
Example
Create DI container:
[ScenarioDependencies]
public static IServiceCollection CreateServices()
{
var services = new ServiceCollection();
Config config = JObject.Parse(File.ReadAllText("config.json")).ToObject<Config>();
services.AddSingleton(config);
services.AddScoped<DbConnections>();
services.AddScoped<ApiClients>();
return services;
}
Consume dependencies (via parameterized constructors):
[Binding]
public sealed class CalculatorStepDefinitions
{
private readonly DbConnections dbConnections;
public CalculatorStepDefinitions(DbConnections dbConnections) => this.dbConnections = dbConnections;
...
}
We solved this problem by implementing SpecFlow RuntimePlugin. In our case it was Castle.Windsor, but principle is the same. First you define the plugin which override default SpecFlow Instance Resolver:
public class CastleWindsorPlugin : IRuntimePlugin
{
public void Initialize(RuntimePluginEvents runtimePluginEvents, RuntimePluginParameters runtimePluginParameters)
{
runtimePluginEvents.CustomizeScenarioDependencies += (sender, args) =>
{
args.ObjectContainer.RegisterTypeAs<CastleWindsorBindingInstanceResolver, IBindingInstanceResolver>();
};
}
}
Where in CastleWindsorBindingInstanceResolver we needed to implement single method: object ResolveBindingInstance(Type bindingType, IObjectContainer scenarioContainer);. This class contains container and resolution (in your case instance of IUnityContainer. I recommend to inject to the container instance of self, so that you could inject the instance of IUnityContainer to SpecFlow binding classes)
This plugin needs to be in separate assembly and you load that to your test project like adjusting app.config like this:
<specFlow>
<plugins>
<add name="PluginAssemblyName" path="." type="Runtime" />
</plugins>
...
</specFlow>
What exactly is BoDi? I can't find any useful information..
BoDI is a very basic Dependency Injection framework that ships within Specflow. You can find its code repository here.
See this entry from the blog of SpecFlow's creator, Gáspár Nagy (emphasis mine):
SpecFlow uses a special dependency injection framework called BoDi to handle these tasks. BoDi is an embeddable open-source mini DI framework available on GitHub. Although it is a generic purpose DI, its design and features are driven by the needs of SpecFlow. By the time the SpecFlow project started, NuGet had not existed yet, so the libraries were distributed via zip downloads and the assemblies had to be referenced manually. Therefore we wanted to keep the SpecFlow runtime as a single-assembly library. For this, we needed a DI framework that was small and could be embedded as source code. Also we did not want to use a well-known DI framework, because it might have caused a conflict with the version of the framework used by your own project. This led me to create BoDi.
You can find an example of how to register types and interfaces in BoDI here:
[Binding]
public class DependencyConfiguration
{
private IObjectContainer objectContainer;
public DependencyConfiguration(IObjectContainer objectContainer)
{
this.objectContainer = objectContainer;
}
[BeforeScenario(Order = 0)]
public void ConfigureDependencies()
{
if (...)
objectContainer.RegisterTypeAs<RealDbDriver, IControllerDriver>();
else
objectContainer.RegisterTypeAs<StubDbDriver, IControllerDriver>();
}
}
However, be warned that (in the words of Gáspár Nagy):
Although it is possible to customize the dependency injection used by SpecFlow, it already scratches the boundaries of BoDi’s capacity. A better choice would be to use a more complex DI framework for this.
In this situation usually you should use Mock of your Interface.

log4net - Rendering an (object) message depending on the appender

I'm using log4net, and a custom logg facade practically identical to log4net.ILog to access it.
I would like to log to a simple textfile, but also use a custom XML appender to write stuff to XML.
Now here's the issue. Say I'm doing something like
interface IMyILogFacade
{
// An implementation of this interface will eventually route this call to
// log4net.ILogger.Trace
void Trace(string format, params object[] args);
}
class Test
{
IMyILogFacade log; // assume this is given (injected)
public void testMethod(Assembly assembly)
{
string msg = "Entering Method 'testMethod'. Method Parameter 0 (assembly): {0}";
log.Trace(msg, assembly);
}
}
By default, log4net will render the message with a simple String.Format(msg, assembly), when accessing %message in any conversion pattern, or RenderedMessage explicitly.
This behaviour is fine for my text file log. However for my XML log, I want it to be rendered differently. In short, I would then like to reflect on the runtime type of the parameter (assembly) and dump all its public members, and its public members' public members, and.. so on... to a nested XML structure. So instead of rendering with String.Format(msg, assembly), I'd need to use something else, say String.Format(msg, MyXmlDumper.Dump(assembly)).
I haven't found any good way to make rendering depending on the type of the Appender though.
My current approach was to have my logger facade transform all calls to Trace, Debug, ... etc into an object of type LogMessage.
public class LogMessage
{
public string message { get; protected set; }
public object[] #params { get; protected set; }
public LogMessage(string message, params object[] #params)
{
this.message = message;
this.#params = #params;
}
}
I'd then use a class implementing IObjectRenderer to render this. This would be the one for a simple ToString-like function
public class LogMessageStringRenderer : IObjectRenderer
{
public void RenderObject(RendererMap rendererMap, object obj, TextWriter writer)
{
LogMessage logMessage = obj as LogMessage;
if(logMessage == null)
{
throw new InvalidOperationException("LogMessageStringRenderer can only render objects of type LogMessage");
} else
{
writer.Write(
String.Format(logMessage.message, logMessage.#params)
);
}
}
}
and of course it'd be straightforward to create the same for creating my Xml dump.
This seems like a not very ideal solution though. For a start, these renderes can (as far as I know) only be attached to a log4net repository. Either by code, or in the config file as a root element like
<renderer renderedClass="LogMessage" RenderingClass="LogMessageStringRenderer"/>
But that means I'd have to create two logging repositories in my application; one for the text file appender, another one for the xml appender, and set their rendering objects accordingly.
This seems very and unnecessarily complex though (let alone I don't have a clue how to best use logging repositories at the moment).
Is there possibly any better solution to this problem? Maybe either a way to somehow select renderers on an appender-basis instead of for the whole repository. Or some conceptually entirely different solution I haven't though of.
Any pointers would be very appreciated.
Thanks
Classic, been spending the entire day on this and just after posting here I do find a solution...
I'll wrap my messages in the LogMessage class as posted above. Then I'll create a converted deriving from PatternLayoutConverter and apply it to the message parameter:
<param name="ConversionPattern" value="%date - %thread - %level - %-80logger - %-40method - %message%newline" />
<footer value="
"/>
<converter>
<name value="message" />
<type value="MyMessageConverter"/>
</converter>
If I'm lazy, I may created two converters, one for a ToString implementation, one for my Xml-based dumping.
If I'm more ambitious, I may create a converter that allows to apply a custom IObjectRenderer in the log4net.config and uses this to render the message. My IObjcetRenderer implementations can then take care of transforming the LogMessage as described in my first post.
This should both work very nicely; will implement this next week, drop me a message if anyone in the future is looking for specific implementations.

Dynamic Connection String in Entity Framework

GOAL using Database-First Paradigm (not Code-First) for a deployed desktop wpf application, with unique databases for end users:
1) Have EntityFramework use a connection string determined at run time.
2) Not deploy different app.config files.
Things attempted:
1) Overload the constructor - while successful, this solution is undesired as it leaves the door open for developers to make mistakes, makes unit testing more difficult.
2) Attempted modifying the connection / context factory - threw Exception.
3) Change the default constructor - could be successful, this solution is undesired as the default constructor is autogenerated.
4) Attempted modifying the ConfigurationSettings - threw Exception, it is read-only.
5) Have a customer side deployment of app.config - while plausible, this solution is undesired as it requires a rewrite of our deployment engine.
Help?
EDIT:
Some code related to first item we tried (overloading the constructor):
public partial class DatabaseContext
{
public DatabaseContext(EntityConnection con)
: base(con, true)
{
}
}
public static class DbContextHelper
{
public static string ConnectionString { get; set; }
public static CounterpartDatabaseContext GetDbContext()
{
EntityConnectionStringBuilder builder = new EntityConnectionStringBuilder
{
Provider = "System.Data.SqlClient",
ProviderConnectionString = ConnectionString,
Metadata = #"res://*/DatabaseContext.csdl|res://*/DatabaseContext.ssdl|res://*/DatabaseContext.msl"
};
EntityConnection con = new EntityConnection(builder.ToString());
return new DatabaseContext(con);
}
}
Usage:
public void SomeMethod()
{
using(DatabaseContext db = DbContextHelper.GetDbContext())
{
// db things
}
}
EDIT code for adding connection string with config manager:
public MainWindow()
{
InitializeComponent();
ConfigurationManager.ConnectionStrings.Add(new ConnectionStringSettings("DatabaseContext", #"metadata=res://*/DatabaseContext.csdl|res://*/DatabaseContext.ssdl|res://*/DatabaseContext.msl;provider=System.Data.SqlClient;provider connection string="data source=sqldev;initial catalog=Dev;persist security info=True;user id=user;password=password;MultipleActiveResultSets=True;App=EntityFramework"", "System.Data.EntityClient"));
}
the config manager code just throws an exception, so no point in any code after that.
Generated DatabaseContext class is both partial. With partial you can add code in another file (just remember about partial keyword there) and still be to re-generate everything. Generator will only overwrite the file it generated, all other files with extra additions to that partial class will not evaporate. No problem with mantaining generated and handwritten parts there.
Also, the generated class is not sealed. You can inherit from it. So, instead of using DatabaseContext directly, you might try inheriting from it and start using the derived class. This derived class will not inherit the constructors, but will inherit all other public important things. You will be able then to provide your own constructor, even default one, that will i.e. call parameterized base class ctor. Actually, I have not tried it that way, but it looks simple and may work.
What I propose is not using DbContextHelper.GetContext() (which is static obviously) (which you think the devs may misuse or forget), but rolling in your own DbContext class.
In the project where you have the EDMX and generated DatabaseContext context, add a file with:
public partial class DatabaseContext
{
protected DatabaseContext(string nameOrConnstring) : base(nameOrConnstring) { }
}
it will add a new overload, it will expose the base DbContext constructor that takes the connstring.
Then add another file to that:
public class ANewContext : DatabaseContext
{
public ANewContext() : base(DbContextHelper.FindMyConnectionString()){ }
}
and that's all. Since your helper was static anyways, then we can call it like that. Just change it to return the connstring props, which it had needed to determine anyways.
Now rename the classes:
DatabaseContext -> InternalDatabaseContextDontUseMe
ANewContext -> DatabaseContext
or something like that, and I bet noone will be ever confused as to which one of them should be used everywhere. Usage:
public void SomeMethod()
{
using(var db = new DatabaseContext()) // that's ANewContext after renaming
{
...
}
}
With partial in the InternalDatabaseContextDontUseMe, you will be able to regenerate the model, and the extra added ctor will not be deleted. With one extra inheritance level, the autogenerated default constructor will be hidden, and devs using the derived class will not be able to accidentally call it, they'll receive new default ctor that will do what's needed.
If you are really interested in hearing about what I found while digging in the EF source, LazyContext, Factories, Resolvers and etc, look at this article of fine. I put there everything I could recall today and while it's somewhat chaotic, it may help you if you like digging&decompiling. Especially the EntityConnection-DbProviderFactory-Resolvers mentioned at the end.

Dependency Injection - Choose DLL and class implementation at runtime through configuration file

I've an API DLL (API.dll, for example) which, in addition to many other thinks, makes available an abstract class (AbstractClass).
Now making use of that AbstractClass I've implemented it on two different dlls:
First.API.Implementation.dll with ConcreteImplementation1
Second.API.Implementation.dll with ConcreteImplementation2
Both ConcreteImplementation1 and ConcreteImplementation2 are implementation of the same abstract class.
What I want is an application where I can choose which of those two dlls to use and, through that, choose which implementation to use without the user having to change anything within the code and, if possible, without stopping the application.
Some configuration file where I can bring the application to use whatever implementation I want. Something like:
<appconfiguration>
<implementation_to_use>
<dll>First.API.Implementation.dll</dll>
<class>ConcreteImplementation1</class>
</implementation_to_use>
</appconfiguration>
I know near to nothing about dependency injection, apart from its concept, but I guess thats the perfect fit for this task.
I've researched several DI/IoC libraries but I'm not familiar with all the concepts and names. I can use whatever library I want. For what I can say these are the most used: StructureMap, Ninject and Sprint.NET
Moreover, apart from all the dlls and implementation I need to indicate a file to be used by that application. Can I indicate its path in that same file?
I just need some tips and directions to implement such a thing. Some examples using one of those libraries, would be awesome.
Thanks.
To get you started using StructureMap, create a console application, include in it:
structuremap.config:
<?xml version="1.0" encoding="utf-8" ?>
<StructureMap MementoStyle="Attribute">
<DefaultInstance
PluginType="DemoIoC.AbstractBase,DemoIoC"
PluggedType="DemoIoC.ConcreteImplementation1,DemoIoC"
Scope="Singleton" />
</StructureMap>
The PluginType and PluggedType attributes are "FullyQualifiedClassName,AssemblyName"
By default it will look for assemblies in the executable folder, I'm not sure how you would specify another location for the assemblies
There are plenty of options for Scope, e.g. Singleton, Transient, etc
Program.cs:
namespace DemoIoC
{
using System;
using StructureMap;
public static class Program
{
public static void Main(string[] args)
{
// here you initialize structuremap from the config file.
// You could probably use a FileSystemWatcher to reinitialize
// whenever the structuremap.config file changes
ObjectFactory.Initialize(x =>
{
x.UseDefaultStructureMapConfigFile = true;
});
var concrete = ObjectFactory.GetInstance<AbstractBase>();
concrete.Method1();
Console.ReadKey(true);
}
}
}
AbstractBase.cs:
namespace DemoIoC
{
public abstract class AbstractBase
{
public abstract void Method1();
}
}
ConcreteImplementation1.cs:
namespace DemoIoC
{
using System;
public class ConcreteImplementation1 : AbstractBase
{
public override void Method1()
{
Console.WriteLine("Called ConcreteImplementation1");
}
}
}

How to use TraceSource across classes

I was recently studying documentation on TraceSource. Microsift says that TraceSource is a new way and should be used instead of old Trace class.
// create single TraceSource instance to be used for logging
static TraceSource ts = new TraceSource("TraceTest");
// somewhere in the code
ts.TraceEvent(TraceEventType.Warning, 2, "File Test not found");
Now my question. You have large project with several assemblies where you have lots of classes. Say you wanna trace specific bit of functionality that is spread across classes. Obvious idea is that you need to create one specific TraceSource .
1) To work with Tracesource I need to create instance first. What is MS thinking about sharing this instance across various classes or assemblies? Should I create one dummy class with static singleton property? What are you doing in that case.
2) Why do I need TraceSource instance? Every propery is described in the configuration file. The old logic based on Trace class did not require some instance and provided the way to work with static methods only.
*1. Just define the TraceSource in each class where you want to use it. You can make the TraceSource static so that it shared among all instances of the class you define it in. No need to share the instance among all classes (types) that need the "same" TraceSource. Each time you decleare a new TraceSource (TraceSource ts = new TraceSource("somename"); instance, you get a new TraceSource object, but it references the same config information. So, if you create a new TraceSource in each of several classes and you use the same name for each one, you will get different instances of TraceSource, but they will all be configured the same. In short, there is no need to try to share the TraceSource instances among classes. There is also no need to create a dummy class with a static singleton. See my examples below. I have also included several more links from here on SO that describe how to work with TraceSources.
//
// In this example, tracing in classes A and B is controlled by the "TraceTest" TraceSource
// in the app.config file. Tracing in class C is controlled by the "TraceTestTwo"
// TraceSource in the app.config.
//
// In addition to using different TraceSource names, you can also use SourceSwitches
// (in the app.config). See some examples of app.config in the
// "turning-tracing-off-via-app-config" link below.
//
public class A
{
private static readonly TraceSource ts = new TraceSource("TraceTest");
public void DoSomething()
{
ts.TraceEvent(TraceEventType.Warning, 2, "File Test not found");
}
}
public class B
{
//
//Use the same config info for TraceTest in this class
//It's ok to use a different instance of TraceSource, but with the same name,
//in this class, the new instance will be configured based on the params in the
//app.config file.
//
private static readonly TraceSource ts = new TraceSource("TraceTest");
public void DoSomething()
{
ts.TraceEvent(TraceEventType.Warning, 2, "File Test not found");
}
}
public class C
{
//
//Use a different TraceSource in this class.
//
private static readonly TraceSource ts = new TraceSource("TraceTestTwo");
public void DoSomething()
{
ts.TraceEvent(TraceEventType.Warning, 2, "File Test not found");
}
}
*2. One benefit to using multiple TraceSources is that you have more granular control over your tracing. You can trace via "TraceTest" at one level (or not at all) and via "TraceTestTwo" at a different level (or, again, not at all). You can send each TraceSource to its own TraceListener or send all to the same TraceListener, or mix and match. Compare the ability to tailor the configuration of individual TraceSources to the limitation of only using the static methods on the Trace class. You can configure where the "trace" information goes (which TraceListener(s)) or the level of the "trace" information, but you cannot control the level per class or per functional area like you can when using TraceSources. Finally, one more benefit to multiple TraceSources is the "free" context information that you can get in your output. By default (or optionally, I can't remember), the TraceListener will log the name of the TraceSource that logged a message. So, you can look at that line in your output and get some idea of the class or functional area where it came from without having to put a log of contextual information in the call site. In the code examples above, the trace output from classes A and B will be tagged with "TraceTest" and the trace output from class B will be tagged with "TraceTestTwo".
Please forgive the link bombardment below, but I have posted some pretty good information (if I do say so myself!) about TraceSource and System.Diagnostics in the past.
If you are going to use TraceSource, consider using the library mentioned in this SO post for formatting your output like log4net/NLog:
Does the .Net TraceSource/TraceListener framework have something similar to log4net's Formatters?
See my answer in this post for more info on using TraceSource and some ideas on how you can improve your "TraceSource experience".
More info on TraceSource: Add Trace methods to System.Diagnostics.TraceListener
More info on TraceSource: System.Diagnostics.Debug namespace vs Other logging solutions (log4net, MS Enterprise Library, etc.)
More info on TraceSource: Turning tracing off via app.config

Categories

Resources