AutoMapper stops working after moving models to another project - c#

I have an api that uses code first entity framework, and uses AutoMapper to map the entities to models before surfacing the data to the consumer. The mapping profiles exist in the same file as the respective model classes.
I'm trying to restructure the projects a bit and pull the models out into a separate project while keeping the mapping profiles where they are in the existing project. And updating the reference to point to the new project after the models are removed locally, of course. Upon doing so, AutoMapper stops working.
Before the restructure
The structure of the solution before the restructure is as follows:
Api project (Houses the controllers and api endpoints)
Startup.cs
public void ConfigureServices(IServiceCollection services) {
...
services.AddAutoMapper(typeof(LocaleModel).GetTypeInfo().Assembly)
...
}
Application project (Houses the MediatR handlers, models, mapping profiles, etc.)
Models/LocaleModel.cs
public class LocaleModel {
public long Id { get; set; }
public string Code { get; set; }
public string Name { get; set; }
}
public class LocaleModelMapping() : Profile {
public LocaleModelMapping() {
CreateMap<Locale, LocaleModel>();
CreateMap<LocaleModel, Locale>();
}
}
Queries/Locales/Get/GetLocalesRequestHandler.cs
public async Task<IEnumerable<LocaleModel>> Handle(GetLocalesRequest request, CancellationToken cancellationToken) {
var locales = await DbContext.Locales
.AsNoTracking()
.ToListAsync(cancellationToken);
return Mapper.Map<List<LocaleModel>>(locales);
}
After the restructure
The structure of the solution after the restructure:
Api project (Houses the controllers and api endpoints)
Startup.cs
public void ConfigureServices(IServiceCollection services) {
...
services.AddAutoMapper(typeof(LocaleModelMapping).GetTypeInfo().Assembly)
...
}
Application project (Houses the MediatR handlers, models, mapping profiles, etc.)
Mappings/LocaleModelMapping.cs
public class LocaleModelMapping() : Profile {
public LocaleModelMapping() {
CreateMap<Locale, LocaleModel>();
CreateMap<LocaleModel, Locale>();
}
}
Models project (Houses the models only)
LocaleModel.cs
public class LocaleModel {
public long Id { get; set; }
public string Code { get; set; }
public string Name { get; set; }
}
The references were updated as necessary so that the Application project is aware of the Models project.
The moment I remove the models from the Application project and refer to the Models project, AutoMapper stops working, even though I updated Startup.cs to use the assembly of a mapping profile from the Application project.
The error that is generated is as follows (full namespace censored):
Mapping types:
List`1 -> List`1
List[...Domain.Locale, ...Domain, Version=2.0.0.0]
-> List[...Application.Models.LocaleModel, ...Application.Models, Version=1.0.0.0]
---> AutoMapper.AutoMapperMappingException: Missing type map configuration or unsupported mapping.
Things I've tried:
Putting all the mapping profiles in a single class.
Several different ways of passing in the assembly, types, etc. into services.AddAutoMapper().
Registering the mapping profiles manually in Startup.cs.
Mirrored another project with the same intended structure where it works just fine without issue.
Any help is appreciated!

Moving the Model and mapping should not have required any change to the service.AddAutomapper call beyond where the project would resolve the Assembly reference. You should have been able to leave the Startup.cs as services.AddAutoMapper(typeof(LocaleModel).GetTypeInfo().Assembly); The difference would have been the namespace where "LocaleModel" would have resolved from.
The first thing I would check is if you accidentally selected an Intellisense option to Create a new LocaleModelMapping class when you updated the Startup.cs so instead of it appending a using block to the new namespace in your Application project, it created a dummy LocaleModelMapping class somewhere in your API project leaving Automapper trying to resolve mappings from the API project still instead of the Application project. You can verify this by right clicking on the LocaleModelMapping inside (typeof(LocaleModelMapping).... and select "Go to Definition". Does this navigate you into your Application project, to an empty class in your API project, or somewhere else? (I.e. disassembly of a stale assembly reference)

Found the solution after a bit more searching.
In our dependency injection Autofac module that is called from ConfigureContainer(...) within Startup.cs, there was some code that was registering the AutoMapper profiles using one of the models. Which used to be coupled together with the mapping prior to me pulling them out into a separate project, but is no longer.
// AutoMapper Profiles
var profiles = typeof(LocaleModelMapping).Assembly.GetTypes()
.Where(t => typeof(Profile).IsAssignableFrom(t))
.Select(t => (Profile)Activator.CreateInstance(t));
builder.Register(ctx => new MapperConfiguration(cfg =>
{
foreach (var profile in profiles) cfg.AddProfile(profile);
}));
builder.Register(ctx => ctx.Resolve<MapperConfiguration>().CreateMapper())
.As<IMapper>()
.InstancePerLifetimeScope();
Simply updating the reference to use a mapping class instead of a model class fixed the issue.
Alternatively, removing the code altogether also worked. The Startup.cs code is already registering the AutoMapper profiles via services.AddAutoMapper(...), so this code in the Autofac module is not necessary.

Related

AutoMapper 9.0.0 (non static) gives Missing type map configuration or unsupported mapping error

I have a ASP.Net Web API that was using Automapper 7.0.1 with static mappings. I recently upgraded to Automapper 9.0.0 which does not have static mappings. So, I used the recommended way of using my Dependency Container (Unity Container) to register the instances of IMapper and IConfigurationProvider as Singletons.
var config = AutoMapperConfig.GetMapperConfiguration();
_container.RegisterInstance<IConfigurationProvider>(config, new SingletonLifetimeManager());
_container.RegisterInstance<IMapper>(new Mapper(config), new SingletonLifetimeManager());
The AutoMapperConfig.GetMapperConfiguration() is a static method that returns a new Config with all the mappings.
public static MapperConfiguration GetMapperConfiguration()
{
return new MapperConfiguration(config =>
{
config.CreateMap<MyDtoReq, MyModel1>(MemberList.Destination);
config.CreateMap<MyModel1, MyDtoRes>(MemberList.Destination);
// other mappings
}
}
Therafter, I have resolved and used IMapper in numerous services which are registered with PerRequestLifetimeManager, like:
_container.RegisterType<IService1, Service1>(new PerRequestLifetimeManager());
I can see that the Unity resolved both Services and Mapper properly, but when I call Map() using:
_service1.Mapper.Map<MyDtoRes>(myModel1ObjectInstance);
It gives me an AutoMapperException saying
Missing type map configuration or unsupported mapping error
I have tried many things inclduding, registering the AutoMapper objects as PerRequest dependencies, even as Singletons using a static class (without DI container) but to no avail.
I am sure that my mappings are correct because they worked with Static AutoMapper in v 7.0.1. What have I missed out after the upgrade?
Turns out that that there were 2 problems.
I needed to use a Profile containing all the mappings. So I moved all the mappings from the MapperConfiguration to a Profile as follows
public class AutoMapperProfile : Profile
{
public AutoMapperProfile()
{
CreateMap<DtoReq, Model>();
// other mappings
}
}
And then, use the MapperConfigurationExpression as follows
var mce = new MapperConfigurationExpression();
mce.ConstructServicesUsing(o => MyUnityContainer);
mce.AddProfiles(new List<Profile>() { new AutoMapperProfile() });
var mc = new MapperConfiguration(mce);
mc.CompileMappings(); // prevents lazy compilation
mc.AssertConfigurationIsValid(); // ensures all mappings are in place
There were some mappings that were missing (only 2) which worked in the older version probably because automatic mappings were enabled in it by default. The newer version 9.0.0 showed the exact missing mappings only after I moved them to the Profile.

How to load controllers from different versions of the same assembly

I am trying to solve the following problem and i am not exactly sure how to do it:
I am building a web server that has differenty APIs/Controllers that are loaded from .dll-Files on Startup. It will run in a linux docker container and is implemented as an ASP-NET Webapplication & .net Core 2.1 .
The loading of assemblies that contain controllers works fine by doing something like this:
public void ConfigureServices(IServiceCollection services)
{
services.AddMvc().SetCompatibilityVersion(CompatibilityVersion.Version_2_1);
services.AddMvc().AddApplicationPart(AssemblyLoadContext.Default.LoadFromAssemblyPath("/PATH/APIPlugin.dll"));
}
This application must have versioned REST-APIs that means: I need to load the same assembly multiple times in different versions. then i need to have some kind of routing between the versions.
For example:
/api/data/latest/
/api/data/v1/
I cannot use AssemblyLoadContext.Default.LoadFromAssemblyPath to load multiple versions of the same assembly. I also tried to grab the controller from the assembly and creating an instance of it like this:
var controllerAssembly = AssemblyLoadContext.Default.LoadFromAssemblyPath("/PATH/APIPlugin.dll");
var pluginType = controllerAssembly ExportedTypes.First<Type>();
var pluginInstance = (ControllerBase)Activator.CreateInstance(pluginType);
services.Add(new ServiceDescriptor(pluginType, pluginInstance));
This throws no exception but ultimately does not work. But i am pretty new to ASP.Net so this might very well be nonsense and i would have to find a solutuion to route between the different versions, even if it would work like this.
My Question:
How would one approach this requirement ?
Is it a god idea/possible to load multiple Controllers from the "same" assembly ? If yes, how would one achieve this?
Or would it be a better solution to have one controller that does all the routuing and load some self-defined implementation from the assemblies. So that the controller would route between the versions, and api-methods?
I was able to find a sultion while tinkering around:
public class ControllerPluginProvider : IApplicationFeatureProvider<ControllerFeature>
{
public void PopulateFeature(IEnumerable<ApplicationPart> parts, ControllerFeature feature)
{
var basePath = AppContext.BaseDirectory;
var pluginPath = Path.Combine(basePath, "plugins");
foreach (var file in Directory.GetFiles(pluginPath, "*.dll")){
var assembly = Assembly.LoadFile(file);
var controllers = assembly.GetExportedTypes().Where(t => typeof(ControllerBase).IsAssignableFrom(t));
foreach (var candidate in controllers)
{
feature.Controllers.Add(candidate.GetTypeInfo());
}
}
}
}
In Startup:
public void ConfigureServices(IServiceCollection services)
{
...
services.AddMvc().ConfigureApplicationPartManager(m =>
m.FeatureProviders.Add(new ControllerPluginProvider()));
}
This lead to the following error when the same assembly, and therefore a Controller with the same name was loaded: Attribute routes with the same name 'Get' must have the same template
I was able to fix it, and also add versioning with the versioning library:
https://github.com/microsoft/aspnet-api-versioning
[ApiVersion("2.0")]
[ApiController]
[Route("api/v{version:apiVersion}/MyController")]
public class MyControllerController : ControllerBase
{
}
Now the only step that is missing, is routing the /latest/ path to the most recent controller of the given type. I have not found a solution to this yet, but this should be doable.

Why is an ASP.NET-Core app 'Configuration/AppSettings' POCO passed around as IOptions<T> instead of just T? [duplicate]

It seems to me that it's a bad idea to have a domain service require an instance of IOptions<T> to pass it configuration. Now I've got to pull additional (unnecessary?) dependencies into the library. I've seen lots of examples of injecting IOptions all over the web, but I fail to see the added benefit of it.
Why not just inject that actual POCO into the service?
services.AddTransient<IConnectionResolver>(x =>
{
var appSettings = x.GetService<IOptions<AppSettings>>();
return new ConnectionResolver(appSettings.Value);
});
Or even use this mechanism:
AppSettings appSettings = new AppSettings();
Configuration.GetSection("AppSettings").Bind(appSettings);
services.AddTransient<IConnectionResolver>(x =>
{
return new ConnectionResolver(appSettings.SomeValue);
});
Usage of the settings:
public class MyConnectionResolver
{
// Why this?
public MyConnectionResolver(IOptions<AppSettings> appSettings)
{
...
}
// Why not this?
public MyConnectionResolver(AppSettings appSettings)
{
...
}
// Or this
public MyConnectionResolver(IAppSettings appSettings)
{
...
}
}
Why the additional dependencies? What does IOptions buy me instead of the old school way of injecting stuff?
Technically nothing prevents you from registering your POCO classes with ASP.NET Core's Dependency Injection or create a wrapper class and return the IOption<T>.Value from it.
But you will lose the advanced features of the Options package, namely to get them updated automatically when the source changes as you can see in the source here.
As you can see in that code example, if you register your options via services.Configure<AppSettings>(Configuration.GetSection("AppSettings")); it will read and bind the settings from appsettings.json into the model and additionally track it for changes. When appsettings.json is edited, and will rebind the model with the new values as seen here.
Of course you need to decide for yourself, if you want to leak a bit of infrastructure into your domain or pass on the extra features offered by the Microsoft.Extensions.Options package. It's a pretty small package which is not tied to ASP.NET Core, so it can be used independent of it.
The Microsoft.Extensions.Options package is small enough that it only contains abstractions and the concrete services.Configure overload which for IConfiguration (which is closer tied to how the configuration is obtained, command line, json, environment, azure key vault, etc.) is a separate package.
So all in all, its dependencies on "infrastructure" is pretty limited.
In order to avoid constructors pollution of IOptions<>:
With this two simple lines in startup.cs inside ConfigureServices you can inject the IOptions value like:
public void ConfigureServices(IServiceCollection services)
{
//...
services.Configure<AppSettings>(Configuration.GetSection("AppSettings"));
services.AddScoped(cfg => cfg.GetService<IOptions<AppSettings>>().Value);
}
And then use with:
public MyService(AppSettings appSettings)
{
...
}
credit
While using IOption is the official way of doing things, I just can't seem to move past the fact that our external libraries shouldn't need to know anything about the DI container or the way it is implemented. IOption seems to violate this concept since we are now telling our class library something about the way the DI container will be injecting settings - we should just be injecting a POCO or interface defined by that class.
This annoyed me badly enough that I've written a utility to inject a POCO into my class library populated with values from an appSettings.json section. Add the following class to your application project:
public static class ConfigurationHelper
{
public static T GetObjectFromConfigSection<T>(
this IConfigurationRoot configurationRoot,
string configSection) where T : new()
{
var result = new T();
foreach (var propInfo in typeof(T).GetProperties())
{
var propertyType = propInfo.PropertyType;
if (propInfo?.CanWrite ?? false)
{
var value = Convert.ChangeType(configurationRoot.GetValue<string>($"{configSection}:{propInfo.Name}"), propInfo.PropertyType);
propInfo.SetValue(result, value, null);
}
}
return result;
}
}
There's probably some enhancements that could be made, but it worked well when I tested it with simple string and integer values. Here's an example of where I used this in the application project's Startup.cs -> ConfigureServices method for a settings class named DataStoreConfiguration and an appSettings.json section by the same name:
services.AddSingleton<DataStoreConfiguration>((_) =>
Configuration.GetObjectFromConfigSection<DataStoreConfiguration>("DataStoreConfiguration"));
The appSettings.json config looked something like the following:
{
"DataStoreConfiguration": {
"ConnectionString": "Server=Server-goes-here;Database=My-database-name;Trusted_Connection=True;MultipleActiveResultSets=true",
"MeaningOfLifeInt" : "42"
},
"AnotherSection" : {
"Prop1" : "etc."
}
}
The DataStoreConfiguration class was defined in my library project and looked like the following:
namespace MyLibrary.DataAccessors
{
public class DataStoreConfiguration
{
public string ConnectionString { get; set; }
public int MeaningOfLifeInt { get; set; }
}
}
With this application and libraries configuration, I was able to inject a concrete instance of DataStoreConfiguration directly into my library using constructor injection without the IOption wrapper:
using System.Data.SqlClient;
namespace MyLibrary.DataAccessors
{
public class DatabaseConnectionFactory : IDatabaseConnectionFactory
{
private readonly DataStoreConfiguration dataStoreConfiguration;
public DatabaseConnectionFactory(
DataStoreConfiguration dataStoreConfiguration)
{
// Here we inject a concrete instance of DataStoreConfiguration
// without the `IOption` wrapper.
this.dataStoreConfiguration = dataStoreConfiguration;
}
public SqlConnection NewConnection()
{
return new SqlConnection(dataStoreConfiguration.ConnectionString);
}
}
}
Decoupling is an important consideration for DI, so I'm not sure why Microsoft have funnelled users into coupling their class libraries to an external dependency like IOptions, no matter how trivial it seems or what benefits it supposedly provides. I would also suggest that some of the benefits of IOptions seem like over-engineering. For example, it allows me to dynamically change configuration and have the changes tracked - I've used three other DI containers which included this feature and I've never used it once... Meanwhile, I can virtually guarantee you that teams will want to inject POCO classes or interfaces into libraries for their settings to replace ConfigurationManager, and seasoned developers will not be happy about an extraneous wrapper interface. I hope a utility similar to what I have described here is included in future versions of ASP.NET Core OR that someone provides me with a convincing argument for why I'm wrong.
I can't stand the IOptions recommendation either. It's a crappy design to force this on developers. IOptions should be clearly documented as optional, oh the irony.
This is what I do for my configuraition values
var mySettings = new MySettings();
Configuration.GetSection("Key").Bind(mySettings);
services.AddTransient(p => new MyService(mySettings));
You retain strong typing and don't need need to use IOptions in your services/libraries.
You can do something like this:
services.AddTransient(
o => ConfigurationBinder.Get<AppSettings>(Configuration.GetSection("AppSettings")
);
Using Net.Core v.2.2, it's worked for me.
Or then, use IOption<T>.Value
It would look something like this
services.Configure<AppSettings>(Configuration.GetSection("AppSettings"));
I would recommend avoiding it wherever possible. I used to really like IOptions back when I was working primarily with core but as soon as you're in a hybrid framework scenario it's enough to drive you spare.
I found a similar issue with ILogger - Code that should work across frameworks won't because I just can't get it to bind properly as the code is too dependent on the DI framework.

Xamarin Forms and EntityFramework Attributes compatibility

I have a client/server solution using C#, WPF, ASP.NET WebAPI and Entity Framework. Client and server clases share the model among his projects. Now I am trying to create a new client, using Xamarin Forms and sharing the model to, but Entity Framework attributes(MaxLength, Index, NotMapped, etc), are not compatible in a PCL. So this are the things that I've tried:
Import Microsoft.EntityFrameworkCore to the PCL Model
As described here, you should be able to use entity framework with Xamarin forms, so I convert the PCL to NetStandard 1.3, and it works, every EntityFramework attribute is allowed. But now the server project is not compatible with that standard and I cannot add packages like prism and Newtonsoft.Json in the model project.
Mock the attributes for Xamarin forms using the bait and switch trick
I've tried the approach described here, based on creating custom attributes in the model PCL, and redefining them in the class libraries. MyClient.Droid and MyClient.UWP redefine the attributes leaving them empty, and MyServer will redefine them with the Entity Framework functionality.
Custom IndexAttribute - Model PCL:
namespace Model.Compatibility
{
public class IndexAttribute : Attribute
{
public IndexAttribute()
{
}
}
}
Custom IndexAttribute - Server side:
[assembly: TypeForwardedToAttribute(typeof(Model.Compatibility.IndexAttribute))]
namespace Model.Compatibility
{
public class MockedIndexAttribute : System.ComponentModel.DataAnnotations.Schema.IndexAttribute
{
public MockedIndexAttribute()
{
}
}
}
I test this aproach calling var attribute = new Model.Compatibility.IndexAttribute();. MockedIndexAttribute constructor is never called.
Create a Shared Project Instead of PCL
This way is a little more messy, but looks like it works. Just creating a new shared project for the model, and using conditional flags like this:
#if !__MOBILE__
[NotMapped, Index]
#endif
public Guid Id { get; set; }
I've not fully deployed this approach at the moment, but if I cannot make none of the first two ways working, I will go with this.
EDIT - Trying to make the "Bait and Switch Attributes" approach work
As #AdamPedley sugested and this thread to, I've redefined IndexAttribute in a new PCL(Xamarin.Compatibility), using the same namespace as the original one:
namespace System.ComponentModel.DataAnnotations.Schema
{
[AttributeUsage(AttributeTargets.Property, AllowMultiple = true)]
public class IndexAttribute : Attribute
{
public IndexAttribute() { }
}
}
Now, my PCL Model includes a reference to Xamarin.Compatibility, so I can use Index attribute in my model properties:
[Index]
public Guid Id { get; set; }
Then, from my Server project, I call the next line of code to check what constructor is called, the custom attribute, or the one defined by EntityFramework:
PropertyInfo prop = typeof(MyClass).GetProperty("Id");
object[] attributes = prop.GetCustomAttributes(true);
The constructor called is the custom one, so it does not work because it have to call to the attribute defined by EntityFramework. Thats is the thing that I don't know, what is the mechanism that make my model's PCL select custom attribute or EF attribute depending on the calling assembly.
I've also added a file in my server project, called TypeForwarding.Net.cs(as sugested here), that contains:
[assembly: TypeForwardedTo(typeof(IndexAttribute))]
But still not working.
I believe the EF fluent API is PCL and NetStandard friendly. Thus you can create POCO objects and let the fluent api do the cross platform mappings instead of using attributes. msdn.microsoft.com/en-us/library/jj591617(v=vs.113).aspx
Note: I did this with a project using EF6 and PCL projects to share across MVC / WPF / Mobile

Where dependency-injection registrations have to be put?

I've read the question Ioc/DI - Why do I have to reference all layers/assemblies in application's entry point?
So, in a Asp.Net MVC5 solution, the composition root is in the MVC5 project (and having a DependencyInjection assembly in charge of all the registrations does not make sense).
Within this picture, it is not clear to me what is the better approach among the following.
Approach 1
The concrete implementations are public class ... and all registrations clauses are centralized within the composition root (e.g. in one or more files under a CompositionRoot folder). MVC5 project must reference all the assemblies providing at least one concrete implementation to be bound. No library references the DI library. MVC project can contain interfaces to be bound with no drawbacks.
Approach 2
The concrete implementations are internal class .... Each library exposes a DI 'local' configuration handler. For example
public class DependencyInjectionConfig {
public static void Configure(Container container) {
//here registration of assembly-provided implementations
//...
}
}
which is up to register its own implementations. The composition root triggers registrations by calling all the Configure() methods, just one for each project. MVC5 project must then reference all the assemblies providing at least one concrete implementation to be bound. Libraries must reference the DI library. In this case, the MVC5 project cannot contain interfaces (otherwise there would be a circular reference): a ServiceLayer assembly would be needed to hold public interfaces to be bound.
Approach 3
Same as Approach 2, but local configuration modules are discovered dynamically through assembly reflection (by convention?). So MVC5 project has not to reference libraries. MVC project can contain interfaces and can be referenced by libraries. Libraries must reference the DI library.
What is the best practice here? Is there some other better possibility?
EDIT 1 (2016-12-22)
Thanks to received answers, I published this github project describing the best solution I found so far.
EDIT 2 (2018-09-09)
This answer provides an interesting option.
EDIT 3 (2020-12-29)
Finally, I came up with a complete solution, packaged in the form of a WebApi application template. I published this solution on GitHub HERE. This approach, not only gives a clear understanding about where DI rules have to be put, but also suggests to setup the application according to SOLID principles and CQRS pattern. The commit history of this project has been structured to have educational purposes.
EDIT 4 (2023-01-31)
The repository linked above publishes an article describing the solution as well.
I typically like to encapsulate these types of things into each project. So for example I might have the following. (This is an extremely simplified example, and I'll use the AutoFac in this example, but I'd imagine all DI frameworks have something like the following).
Common area for just POCOs and Interfaces.
// MyProject.Data.csproj
namespace MyProject.Data
{
public Interface IPersonRepository
{
Person Get();
}
public class Person
{
}
}
Implementation of Repositories and Data Access
// MyProject.Data.EF.csproj
// This project uses EF to implement that data
namespace MyProject.Data.EF
{
// internal, because I don't want anyone to actually create this class
internal class PersonRepository : IPersonRepository
{
Person Get()
{ // implementation }
}
public class Registration : Autofac.Module
{
protected override void Load(ContainerBuilder builder)
{
builder.Register<PersonRepository>()
.As<IPersonRepository>()
.IntancePerLifetimeScope();
}
}
}
Consumer
// MyPrject.Web.UI.csproj
// This project requires an IPersonRepository
namespace MyProject.Web.UI
{
// Asp.Net MVC Example
internal class IoCConfig
{
public static void Start()
{
var builder = new ContainerBuilder();
var assemblies = BuildManager.GetReferencedAssemblies()
.Cast<Assembly>();
builder.RegisterAssemblyModules(assemblies);
}
}
}
So the Dependencies look like:
MyProject.Data.csproj
- None
MyProject.Data.EF.csproj
- MyProject.Data
MyProject.Web.UI.csproj
- MyProject.Data
- MyProject.Data.EF
In this setup, the Web.UI cannot know anything about what is registered nor for what reason. It only knows that the EF project has implementations but can't access them.
I can drop EF for say Dapper extremely easily as each project encapsulates it's own implementations and registration.
If I was adding unit tests and had an InMemoryPersonRepository, how would I swap out the PersonRepository for my InMemoryPersonRepository?
Assuming we ignore any business logic layer and have an MVC Controller directly access our Data Accessor, my code might look like:
public class MyController
{
private readonly IPersonRepository _repo;
public MyController(IPersonRepository repo)
{
_repo = repo;
}
public IActionResult Index()
{
var person = _repo.Get();
var model = Map<PersonVM>(person);
return View(model);
}
}
Then a test using nSubstitute Might look like:
public class MyControllerTests
{
public void Index_Executed_ReturnsObjectWithSameId
{
// Assign
var repo = Substitute.For<IPersonRepository>();
var expectedId = 1;
repo.Get().Returns(new Person { Id = expected });
var controller = new MyController(repo);
// Act
var result = controller.Index() as ActionResult<PersonVM>;
// Assert
Assert.That(expectedId, Is.EqualTo(result.Value.Id));
}
You've identified a real problem. (One could say it's a good problem to have.) If entry application Areferences B, B references C, and B and/or C require some DI registration, that makes A (your entry application) responsible for knowing enough about the details of B and C to register all the dependencies.
The solution is to have a separate assembly that handles composing all of the registrations for B and C. A references that, and it provides all of the container configuration that A needs to use B and C.
The benefits are
A doesn't know more about B and C than it should
Neither A, B, nor C have to be tied to one particular DI framework like Unity or Windsor.
Here's an example. This is an event bus class that works best with a DI container. But in order to use it you shouldn't have to know all about the dependencies it needs to register. So for Windsor I created a DomainEventFacility. You just call
_container.AddFacility<DomainEventFacility>();
and all of the dependencies are registered. The only thing you register are your event handlers.
Then if I want to use the same event bus library with a different DI container like Unity I can just create some similar assembly to handle the same configuration for Unity.

Categories

Resources