I have a project that requires certain functions, and I have created an interface for these functions. If the implementations of the interface is to be performed by external parties, what should be the correct way to do this?
The idea I have is to create a class library project (MyInterface), define my interface (IModule) in this project, compile this and give the DLL (MyInterface.dll) to the external parties. The external parties would develop/implement using the DLL as a reference, then give me their final DLL. My main project would reference the MyInterface project, as well as all the implementations. In my code, I would do something like this:
IModule module = new ImplementationA();
module.DoSomething();
module = new ImplementationB(); // Change implementation at runtime
Is this approach correct? Are there better alternatives?
Side question: Is this strategy design pattern? Or facade?
I would probably use configuration to load the 3rd party assembly at runtime, look for a single factory type, create a singleton instance, and create each implementation class through the factory. This way you can give all of your assemblies to a 3rd party, and they can build and update their component at any time. They depend on your code, but you don't depend on theirs.
I don't know much about your project or situation, but I would consider publishing a nuget package of your interface and allow the third parties to install it. That gives you the power of versioning when you need to make changes to your interface(s). You publish the change(s) as a new version and then they can update their installed package of your dll accordingly. That at least gives them a concrete way to develop against what you require in a controlled, robust manner.
In case you aren't familiar with nuget packages, in Visual Studio you can right click the necessary project with the dll within the solution and select Pack. You can also change pack settings by going to the Package tab in the project properties (same context menu on the project, select Properties).
You can also do dotnet pack through the command line, which has a lot of command line arguments that you can leverage.
Whichever way you go about it, you can publish your nuget package to nuget.org, to some other service with an artifact feed, or simply to a file on disk.
Helpful Nuget Package References:
Installing a nuget package
Create and publish a nuget package
As for your question about the pattern, typically when you switch implementations like that at run time that is the strategy pattern at work. The facade is when you use a single class to abstract away multiple instances of different sub-system classes. So for example lets say you have operation classes for GetUser, UpdateUser and DeleteUser. Your facade could be a single class called UserManager and within that class you would expose functions of each user operation. Inside those functions you would be accessing each individual instance of each operation class and passing the call on to those functions internally. In this example, inside the facade you know you are working with 3 classes. From the perspective of code/callers outside of it, they only know about the single UserManager class.
Related
I want to create a class library and publish it to artifactory. However, I have organized my solution such that there is a public "Client" project, then an internal "Domain" project that contains business logic that I do not want to publish.
My issue is that apparently I have to publish the Domain project if the Client project depends on it. If I don't, then when I attempt to install the Client dependency in another project, I get an error stating that Domain cannot be found.
Is it possible to accomplish what I'm looking for, i.e., an entire assembly that is only accessible to another assembly in the same solution? The key point is that I want the classes within Domain to not be accessible to other projects. I know I could put them in the Client assembly and mark the internal, but I prefer the organization of having an entirely separate project, as well as not having to mark every single class internal.
I have a question that has been bothering me for awhile. I ran across this problem a few years back when I was dealing with writing a generic logging wrapper around some hosted provider instances using log4net.
The idea was simple enough, I wanted to write a logging and metrics that hid all the implementation in a separate visual studio project so when you wanted to add any telemetry support to another application you could just include the project, new up an instance of the logger and start logging using generic calls. If you ever switched providers or tweak logging settings, it wouldn't require any changes to the host applications.
This creates a strong decoupling point, where the main application used an interface in a logging class library, but would know nothing about the packages or providers that the logging class library was using to do the real work.
When I did this and tried out using Loggly's nuget package and log4net, I found that the calling application had to have a ref to the nuget package or else the dependent assembly would not be copied to the build directory. At the time I just wrote this off as something odd that they Loggly engineers were doing. But I have since encountered the same behavior in some, but not all other packages. (DogstatsD doesn't have a problem, Raygun does, etc..)
I have noticed that some nuget packages in assemblies are automatically copied into the parent output directory, but when I look for the setting that controls this, I cannot find it.
I have written dozens of class libraries over the years, and I have never had a problem with 'chained dependency assemblies (a refs b, b refs c, etc.) resolving when I build. It only seems to be some nuget packages that are a problem.
How do I force nuget packages referenced by a class library project to copy into the build directory without an explicit reference in the application?
Ok, I figured this one out.
This is a problem only for the Log4Net & Loggly wrapper assembly combo in particular because it is referenced entirely at runtime. Log4net loads up its required log appenders at runtime, and because .net doesn't see a ref to the assembly at build time, it assumes that it isn't being used and omits copying the required assembly to the bin directory. The solution when you know this is simple, just write an empty dummy method in the referenced library that you can call in the main application. This will cause .net to include the assembly in the build.
While this problem is specific to the Log4net library, it could occur anywhere that you are using an assembly that is only used with runtime reflection.
We are looking to implement reusable functionality within more than one of our products.
What I would like to do is:
Create a C# project that contains one or more Azure Functions (static methods with the FunctionNameAttribute attached to them)
Turn this project into a NuGet package
Reference this NuGet package in an Azure Functions project
Have the function(s) from the NuGet package exposed in the project it is being used in
Whilst I find that it all compiles, the functions in the NuGet package are not "found" on startup and are ignored. I can see that this could be desirable for security reasons, but I'm wondering if there is a way to overwrite how the functions runtime operates to say "scan this assembly and use any functions contained within it".
Even having that custom code would be preferable to the situation in which we find ourselves - having joint functionality in our package that is referenced by two different products, but each of the products having to create a duplicate set of functions that then call the common code.
I think that the best approach for this would be through git submodules and not Nuget packages.
Create a project that both products will reference. A base class that will have some methods that you can overwrite or just reuse in both solutions by inhering it from the base class.
Following your comment, let's try this workflow:
1 - Write a AZF project with Swagger, for all the Product common operations.
2 - Write a new Project to have the ProductBase class that will consume all the Product REST endpoints.
3 - Convert the Consume project to a Nuget Packages.
4 - Install it in both Projects A and B, and Inherit the Base product class
I've searched for an answer on google using:
"The type 'Microsoft.SqlServer.Management.Smo.Server' is defined in an assembly that is not referenced."
Why does using Microsoft Sql Server Management Objects (SMO) in the DAL require references to SMO dlls in a referenced project?
using sql smo in referenced projects
sql smo in layered solutions
sql smo reference requirements
and probably a few others and have not found a solution or explanation to this issue.
Admittedly, I'm only a journeyman googler, so if someone wishes to power level me and point the way to an existing resource, I'll gladly go spelunking from there.
Here's my set up:
I've got a layered solution: DAL, Business Logic, Services, UI. There's a hosts project that hosts the services. I'm actually using the VS2010 extension layerguidance.codeplex.com, which is quite nice, to set up all these projects. I'm using SQL Server 2008 Express and SMO v 10. All solution projects are referenced using Project References. All projects compile to a common top level Bin folder.
Now the problem:
Among the classes in the DAL I have an SmoTasks class which handles interfacing with SMO objects and a Utilities class which abstracts from SmoTasks and provides acces to its functions without requiring any SMO objects for parameters, so that referencing projects (read: Business Logic Layer) can interface using non-SMO types. All is well in the DAL, it compiles fine, the methods pass their tests - it feels good about its place in my world. Then in the BLL I have a component which handles using the Utilities class to perform database configuration for the application which will be exposed via the services. The BLL uses a project reference to the DAL and sees the DAL classes (a la intellisense) as expected. When I compile though, I get:
The type 'Microsoft.SqlServer.Management.Smo.Server' is defined in an assembly that is not referenced. You must add a reference to assembly 'Microsoft.SqlServer.Smo, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91'.
The code in BLL looks like this:
public bool CreateTables(string connectionString)
{
bool result = default(bool);
// Data access component declarations.
Utilities utilities = new Utilities();
// Step 1 - Calling CreateTables on Utilities.
result = utilities.CreateTables(connectionString);
return result;
}
The line the error points to is:
result = utilities.CreateTables(connectionString);
I could, obviously, add the SMO references to the BLL and then the BLL would be happy, but that violates my design goal of loosely coupled projects. If I add the SMO assemblies to the BLL, it compiles and then referencing BLL objects in the services layer doesn't cause a complaint. My question is, why? More specifically: Why does the BLL need references to SMO when the Utilities class in the DAL already abstracts away the SMO types?
What I want is everything database related to live in the DAL (duh) and only business logic in the BLL (double duh). Is there another way to achieve this using SMO that I have overlooked?
Thank you for your valuable time and answers, I humbly await your responses
Edit:
I've adjusted my solution based on suggestions by Chris, verified that I'm using project refs (I am), readded the references to SMO in the DAL using Muse.VSExtensions to add GAC reference, before I had been browsing and adding manually, then I went ahead and set Copy Local = True for those assemblies just to be doubly sure they're around... but I'm still stuck with this annoying compile error.
I think this boils down to how things are referenced in your solution. So I'm going to take a couple guesses.
It sounds like your DLL references the DAL as an assembly instead of as a project reference.
During compile time Visual Studio copies everything it thinks is necessary to the projects BIN directory. If you reference an external DLL (DAL) then it will copy that DLL only to your BLL's BIN directory.
What you need to do is get it to copy the SMO assemblies as well OR have those SMO assemblies available through the GAC. Personally, I don't like GAC'ing things, so I'll ignore that.
There are three ways of doing this. The easiest is to simply add a reference to those other assemblies to your BLL. Obviously that's not what you want.
The second way is to reference the DAL project as a project reference. This will allow Visual Studio to detect the extra dependencies and copy them accordingly. This is also not exactly what you want as well.
The third way is to copy them as part of a build step. Right click on your BLL project and go to Build Events. In the Pre-build event command line put in the commands to copy the necessary SMO files to your BLL projects BIN directory.
You'll have to do this again for the main service project as well.
It's depressing to answer your own question with a mea culpa, "I'm an idiot"... but, well, I'm an idiot:
In Utilities there was an overload for the offending method which did contain an Smo.Server parameter. I removed that overload (an artifact from testing before refactoring) and voila, problem solved/idiocy confirmed! The interesting thing I learned here is that using the other methods of the Utilities class, which did not have overloads containing Smo objects, was absolutely fine, meaning even with a function in the Utilities class which required an Smo object for a parameter, as long as I didn't call that method or one of its overloads, the references resolved perfectly without a hitch. The new question I have, is why? Why does that overload's existence matter for reference resolution if I call another version of that function from a project higher in the dependency chain? Is there some internal loop where it goes over all versions of a function checking references if any version has been called...
I have created several small applications that use my own DLL. The problem is, this DLL is constantly changing. My current solution to this problem is that I have a Setup project in the class library solution that creates and registers the DLL. In all my applications I then have to open the solution and re-reference the newly created/registered DLL. Then I have to re-compile their setup projects, uninstall the old applications, and then re-install the new application.
There has to be a better way and I'm just not sure because I'm fairly new to all this. I have looked into ClickOnce but I don't think that will solve my issue as I cannot publish a class library. I have looked into checking version numbers but I must be doing something wrong because it doesn't work either.
I understand that once a DLL is created and being used in an application it should essentially not be touched. I do not have that option in this situation. It is constantly updated. Done.
So, is there a better way? A point in the direction of a guide or related question/answer/forum would be greatly appreciated.
Edit: The DLL is not constantly changing during runtime but it is constantly evolving to allow more functionality and detail within the other applications. Also, one big thing I guess I should have mentioned is the Public interface is constantly chaning - usually adding new methods.
Make sure the references to your DLL specify SpecificVersion=false. Then just deploy each new version into the GAC and that should do the trick.
Eventually, you can also manually force versions using Binding Redirection.
A solution you can try is to use a single solution for your project and reference the project wherever it needs to go.
Check out NuGet
You could set up an internal Nuget repository (really just a folder that stores nupkg files.) Then when you build a new DLL, you can update the apps as needed in studio. This would ensure it had the latest version. They shouldn't need a redployment unless there are bugs in the DLL that you're fixing.
One solution is as follows:
Physically separate the interface from the implementation. e.g. AssemblyA is the interface, the application (AssemblyB say) knows only the interface at compile time. The implementation (AssemblyC) also knows/references AssemblyA of course. The point being that AssemblyB does not reference AssemblyC. This will require you to use an IoC container (like MS Unity 2.0 but there are many others) in order to resolve and instantiate your concretes at runtime.
Write an update process that finds the new AssemblyC.dll, replaces the local copy and uses reflection along with the IoCContainer to 'load' the new implementation at what ever interval you require, typically app start up.
The above relies on your interface being stable. If it isn't, you may be able to write a (more) stable Facade.