We are looking to implement reusable functionality within more than one of our products.
What I would like to do is:
Create a C# project that contains one or more Azure Functions (static methods with the FunctionNameAttribute attached to them)
Turn this project into a NuGet package
Reference this NuGet package in an Azure Functions project
Have the function(s) from the NuGet package exposed in the project it is being used in
Whilst I find that it all compiles, the functions in the NuGet package are not "found" on startup and are ignored. I can see that this could be desirable for security reasons, but I'm wondering if there is a way to overwrite how the functions runtime operates to say "scan this assembly and use any functions contained within it".
Even having that custom code would be preferable to the situation in which we find ourselves - having joint functionality in our package that is referenced by two different products, but each of the products having to create a duplicate set of functions that then call the common code.
I think that the best approach for this would be through git submodules and not Nuget packages.
Create a project that both products will reference. A base class that will have some methods that you can overwrite or just reuse in both solutions by inhering it from the base class.
Following your comment, let's try this workflow:
1 - Write a AZF project with Swagger, for all the Product common operations.
2 - Write a new Project to have the ProductBase class that will consume all the Product REST endpoints.
3 - Convert the Consume project to a Nuget Packages.
4 - Install it in both Projects A and B, and Inherit the Base product class
Related
I'm not new to C# programming, but I suppose I'm new to programing "the right way" in C#. I've worked in C on embedded devices for years and have written desktop apps to support them. First in VB6, then in C#.
I recently started making better use of classes for reusing code (and for instantiating more than one instance of the class in a program). For example, I "wrapped" a UART interface with some additional functionality so I can use the same code for multiple ports by creating an instance of the class for each one.
It is in a separate file, but still in the same program namespace, so when I want to reuse it, I have to copy the file and change the namespace to the new project.
I'm sure there's a way to create it such that I can just reference it like everything else with either a "using..." reference at the top of the program or with a "Project | References..." checkbox. But for the life of me I can't find a good learning journey for this.
Any direction would help.
You want to create your reuseable class in an assembly - this is the equivalent of a dll from your C experience.
To create an assembly, have a separate project of type assembly (instead of exe) . You can reference the assembly from other projects. If your project is in the same solution you can reference the project, otherwise you can reference the compiled assembly.
C# uses a packaging system called Nuget, so you can package your assemblies into "Nugets" which you host in a Nuget Server. You can then use tooling to discover and import these.
Please create a Class Library project and include your class into that project. Make sure your class is public. Once you build this project you'll get an assembly which can be referenced from other projects. See Tutorial: Create a .NET class library using Visual Studio
There are different ways of referencing it.
You can have the class library project in the same solution as the main project. In this case you should add a project reference.
You can copy the compiled *.dll file to some folder in your solution (e.g. Lib) and add an assembly reference.
If this assembly is to be used in multiple projects please consider creating a NuGet package with this library and pushing it to some repository. Then other projects can add a package reference to this package.
Details:
How to: Add or remove references by using the Reference Manager
Install and manage packages in Visual Studio using the NuGet Package Manager
It is in a separate file, but still in the same program namespace, so when I want to reuse it, I have to copy the file and change the namespace to the new project.
Well, it isn't the best practice but (unfortunatly) still a common behavior. So don't worry to much about it.
What you could do to improve it place the file (and other reusable parts) in a seperated csproj.
For example name the project of the type class library and name it VinDag.Tools. Within the project create a folder UART and place the wrapper there. The namespace of the wrapper would then be VinDag.Tools.UART.
From know on you can just reference the class library instead of renaming the file. It's not necessarily required to be the same namespace as the project.
From there you can start considering (private) nugets. This would prevent you from copying files/csproj around.
I'm trying to use the Fody MethodDecorator to log method entry/exit in multiple projects. I've created a project in visual studio and implemented the IMethodDecorator interface as described in the ReadMe.md in the Fody MethodDecorator project.
The Fody MethodDecorator works fine in the one project where I've implemented the IMethodDecorator interface and I can log method entries/exits. But when I try to log method entry/exits in other projects by adding a reference to my first project, there is no logging in any other projects.
I've created a github repository with a minimal working example just to show the issue, click here for link.
There are others that have had the same issue, like this post from 3 yeas ago. But the solution in that post is to add the Fody and MethodDecorator nuget packages to all projects. I have already added both nuget packages to all projects. I have also added the FodyWeavers.xml to all projects.
If you want to know what my code looks like, just check out the git repo. Is there anything else I have to do to get logging of method entry/exit working in other projects?
I would like to simply add a reference to my first project and reuse the method decorator from the first project. I'm using .Net 4.8, Fody 6.5.3 and MethodDecorator.Fody 1.1.1.
Edit
If you download the repository and get the error: A numeric comparison was attempted on "$(MsBuildMajorVersion)" that evaluates ... This is apparantly due to a bug in Visual Studio, that can be overcome by a restart.
Not really an answer to the question, but I ended up switching to MethodBoundaryAspect.Fody.
It's a nuget package that does the same thing as MethodDecorator.Fody, and has an interface that you implement in an almost identical manner.
I got MethodBoundaryAspect.Fody to work in multiple projects, by referencing an attribute that I only created in one of the projects. You still have to add the nuget packages MethodBoundaryAspect.Fody and Fody, and also add the FodyWeavers.xml file to each project though.
For Fody.MethodDecorator to work in other projects, you need to add the module
[module: Interceptor] in each project.
My suggestion is to create an Interface as below in each project, and make sure your classes inherit from the interface. ( This way you only added the module at one place in the project)
[module: Interceptor]
public interface IInterceptor
{
// no methods here.
// the interface is used only to register the Interceptor attribute at the module level.
}
public class SomeClass : IInterceptor
{
[Interceptor]
public void SomeMethod()
{
}
}
We have different C# Visual Studio solutions in different Git repositories. Each repository contains source code that needs functional tests and these tests will be integrated with Azure DevOps. The idea is to have a single C# testing automation framework with generic steps, hooks and logic that can be used among all solutions.
Some ideas I came up with:
Having a separate repository with automation framework files and just copy and paste hooks, steps, configurations files, etc. for each of the solutions / repositories.
Create a SpecFlow project inside of each solution / repository and maintain one automation framework for each solution.
Use NuGet to pack the testing framework and install it into each of the solutions. I personally like this last approach but I am not sure how it can be achieved.
Any suggestions or ideas on how to solve it?
You can use Bindings from External Assemblies, which allows you to create a general class library containing your step definitions and hooks, and then reuse them in multiple test projects.
If using SpecFlow.json in your test project, add this:
{
... other JSON settings for SpecFlow ...
"stepAssemblies": [
{
"assembly": "NameOfYourSharedBindings"
}
]
}
If using App.config:
<specFlow>
<!-- other SpecFlow settings -->
<stepAssemblies>
<stepAssembly assembly="NameOfYourSharedBindings" />
</stepAssemblies>
</specFlow>
Just replace NameOfYourSharedBindings with the file name of the DLL file containing your shared steps and hooks. Be sure to add a project reference to your test project so it references the class library project containing your shared steps.
You are not limited to step definitions in another project inside the same Visual Studio solution. You can reference pre-compiled DLL files as well as NuGet packages. So, really, options 2 or 3 would work for you. Just pick which one makes the most sense for your use case.
I have a project that requires certain functions, and I have created an interface for these functions. If the implementations of the interface is to be performed by external parties, what should be the correct way to do this?
The idea I have is to create a class library project (MyInterface), define my interface (IModule) in this project, compile this and give the DLL (MyInterface.dll) to the external parties. The external parties would develop/implement using the DLL as a reference, then give me their final DLL. My main project would reference the MyInterface project, as well as all the implementations. In my code, I would do something like this:
IModule module = new ImplementationA();
module.DoSomething();
module = new ImplementationB(); // Change implementation at runtime
Is this approach correct? Are there better alternatives?
Side question: Is this strategy design pattern? Or facade?
I would probably use configuration to load the 3rd party assembly at runtime, look for a single factory type, create a singleton instance, and create each implementation class through the factory. This way you can give all of your assemblies to a 3rd party, and they can build and update their component at any time. They depend on your code, but you don't depend on theirs.
I don't know much about your project or situation, but I would consider publishing a nuget package of your interface and allow the third parties to install it. That gives you the power of versioning when you need to make changes to your interface(s). You publish the change(s) as a new version and then they can update their installed package of your dll accordingly. That at least gives them a concrete way to develop against what you require in a controlled, robust manner.
In case you aren't familiar with nuget packages, in Visual Studio you can right click the necessary project with the dll within the solution and select Pack. You can also change pack settings by going to the Package tab in the project properties (same context menu on the project, select Properties).
You can also do dotnet pack through the command line, which has a lot of command line arguments that you can leverage.
Whichever way you go about it, you can publish your nuget package to nuget.org, to some other service with an artifact feed, or simply to a file on disk.
Helpful Nuget Package References:
Installing a nuget package
Create and publish a nuget package
As for your question about the pattern, typically when you switch implementations like that at run time that is the strategy pattern at work. The facade is when you use a single class to abstract away multiple instances of different sub-system classes. So for example lets say you have operation classes for GetUser, UpdateUser and DeleteUser. Your facade could be a single class called UserManager and within that class you would expose functions of each user operation. Inside those functions you would be accessing each individual instance of each operation class and passing the call on to those functions internally. In this example, inside the facade you know you are working with 3 classes. From the perspective of code/callers outside of it, they only know about the single UserManager class.
I quite like separating functionality across a few assemblies, for example a facade to a data provider, contracts for the data provider and the data provider implementation itself... to my mind, it makes it easy to unit test the individual components of a piece of functionality and easy to swap out one thing in the future (in the case of my example, it makes the data provider easy to swap out).
If I create a solution with 3 projects and use project references, when I dotnet-build on the entry assembly, all the references are copied to the output folder. When I dotnet pack the entry assembly project to create a NuGET package, only the entry asembly (not the contracts or the data provider) are included in the NuGET package
This appears to be by design; the documentation for .NET Core dotnet-pack states that
Project-to-project references aren't packaged inside the project.
Currently, you must have a package per project if you have project-to-project dependencies.
My question is - why is this the case? If I want to separate my code into logical assemblies, I am forced to either create separate NuGET packages and reference those, or simply lump all my code into a single assembly. Is there any way to include project references in a NuGET package?
I am using VS2017 / .NET Core v1.1 (csproj, not xproj)
A possible way to achieve the needed is to use a custom .nuspec file where you can specify the dlls you want to be packed
<PropertyGroup>
<NuspecFile>App.nuspec</NuspecFile>
</PropertyGroup>
Having this, dotnet pack will produce the package with respect to MyPackage.nuspec.
Furthermore, if you have like 3 projects with contracts and implementation and you don't want to add 3 package references, you can create a meta-package that simply has those 3 as dependencies and reference that single meta-package.