We have different C# Visual Studio solutions in different Git repositories. Each repository contains source code that needs functional tests and these tests will be integrated with Azure DevOps. The idea is to have a single C# testing automation framework with generic steps, hooks and logic that can be used among all solutions.
Some ideas I came up with:
Having a separate repository with automation framework files and just copy and paste hooks, steps, configurations files, etc. for each of the solutions / repositories.
Create a SpecFlow project inside of each solution / repository and maintain one automation framework for each solution.
Use NuGet to pack the testing framework and install it into each of the solutions. I personally like this last approach but I am not sure how it can be achieved.
Any suggestions or ideas on how to solve it?
You can use Bindings from External Assemblies, which allows you to create a general class library containing your step definitions and hooks, and then reuse them in multiple test projects.
If using SpecFlow.json in your test project, add this:
{
... other JSON settings for SpecFlow ...
"stepAssemblies": [
{
"assembly": "NameOfYourSharedBindings"
}
]
}
If using App.config:
<specFlow>
<!-- other SpecFlow settings -->
<stepAssemblies>
<stepAssembly assembly="NameOfYourSharedBindings" />
</stepAssemblies>
</specFlow>
Just replace NameOfYourSharedBindings with the file name of the DLL file containing your shared steps and hooks. Be sure to add a project reference to your test project so it references the class library project containing your shared steps.
You are not limited to step definitions in another project inside the same Visual Studio solution. You can reference pre-compiled DLL files as well as NuGet packages. So, really, options 2 or 3 would work for you. Just pick which one makes the most sense for your use case.
Related
I quite like separating functionality across a few assemblies, for example a facade to a data provider, contracts for the data provider and the data provider implementation itself... to my mind, it makes it easy to unit test the individual components of a piece of functionality and easy to swap out one thing in the future (in the case of my example, it makes the data provider easy to swap out).
If I create a solution with 3 projects and use project references, when I dotnet-build on the entry assembly, all the references are copied to the output folder. When I dotnet pack the entry assembly project to create a NuGET package, only the entry asembly (not the contracts or the data provider) are included in the NuGET package
This appears to be by design; the documentation for .NET Core dotnet-pack states that
Project-to-project references aren't packaged inside the project.
Currently, you must have a package per project if you have project-to-project dependencies.
My question is - why is this the case? If I want to separate my code into logical assemblies, I am forced to either create separate NuGET packages and reference those, or simply lump all my code into a single assembly. Is there any way to include project references in a NuGET package?
I am using VS2017 / .NET Core v1.1 (csproj, not xproj)
A possible way to achieve the needed is to use a custom .nuspec file where you can specify the dlls you want to be packed
<PropertyGroup>
<NuspecFile>App.nuspec</NuspecFile>
</PropertyGroup>
Having this, dotnet pack will produce the package with respect to MyPackage.nuspec.
Furthermore, if you have like 3 projects with contracts and implementation and you don't want to add 3 package references, you can create a meta-package that simply has those 3 as dependencies and reference that single meta-package.
I would like to use MS VS2012's add as link functionality, meant for files, but instead for a certain block of code.
I've got a solution with lots of projects. I am creating a unit testing project that will house all the algorithms that exist in the other projects in the solution. I can copy over all the algorithms I want to test into a file in the new unit testing project, however I am also looking for a way to automatically update the code in the test file if say the code in the other projects updates. It is almost as if I want to create a reference to a code chunk in VS.
If no such functionality exists is there some sort of script I could create that updates the code in the test project every time I build?
EDIT:The reason I can not have testing code within the projects themselves is because the other projects in the solution are .NET Microframework projects and the .NET Microframework does not support the use of c# attributes which are being used with the NUnit testing framework. Hence the reason I can not have any test code within those projects. However, there are some algorithms in the driver files of the .Net MF projects that I would like to be able to test and these algorithms are independent of the project type, so I am looking for a way to keep this code in sync so that if any changes are made to the algorithm within the .Net MF projects the same respective change is made within the unit testing project without the need for manual copying.
EDIT: In the simplest terminology all I am looking for is some sort of script I can run to copy over certain code blocks from one project file to another project file.
I had a very similar problem where different parts of the same source code file were shared between different projects. Here is a fix that worked for me.
Suppose the source file File1.cs contains four methods, two of which are used by ProjA and two are used by ProjB, where ProjA and ProjB are different projects inside a Visual Studio solution.
To keep just one copy of File1.cs, use conditional compile symbols, e.g. PROJ_A and PROJ_B for the two projects. Put the methods used by ProjA under conditional compilation symbol #if PROJ_A and put the methods used by ProjB under conditional compilation symbol #if PROJ_B. Then add File1.cs as linked file to both projects and make sure that corresponding conditional compilation symbols are set on those projects.
If this is what you were looking for, let me know if you get any problems implementing it.
In my project I use some kind of SDK libraries written by external team. These libraries are using Prism. For some reasons we had to rollback to previous version of their SDK and now build is failed trying to find reference to Microsoft.Practices.Composite.dll. Am I right that this is how Prism library was called in earlier version or is it something completely different?
You are correct. Pre v4, Prism's dlls included Microsoft.Practices.Composite.dll but, as of v4, this has been rolled into Microsoft.Practices.Prism.dll along with some other functionality.
The documented list of changes is this:
The Composite Application Library was renamed to the Prism Library.
The Composite and Composite.Presentation portions of the namespaces were removed and the Composite and Composite.Presentation assemblies collapsed into a single assembly named Microsoft.Practices.Prism.
The Microsoft.Practices.Prism libraries for Silverlight and WPF now register the Microsoft.Practices.Prism.Regions, Microsoft.Practices.Prism.Commands, and Microsoft.Practices.Prism.ViewModel namespaces with the http://www.codeplex.com/prism xmlns definition.
Several reusable user interface (UI)–based behaviors were extracted into the Prism.Interactivity assembly, including the interaction request behavior.
You can now use MEF as the dependency injection container. This functionality required two new projects in the Prism Library solutions: Prism.MefExtensions.Desktop and Prism.MefExtensions.Silverlight. These projects create a new assembly, Microsoft.Practices.Prism.MefExtensions.dll, in the respective Desktop and Silverlight folders. Also included in the solutions are new unit test projects for the new functionality.
Source here.
Microsoft.Practices.Composite is from Prism 2.x
The currently version of Prism is 4.1!
Source: http://msdn.microsoft.com/en-us/library/microsoft.practices.composite.aspx
If you have problems with the upgrade/rollback, you might take a look into:
http://msdn.microsoft.com/en-us/library/ff921073%28v=PandP.40%29.aspx
and, more specifically about the SDK's assemblies:
http://msdn.microsoft.com/en-us/library/ff921144(v=pandp.40).aspx#AssemblyRef
While these documents talk about the upgrade process, it should help you understanding what should be taken care of during a rollback.
Whats the motivation behind using c# test project rather then c# class library project to hold my unit tests?
Thanks.
The test project will, by default, have all the MSTest references added for you automatically. Also some default examples, such as a simple example test, is created for you.
With the class project, you can build a test project from that too, but you manually have to add the MSTest references yourself. Not a major problem really, but the test project can save you time and hassle.
EDIT:
As noted in the comments, the big difference between the two project types is that, with a class project, you can choose whichever unit testing framework you like.
I wasn't able to find too much information about this Test project subtype identified by {3AC096D0-A1C2-E12C-1390-A8335801FDAB} however, according to https://msdn.microsoft.com/en-us/library/bb166488.aspx project subtypes can
include customizations:
- saving additional data in the project file,
- adding or filtering items in the Add New Item dialog box,
- controlling how assemblies are debugged and deployed,
- and extending the project Property Pages dialog box.
Indeed right-clicking on Test project in context menu -> in "Add" submenu you can see test items "Unit Test", "Ordered Test", "Generic Test". Adding "Unit Test" generates a test class template for MSTest.
Also it contains additional data like:
<TestProjectType>UnitTest</TestProjectType>
<IsCodedUITest>False</IsCodedUITest>
So it seems that this project subtype is rather MSTest specific.
So if not using MSTest (using xUnit for example) I prefer a class library.
Additional argument maybe the fact that in .Net Core project flavoring (subtyping) is not desired way to do things and indeed .Net Core test templates are class libraries. See discussion: https://social.msdn.microsoft.com/Forums/vstudio/en-US/061aaf74-bb13-4646-9d69-064f6f1b8ef6/net-core-project-subtypes
Bonus: xUnit.net.TestGenerator extension generates a Test project subtype if you don't select existing project (probably because it's just an xUnit adapter for VS generation tool)
We are doing a project that uses interfaces and Unity to resolve concrete implementations of classes.
My questions is the following: I need to get my dll's all into the same folder otherwise unity will not be able to resolve the interface etc. So according to me I have a couple of options:
1. Add the projects with the implementations as references and let VS copy the files to the output folder (for some reason this just feels like a hack)
2. Change the build location of all my projects to build to the same folder
3. Create a post build event to copy all the files needed to whereever they need to go
I have implemented to second option but this could lead to files in your build folder that should not be there. I am not a big fan of post build events, so I would like to ask from other people using Unity what they found to be the best solution for them.
Thanks in advance
The first approach sounds like the right one to me. Your project does depend on the implementation libraries; it doesn't express that dependency directly in code, but it requires them, so it seems reasonable to add a reference to them.
This is basically the same situation as where you've got three projects, where project A depends on project B, which depends on project C - you need to explicitly add project C as a reference within project A. Visual Studio doesn't work out transitive dependencies for you (at least it didn't the last time I checked).