After reading about using GIT with Visual Studio solution, I understand that it is preferable to use GIT per project, at least for project which represent a (re-usable) common library. If you have a solution with many projects specific to that solution, using only one repository could be acceptable. For common library, it appears logical to me to have one repository for it in order to be able to fix bugs in the common library from where you detect it and have only one history for all changes.
Using one GIT per common project means that you have more than one GIT repository for any solution that produce an executable that use one or more common library.
According to this: Visual Studio suggestion - Allow multiple Git repositories to be active at once, Visual Studio seems to not support many GIT repository seamlessly. (1995 request)
Does Visual Studio 2017 implement the previous suggestion and manage many GIT repository per solution? (ex: one per project for some projects and one per solution for the solution itself and all other specific projects to this solution)? In other words, does Microsoft will see and manage changes per project/GIT or do I have to works with only one GIT at the solution Level?
Just as a side one (this is not the primary question - the real question is in the previous paragraph): If Visual Studio does not allow multiple GIT repo to be active at once, wouldn't be better to stick with TFS for the moment for any development with common library?
This is a controversial topic. You have a few options:
Mono-repo. All your code lives in a single repository, whether they're split up into separate solutions or not is up to you. If your dependencies are "possibly reusable" then this is the simplest starting point. If you ever really start to reuse your components, you can always break them out.
Separate repositories + Package Manager (npm, Nuget). Put each reusable project in a separate solution, optionally in a separate repository. Have a CI build automatically create a package and publish it to the VSTS package management feed.
SubModules: Create a master repository with your solution, create a separate repository for each reusable component with just the csproj and the sources for that component. Visual Studio 2017 has rudimentary support for submodules. But with a separate git client on the side (Tower, SourceTree) or simply with some commandline mastery, you can make this work just fine.
Subtree: Create a master repository with your solution, create a separate repository for each reusable component with just the csproj and the sources for that component. Visual Studio 2017 has no support for subtrees at the moment. But with a separate git client on the side (Tower, SourceTree) or simply with some commandline mastery, you can make this work just fine.
Multi-repo: Create separate repositories for each project, put a solution alongside it. Manage each sub-component separately and there is no concept of a submodule tracking. Visual Studio will actively fight you in this setup, as it will only support a single master repository at a time
In the end it is your choice. Each solution has its own benefits and there are examples of each out there. The Windows sources are all stored in a single monstrous mono-repo with all its reusable components in the same repository. It has great advantages with regards to knowing which files and which versions work together and which won't. But the repository is so big that Microsoft built GVFS in order to allow their developers to work on a 300GB working directory.
If your components are currently not being re-used and if your tests are more integration tests than real unit tests, then it makes sense to join them up into the same repository.
You can always decide to fix this when the need arises. You can always try to keep these projects as separate as possible. The .NET route will most logically lead you to Nuget though... Also because it handles the versioning aspects and the ease of sharing between projects without having to build the sources everywhere.
While there has not been great support for multi repo solutions in the past, this is about to change. In Visual Studio 16.11 Preview 1, there is now a much more fully featured Git client, multiple repositories are detected and users can easily switch between them using a repository selector in the status bar.
See this blog for further information: Enhanced Productivity with Git in Visual Studio
Related
I have a test framework written in C#, I am using Selenium, nUnit and Specflow.
Visual Studio is the tool and I have the normal project folder structure (pages, hooks, steps, features).
Testcases are associated with AzureDevOps and the code is pushed to the repo and pipilines are working great.
My issue is that I want to hide the code under the level of the features.
The idea is that other people could create new feature files and create tests within them, but I do not want them to be able to see the code that is under those Gherkin sentences.
So they should be only able to create and write new features, but not seeing the code below.
Could you please give me some ideas how this can be achieved?
You can import bindings from external assemblies. In SpecFlow.json, add:
{
"stepAssemblies": [
{
"assembly": "Name.Of.Your.StepDefinitions.dll"
}
]
}
While this is typically used as a means to reuse step definitions in multiple projects, you could use this to "hide" the step definitions from test writers. The step definitions would be a pre-compiled DLL file that they can import into a different Visual Studio solution.
This requires two different solutions in Visual Studio:
A regular class library that holds step definition classes. You will probably need to choose .NET Core or .NET Framework 5+, since it must integrate with a unit testing framework. I don't believe .NET Standard would work (but you can certainly try).
Another Visual Studio solution that testers can use, which contains the feature files. This project would reference a DLL file you would create from project #1.
The challenge is disseminating this DLL file. Any update you make would need to be given to the testers' Visual Studio solution. A couple of options come to mind:
Manually update solution #2.
Create a NuGet package for solution #1. Set up a build pipeline in Azure DevOps to build, package, and push this NuGet package to your project's "Artifacts" feed whenever you make code changes. Someone (you or a tester) would need to update that NuGet package in solution #2.
As an alternative, give everyone access to the features and step definitions, but enforce a pull request policy on the repository requiring you as a reviewer. That way you can protect the main branch as the gatekeeper. Testers can submit changes to step definitions, and you are free to reject their pull requests if the changes are not satisfactory.
This, of course, depends on the team structure. If you have outside contractors, sharing the step definition code might not be desirable. Just don't limit people's abilities simply because you don't trust their skill level. Use pull request policies requiring you as a reviewer if they change step definition files. This allows you to benefit from someone else's labor while still giving you control.
I am currently working on a personal project but I want to use Azure & Visual studio online build facilities for self teaching purpose. I am having a hard time resolving this problem :
I have a wpf app connected to an azure web api.
Wpf app is in its own Git repo, web api is also in its own Git repo.
Since both apps shared a common model, I put common model in its own repo as well to avoid code duplication.
I must be missing something ....
What I want to do
When I build on Visual studio online, I want to build "common" and feed its output dlls to webapi and wpf apps so that they can reference the model.
Solutions considered so far
nuget package
making a nuget package of "model" but where do I push it ? It's definitively not going to be of any value to nuget.org so no go.
I would need some private nuget repo in visual studio online, not sure it exists.
postbuild event
I also considered adding a post build event to the "common" build and copy its bin*.dll output to wpf and webapi apps to some "dependencies folder" but I find this dirty, Moreover I am not sure a build can push its output to the input of another build (I know Jenkins can but I am unsure about visual studio online), Moreover, how can I reference dlls which do not exist yet in my csprj ?
commit bins in repo (ugh)
Of course, I could build model locally and push the resulting dll in the git repos but, well, I am against putting binaries in versioning tools :)
Change my design
Consider that WPF only needs dto and not the real entities (which is true) but webapi will need to deserialize dtos anyway so back to square one, but with dtos this time :)
Thanks for your input !
Thanks a lot to CrowCoder !
That's exactly what I needed : using the "Package Management" extension in visual studio online, which is free up to 5 users.
Steps required :
configure nuget on my local machine,
create the nuspec,
create the feed,
package the model library
configure the build to push the library to the feed
use nuget packages to reference model
I'm trying to implement my own feature request to the Visual Studio extension CommitFormatter, and I need the git diff patch for that. I could use libgit2sharp (which I expect to be easy), however, that will pull in an additional dependency. A burden I don't want to add on the extension, if it's not needed.
I expect that it's possible to get the same using one of the APIs of Team Explorer, but I'm a bit overwhelmed with the amount of libraries Microsoft.TeamExplorer.*.dll libraries that Visual Studio contains, and cannot find any good MSDN pages for this.
What I want to achieve is to get the "diff patch" of the staging area, the output that git diff --cached" from the command line gives you, but then using the Team Explorer API. Similar to what libgit2sharp's repo.Diff.Compare<Patch>(repo.Head.Tip.Tree, DiffTargets.Index) would give you.
There is no Microsoft.TeamExplorer assembly that provides a git diff, public or private.
Depending on the version, Team Explorer either uses LibGit2Sharp to interact with the git repository (prior to VS 2017) or uses git (VS 2017).
However, no version actually creates git diff files. The difference view takes the raw files out of the repository and calculates the differences and displays them itself, it does not use patch files as an input or as an intermediate step.
You should either use LibGit2Sharp or call git to produce a diff.
So I have a solution which contains 4 projects, a "Core" Project which is the actual application (as a class library), and 3 wrapper projects, "Console", "WinForm" and "Service" which basically wraps a Facade class in the core class and contains various settings to handle different logging strategies for each different application (Console/Trace/File) and launch the application as either a Console, WinForms or Service, depending on how the customer wishes to deploy the application.
In the Core project I have 3 resource files which contain simple template views for the Nancy web framework. However the way Nancy looks for these views are on the current path. Since the files in the Core project aren't on the current path for any of the 3 other projects I need a simple way to access these files across projects.
Somewhat naively I thought this was where the concept of a "Solution" came in, to handle dependencies between projects. However by searching the Internet, much to my surprise, it appears there is no elegant way to do this. The only two solutions I've been able to find involves copying the files to a scratch/temporary or directory in the solution, and copying them to the respective needed directories later, as post build actions, and Adding an item manually using "Add as Link". Now while both these solutions technically work, the first leaves (possibly out-of-date) build artefacts lying about where they don't really belong (IMHO), and the second is tedious, time-consuming and prone to human error (because you can't just link to a directory).
Are these really my only two options, or is there some third, totally obvious way I've just missed because I'm new to Visual Studio?
You could use a custom IRootPathProvider in Nancy, if the only things you need are Nancy specific.
The other option is to link a folder - you can do this, but it involves manually hacking on the csproj file, there's a few questions on here about it, including this one:
Visual Studio Linked Files Directory Structure
Nuget is a package management system, that I have used to share artifacts between projects as dependencies. You could include libraries available via nuget.org or have your own nuget packages defined.
Teamcity has got good support for generating nuget packages with every build and can serve as a Nuget server.
Here is a reference to include files into a nuget package.
In a quest to reduce the dependencies in my projects, I now have everything depending on and implementing interfaces, and they are glued together by an IoC container. This means projects need only to have direct references to such interface libraries.
However, if you don't specify the project as having a reference to the implementation (even though you don't need it at compile time) the implementation libraries are not included with the executable or in the setup project.
Is in a way Visual Studio promoting bad practices by requiring explicit references when they are not needed? Is it possible to have the dependencies only to the required interfaces and in this case what is the best method to get the implementation libraries available?
Is in a way Visual Studio promoting bad practices by requiring explicit references when they are not needed?
Not really. Your issue is one of deployment, not building.
Deploying an application and building it are separate things - if you have a good deployment strategy, you will be deploying implementations where they belong.
Is it possible to have the dependencies only to the required interfaces and in this case what is the best method to get the implementation libraries available?
The easiest way is indeed to reference the implementation assemblies. This will definitely make building and running locally as easy as F5, but do you really want that? To be honest, if you and your team have the discipline to only code to interfaces, that's all you need (and there are static analysis tools like nDepend that help with ensuring that remains the case).
One way forward is to create a deployment script that will deploy all dependencies whether local or elsewhere.
Visual studio does not require these references, but your IoC container does.
When adding a reference to the project, its binaries are automatically included in the output folder, which is necessary for your IoC container to glue the code together. There are other ways to get these binaries to the output folder than referencing their projects in Visual Studio - perhaps a post-build step?
No. It is simply the minimum they need to do in order to give developers working code without them having to do anything extra (aside from hitting F5) or for all references to be added by default (which would likely be a mess and slow on older hard discs.
For local development builds, you can simply have a post-build step on the relevant project to copy the DLLs to the main working directory. Just make sure the project is added to the active configuration to be built, other wise you'll go through a whole annoying debug session on the post-build only to realise there was no-build... lol
VS 2010. Post-build. Copy files in to multiple directories/multiple output path
Copy bin files on to Physical file location on Post Build event in VS2010
For full-scale application deployment, you'd likely be looking at Visual Studio setup projects at a bare minimum, but more ideally something like WiX or another deployment tool-set.