Due to the wording (many meanings of "extension" and "method"), I am utterly unable to find any information about my question on the interwebs. So I am asking here:
Is it possible to deploy "extension methods" with a VSIX extension?
Clarification:
By "extension method" I mean something like
public static string SomextExtension(this string s, string p) {
retrun s + p;
}
By VSIX extension I refer to a custom extension, that gets installed via a
SomeCoolExtension.vsix
The goal is:
A user installs the VSIX, gets a few features (mainly custom code generators in my case) and additionally has access to "Hello".SomeExtension(" World"); within their source code.
I am slowly thinking this just isn't possible as I have tried everything I could come up with and as stated in the beginning, it is virtually impossible to search for it on googls.
If it really is impossible, I would at least love to understand why.
So a simple "no" might be a valid answer, but if you could elaborate, that would put my coding soul to rest :-)
Specs: I am using VS2017 and the new "Visual Studio AsyncPackage", but if you know an answer for older version, I sould be happy to try them.
You can use both approaches:
An extension (.vsix) provides the greatest flexibility, because it can provide commands (buttons on menus, context menus and toolbars) that, on demand by the user, can 1) Insert code in the active document 2) Add files with code or other assets 3) Add references to Dlls that the extension can deploy in the source folder of the project, etc. Also, it can do all that not only on demand, but automatically watching events and examining if some conditions are met, for example, a solution is created, or it is loaded, a project is added, it already contains a code file or not, it already has a reference or not, etc. Needless to say, all this flexibility comes from some complexity.
A NuGet package can add DLLs to the references or code files to a project, it can execute a PowerShell script when the NuGet package is added to a project that can modify the project, it can modify the build process adding new MSBuild targets/tasks (being the Microsoft.VSSDK.BuildTools NuGet package to create a VSIX extension the prime example). It is a one-time operation during installation on a project. After that, no events, no commands, etc. but for most scenarios is much simpler.
Related
Ever since I've been using the (relatively) new .NET Standard Library project type in Visual Studio, I've been having some problems getting a complete set of DLL files that are required by my project.
The problem is usually limited to 3rd-party libraries which I reference as NuGet packages. I've noticed that these don't get copied to the output folder of my project when I build it. This didn't use to be the case in classic project types.
While I can appreciate the de-cluttering effect that this change has brought for .NET Standard projects, I'm now faced with a problem. I sometimes absolutely need to be able to get the entire list of all files that my project depends on!
I have several different cases, where I might require this list for one reason or another, but the one I believe is most crucial for me, is when I want to gather these files from the csproj itself, right after it's built. In there, I have a custom MSBuild <Target> which should take all the files from the output dir and zip them together for distribution. The problem is, I'm missing all the files that come from NuGet dependencies, because they're not there!
How can I solve this in a general (i.e. not project-specific) way?
UPDATE
There's this deps.json file that contains basically all I'm after and then some. It's just a matter of extracting the relevant information and find the files in the local NuGet cache. But that would involve writing a specialized app and calling it from my target. Before I start writing one myself... Is there something like this already out there somewhere?
I followed this answer and it sort of works.
The suggested thing was to include the following into my csproj:
<CopyLocalLockFileAssemblies>true</CopyLocalLockFileAssemblies>
My main concern is that it also outputs some other DLLs from the framework (such as System.Memory.dll and System.Buffers.dll, among others), which I didn't expect. But maybe that's a good thing. They do seem to be dependencies, just not direct ones. I'll see how it plays out.
If it turns out ok, my only wish would be that this directive was more prominently displayed in project settings (as a simple checkbox, maybe?) so I wouldn't have to hunt the web to find it.
How do I build sample code, split into folders in a repo, from a class or tutorial, in Visual Studio?
So - I'm pretty much a noob at C#, I've gone through a lot of tutorials and browsed through some large C# projects from work and built them, and done some other minor things. I'm going through a course on writing testable code on Pluralsight. He has a public Github repo for the code examples, writing-testable-code. I connected to the repo and downloaded it okay into a local Git repo. I was able to download all the packages from NuGet and they are all showing as the version he used (a few have updates, but I figured updating might break things).
I can't figure out how to run this code, build it, or run the tests in it.
What I tried so far
My issue is - I open the solution, and there are a bunch of files and folders - each module/chapter is split into folders (i.e. Module1/Easy, Module1/Hard, Module2/Easy, etc.). I want to build the Module1/Easy folder, including unit test examples, and run the tests.
When reviewing Module1/Easy, it has 3 files that should build okay - the program.cs has a main() and looks like a console app, the Calculator.cs has a simple class, and the CalculatorTests.cs has unit tests built for Nunit. The solution has NUnit, Castle.core, and things from later modules (Moq, AutoMoq, Unity, Ninject, etc.). It didn't seem to have a VS runner, so I added Nunit3TestAdapter - the guy in the course has resharper installed, which I don't, and he was using the Resharper test runner, which would explain why he didn't include it.
I tried setting the "Module1/Easy/Project.cs" file the "Set as Startup Item", since it has a main and looks setup as a console app. However, running it (the "Start" button turned into a "Program.cs" button), it fails saying it can't run a dll. The tests aren't showing up in the Test Explorer like some other small projects I've built from examples.
What's the right way to do this?
I'm not sure where to go from here. On the Build menu is only a "Build Solution" and one about Code Analysis - I'm used to a lot more options here. It feels like I have to turn this folder into a project, maybe? I can always reinstall the packages - but what is the best solution here?
I've run into this before on other book, tutorial, or class repos, but finally decided to figure out how to get this one working. I appreciate any help!
Notes
I'm running Visual Studio Community 2017 at the moment.
I can post some of the files, but the repo is publically available, and not sure exactly what to post to help.
Progress from comments and answers
Per Biker-Dude's answer, I switched the project to build a console app rather than a dll, and now I get a compile-time error for having multiple entry points (i.e. every module and sub folder has its own Main() function and should probably be a separate project).
After #1, I removed all folders but one from the solution, it will then compile, run the tests, etc. - but I eventually want to be able to at least separately compile every sub-folder - what's the best way?
The problem must be that the project must have the output type set to class libraries. Browse through the solution tree and:
Select your class's project> right click > Properties > Application >
Output Type > Console Application/ Windows Application.
This should fix it, if the other things are set up properly.
With the help of BikerDude's answer and stijn's comments, I was eventually able to play around with this and get some things working.
First of all, don't try to exclude any folders in this situation, that will just make things worse! They will still be in your underlying folder/repo, just won't be showing up in your solution anywhere, and you won't be able to create a new folder with the same name (weird decision...). And you'll have to add them back in as individual files - I think.
The best solution (so far)
The best solution seems to be:
Create a new project for each buildable set of files in the solution (in my case, at least one project per "module" folder). I used the ".Net framework console app" project type (right click on the solution, use Add/New Project) to get things to work, but this would depend on the particular course or tutorial repo you downloaded.
Move the folder or sub-folder that has the files you want to build out of the main solution and into the new project - you can click and drag to move it.
Visual Studio will make an empty, pre-formatted file in your project that you likely have to delete - for .Net apps, this is the "Program.cs" file in C#. For one of my folders this file already existed, and I had to delete the new one in order to build. Another folder from a different module was setup more like a library and couldn't build standalone, but this procedure did get me to being able to build the files and that allowed the unit tests to show up in the test explorer and run the tests successfully (which was the main point of that module).
Go to the solution and right-click and choose "Manage Nuget Packages for Solution". As long as all the packages are installed for the main solution, then they will all show up in the list of Installed Packages (you might need to click on the "Installed" tab). You can click on each package in turn, then on the right you can checkmark the new project, and the "Install" button should be available - click it. Repeat for all the packages to install them all. Note that you can cut out some repetition here if you create all the projects you need first, then you can install all of them at the same time in this step (i.e. checkmark all the new projects at once instead of reopening the package manager each time).
You might have to fix the NameSpace - it should be consistent within the files/folders you transferred from the original solution, but if you add any new files to play with things, the Namespace for it will likely not match, and to see classes, etc. in the original files, you'll have to update your Namespace on the new files.
Per BikerDude's answer - After transferring everything to new projects, if you keep anything in the original project that came with the solution, it might not be trying to build the right type of item. You may be able to fix that by right-clicking the project, selecting properties, and adjusting the "output type", but it may not have the options you need. If it doesn't, just create a new project with the right type and transfer the files as above.
After following the above steps, I was able to build each new project I created, using the original files from folders that I moved. Mainly I just needed to build, which enabled all the unit tests that this tutorial/class was focused on, but this allowed me to build the console apps as well, when present.
Thanks for the help from all in pointing me in the right direction!
I am working on a conversion from Accurev to TFS and am being blocked by Accurev's usage of symbolic links, which TFS does not work with. I have tried several methods, but they all seem to fail to work.
What I would like to do is have a file in the project/branch that lists all the linked files and folders that is stored in source control. On every get operation, I would like to read this file and link the folders and files specified in the central file. However, I cannot find a way to extend the get operation. Does anyone have any experience in extending it in VS?
TFS does not provide a way to extend what happens on a Get action. You could easily create a custom powershell or batch file that you use in place of calling tf.exe, but since Team Build and Visual Studio call into TFS directly using the Client Object Model you're not going to make this easy on yourself.
In the end everything is possible of course. You could write a custom build action for Team Build to replace the standard get operation, or create one that triggers after the standard get operation has completed. You can write a vsix visual studio extension that replaces the standard Get operation everywhere in the menu's of Visual Studio and get to a something that could be considered workable. But I would not recommend this. It is far from standard and it is far from sustainable. You'll have to unwire so much default behavior in Visual Studio (that checks out files that are changed, adds files to source control when they're added to the project file etc etc etc).
SourceSafe used to have this feature as well (it was called pinning) and Microsoft removed it when they created TFVC. They now recommend you use branching and merging to synchronize these files across multiple projects, making sure that the source structure in Source control is the same as the ones on disk during build.
You can also make use of the Add-as-Link option in your project files. This allows you to keep the original files in their original location, but MsBuild will understand that in the project structure this file actually lives somewhere else. Or package the linked files up in a NuGet package and use the Dependency Management using NuGet guide to help you place the files in the right location during build.
And finally, you can get very creative using Workspace Mappings, many people never get further than mapping $/project -> $(SourceDir), but in essence the workspace mapping is like the file you describe. A way to layout your sources from Source control to disk. You could do:
$/Project/DEV/MyProject -> $(SourceDir)\MyProject
$/Project/Shared/FilesToCopy -> $(SourceDir)\Shared
And you can even add files from other projects in the same collection:
$/AnotherProject/Shared -> $(SourceDir)\MoreShared
And something not many people know, you can layout individual files:
$/AnotherProject/CompanyAssemblyInfoItems.cs -> $(SourceDir)\CompanyAssemblyInfoItems.cs
The only thing you cannot do, is map files to be children of an already mapped folder. In that case you might need to have the workspace mapping do the fetching of the sources and then a .targets file that you include in your .csproj file to do the copying of files.
This thing I want to do might not even be worth doing but I thought it would be cool.
So what I want to do is to have some code that runs when my project is building (not only when compiling), and adds stuff to my classes based on things like attributes and general code analysis. What I want to do is have dynamically generated fields/properties that are usable through intellisense, but not visible in the actual source.
The reason for that being that I might potentially want to generate a lot of them, and outputting them to source would turn into a mess very quickly.
The potential possibilities of a system like that would be stuff like vector swizzling.
Is there maybe a library of some sort for that that I could just plug my generation code in? If not, what would be the best way to approach this, if there is any?
The most visible example of this is done by Microsoft for XAML files. During the build, a C# source file is created for each XAML file and placed in the obj/Debug or obj/Release folder. In addition to that, the MSBuild .targets file where the relevant tasks are defined is specially configured to tell Visual Studio that the generated files are required for proper IntelliSense support, which means you don't actually have to explicitly build the project in order for IntelliSense to allow items declared in XAML to be used in C# code elsewhere in the project.
This is exactly the method I use for generating code for ANTLR grammar files during a build. You can see a complete example with a build task assembly and custom .targets file here:
https://github.com/antlr/antlrcs/tree/master/AntlrBuildTask
You should be aware that some 3rd party extensions for Visual Studio completely replace the IntelliSense support with their own implementation of code completion. Some of these extensions are known to not support the MSBuild IntelliSense extensibility features required for this to work with custom code generators. If you run into problems with IntelliSense and have any extensions installed, you may find that removing the extensions completely resolves the problems.
You should compile code by the class CSharpCodeProvider/ICodeCompiler/CompilerParameters when application run.
Are there any good programs out there to compare to compile .NET assemblies?
For example I have HelloWorld.dll (1.0.0.0) and HelloWorld.dll (2.0.0.0), and I want to compare differences how can I do this?
I know I can use .NET Reflector and use the Assembly Diff plugin. Are there any other good tools out there to do this?
Ways to Compare .NET Assemblies suggests
Commercial:
NDepend
Free:
JustAssembly (only shows differences in API)
BitDiffer (same)
Reflector Diff Add-in (which you've already discovered, but not available anymore)
Existing compare tools like Beyond Compare (commercial) can do this by special configuration. Here's how to do this for Beyond Compare:
Go to Tools → Options
Click New.., select "Text format", click OK
Give it a name (say, EXE, or DLL), and specify the mask as *.exe or *.dll
Click on tab Conversion and select "External program (Unicode filenames)"
Under "Loading", specify the path to ildasm and add %s /OUT:%t /NOBAR (i.e.: C:\Program Files (x86)\Microsoft SDKs\Windows\v10.0A\bin\NETFX 4.8 Tools\ildasm.exe %s /OUT:%t /NOBAR)
Make sure to check disable editing.
Click Save, then Close
Restart BC and open two exe files to compare, it should decompile into ilasm automatically now.
You can also add syntax highlighting to this new format. I plan to send the syntax file to them so that it'll become available to share.
Two ways.
You could ILDASM and diff with classic tools.
Or you could use NDepends, which will cost for that feature.
[Piece to C# bloaters..]
I just had to compare two revisions of the same DLL, which had the same version (I needed to implement a small hotfix, and deploy the DLL in production, but I wanted to make sure that no other changes leaked into code). Ideally, I would want the Assemby Diff add-in to show me the difference, but it does not work (it thinks that I'm comparing a DLL to itself). So this is what I did:
Created two folders to hold disassembled source files.
Used the Reflector's Export option (from context menu) to generate source files from each DLL in the folders created in previous step.
Used the free DiffMerge tool to compare two directories; the tools showed me the modified files and the difference.
It's a bit kludgy, but seems to work. I wish the Assembly Diff add-in worked, though.
UPDATE: The latest version of the Assembly Diff add-in is supposed to fix the issue of comparing two versions of the same assembly. Give it a try.
The tool NDepend offers many features to compare compiled .NET assemblies.
First from the NDepend Start Page click: Compare 2 versions of a code base. This will let you provide older and newer versions of your assemblies.
Then after NDepend has analyzed both older and newer assemblies, you can use the panel Search by Change. It is dedicated to browse assemblies code diff. Notice that:
If source code is available, just right click an element and click Diff Source. In the NDepend options you can plug to NDepend any code diff tool (Visual Studio, Beyond Compare...)
If you don't have the source code and just only the raw assemblies, there is the option Compare older and newer version disassembled with ILSpy. ILSpy v7.0 and upper versions are supported. This menu works on assembly, namespace, type and method level and you can choose to decompile to C# or IL.
Notice also in the screenshot that a CQLinq code query is generated to browse the diff.
from m in Application.Methods
where m.CodeWasChanged()
select new { m, m.NbLinesOfCode }
Many others default diff queries and rules are proposed by default, that will let you browse .NET code diff in a smart way.
Types that used to be 100% covered but not anymore
API Breaking Changes: Methods
Avoid making complex methods even more complex
Avoid decreasing code coverage by tests of types
From now, all types added or refactored should respect basic quality principles
Avoid transforming an immutable type into a mutable one
Heuristic to find types moved from one namespace or assembly to another
Disclaimer: I am one of the developer of the tool.
One more option is LibCheck from Microsoft.
Pretty old console tool for just getting public API diff. I could not run without debugging and retargeting to a more recent .net version. However, it gave me very clear output and I am going to use it later.
Here is an article with screenshots.
Here's a thinking outside the box approach whcih works fine.
Dump your old and new assemblies with dnSpy, dotPeek or JustDecompile into projects.
Create a new Git repo and commit the old assembly code first.
In your local repo folder delete all the files/folders except for ".git" and paste the new assembly files.
Either commit the new changes and view changes on say Github or use a Git viewer like Fork. Easy code comparison for free.
Java has a nice one: Semantic Diff Utilities