I've worked out that in my C#/F# code I can load information about any .NET project using
collection.LoadProject(path_to_my_proj_file)
where collection is of type ProjectCollection. I can then get the access to all the properties and items defined in the project and all of its dependencies. As an example I can get an access to all the files included via Compile in the following way
project.GetItems "Compile"
Let's assume I want to define a custom ItemGroup in my fsproj file:
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
...
<ItemGroup>
<MyGroup Include="Test.txt" />
<MyGroup Include="TestFiles\**\*" />
</ItemGroup>
</Project>
Some comments:
MyGroup is my custom collection identifier. I've found out things like this are allowed
The "TestFiles" folder contains 4 files all together that exist in some subfolders, but they all match the pattern "TestFiles\**\*"
When I load the project using the method I had mentioned at the beginning and run
project.GetItems "MyGroup"
I get only one item, that is "Test.txt". The other files don't get discovered, unless I define them explicitly (i.e. without the wildcard) in the fsproj.
Is there a way for me to discover the files included using wildcards as well ? I'm even happy to get them in the unresolved form. So getting "TestFiles\**\*" instead of specific files that match the pattern is fine as well.
Found the solution.
First, let me say that the problem was on my side. In fact in general when you try to load a project, it DOES load also all the files. Even those defined with a wildcard, as long as the ofc match the pattern.
And here's the deal: in my application I don't use the plain collection.LoadProject, but rather a more sophisticated library. And yes - the library does some magic stuff that caused the problem.
In this situation the problem was simple: all the *.[cs|fs]proj files were copied to C:\temp\<random_folder_name>. Only *.[cs|fs]proj files - nothing less, nothing more. As a result, when the project loader attempted to evaluate the solution, there were simply no files to pattern-match, resulting in zero elements.
The lesson here is: writing a proper MCVE is important - would save me (and probably you as well) some time. Apologies for problems, guys!
Related
I'm trying to set AppendTargetFrameworkToOutputPath for a large number of projects, but depending on the TargetFramework.
So basically I have
<PropertyGroup Condition="'$(TargetFramework)'=='net48'">
<AppendTargetFrameworkToOutputPath>false</AppendTargetFrameworkToOutputPath>
<AppendRuntimeIdentifierToOutputPath>false</AppendRuntimeIdentifierToOutputPath>
</PropertyGroup>
and have to figure out where to put it so I don't have to duplicate code in all projects.
What I've figured out so far:
Directory.Build.props: If I don't specify a condition (Condition="'$(TargetFramework)'=='net48'") it works reliably. If I limit it to only a specific TargetFramework then it only works if the project is multi-targeting. I.e. <TargetFrameworks>net48;net5.0</TargetFrameworks> works fine, but for <TargetFramework>net48</TargetFramework> the condition evaluates to false - not too surprising since the Build.props file is evaluated before the TargetFramework is set in the csproj (also not too surprising that multi-targeting works, given how this is implemented).
Directory.Build.targets: While this would solve the problem with the TargetFramework not being set in the csproj, it seems like this is too late in the evaluation process and the output path is already set.
Is there any way around this apart from requiring all projects to use <TargetFrameworks>? (which would be an incredibly fragile solution).
Note that this is NOT a duplicate, since the user is importing the common file manually in the csproj which gives you much finer control over when the definitions are imported.
You can use the <BeforeTargetFrameworkInferenceTargets> property to specify the path to a target file that is executed at the right time during the build.
I work frequently in multiple EF Core projects across multiple solutions. It's getting very frustrating seeing IDE0058 analysis hints everywhere whenever I'm saving a DbContext:
From what I can gather, suppressing this code style violation requires modifying at least one file:
Adding a local discard for every call to database.SaveChangesAsync (looks terrible)
Adding a System.Diagnostics.CodeAnalysis.SuppressMessage annotation on a per-method basis (again not ideal)
Adding a GlobalSuppressions.cs file to each project in the solution (really not ideal either)
Adding a .editorconfig file to each project to configure this violation. None of the projects I work with use editorconfig files.
For code review reasons, I can't just keep adding irrelevant files/changes like this whenever I work on a different project.
The thing that gets me is that I swear this is a recent issue. I've been working in EF Core for years up until now and this has not been an issue.
Further to this, a Roslyn team member commented on GitHub saying it has "no UI impact" and "is hidden by default" (clearly not the case here). There appears to be no way of "resetting" this to the default value, as the linked comment suggests, either.
Is there anyway to suppress this violation, once and for all, across every project and solution that I work on?
Yes, you can create a global Analyzers.ruleset file and add the rule ids you wish to ignore.
In the .csproj file of each project, you will have to add in the PropertyGroup section the <CodeAnalysisRuleSet> and specify the path to the ruleset file.
The ruleset file is an XML file and this is an example of what you can do:
<?xml version="1.0" encoding="utf-8"?>
<RuleSet Name="Rules for Core Application" Description="Custom Rules" ToolsVersion="16.0">
<Rules AnalyzerId="Microsoft.CodeAnalysis.CSharp.Features" RuleNamespace="Microsoft.CodeAnalysis.CSharp.Features">
<Rule Id="IDE0058" Action="None" />
</Rules>
</RuleSet>
This is how the file looks:
You can decide for each one if it is a warning/message/error/hidden/info/none.
Update:
One more way would be to add a .editorconfig file and ignore the warning there.
dotnet_style_readonly_field = false:none
Ever since I've been using the (relatively) new .NET Standard Library project type in Visual Studio, I've been having some problems getting a complete set of DLL files that are required by my project.
The problem is usually limited to 3rd-party libraries which I reference as NuGet packages. I've noticed that these don't get copied to the output folder of my project when I build it. This didn't use to be the case in classic project types.
While I can appreciate the de-cluttering effect that this change has brought for .NET Standard projects, I'm now faced with a problem. I sometimes absolutely need to be able to get the entire list of all files that my project depends on!
I have several different cases, where I might require this list for one reason or another, but the one I believe is most crucial for me, is when I want to gather these files from the csproj itself, right after it's built. In there, I have a custom MSBuild <Target> which should take all the files from the output dir and zip them together for distribution. The problem is, I'm missing all the files that come from NuGet dependencies, because they're not there!
How can I solve this in a general (i.e. not project-specific) way?
UPDATE
There's this deps.json file that contains basically all I'm after and then some. It's just a matter of extracting the relevant information and find the files in the local NuGet cache. But that would involve writing a specialized app and calling it from my target. Before I start writing one myself... Is there something like this already out there somewhere?
I followed this answer and it sort of works.
The suggested thing was to include the following into my csproj:
<CopyLocalLockFileAssemblies>true</CopyLocalLockFileAssemblies>
My main concern is that it also outputs some other DLLs from the framework (such as System.Memory.dll and System.Buffers.dll, among others), which I didn't expect. But maybe that's a good thing. They do seem to be dependencies, just not direct ones. I'll see how it plays out.
If it turns out ok, my only wish would be that this directive was more prominently displayed in project settings (as a simple checkbox, maybe?) so I wouldn't have to hunt the web to find it.
I'm currently developing an application, which requires external dll's which I do not control. I'd like to add documentation for these classes, so that others can understand why I'm making certain calls I'm making to these external DLL files.
Adding the external DLL files to the documentation sources does indeed log the classes, but all the summaries and other information is unavailable. Is it possible to document these files (Preferably without having to decompile/recreate the assembly as a project), so I can generate the related HTML documentation with sandcastle?
I've tried to keep Sandcastles working directory enabled, to see if the .xml files (That I see were copied over from my other projects) were somehow generated and placed in this directory. This doesn't seem to be the case, no files were generated, and it goes straight to generating the html files.
As far as I understand your question about creating documentation for an external DLL use case I see two possible ways that you can go:
Add "missing" notes for all items of the external DLL that you might want to document or/and
add conceptual topics to your own program documentation.
My sample solution WindowsApplication2 project has a form to add two values using a simple PDUNZDLL. A Sandcastle help file builder project "Documentation1" was added and two Documentation Sources (at this stage without XML comment file, see first snapshot below). You know - a DLL without XML comments file is resulting in a red missing summary.
Proposed solution (1):
Create a blank XML comments file like the following and name it after the assembly with a .xml extension e.g. PDUNZDLL.xml
<?xml version="1.0"?>
<doc>
<assembly>
<name>PDUNZDLL</name>
</assembly>
<members>
</members>
</doc>
Save this file to the e.g Debug folder D:\Visual-Studio-2015\Projects\WindowsApplication2\WindowsApplication2\bin\Debug
Double-click the "Project Properties" (see second snapshot below)
In the Component Configurations dialog, add the "IntelliSense Component" to the project.
Select "Missing Tags" and set the project's Show Missing Tags properties to your liking. This will force the build to add "missing" notes for all items that you might want to document.
Build the project and you will find a new XML comments file named after the assembly in the project's output folder e.g. D:\Visual-Studio-2015\Projects\WindowsApplication2\Documentation1\Help
Edit the <member> elements in the XML comments file to add the comments that you want for each member as shown in the second snapshot.
When you are done, replace your original placeholder file e.g. D:\Visual-Studio-2015\Projects\WindowsApplication2\WindowsApplication2\bin\Debug\PDUNZDLL.xml with the one generated from the build with your edited comments. ReBuild your documentation project.
This is of course time consuming as any help authoring for applications. And it was actually the task of the DLL developer.
Proposed solution (2):
Reading between your lines "... so that others can understand why I'm making certain calls I'm making to these external DLL files." leads me to the suggestion to add a supplementary documentation to your own program.
So, what I mean is to add conceptual topics describing how you call the features of the external DLL files.
I removed the the DLL under "Documentation Sources",
added a new folder "ExternalDLL",
added a new item Conceptual and Walkthrough,
double-click ContentLayout.content in Solution explorer,
and did all the steps for content layout, ReBuild the documentation project resulting in a help file like shown in the third snapshot below (see background info too).
Happy help authoring!
I have a solution which is built for several customers, and I need to be able to specify different xml files for each customer. How can I do this automatically. I was thinking it might be done with different configurations, but can't seem to figure out how.
Any suggestions?
EDIT:
This is the code used for declaring the xml file right now:
protected readonly static string XML_PATH = #"Resources/xml/Description.xml";
And the way it is solved now is to manually copy the correct file to the Description.xml before building. This is of course error prone, and I would like to automate it, preferentially based on the configuration. I'm looking for a quick fix right now, as we unfortunately haven't got the time to refactor the code.
Build Configuration dependent config files are a tricky issue and there are multiple ways to solve it.
If you want to down the road you outlined, you would need to manually edit the *.csproj File and add a Conditional ItemGroup to include the correct xml file. The syntax below hasn't been checked, but something like this should do
<ItemGroup Condition="'${Configuration}' == 'DEBUG'">
<Content Include="blablabl.xml"/>
</ItemGroup>
I don't remember if Content was the right ItemGroup, but simply check what ItemGroup your current .xmls are in and use that.
Based on your reformulated question:
You could use conditional compilation (caveat: It's messy and not the right way to manage config files!):
protected readonly static string XML_PATH =
#if DEBUG
#"Resources/xml/Description.xml";
#else
#"Resources/xml/Description2.xml";
#endif
If you want to read up on better techniques for managing config files, this is worth a read.
Now, I now self-promotion is frowned upon, but in this case I hope it's ok as it sounds relevant to the question, and I don't gain anything from this.
Recently I wrote a couple of blog posts on how to target multiple environments/machines:
Targeting multiple environments and machines - part 1/2
Targeting multiple environments and machines – part 2/2
As I understand it, the problem in this case, is how do you automatically build the correct set of files without having to manually figure out which files belong to which customer/environment. The solution I propose in the blog posts, suggests the use of nAnt along with some extensions built on top. nAnt is the .NET versions of Ant, a build tool, which lets you generate e.g. xml files given a specific set of input files, allowing you for example to generate a customer specific web.config file.
In the following appSetting section of the web.config file, say you want to specify a different value for the CustomerName key for each customer:
<appSettings>
<add key="CustomerName" value="${CustomerName}"/>
</appSettings>
Instead of specifying a value for the CustomerName key, you define a property called CustomerName. Now, assuming we are using nAnt, you create another customer specific file with the following content:
<?xml version="1.0" encoding="utf-8"?>
<target xmlns="http://nant.sf.net/release/0.86-beta1/nant.xsd">
<property name="CustomerName" value="Acme Incorporated"/>
</target>
nAnt can then merge these two files and automatically build customer/environment specific files for you.
The solution I go through, allow you to automatically build environment and machine specific files, such as the web.config file, but also allow you to output static files such as license files or libraries, all depending on which environment/machine you are targeting. I also supply a sample Visual Studio 2010 solution that shows a very basic example on how to do it, which you can download here.
You can of course just go ahead and take a look at nAnt, but I thought I'd provide you with the option to use my solution.