I have a simple C# project which loads external C# files at startup to be used as scripts. Unfortunately when editing any of these 'non-project' files in Visual Studio I only get the most basic of syntax highlighting, since classes and types within the project are not known in the context of this external file.
Without adding the files to my project (defeating the purpose of them being external scripts), is there any way I can define an external interface or somehow otherwise convince Visual Studio (2008) to parse the code within these files in the context of the classes in the project?
A couple of clarifications (with thanks to the early answerers)
People should be able to edit these scripts without access to my source code
People shouldn't have to set up an entire Visual Studio project to edit one source file that's likely to contain less that 10 lines of actual code.
You will always need a reference to these classes. Maybe you can add these files as a link to the project or to a new project with a reference.
Visualstudio needs some informations to accomplish that.
I would think about the Bridge Pattern and you need to add the class body in the same file
http://en.wikipedia.org/wiki/Bridge_pattern
or using mock objects- you can easily use them to provide syntax highlighting without sharing your code (the same here - all in one file):
http://en.wikipedia.org/wiki/Mock_object
You can separate the script and the assiting classes if you would allow having a project file.
Related
I have quite odd scenario. I have external library like one below:
Now I need to use it in other solution but in quite unnatural way (but it needs to be like that :/). So it is included in such fashion:
As you noticed we have here new project named ExternalLib (just like the one we've created before in different solution). It contains only .dll of that library and we reference this project in Consumer (Add reference -> Project -> ExternalLib). Also I had to add one postbuild action for this 'proxy' ExternalLib copy /Y "$(ProjectDir)/ExternalLib.dll" "$(TargetDir)". It builds without errors. Only errors are displayed by Visual Studio text editor (as it is visible on screenshot above).
My question: is there any way to make visual know about that 'real' contents of that library? (I also do not own .pdb files) I know it is really odd way including external library but it is some constraint I can't avoid :/
So I'm having some trouble creating an item template and would like some help.
What I'm trying to do:
Create a template that adds 3 files. A Class.cs and two config files in the location "./Config/Acc/Config.xml" and "./Config/Prod/Config.xml".
I've managed to create the template through the wizard and editing the resulting files, but I would like an easy method of distributing the template on my teams TFS.
From some googling it seems that I should use A VSIX project to deply this easily. Problem is that I can't get it to compile. I have 2 projects: VSIXproject and ItemTemplateProject. I've set the assembly info on VSIX project to use the ItemTemplateProject and I've modified my Class.cs, but when I compile, visual studio doesn't know how to handle the Class.cs file.
What am I doing wrong? Is there a better way of including my ItemTemplate so that anyone who pulls the repo can use it?
The template project (a project that only contains Visual Studio project templates or item templates) is only used to be able to work with templates. Its output (say [myproject].dll) is not important (the project type may be C# but it's irrelevant) and you will only distribute the .vsix file in the end.
Files (.vstemplate files and any other files) in this project are like "static" files. They will also ultimately be included into the .vsix file output during build.
So, to ensure this files (.cs or other) are "static", you must make sure they have their action set to None (for example), not Compile.
I am working on a conversion from Accurev to TFS and am being blocked by Accurev's usage of symbolic links, which TFS does not work with. I have tried several methods, but they all seem to fail to work.
What I would like to do is have a file in the project/branch that lists all the linked files and folders that is stored in source control. On every get operation, I would like to read this file and link the folders and files specified in the central file. However, I cannot find a way to extend the get operation. Does anyone have any experience in extending it in VS?
TFS does not provide a way to extend what happens on a Get action. You could easily create a custom powershell or batch file that you use in place of calling tf.exe, but since Team Build and Visual Studio call into TFS directly using the Client Object Model you're not going to make this easy on yourself.
In the end everything is possible of course. You could write a custom build action for Team Build to replace the standard get operation, or create one that triggers after the standard get operation has completed. You can write a vsix visual studio extension that replaces the standard Get operation everywhere in the menu's of Visual Studio and get to a something that could be considered workable. But I would not recommend this. It is far from standard and it is far from sustainable. You'll have to unwire so much default behavior in Visual Studio (that checks out files that are changed, adds files to source control when they're added to the project file etc etc etc).
SourceSafe used to have this feature as well (it was called pinning) and Microsoft removed it when they created TFVC. They now recommend you use branching and merging to synchronize these files across multiple projects, making sure that the source structure in Source control is the same as the ones on disk during build.
You can also make use of the Add-as-Link option in your project files. This allows you to keep the original files in their original location, but MsBuild will understand that in the project structure this file actually lives somewhere else. Or package the linked files up in a NuGet package and use the Dependency Management using NuGet guide to help you place the files in the right location during build.
And finally, you can get very creative using Workspace Mappings, many people never get further than mapping $/project -> $(SourceDir), but in essence the workspace mapping is like the file you describe. A way to layout your sources from Source control to disk. You could do:
$/Project/DEV/MyProject -> $(SourceDir)\MyProject
$/Project/Shared/FilesToCopy -> $(SourceDir)\Shared
And you can even add files from other projects in the same collection:
$/AnotherProject/Shared -> $(SourceDir)\MoreShared
And something not many people know, you can layout individual files:
$/AnotherProject/CompanyAssemblyInfoItems.cs -> $(SourceDir)\CompanyAssemblyInfoItems.cs
The only thing you cannot do, is map files to be children of an already mapped folder. In that case you might need to have the workspace mapping do the fetching of the sources and then a .targets file that you include in your .csproj file to do the copying of files.
This thing I want to do might not even be worth doing but I thought it would be cool.
So what I want to do is to have some code that runs when my project is building (not only when compiling), and adds stuff to my classes based on things like attributes and general code analysis. What I want to do is have dynamically generated fields/properties that are usable through intellisense, but not visible in the actual source.
The reason for that being that I might potentially want to generate a lot of them, and outputting them to source would turn into a mess very quickly.
The potential possibilities of a system like that would be stuff like vector swizzling.
Is there maybe a library of some sort for that that I could just plug my generation code in? If not, what would be the best way to approach this, if there is any?
The most visible example of this is done by Microsoft for XAML files. During the build, a C# source file is created for each XAML file and placed in the obj/Debug or obj/Release folder. In addition to that, the MSBuild .targets file where the relevant tasks are defined is specially configured to tell Visual Studio that the generated files are required for proper IntelliSense support, which means you don't actually have to explicitly build the project in order for IntelliSense to allow items declared in XAML to be used in C# code elsewhere in the project.
This is exactly the method I use for generating code for ANTLR grammar files during a build. You can see a complete example with a build task assembly and custom .targets file here:
https://github.com/antlr/antlrcs/tree/master/AntlrBuildTask
You should be aware that some 3rd party extensions for Visual Studio completely replace the IntelliSense support with their own implementation of code completion. Some of these extensions are known to not support the MSBuild IntelliSense extensibility features required for this to work with custom code generators. If you run into problems with IntelliSense and have any extensions installed, you may find that removing the extensions completely resolves the problems.
You should compile code by the class CSharpCodeProvider/ICodeCompiler/CompilerParameters when application run.
When using Visual Studio 2010, I open an aspx and cs file to edit. (I have not created a project. I am simply opening one aspx file and one cs file from my web directory to edit.) The intellisense will not detect System.Web or a large variety of others. In fact, only basic resources seem available. Is there a way to correct this?
As you are not in a project, you lack much of the context that would permit full intellisense support. VS has no idea what assemblies are included, and does not have imports from the web.config.
Remember that Intellisense tries to only present you with code completions that actually apply in the current build configuration. Without assemblies referenced, it can't guess that you have anything at all in, say, System.Web.
the Intellisense pretty much based on the content of the "using" clauses you have in the beginning of your file. It runs based on what you have already typed against a list o possible functions contained on the "used" assemblies.
For example, if you want intelisense to have access to the Convert function you need to use the System assembly. Without it, intelisense wont know the function exists