I am a software developer working on a webshop. We are using nopCommerce 3.30 with custom plugins. The whole project is checked in on a TFS server. (Visual studio 2013, Team Foundation server 2012)
The problem is the following:
nopCommerce 3.40 was released, we downloaded the ZIP with the source code, but i am not sure how do i compare the diferences and check-in the new version. I can't just replace all files because i need to compare the folder structure / delete the files and folders that are not in 3.40 version.
Is there any compare function between 2 projects on either the client or server side ?
If you're using local workspaces, you can just delete all the code in your workspace, then copy the new code into the same workspace folder. Then examine the pending changes window. TFS will automatically detect all add/deletes/edits (you may have to promote some of the changes from the Excluded Changes section of the Pending Changes window).
extract the files to a local folder, you can then use the compare tool.
Map one side to the source location of the original package, map the other side to your local directory. this will show you differences in the folder structure, file names etc. you can repeat / drill down to do the same at the file level
Related
I'm working on some UWP-app using Visual Studio 2019 Community Edition and added a directory with lots of files as assets. Those files need to be deployed with the application while preserving their directory structure and are no language or screen related resources, but YAML files for some different purpose.
What I did was simply dragging and dropping my folder if interest into the VS-GUI onto the Assets folder, which made the folder available in the project. The problem is that afterwards one needs to set the build options for all those added files. While one can mark all files and set their build options at once, this still is tedious with lots of files in lots of directories.
Making things worse, my added assets change from time to time, folders and files get removed entirely, renamed, new ones added etc. I don't want to manually keep track of those changes, so the only solution I currently have would be to remove my assets from the project, add them again and once again mark all files in all folders and set their build options.
So, instead of marking individual files, can build options be set per-directory for all files underneath?
Any other ideas on how to better handle my use case?
I have two projects A|B. A is a creator and B is a consumer of what A creates. Each project needs to use a global.txt file which has configurations. Each project has its own solution in TFS. Is there a way to link global.txt from a location in TFS into each project so that if someone edits the global.txt, then rebuilds project A it will contain the updated file without having to update it manually
When you configure the source repositories in the build definitions separately for two solutions, you could both include the global.txt file mapping.
At the beginning of the build process, the agent downloads files from your remote repository(on TFS) into a local sources directory(build agent).
Then set a CI trigger , the builds will run whenever some changes made to global.txt and checked in.
I have a C# web application that I have developed using Visual Studio 2010 and commit changes to a VisualSVN repository using the AnkhSVN plugin for Visual Studio.
I have created a Jenkins project that checks the repository every five minutes for new commits and then builds the web application, using the MSBuild plugin, if it sees a change.
This is working fine, however it is building the application to C:\Program Files (x86)\Jenkins\workspace\[Jenkins Project Name]\[Web Application Name] and I would like it to build to D:\Web\[Web Application Name] as this is the directory my IIS site is pointing at. (Both locations are on the same server)
Is there a setting in the Jenkins project where I can change this or do I have to add a build step that copies to a different location using a batch command or something similar?
Many thanks in advance.
You can specify a custom workspace for the Jenkins job to run in.
In your jenkins job look for a button on the right hand side that is labeled Advanced. On Jenkins 2.46.1 it is at the bottom of the General section just before the SCM section of the build job. Click on it and a new set of options will appear, one of them will be Use custom workspace. Check the box and put in the path to the folder you want to use. You should ensure the Jenkins user has permissions on this folder or bad things might happen.
Note that this will perform the entire build in this folder, so anything else that is also in the workspace folder for that build job will be in the new location.
If you want just the output files without all the source and other stuff you will indeed have to add another build step (batch is one option) to copy the relevant files from your build jobs workspace (which can be accessed in batch using the WORKSPACE variable jenkins defines for the job) to the desired destination folder.
I am working on a Project that is bind in TFS, this project gives me a folder Log.
When I execute my project and do some testing it generates text files and store in Log folder.
After that when I try to check in files those files are checked in as a new file in TFS.
I want TFS to exclude these files. Folders where I store temp files are:
\WebPages\ErrorLogs
\WebPages\TempReports
Is there a setting in TFS where I can say it not to include this folder in TFS but Get latest from server if any?
Is it possible to create a Check-in-Policy for this problem?
Use the Forbidden Patterns Policy included in the TFS Power Tools
The following regex prevents suo files from being checked-in:
\.((?i)suo$)
The following regex prevents bin, obj and debug folders from being checked-in:
^\\(bin)|(obj)|(debug)\\$
If you are using Team Explorer Everywhere you can use this document to find out how to configure the policies directly through the Eclipse plugin.
I currently have a requirement where I need to validate that all the files in a folder structure which are in SVN to see if they are up to date. Is there a way to do this using the svn artefacts that are held in the same directory as the file without interacting with SVN?
Apologies, this appears to need some further explanation...
I'm iterating through the projects loaded within a solution and can access the corresponding files on the file system. I need to check
a) if corresponding files on the file system match the latest revision if the file is under source control and
b) if any or the files or projects have not been added to the SVN.
This is essentially a validation check prior to calling a build server to start a build in case a user has forgotten to check in some changes, new files or new projects to a solution. The build server retrieves the code directly from the source control.
Do you ask how to detect what files stored in SVN folder(s) are changed?
You ask to do that without interacting with SVN, but I would recommend avoid this because file or folder structure can be changed by Subversion team in future.
Try to use SharpSVN I use it in one of my projects. It handy, stable, has nice API and actively developed.
Also check comments in your question. Anyway you can check source code of SharpSVN to undestand how to make it by yourself