I have to following directory structure on the local machine before the inital checkout:
base_dir/somefolder/someotherfolder/file.txt
After the checkout I want the following:
base_dir/somefolder/someotherfolder/file.txt
base_dir/somefolder/checked_out_folder/new_file.txt
So basically the checkout adding new files into the already existing directory. I can't get that to work with SharpSvn however. When checking out, it downloads all not locally existing files, the files or folders that already exist locally are not touched.
TortoiseSvn seems to be able to do that. I've read here that it should work somehow, but like the guy asking points out, setting the option AllowObstructions to true does nothing.
My checkout code ( I'm using SharpSVN 1.7 ):
using (SvnClient client = new SvnClient()) {
client.Progress += new EventHandler<SvnProgressEventArgs>(cl_Progress);
SvnCheckOutArgs sco = new SvnCheckOutArgs();
sco.Depth = SvnDepth.Infinity;
sco.AllowObstructions = true;
client.CheckOut(from, to, out result);
}
I don't know what to do, the documentation on SharpSvn is very thin. I hope somebody here can maybe help me out.
Even with .AllowObstructions it is still possible to get into conflicts.
E.g. With AllowObstructions a local file that already exists is left in place as a modified version of the new file. But if you add a directory in its place then you get a tree conflict.
(I would really recommend not using .AllowObstructions as that makes it easy to commit a new file over another file without noticing that you accidentally did this)
There are more than a few cases where you can get obstructions or skips, so you should really look at the Notifications (.Notify event on the client or the args object) or the Status after update.
Even checkout to an empty directory (or a not existing location) might cause conflicts... E.g. in cases where there are problems in the svn:externals definitions.
Related
I basically have what's a poor-man's versioning...
At one point someone copied / renamed the 'file.cs' to 'old-file.cs' - and all its history up to that point going with it.
And then created a new 'file.cs' - with all the new history going forward.
I ended up with the same file having history split up in between these two files.
I know this must be simple (if possible),
- I've tried searching, but my problem is how to 'phrase the question'
- This isn't a 'merge' (I think - I don't have branches involved),
- It's not the typical 'move' either
- I've looked up the tf command line but nothing resembles what I need
- I have the TFS Source Control Explorer Extension installed (but it can't really help with this)
FWIW, I'm using the VS 2015, C# project (both files are part of the same project), though I don't mind if the solution is command line 'tf' or whatever gets the job done.
So if anyone could help point me to the right direction at least it would be much appreciated. Thanks!
I have tested with TFS 2015.3 + VS 2015.3, but couldn't reproduce your scenario. In my test, the history in old file has been migrated to new file. You may check my steps to see whether they are the same as yours:
Rename a file gulpfile.js to old-gulpfile.js, and check it in in Source Control Explorer. Then copy old-gulpfile.js in workspace and modify it to gulpfile.js, and add it to source control and check it in.
Check old-gulpfile.js history:
Check gulpfile.js history:
You can see all history in old-gulpfile.js is also in new gulpfile.js file.
This one is proving very difficult to debug. Lets start with my situation:
I have an ASP.Net MVC3 web app developed with C# using .Net 4. Part of the system allows a user to upload a zip file. This is done fine, and the zip file is saved. There is also a Windows service which will periodically look for new zip files, extract them, do some work and then re-zip them back up. (I use System.IO.Compression for the zipping stuff). This part all works fine, after the processing I end up with a structure something like.
Object1Folder
\_ Input.zip
\_ ExtractedFolder
\_ Output.zip
There is also a feature that allows the user to delete the item, and the requirement is to delete the object folder, in this case "Object1Folder". Because I need to delete all sub folders and files I have the following recursive function to do the work...
public static void DeleteDirectory(string directoryPath)
{
string[] files = Directory.GetFiles(directoryPath);
string[] directories = Directory.GetDirectories(directoryPath);
foreach (string file in files)
{
File.SetAttributes(file, FileAttributes.Normal);
File.Delete(file);
}
foreach (string directory in directories)
{
DeleteDirectory(directory);
}
Directory.Delete(directoryPath, true);
}
Which is initially called something like...
DeleteDirectory("C:\\Objects\\Object1Folder");
But it doesn't work! Firstly, there is no error thrown, the code appears to successfully execute, which is annoying. But the result is that only the "Input.zip" file is removed. The "ExtractedFolder" and "Output.zip" file remains.
As far as I can tell, the code is sound. I can't see that it is a case that it doesn't attempt to delete the remaining file and folder. Unfortuantely, we don't have VS installed on the target server so I cannot step through the code and check that it attempts to delete them. Please point out if you can see this being a potential issue though?
My best guesses so far is that it is a permissions issue of some sort. What is interesting (perhaps) is that when I go to manual clean-up the problem (i.e. windows explorer delete of the "Object1Folder" it says "Denied" and asks me to confirm with the admin rights button thing that it does.
I know it's hard for you all to work the problem out, but I am looking for things that I should check to try and solve this problem. What kind of things should I ensure have correct permissions? What permissions do they need? Is there a good way for me to debug this issue? If something else has a hold of those files (maybe from the extraction process of the Windows Service), how can I check if this is the problem?
Some information about that server that might help: It's running "Windows Server 2008 R2 Datacenter" with service pack 1 and is 64-bit. The web app is assigned a user that is a member of both "Users" and "IIS_IUSRS".
Let me know if any of you need some extra information.
Have a look at the eventlog on the server; you might find an exception\error message there.
You could consider using Directory.Delete(path, true) so you delete the folder and all its content in one call (unless I do not understand your code correctly).
Have a look at files being in use. If a file is in use, the OS can't delete it. So make sure you are releasing all files correctly.
Finally, you can't force the files to be not in use so you might end up writing a clean script that runs every night to delete unwanted files and folders.
You really should use a temp destination folder for your unpacked files.
When they have been used you can try deleting them and if that fails just leave them.
The permission to write to a non-temp folder should not really be given to a presentation app residing in the iis.
is there a way to get the version-history of a file if you only know an old name of the file?
I am currently looking at an old copy of our repository (I don't know the exact date, the copy was taken). When I compare it to the current repository, there is one file, that only exists in the copy, but not in the current repository. It has not been deleted in the repository. I guess, it has been moved or renamed. Is there any way in TFS to find the version-history using the old path and name?
I know that I could dig around using the name or some code-fragments. But IMO this is not an acceptable solution when using a repository :)
Thank you very much
Andreas
In Team Explorer 2010, you can simply turn on the "Show Deleted Files" option and navigate to the original folder, you'll be able to then see the file that was moved or deleted. You can view history on the item to see its last changeset - this will show you whether it was outright deleted, or if it was just renamed and thus the item no longer exists in the current path name (aka "slot") and was deleted that way. You can further drill down in to changeset details for that changeset to see the new path name (slot) that item occupies.
As you mention, you could certainly do this with a little bash against the TFS API using the GetItems method. Though I understand that it's not what you want to do, I thought it worth saying just because the TFS API is surprisingly easy to work with.
A couple of simple approaches (not already suggested in other answers) may help:
In your new repository, go to the folder that used to contain the old file, right click and show History. This will show all the versioned changes to files in that folder. Now look through the list of changes for files that no longer exist in the folder, and double click them to view them and determine if the file looks like an ancestorof your new file.
Or go for a brute force approach: get all the source code onto your disk and search for files of the same name, or files with some of the same text in them, as the file you're looking for (I'd look for comments that seem like they might be faily old and which use a distinctive wording that is unlikely to have appeared in many places. Comments are less likely to have changed than class/method names that might have been refactored if the file was renamed)
Grep may be an ugly, brute force way of approaching the problem, but sometimes it's the quickest and easiest. The TFS CLI tools are powerful, but unhelpful, complex and poorly documented, so unless you're already an expert, they can take a lot of trial and error to get them to do what you want.
I have a single-threaded program that processes folders and files in a source directory. Is there a way to block a folder, with files in it, within my source directory from being modified by other processes while my program is working on it? I'm thinking something along the lines of placing some kind of exclusive lock on the folder itself, so only my program's process can use it.
NOTE:
I do not want to block the root source directory itself, just whatever folder(s), in the top level of that directory, I might be processing at any particular moment. I still want to be able to allow outside processes to add folders to the source directory, while I'm processing other folders.
UPDATE:
#Yuri - Yes this is a Windows program, a Windows Service application to be exact.
Part of what makes this both challenging and necessary is that I need to recreate the structure of whatever folder(s) I'm processing, in the source directory, in a separate destination directory. So I can't have a any other process modifying the folder(s) and File(s) while my program is working with them.
Ways to lock a folder:
1.Mark it readonly. (Included because people always try this)
Con - This does not actually work. On folders, the readonly flag is used for other purposes.
2.Mark all of the contents readonly.
Con - Won't prevent certain types of actions (e.g., creating new items).
3.Use ACL
Con - Won't prevent administrators from messing with the folder. Requires certain types of permissions.
4.Use ACL with a specially created user and enable folder Encryption.
Con - This is really of horrifying. If anything goes wrong, your data might be lost.
5.Rename/Move the folder
Con - Can be bypassed by user stupidity.
6.(Edit) gjvdkamp's answer: Lock Individual files
Con - As with #2, this still allows creating new files within the folders. That said, this is The Right Way™ to do it.
edit don't us this hack, doesn't work:
http://itknowledgeexchange.techtarget.com/itanswers/folder-lock/
Else i would loop over all files in foder, create a list<FileStream> and then call Lock on each. This would give you finer grained control over wich files to lock.
GJ
Edit: Try something along these lines:
var locks = new List<FileStream>();
var di = new DirectoryInfo(#"C:\Test");
foreach (var file in di.GetFiles()) {
var fs = new FileStream(file.FullName, FileMode.Open);
fs.Lock(0, 0);
locks.Add(fs);
}
Why don't you take a new copy of the folder, work on that one then replace the old one if neccessary?
I guess this might help you http://www.codeproject.com/KB/files/Unique_Folder_Protection.aspx
What the article basically does is change the extension of the folder to one of the following
".{2559a1f2-21d7-11d4-bdaf-00c04f60b9f0}"
".{21EC2020-3AEA-1069-A2DD-08002B30309D}"
".{2559a1f4-21d7-11d4-bdaf-00c04f60b9f0}"
".{645FF040-5081-101B-9F08-00AA002F954E}"
".{2559a1f1-21d7-11d4-bdaf-00c04f60b9f0}"
".{7007ACC7-3202-11D1-AAD2-00805FC1270E}"
And stores the password inside the folder
Hope this helps
Alex
While attempting to use the following code, I am receiving the following error : Already working for different url, error code 155000.
string targetPath = #"C:\Documents and Settings\Admin\My Documents\CPM Creator\"; //" for prettify
client.Authentication.DefaultCredentials = new NetworkCredential("guestUser", "hjk$#&123");
// Checkout the code to the specified directory
client.CheckOut(new Uri("http://svn.peerlis.com:8080/CPM Creator"), targetPath);
Well, is it correct? Is this already the working path for a SVN folder? Are there any hidden svn folders in that location?
I use SharpSVN in a "get to scratch area, work locally, throw away" cycle, so I always start with a clean (empty) folder (with no SVN folders in the ancestors). This has always worked fairly well.
IMHO, the best way to troubleshoot SVN problems is to use the command line client. Sometimes, it offers more clues that way, so you might want to look at documentation on svn checkout
You said there are hidden .svn folders; this means that targetPath is already a working copy, you'll have to check out to another folder, or delete the existing working copy if it's no longer needed.
In case you want to do an update of the existing working copy do something like:
client.Update(targetPath);
Check out the Subversion docs for more info on what command you need in what case.