webservice (iisexpress) keeps files open - c#

I need to create a sema4 file that restricts other sessions from trying to open/write to a database if another session is already trying to do the same 'transaction' By transaction, in this case make a similar booking that is already 'in progress'.
Here's the code:
HttpSessionState ss = HttpContext.Current.Session;
string sessionID = ss.SessionID;
DirectoryInfo di = new DirectoryInfo(dataDirectory + "Semaphores");
string facilityIDExt = requestedFacilityID.ToString().PadLeft(3, '0');
string sema4File = string.Format("{0}.{1:yyyyMMdd}.{2}", sessionID, RequestedStartDT, facilityIDExt);
sema4FilePath = Path.Combine(di.FullName, sema4File);
File.Create(sema4FilePath);
FileInfo[] fiPaths = di.GetFiles(string.Format("*.{0}", facilityIDExt));
bool bookingInProgress = true;
int waitPeriod = 60;
while (waitPeriod > 0 && bookingInProgress)
{
fiPaths = di.GetFiles(string.Format("*.{0}", facilityIDExt));
bookingInProgress = false;
foreach (FileInfo item in fiPaths)
if (item.Name.Contains(string.Format("{0:yyyyMMdd}.{1}", RequestedStartDT, facilityIDExt)) && item.Name != sema4File)
{
if (item.LastWriteTime > DateTime.Now.AddMinutes(-1))
{
bookingInProgress = true;
break;
}
}
System.Threading.Thread.Sleep(5000);
waitPeriod = waitPeriod - 5;
}
The idea is that the actual booking will take much less than 60 seconds to record in the database however in the meantime, no other booking requests will be permitted.
The problem that I am having is that when I call the following:
if (File.Exists(sema4FilePath))
File.Delete(sema4FilePath);
iisexpress won't delete the file as it is 'in use'. It is 'in use' by iisexpress.
I assume that this will happen with iis as well.
I don't understand why iisexpress keeps the sema4 file open?
How do I get around the 'in use' issue when I want to delete the sema4 file?

When you do this:
File.Create(sema4FilePath);
You get back a FileStream. You should close that to release it. Preferably wrap it into a using:
using (var stream = File.Create(sema4FilePath)) {
// Do you stuff
}
Or just directly close if you don't use the contents:
File.Create(sema4FilePath).Close();

Related

The operation has timed out on uploading document and updating metadata in sharepoint library using clientcontext.executequery()

I wrote a program using CSOM to upload documents to SharePoint and insert metadata to the properties. once a while(like every 3 months) the SharePoint server gets busy or we reset IIS or any other communication problem that it may have, we get "The operation has timed out" error on clientContext.ExecuteQuery(). To resolve the issue I wrote an extension method for ExecuteQuery to try every 10 seconds for 5 times to connect to the server and execute the query. My code works in the Dev and QA environment without any problem but in Prod, when it fails the first time with timeout error, in the second attempt, it only uploads the document but it doesn't update the properties and all the properties are empty in the library. It doesn't return any error as result of ExecteQuery() but It seems from the two requests in the batch witch are uploading the file and updating the properties, it just does uploading and I don't know what happens to the properties. It kinda removes that from the batch in the second attempt!
I used both upload methods docs.RootFolder.Files.Add and File.SaveBinaryDirect in different parts of my code but I copy just one of them here so you can see what I have in my code.
I appreciate your help.
public static void ExecuteSharePointQuery(ClientContext context)
{
int cnt = 0;
bool isExecute = false;
while (cnt < 5)
{
try
{
context.ExecuteQuery();
isExecute = true;
break;
}
catch (Exception ex)
{
cnt++;
Logger.Error(string.Format("Communication attempt with SharePoint failed. Attempt {0}", cnt));
Logger.Error(ex.Message);
Thread.Sleep(10000);
if (cnt == 5 && isExecute == false)
{
Logger.Error(string.Format("Couldn't execute the query in SharePoint."));
Logger.Error(ex.Message);
throw;
}
}
}
}
public static void UploadSPFileWithProperties(string siteURL, string listTitle, FieldMapper item)
{
Logger.Info(string.Format("Uploading to SharePoint: {0}", item.pdfPath));
using (ClientContext clientContext = new ClientContext(siteURL))
{
using (FileStream fs = new FileStream(item.pdfPath, FileMode.Open))
{
try
{
FileCreationInformation fileCreationInformation = new FileCreationInformation();
fileCreationInformation.ContentStream = fs;
fileCreationInformation.Url = Path.GetFileName(item.pdfPath);
fileCreationInformation.Overwrite = true;
List docs = clientContext.Web.Lists.GetByTitle(listTitle);
Microsoft.SharePoint.Client.File uploadFile = docs.RootFolder.Files.Add(fileCreationInformation);
uploadFile.CheckOut();
//Update the metadata
ListItem listItem = uploadFile.ListItemAllFields;
//Set field values on item
foreach (List<string> list in item.fieldMappings)
{
if (list[FieldMapper.SP_VALUE_INDEX] != null)
{
TrySet(ref listItem, list[FieldMapper.SP_FIELD_NAME_INDEX], (FieldType)Enum.Parse(typeof(FieldType), list[FieldMapper.SP_TYPE_INDEX]), list[FieldMapper.SP_VALUE_INDEX]);
}
}
listItem.Update();
uploadFile.CheckIn(string.Empty, CheckinType.OverwriteCheckIn);
SharePointUtilities.ExecuteSharePointQuery(clientContext);
}
catch (Exception ex)
{
}
}
}
}
There's too many possible reasons for me to really comment on a solution, especially considering it's only on the prod environment.
What I can say is that it's probably easiest to keep a reference to the last uploaded file. If your code fails then check if the last file has been uploaded correctly.
Side note: I'm not sure if this is relevant but if it's a large file you want to upload it in slices.

.NET FileSystemWatcher goes into infinite loop when moving file

I have a issue with the FileSystemWatcher. I'm using it in a windows service to monitor certain folders and when a file is copied, it proccesses that file using a SSIS package. Everything works fine, but every now and then, the FileWatcher picks up the same file and fires the Created event multiple times in a infinate loop. The code below works as follow:
Firstly, this method is called by the windows service and creates a watcher :
private void CreateFileWatcherEvent(SSISPackageSetting packageSettings)
{
// Create a new FileSystemWatcher and set its properties.
FileSystemWatcher watcher = new FileSystemWatcher();
watcher.IncludeSubdirectories = false;
watcher.Path = packageSettings.FileWatchPath;
/* Watch for changes in LastAccess and LastWrite times, and
the renaming of files or directories. */
watcher.NotifyFilter = NotifyFilters.LastAccess | NotifyFilters.LastWrite
| NotifyFilters.FileName | NotifyFilters.DirectoryName | NotifyFilters.Size;
//Watch for all files
watcher.Filter = "*.*";
watcher.Created += (s, e) => FileCreated(e, packageSettings);
// Begin watching.
watcher.EnableRaisingEvents = true;
}
Next up, The Watcher.Created event looks something like this:
private void FileCreated(FileSystemEventArgs e, SSISPackageSetting packageSettings)
{
//Bunch of other code not important to the issue
ProcessFile(packageSettings, e.FullPath, fileExtension);
}
The ProcessFile method looks something like this:
private void ProcessFile(SSISPackageSetting packageSetting,string Filename,string fileExtension)
{
//COMPLETE A BUNCH OF SSIS TASKS TO PROCESS THE FILE
//NOW WE NEED TO CREATE THE OUTPUT FILE SO THAT SSIS CAN WRITE TO IT
string errorOutPutfileName = packageSetting.ImportFailurePath + #"\FailedRows" + System.DateTime.Now.ToFileTime() + packageSetting.ErrorRowsFileExtension;
File.Create(errorOutPutfileName).Close();
MoveFileToSuccessPath(Filename, packageSetting);
}
Lastly, the MoveFile Method looks like this:
private void MoveFileToSuccessPath(string filename, SSISPackageSetting ssisPackage)
{
try
{
string newFilename = MakeFilenameUnique(filename);
System.IO.File.Move(filename, ssisPackage.ArchivePath.EndsWith("\\") ? ssisPackage.ArchivePath + newFilename : ssisPackage.ArchivePath + "\\" + newFilename);
}
catch (Exception ex)
{
SaveToApplicationLog(string.Format
("Error ocurred while moving a file to the success path. Filename {0}. Archive Path {1}. Error {2}", filename, ssisPackage.ArchivePath,ex.ToString()), EventLogEntryType.Error);
}
}
So somewhere in there, we go into a infinite loop and the FileWatcher keeps on picking up the same file. Anyone have any idea? This happens randomly and intermittently.
When using the FileSystemWatcher I tend to use a dictionary to add the files to when the notification event fires. I then have a separate thread using a timer which picks files up from this collection when they are more than a few seconds old, somewhere around 5 seconds.
If my processing is also likely to change the last access time and I watch that too then I also implement a checksum which I keep in a dictionary along with the filename and last processed time for every file and use that to suppress it firing multiple times in a row. You don't have to use an expensive one to calculate, I have used md5 and even crc32 - you are only trying to prevent multiple notifications.
EDIT
This example code is very situation specific and makes lots of assumptions you may need to change. It doesn't list all your code, just somethind like the bits you need to add:
// So, first thing to do is add a dictionary to store file info:
internal class FileWatchInfo
{
public DateTime LatestTime { get; set; }
public bool IsProcessing { get; set; }
public string FullName { get; set; }
public string Checksum { get; set; }
}
SortedDictionary<string, FileWatchInfo> fileInfos = new SortedDictionary<string, FileWatchInfo>();
private readonly object SyncRoot = new object();
// Now, when you set up the watcher, also set up a [`Timer`][1] to monitor that dictionary.
CreateFileWatcherEvent(new SSISPackageSetting{ FileWatchPath = "H:\\test"});
int processFilesInMilliseconds = 5000;
Timer timer = new Timer(ProcessFiles, null, processFilesInMilliseconds, processFilesInMilliseconds);
// In FileCreated, don't process the file but add it to a list
private void FileCreated(FileSystemEventArgs e) {
var finf = new FileInfo(e.FullPath);
DateTime latest = finf.LastAccessTimeUtc > finf.LastWriteTimeUtc
? finf.LastAccessTimeUtc : finf.LastWriteTimeUtc;
latest = latest > finf.CreationTimeUtc ? latest : finf.CreationTimeUtc;
// Beware of issues if other code sets the file times to crazy times in the past/future
lock (SyncRoot) {
// You need to work out what to do if you actually need to add this file again (i.e. someone
// has edited it in the 5 seconds since it was created, and the time it took you to process it)
if (!this.fileInfos.ContainsKey(e.FullPath)) {
FileWatchInfo info = new FileWatchInfo {
FullName = e.FullPath,
LatestTime = latest,
IsProcessing = false, Processed = false,
Checksum = null
};
this.fileInfos.Add(e.FullPath, info);
}
}
}
And finally, here is the process method as it now is
private void ProcessFiles(object state) {
FileWatchInfo toProcess = null;
List<string> toRemove = new List<string>();
lock (this.SyncRoot) {
foreach (var info in this.fileInfos) {
// You may want to sort your list by latest to avoid files being left in the queue for a long time
if (info.Value.Checksum == null) {
// If this fires the watcher, it doesn't matter, but beware of big files,
// which may mean you need to move this outside the lock
string md5Value;
using (var md5 = MD5.Create()) {
using (var stream = File.OpenRead(info.Value.FullName)) {
info.Value.Checksum =
BitConverter.ToString(md5.ComputeHash(stream)).Replace("-", "").ToLower();
}
}
}
// Data store (myFileInfoStore) is code I haven't included - use a Dictionary which you remove files from
// after a few minutes, or a permanent database to store file checksums
if ((info.Value.Processed && info.Value.ProcessedTime.AddSeconds(5) < DateTime.UtcNow)
|| myFileInfoStore.GetFileInfo(info.Value.FullName).Checksum == info.Value.Checksum) {
toRemove.Add(info.Key);
}
else if (!info.Value.Processed && !info.Value.IsProcessing
&& info.Value.LatestTime.AddSeconds(5) < DateTime.UtcNow) {
info.Value.IsProcessing = true;
toProcess = info.Value;
// This processes one file at a time, you could equally add a bunch to a list for parallel processing
break;
}
}
foreach (var filePath in toRemove) {
this.fileInfos.Remove(filePath);
}
}
if (toProcess != null)
{
ProcessFile(packageSettings, toProcess.FullName, new FileInfo(toProcess.FullName).Extension);
}
}
Finally, ProcessFile needs to process your file, then once completed go inside a lock, mark the info in the fileInfos dictionary as Processed, set the ProcessedTime, and then exit the lock and move the file. You will also want to update the checksum if it changes after an acceptable amount of time has passed.
It is very hard to provide a complete sample as I know nothing about your situation, but this is the general pattern I use. You will need to consider file rates, how frequently they are updated etc. You can probably bring down the time intervals to sub second instead of 5 seconds and still be ok.

Out of memory exception. May be from webservice

I got out of memory exception problem for 4 months. My client use webservice, they wanna me test their webservice. In their webservice, there is a function call upload. I test that function on 1500 users who uploaded at the same time. I tried garbage collection function of visual studio (GC). With 2mb of file, there is not exception, but with 8mb of file there is still out of memory exception. I have tried many times and a lot of solutions but still happened. I gonna crazy now. When upload is on going, I watched memory of all test computers but memory is not out of. So I think that problem is from webservice and server. But my client said that i have to improve those reasons which is from webservice and server to them. I'm gonna crazy now. Do you guys have any solotions for this? In additional, Our client does not public code, I just use webservice's function to test. Additional, I have to use vps to connect their webservice and network rather slow when connect to vps.
I have to make sure that my test script doesn't have any problem. Here is my test script to test upload function.
public void UploadNewJob(string HalID, string fileUID, string jobUID, string fileName, out List errorMessages)
{
errorMessages = null;
try
{
int versionNumber;
int newVersionNumber;
string newRevisionTag;
datasyncservice.ErrorObject errorObj = new datasyncservice.ErrorObject();
PfgDbJob job = new PfgDbJob();
job.CompanyName = Constant.SEARCH_CN;
job.HalliburtonSalesOffice = Constant.SEARCH_SO;
job.HalliburtonOperationsLocation = Constant.SEARCH_OL;
job.UploadPersonHalId = HalID;
job.CheckOutState = Constant.CHECKOUT_STATE;
job.RevisionTag = Constant.NEW_REVISION_TAG;
var manifestItems = new List();
var newManifestItems = new List();
var manifestItem = new ManifestItem();
if (fileUID == "")
{
if (job.JobUid == Guid.Empty)
job.JobUid = Guid.NewGuid();
if (job.FileUid == Guid.Empty)
job.FileUid = Guid.NewGuid();
}
else
{
Guid JobUid = new Guid(jobUID);
job.JobUid = JobUid;
Guid fileUid = new Guid(fileUID);
job.FileUid = fileUid;
}
// Change the next line when we transfer .ssp files by parts
manifestItem.PartUid = job.FileUid;
job.JobFileName = fileName;
manifestItem.BinaryFileName = job.JobFileName;
manifestItem.FileUid = job.FileUid;
manifestItem.JobUid = job.JobUid;
manifestItem.PartName = string.Empty;
manifestItem.SequenceNumber = 0;
manifestItems.Add(manifestItem);
errorMessages = DataSyncService.Instance.ValidateForUploadPfgDbJobToDatabase(out newVersionNumber, out newRevisionTag, out errorObj, out newManifestItems, HalID, job, false);
if (manifestItems.Count == 0)
manifestItems = newManifestItems;
if (errorMessages.Count > 0)
{
if (errorMessages.Count > 1 || errorMessages[0].IndexOf("NOT AN ERROR") == -1)
{
return;
}
}
//upload new Job
Guid transferUid;
long a= GC.GetTotalMemory(false);
byte[] fileContents = File.ReadAllBytes(fileName);
fileContents = null;
GC.Collect();
long b = GC.GetTotalMemory(false);
//Assert.Fail((b - a).ToString());
//errorMessages = DataSyncService.Instance.UploadFileInAJob(out transferUid, out errorObj, job.UploadPersonHalId, job, manifestItem, fileContents);
DataSyncService.Instance.UploadPfgDbJobToDatabase(out errorObj, out versionNumber, job.UploadPersonHalId, job, false, manifestItems);
}
catch (Exception ex)
{
Assert.Fail("Error from Test Scripts: " + ex.Message);
}
}
Please review my test code. And if there is not any problem from my test code, I have to improve reason is not from my test code T_T
My guess would be that you hit the 2 GB object size limit of .NET (1500 * 8MB > 4GB).
You should consider to change to .NET 4.5 and use the large object mode - see here - the setting is called gcAllowVeryLargeObjects.

How to set a dynamic number of threadCounter variables?

I'm not really into multithreading so probably the question is stupid but it seems I cannot find a way to solve this problem (especially because I'm using C# and I've been using it for a month).
I have a dynamic number of directories (I got it from a query in the DB). Inside those queries there are a certain amount of files.
For each directory I need to use a method to transfer these files using FTP in a cuncurrent way because I have basically no limit in FTP max connections (not my word, it's written in the specifics).
But I still need to control the max amount of files transfered per directory. So I need to count the files I'm transfering (increment/decrement).
How could I do it? Should I use something like an array and use the Monitor class?
Edit: Framework 3.5
You can use the Semaphore class to throttle the number of concurrent files per directory. You would probably want to have one semaphore per directory so that the number of FTP uploads per directory can be controlled independently.
public class Example
{
public void ProcessAllFilesAsync()
{
var semaphores = new Dictionary<string, Semaphore>();
foreach (string filePath in GetFiles())
{
string filePathCapture = filePath; // Needed to perform the closure correctly.
string directoryPath = Path.GetDirectoryName(filePath);
if (!semaphores.ContainsKey(directoryPath))
{
int allowed = NUM_OF_CONCURRENT_OPERATIONS;
semaphores.Add(directoryPath, new Semaphore(allowed, allowed));
}
var semaphore = semaphores[directoryPath];
ThreadPool.QueueUserWorkItem(
(state) =>
{
semaphore.WaitOne();
try
{
DoFtpOperation(filePathCapture);
}
finally
{
semaphore.Release();
}
}, null);
}
}
}
var allDirectories = db.GetAllDirectories();
foreach(var directoryPath in allDirectories)
{
DirectoryInfo directories = new DirectoryInfo(directoryPath);
//Loop through every file in that Directory
foreach(var fileInDir in directories.GetFiles()) {
//Check if we have reached our max limit
if (numberFTPConnections == MAXFTPCONNECTIONS){
Thread.Sleep(1000);
}
//code to copy to FTP
//This can be Aync, when then transfer is completed
//decrement the numberFTPConnections so then next file can be transfered.
}
}
You can try something along the lines above. Note that It's just the basic logic and there are proberly better ways to do this.

TFS 2008 Source Control - Quick way to destroy all deleted items

I have a bunch of source control folders for which I want to get rid of all items that are no longer required. These items have been deleted (code has been moved or rewritten) and, because most of us use the 'Show Deleted Items' option by default some of these folders now show more deleted items and folders and legitimate items. I want to make sure all this redundant code is gone, forever - as it most definitely will not ever be required. These are new projects being built from branches of older ones that, as yet, nobody is using.
There's quite a lot of files spread across multiple folders, though, so I'd rather avoid having to do each one individually. I'm at the command-line too, not using the API.
I know ultimately I will need the tf destroy command.
I also know that tf dir [wildcard] /recursive /deleted will return all deleted items within a path (unfortunately alongside all legitimate items).
Can anyone think of a good way of doing this quickly?
I've thought of two solutions:
1) Take the output of the dir command and find all the items that have :Xnnnnnnn after - these are the deleted items; then simply spit out a bunch of tf destroy calls, or construct a response file (not sure about this bit though). This sounds like a potential use for Powershell, but haven't actually done anything with that yet...
2) Get all the projects ready, and then simply destroy them from TFS and then re-add them so that only the required stuff is then in TFS. However, this does remove the branch relationship which could be useful because for a while I'm going to have to maintain two versions of some of these libraries (pre and post upgrade). Not ideal, but nothing I can do about it.
Obviously Option 2 is a cheat but it'll work - I'd just ideally like a reusable script that could be used for any folder in TFS in the future (a couple of other teams have other long-lived projects that could do with a full purge!).
Thanks in advance.
Okay so I wrote a console app (.Net 4):
IT GOES WITHOUT SAYING I OFFER NO WARRANTY ABOUT THIS - IT WILL DESTROY ITEMS IN TFS!!!!
Update (8th May 2012) If you run this on a folder that has masses and masses (I mean thousands or tens-of-thousands) of deleted items it might not complete before the TFS command-line times out. The majority of the time taken by this command is in generating the .tfc script. If you run it and find this happening, try to targetting some child folders first
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.IO;
using System.Text.RegularExpressions;
using System.Diagnostics;
namespace TFDestroyDeleted
{
class Program
{
static void Main(string[] args)
{
if (args.Length < 1 || args.Length > 3)
Usage();
bool prepareOnly = false;
bool previewOnly = false;
if (args.Any(s => StringComparer.InvariantCultureIgnoreCase
.Compare(s, "preview") == 0)) previewOnly = true;
if (args.Any(s => StringComparer.InvariantCultureIgnoreCase
.Compare(s, "norun") == 0)) prepareOnly = true;
string tfOutput = null;
Process p = new Process();
p.StartInfo = new ProcessStartInfo("tf")
{
Arguments = string.Format
("dir /recursive /deleted \"{0}\"", args[0]),
UseShellExecute = false,
RedirectStandardOutput = true,
RedirectStandardError = true,
RedirectStandardInput = true
};
p.Start();
tfOutput = p.StandardOutput.ReadToEnd();
p.WaitForExit();
string basePath = null;
string nextDelete = null;
List<string> toDelete = new List<string>();
using (var ms =
new MemoryStream(Encoding.Default.GetBytes(tfOutput)))
{
using (StreamReader sr = new StreamReader(ms))
{
while (!sr.EndOfStream)
{
nextDelete = null;
string line = sr.ReadLine();
if (string.IsNullOrWhiteSpace(line))
basePath = null;
else
{
if (basePath == null)
{
if (line.EndsWith(":"))
basePath = line.Substring(0, line.Length - 1);
else
continue;
}
else
{
nextDelete = Regex.Match(line, #"^.*?;X[0-9]+").Value;
if (!string.IsNullOrWhiteSpace(nextDelete))
{
toDelete.Add(
string.Format
(
"{0}/{1}", basePath,
nextDelete.StartsWith("$") ? nextDelete.Substring(1)
: nextDelete
));
}
}
}
}
}
}
using (var fs = File.OpenWrite("destroy.tfc"))
{
fs.SetLength(0);
using (var sw = new StreamWriter(fs))
{
//do the longest items first, naturally deleting items before their
//parent folders
foreach (var s in toDelete.OrderByDescending(s => s.Length))
{
if (!previewOnly)
sw.WriteLine("destroy \"{0}\" /i", s);
else
sw.WriteLine("destroy \"{0}\" /i /preview", s);
}
sw.Flush();
}
}
if (!prepareOnly)
{
p.StartInfo = new ProcessStartInfo("tf")
{
Arguments = string.Format("#{0}", "destroy.tfc"),
UseShellExecute = false
};
p.Start();
p.WaitForExit();
}
p.Close();
}
static void Usage()
{
Console.WriteLine(#"Usage:
TFDestroyDeleted [TFFolder] (preview) (norun)
Where [TFFolder] is the TFS root folder to be purged - it should be quoted if there are spaces. E.g: ""$/folder/subfolder"".
norun - Specify this if you only want a command file prepared for tf.
preview - Specify this if you want each destroy to be only a preview (i.e. when run, it won't actually do the destroy) ");
Environment.Exit(0);
}
}
}
You must pass the TFS folder to be deleted e.g '$/folder'. If you just pass that, then all matching deleted items will be detected and destroyed, one by one.
For some reason - if you accidentally pass a folder that doesn't actually exist then the operation takes forever. A CTRL+C will stop it, of course.
The app does a recursive dir on the folder, with the /deleted switch.
It then runs through each line in the output, looking for the delete hint, i.e. items with ;Xnnnnnnn. If found, it adds the full tfs path for that item to a list.
After complete, the list is sorted by length in descending order and the contents written out to a tfc response file for the tf.exe command line.
If the preview option is specified then the tf commands are written out with the /preview switch (see TFS Destroy on MSDN)
Then the deletions aren't actually performed.
Finally, you can specify norun which causes the tfc file to be created, but not actually run.
I know it is an old question but I think it can be helpful.
We have an old collection with 20+ team projects under VSO and really needed to clean up our team projects. This code worked perfectly for us.
using Microsoft.TeamFoundation.Client;
using Microsoft.TeamFoundation.VersionControl.Client;
static void Main(string[] args)
{
TfsTeamProjectCollection tfs = new TfsTeamProjectCollection(new Uri("COLLECTION_URL")); //Example: https://xxxx.visualstudio.com
var versionControl = tfs.GetService<VersionControlServer>();
ItemSpec spec = new ItemSpec("$/", RecursionType.Full);
var folderItemSet = versionControl.GetItems(spec, VersionSpec.Latest, DeletedState.Deleted, ItemType.Folder, true);
DestoryItemSet(versionControl, folderItemSet);
//Delete remaining files
var fileItemSet = versionControl.GetItems(spec, VersionSpec.Latest, DeletedState.Deleted, ItemType.File, true);
DestoryItemSet(versionControl, fileItemSet);
}
private static void DestoryItemSet(VersionControlServer versionControl, ItemSet itemSet)
{
foreach (var deletedItem in itemSet.Items)
{
try
{
versionControl.Destroy(new ItemSpec(deletedItem.ServerItem, RecursionType.Full, deletedItem.DeletionId), VersionSpec.Latest, null, Microsoft.TeamFoundation.VersionControl.Common.DestroyFlags.None);
Console.WriteLine("{0} destroyed successfully.", deletedItem.ServerItem);
}
catch (ItemNotFoundException) //For get rid of exception for deleting the nested objects
{
}
catch (Exception)
{
throw;
}
}
}
I used Microsoft.TeamFoundationServer.ExtendedClient NuGet package.

Categories

Resources