Following the code found here:
How to check if file is under source control in SharpSvn?
I'm trying to make a small utility application that will iterate over a designated folder and print out the status of all the files.
private void btnCheckSVN_Click(object sender, EventArgs e)
{
ParseSVNResults(CheckSVN());
}
private Collection<SvnStatusEventArgs> CheckSVN()
{
string path = #"C:\AMG\trunk\AMC";
if (!Directory.Exists(path))
return null;
DevExpress.Utils.WaitDialogForm wait = new DevExpress.Utils.WaitDialogForm();
wait.Caption = "Please wait, loading SVN file statuses. This may take a moment.";
wait.Caption += Environment.NewLine + path;
wait.Show();
SvnClient client = new SvnClient();
SvnStatusArgs sa = new SvnStatusArgs();
sa.Depth = SvnDepth.Infinity;
Collection<SvnStatusEventArgs> statuses;
client.GetStatus(path, sa, out statuses);
wait.Close();
return statuses;
}
private void ParseSVNResults(Collection<SvnStatusEventArgs> results)
{
if (results == null)
return;
int modified = 0;
int unversioned = 0;
foreach (SvnStatusEventArgs item in results)
{
memoEditSVNFiles.Text += item.LocalContentStatus.ToString() + " -- " + item.Path + Environment.NewLine;
if (item.LocalContentStatus.ToString() == "Modified")
modified++;
else if (item.LocalContentStatus.ToString() == "NotVersioned")
unversioned++;
}
memoEditSVNFiles.Text += Environment.NewLine + "Modified: " + modified + Environment.NewLine;
memoEditSVNFiles.Text += "Not Versioned: " + unversioned + Environment.NewLine;
memoEditSVNFiles.Text += "Total: " + results.Count;
}
When the code executes, I get a total of 147 Files & Folders. The actual folder has a few thousand files. Is it possible I'm looking at too many files and SharpSVN just quits after a while?
edit; I just tried creating about 100 text files and putting 30 into 3 folders, then 'nesting' them. So I've got;
C:\AMG\trunk\test which has ~30 files
C:\AMG\trunk\test\Folder1 which has ~30 files
C:\AMG\trunk\test\Folder1\Sub which has another 30
Without comitting this to the repository, when I run the above code on C:\AMG\trunk\test instead of the given path in my code snippet, the output says 1 total file.
So it turns out the SvnStatusArgs class has a "RetrieveAllEntries" boolean flag that defaults to false.
As the name implies, setting this true returns every file, whether it was modified / unversioned or up to date.
1 extra line in the CheckSVN() method in my original post:
SvnClient client = new SvnClient();
SvnStatusArgs sa = new SvnStatusArgs();
sa.Depth = SvnDepth.Infinity;
sa.RetrieveAllEntries = true; //the new line
Collection<SvnStatusEventArgs> statuses;
client.GetStatus(path, sa, out statuses);
Related
So this is with visual studio 2003. The code works perfect on higher frameworks however I need it in 1.1 so this is my only option. Also, the programs in german so the actual message was "Auf die methode (mymethod) wurde ohne anführungszeichen verwiesen". The code and everything is on a seperate PC so I can retrieve it if necessary but unfortunately no copy paste so for now i'm just writing as my main question how on earth is that even an error? Adding quotations doesn't work and gives the expected error complaining you can't implicitly convert method type to string. The line it failed on was FileSystemWatcher += fileSystemWatcher_Changed Theres a bunch more errors which I've managed to work around and more to fix however this is the only one thats truly stumpt me and any guidence would be useful.
I've trie using ' as quotations around it as well as * on either side. I've tried passing arguments which it uses which was of no help and honestly don't even know what else to try as I'm new to coding and it worked perfect on framework 2.0. Just not in this german 2003 1.1 framework..
Any help is kindly appreciated!
edit: Heres the related code to the error
private static void GetCurrentJob()
{
//This gets the file
string CurrentJobPath = #"C:\LuK\Master\Daten\dnocontainer.cfg";
//This makes it so it only reads the 5th line of the notepad and ignore everything else
//string line = File.ReadLines(CurrentJobPath).Skip(4).Take(1).First();
string[] lines = File.ReadAllLines(CurrentJobPath);
string line = lines[4];
//This is getting rid of all parts of the line I don't care about
line = line.Replace(" ", "").Replace("(STRING)Dno=", "").Replace("\"", "").Replace(";", "");
//This isn't necessary but I like it okay
JobName = line;
//Testing To prove its behaving
Console.WriteLine(JobName);
//This is making a path to the exact folder for the job currently running
AreaOfMonitor = #"C:\LuK\Master\Zeichnungsdaten\" + JobName;
Console.WriteLine(AreaOfMonitor);
}
private static void ProgramSwapMonitor(string ProgramChange)
{
// Create a new FileSystemWatcher and set its properties.
FileSystemWatcher watcher = new FileSystemWatcher();
watcher.Path = ProgramChange;
/* Watch for changes in LastAccess and LastWrite times, and
the renaming of files or directories. */
watcher.NotifyFilter = NotifyFilters.LastAccess | NotifyFilters.LastWrite
| NotifyFilters.FileName | NotifyFilters.DirectoryName;
// Only watch this specific file
watcher.Filter = "dnocontainer.cfg";
// Add event handlers.
watcher.Changed += new FileSystemEventHandler(OnChanged);
watcher.Created += new FileSystemEventHandler(OnChanged);
watcher.Deleted += new FileSystemEventHandler(OnChanged);
watcher.Renamed += new RenamedEventHandler(OnRenamed);
// Begin watching.
watcher.EnableRaisingEvents = true;
}
// Define the event handlers.
private static void OnChanged(object source, FileSystemEventArgs e)
{
// Specify what is done when a file is changed, created, or deleted.
Console.WriteLine("File: " + e.FullPath + " " + e.ChangeType);
Console.WriteLine(Spacer);
//This disables the directory monitor, then changes the active job in its memory, then restarts the directory monitor so it can now monitor the new location, then removes the old watcherChanged instance so theres no duplicates.
fileSystemWatcher.EnableRaisingEvents = false;
GetCurrentJob();
MonitorDirectory(path);
fileSystemWatcher.Changed -= FileSystemWatcher_Changed;
public static void MonitorDirectory(string path)
{
//fileSystemWatcher.EnableRaisingEvents = false;
path = AreaOfMonitor;
//Declaring variables outside the catch for use of error handling
string ErrorHandlingPathName = path;
string ErrorPathNameMaster = "JobNameError";
string Unlikely = "";
string errorpath = #"D:\JobEditsSaved\" + ErrorPathNameMaster + ".txt";
string PathError = "This file isn't in the directory, does a new job need adding?";
string Aidpath = #"D:\JobEditsSaved\CODEISSUES.txt";
int retryName = 0;
int NewNameCount = 0;
//Trys to do the normal function of beginning monitoring of path extracted from the dnocontainer
//If the file extracted doesn't have a corrosponding folder, rather than crash the program we write an error log which
//In the backup location writes a textfile saying its had this problem and writes the name of the job it couldn't find
//If this exceptions been thrown, the name of the errorTextFile I'm creating would already exist, so I'm doing a simple
//Counter which will add a number to the end of the error file name if it exists already.
try
{
//Trys code we want to work
fileSystemWatcher.Path = path;
}
catch (IOException ex)
{
//Both these catches do the exact same thing but different exceptions. However, the exceptions are for effectively the same thing (dir not exist)
//It will try to write an error log, if an error logs been made before, it adds 1 to its name, trys again and repeats this pricess 5000 times.
//If theres serious issues leading to errors then it will make a file saying to get me to look at code cause god almighty something major must have happened.
while (retryName < 5000)
{
if (!File.Exists(errorpath))//check if error log file is existing
{
PathError = PathError + ex;
File.WriteAllText(errorpath, PathError);
Console.WriteLine(PathError);
retryName = 0;
NewNameCount = 0;
break;
}
else if (errorpath.Length <= 250)//checks that the new error log file name its giving is less than max characters for a file in wondows
{
string ErrorPathName = ErrorPathNameMaster;
NewNameCount++;
retryName++;
ErrorPathName = ErrorPathNameMaster + Convert.ToString(NewNameCount);
Console.WriteLine("ErrorFile Name exits. Attempting new file name -" + ErrorPathName);
errorpath = #"D:\JobEditsSaved\" + ErrorPathName + ".txt";
}
else if (errorpath.Length > 250)//if its more than max characters lets stop the numbers, reset them, then change part of the name and run again. Allowing double the error logs
{
NewNameCount = 0;
retryName++;
NewNameCount++;
Unlikely = "JobNameError,ClearSomeErrorsOut" + Convert.ToString(NewNameCount);
errorpath = #"D:\JobEditsSaved\" + Unlikely + ".txt";
}
else //at this point theres over 500 errors thats occured. Someone should have analyised it and seen a few and followed it up, nevermind over 500. Only scenario I see where this could happen is a glitch in code causing it to make way too many at once. Hence, call me.
{
string Jesus = "Contact me (Josh Simpson), error handling needs looking at in static void monitor directory";
File.WriteAllText(Aidpath, Jesus);
}
Console.WriteLine(Spacer);
//If everything goes wrong it will create errors then to allow it to continue running and not crash it has to monitor somewhere
//I'm making it monitor this path as its the most common one I've seen so far.
path = #"C:\LuK\Master\Zeichnungsdaten\L-01026-0G20-04";
JobName = "L-01026-0G20-04";
fileSystemWatcher.Path = path;
}
}
catch (SystemException ex)
{
while (retryName < 5000)
{
if (!File.Exists(errorpath))
{
PathError = PathError + ex;
File.WriteAllText(errorpath, PathError);
Console.WriteLine(PathError);
retryName = 0;
NewNameCount = 0;
break;
}
else if (errorpath.Length <= 250)
{
string ErrorPathName = ErrorPathNameMaster;
NewNameCount++;
retryName++;
ErrorPathName = ErrorPathNameMaster + Convert.ToString(NewNameCount);
Console.WriteLine("ErrorFile Name exits. Attempting new file name -" + ErrorPathName);
errorpath = #"D:\JobEditsSaved\" + ErrorPathName + ".txt";
}
else if (errorpath.Length > 250)
{
NewNameCount = 0;
retryName++;
NewNameCount++;
Unlikely = "JobNameError,ClearSomeErrorsOut" + Convert.ToString(NewNameCount);
errorpath = #"D:\JobEditsSaved\" + Unlikely + ".txt";
Console.WriteLine("Errors need clearing - Still making logs fine though");
}
else
{
Console.WriteLine("What manner of thing hath gone wrong, I don't actually expect any of these statements to ever be called.\r\n Is there a new job thats not been added? Or format of dnoconfig??");
string Jesus = "Contact me (Josh Simpson), error handling needs looking at in static void monitor directory";
File.WriteAllText(Aidpath, Jesus);
}
}
Console.WriteLine(Spacer);
//If everything goes wrong it will create errors then to allow it to continue running and not crash it has to monitor somewhere
//I'm making it monitor this path as its the most common one I've seen so far.
path = #"C:\LuK\Master\Zeichnungsdaten\L-01026-0G20-04";
JobName = "L-01026-0G20-04";
fileSystemWatcher.Path = path;
}
//Allows monitoring of subdirectories. - Not needed as shouldn't be any:
fileSystemWatcher.IncludeSubdirectories = true;
//Declaring the filters, I don't really know why its needed for the monitoring of changes, creation, deletion ECT but it is.
fileSystemWatcher.NotifyFilter = NotifyFilters.LastAccess | NotifyFilters.LastWrite
| NotifyFilters.FileName | NotifyFilters.DirectoryName;
//Calling the SystemWatcherFunctions
fileSystemWatcher.Changed += FileSystemWatcher_Changed;
fileSystemWatcher.Error += new ErrorEventHandler(OnError);
Console.WriteLine("This is monitoring {0}{1}", path, Spacer);
//enables the monitoring
fileSystemWatcher.EnableRaisingEvents = true;
}
public static void FileSystemWatcher_Changed(object sender, FileSystemEventArgs e)
{
//This is for the sizes and file info. We are grabbing the resutls but also creating a converted version so I can't display in MB for large file with accuracy
/*var info = new FileInfo(e.FullPath);
var size = Convert.ToInt64(0);
//declaring to be used, double allows decimals. 64bitInt does not.
double MBSIZE = 0;
//This is to avoid potential error. Ensures the info we are grabbing is in the direct directory and no sub ones.
if ((info.Attributes & FileAttributes.Directory) != FileAttributes.Directory)
{
size = info.Length;
}
//This is for making it into megabytes for unnecessary user friendliness
if (size > 2000)
{
MBSIZE = Convert.ToDouble(size);
MBSIZE = MBSIZE / 1000000;
Console.WriteLine("File Changed/Edited/saved: {0}\r\nwas modified on {1}\r\nit is {2} Megabytes in size\r\nIs was originally created on {3}", e.Name, modification, MBSIZE, creation, Spacer);
}
else
*///{
Console.WriteLine("File Changed/Edited/saved: {0}\r\nwas modified on {1}\r\nIs was originally created on {2}{3} ", e.Name, modification, creation, Spacer);
//}
//The changed Files Name, then its location, then the date it was edited, then we take that date and convert to integer string, then remove all / as not to mess with directories.
FileNameStore = e.Name;
FilesPath = e.FullPath;
DateTime FilesLastWrite = File.GetLastAccessTime(FilesPath);
string FilesLastDone = FilesLastWrite.ToString("dd/MM/yyyy-HH:mm:ss");
FilesLastDone = FilesLastDone.Replace("/", "-").Replace(":", ".");
//This will be the files name when saved, this is a combo of when It was edited and its original name
string NewFileName = FilesLastDone + "_" + JobName + "_" + FileNameStore;
string BackupLocation = #"D:\JobEditsSaved\";
PasteLocation = BackupLocation + NewFileName;
//ConsoleWrites to check everything passing what it should be
Console.WriteLine("\r\n -- NewFileName:{0}\r\n -- FilesPath:{1}\r\n -- FileNameStore:{2}\r\n -- FilesLastDone:{3}\r\n -- PasteLocation:{4}", NewFileName, FilesPath, FileNameStore, FilesLastDone, PasteLocation);
if (CanRunTime)
{
CanRunTime = false;
t.Interval = 2000;
t.Elapsed += new ElapsedEventHandler(t_Elapsed);
t.Start();
}
}```
I have an issue with Files.
I am doing an image importer so clients put their files on an FTP server and then they can import it in the application.
During the import process I copy the file in the FTP Folder to another folder with File.copy
public List<Visuel> ImportVisuel(int galerieId, string[] images)
{
Galerie targetGalerie = MemoryCache.GetGaleriById(galerieId);
List<FormatImage> listeFormats = MemoryCache.FormatImageToList();
int i = 0;
List<Visuel> visuelAddList = new List<Visuel>();
List<Visuel> visuelUpdateList = new List<Visuel>();
List<Visuel> returnList = new List<Visuel>();
foreach (string item in images)
{
i++;
Progress.ImportProgress[Progress.Guid] = "Image " + i + " sur " + images.Count() + " importées";
string extension = Path.GetExtension(item);
string fileName = Path.GetFileName(item);
string originalPath = HttpContext.Current.Request.PhysicalApplicationPath + "Uploads\\";
string destinationPath = HttpContext.Current.Server.MapPath("~/Images/Catalogue") + "\\";
Visuel importImage = MemoryCache.GetVisuelByFilName(fileName);
bool update = true;
if (importImage == null) { importImage = new Visuel(); update = false; }
Size imageSize = importImage.GetJpegImageSize(originalPath + fileName);
FormatImage format = listeFormats.Where(f => f.width == imageSize.Width && f.height == imageSize.Height).FirstOrDefault();
string saveFileName = Guid.NewGuid() + extension;
File.Copy(originalPath + fileName, destinationPath + saveFileName);
if (format != null)
{
importImage.format = format;
switch (format.key)
{
case "Catalogue":
importImage.fileName = saveFileName;
importImage.originalFileName = fileName;
importImage.dossier = targetGalerie;
importImage.dossier_id = targetGalerie.id;
importImage.filePath = "Images/Catalogue/";
importImage.largeur = imageSize.Width;
importImage.hauteur = imageSize.Height;
importImage.isRoot = true;
if (update == false) { MemoryCache.Add(ref importImage); returnList.Add(importImage); }
if (update == true) visuelUpdateList.Add(importImage);
foreach (FormatImage f in listeFormats)
{
if (f.key.StartsWith("Catalogue_"))
{
string[] keys = f.key.Split('_');
string destinationFileName = saveFileName.Insert(saveFileName.IndexOf('.'), "-" + keys[1].ToString());
string destinationFileNameDeclinaison = destinationPath + destinationFileName;
VisuelResizer declinaison = new VisuelResizer();
declinaison.Save(originalPath + fileName, f.width, f.height, 1000, destinationFileNameDeclinaison);
Visuel visuel = MemoryCache.GetVisuelByFilName(fileName.Insert(fileName.IndexOf('.'), "-" + keys[1].ToString()));
update = true;
if (visuel == null) { visuel = new Visuel(); update = false; }
visuel.parent = importImage;
visuel.filePath = "Images/Catalogue/";
visuel.fileName = destinationFileName;
visuel.originalFileName = string.Empty;
visuel.format = f;
//visuel.dossier = targetGalerie; On s'en fout pour les déclinaisons
visuel.largeur = f.width;
visuel.hauteur = f.height;
if (update == false)
{
visuelAddList.Add(visuel);
}
else
{
visuelUpdateList.Add(visuel);
}
//importImage.declinaisons.Add(visuel);
}
}
break;
}
}
}
MemoryCache.Add(ref visuelAddList);
// FONCTION à implémenter
MemoryCache.Update(ref visuelUpdateList);
return returnList;
}
After some processes on the copy (the original file is no more used)
the client have a pop-up asking him if he wants to delete the original files in the ftp folder.
If he clicks on Ok another method is called on the same controller
and this method use
public void DeleteImageFile(string[] files)
{
for (int i = 0; i < files.Length; i++)
{
File.Delete(HttpContext.Current.Request.PhysicalApplicationPath + files[i].Replace(#"/", #"\"));
}
}
This method works fine and really delete the good files when I use it in other context.
But here I have an error message:
Process can't acces to file ... because it's used by another process.
Someone have an idea?
Thank you.
Here's the screenshot of Process Explorer
There are couple of thing you can do here.
1) If you can repro it, you can use Process Explorer at that moment and see which process is locking the file and if the process is ur process then making sure that you close the file handle after your work is done.
2) Use try/catch around the delete statement and retry after few seconds to see if the file handle was released.
3) If you can do it offline you can put in some queue and do the deletion on it later on.
You solve this by using c# locks. Just embed your code inside a lock statement and your threads will be safe and wait each other to complete processing.
I found the solution:
in my import method, there a call to that method
public void Save(string originalFile, int maxWidth, int maxHeight, int quality, string filePath)
{
Bitmap image = new Bitmap(originalFile);
Save(ref image, maxWidth, maxHeight, quality, filePath);
}
The bitmap maintains the file opened blocking delete.
just added
image.Dispose();
in the methos and it work fine.
Thank you for your help, and thank you for process explorer. Very useful tool
I have 10 txt files in Debug\Tests\Text\ (10 txt files). I need to write a program to open all 10 files and updated every single file. I'm not sure how to do it. Now, I'm actually reading the folder and getting the file name and storing the file name in an array. Below is my code:
private void getFilesName()
{
string[] fileArray = Directory.GetFiles(#"Tests\Text");
//looping through the folder and get the fileNames
for (int i = 0; i<fileArray.Length; i++)
{
MessageBox.Show(fileArray[i]); // I'm doing this is to double check i manage to get the file name.
}
}
After doing this, it do read all the text file name, but the challenge now is for me to access the filename and updating every file in it. I have also created another method just for updating the values in the txt files, below is the code:
private bool modifySQLFile()
{
string destFileName = #"Tests\Text\" // I need the fileName?
string[] fileTexts = File.ReadAllLines(destFileName);
int counter = 0;
//Processing the File
foreach(string line in fileTexts)
{
//only read those non-comments line
if(line.StartsWith("--") == false)
{
//Start to replace instances of Access ID
if(line.Contains(Variable) == true)
{
fileTexts[counter] = fileTexts[counter].Replace(Variable, textBox2.Text);
}
}
counter++;
}
//check if file exists in the backup folder
if(File.Exists("Tests\\Text\\file name "+ textBox1.Text +".sql") == true)
{
MessageBox.Show("This file already exist in the backup folder");
return false;
}
else
{
//update the file
File.WriteAllLines(destFileName, fileTexts);
File.Move(destFileName, "Tests\\Text\\file name"+ textBox1.Text +".sql");
MessageBox.Show("Completed");
return true;
}
}
Your problem seems to be passing the filename variable from the loop to the method.
In order to do what you want, add a parameter to the method:
private bool ModifySQLFile(string filename)
{
string[] fileTexts = File.ReadAllLines(filename);
// ...
}
Then call the method with this parameter:
for (int i = 0; i<fileArray.Length; i++)
{
ModifySQLFile(fileArray[i]);
}
But in general you really don't want to treat a formal language as plaintext like you do. It's very easy to break the SQL like that. What if the user wanted to replace the text "insert", or replaces something with "foo'bar"?
First, implement one (file) modification:
private bool modifySQLFile(String file) {
// given source file, let´s elaborate target file name
String targetFile = Path.Combine(
Path.GetDirectoryName(file),
String.Format("{0}{1}.sql",
Path.GetFileNameWithoutExtension(file),
textBox1.Text));
// In case you want a back up
//TODO: given source file name, elaborate back up file name
//String backUpFile = Path.Combine(...);
// Check (validate) before processing: do not override existing files
if (File.Exists(targetFile))
return false;
//TODO: what if back up file exists? Should we override it? skip?
// if line doesn't start with SQL commentary --
// and contains a variable, substitute the variable with its value
var target = File
.ReadLines(file)
.Select(line => (!line.StartsWith("--") && line.Contains(Variable))
? line.Replace(Variable, textBox2.Text)
: line);
// write modified above lines into file
File.WriteAllLines(targetFile, target);
// In case you want a back up
// Move file to backup
//File.Move(file, backUpFile);
return true;
}
Then call it in the loop:
// enumerate all the text files in the directory
var files = Directory
.EnumerateFiles("#"Tests\Text", "*.txt");
//TODO: you may want filter out some files with .Where
//.Where(file => ...);
// update all the files found above
foreach (var file in files) {
if (!modifySQLFile(file))
MessageBox.Show(String.Format("{0} already exist in the backup folder", file));
}
Please, do not do:
Use Magic values: what is #"Tests\Text\" within your modifySQLFile
Mix UI MessageBox.Show(...) and logic: modifySQLFile returns true or false and it's caller who can display message box.
Materialize when it's not required (Directory.GetFiles, File.ReadAllLines)
If you would like to edit the files in parallel. With threads you can parallelize work.
for (int i = 0; i < fileArray.Length; i++)
new Thread(UpdateFileThread).Start(fileArray[i]);
private void UpdateFileThread(object path)
{
string filePath = (string)path;
//ToDo: Edit file
}
In your case you would create 10 Threads. That solution works, but is a bad pattern if you have to deal with more than 10 files.
Below i have posted the real time code ,which i have used project
protected void btnSqlfinder_Click(object sender, EventArgs e)
{
//Defining the path of directory where all files saved
string filepath = # "D:\TPMS\App_Code\";
//get the all file names inside the directory
string[] files = Directory.GetFiles(filepath);
//loop through the files to search file one by one
for (int i = 0; i < files.Length; i++)
{
string sourcefilename = files[i];
StreamReader sr = File.OpenText(sourcefilename);
string sourceline = "";
int lineno = 0;
while ((sourceline = sr.ReadLine()) != null)
{
lineno++;
//defining the Keyword for search
if (sourceline.Contains("from"))
{
//append the result to multiline text box
TxtResult.Text += sourcefilename + lineno.ToString() + sourceline + System.Environment.NewLine;
}
if (sourceline.Contains("into"))
{
TxtResult.Text += sourcefilename + lineno.ToString() + sourceline + System.Environment.NewLine;
}
if (sourceline.Contains("set"))
{
TxtResult.Text += sourcefilename + lineno.ToString() + sourceline + System.Environment.NewLine;
}
if (sourceline.Contains("delete"))
{
TxtResult.Text += sourcefilename + lineno.ToString() + sourceline + System.Environment.NewLine;
}
}
}
}
This code will fetch the multiple files in the given directory,and show the lines as per the keyword in a separate text.
But you can easily change as per your requirement,Kindly let me know your thoughts.
Thanks
I am using the code below start at a path (root) provided by a GET variable and recursively go into every sub folder and display it's contents as list items. The path I'm using has about 3800 files and 375 sub folders. I takes about 45 seconds to render the page, is there any way I can cut this time down as this is unacceptable for my users.
string output;
protected void Page_Load(object sender, EventArgs e) {
getDirectoryTree(Request.QueryString["path"]);
itemWrapper.InnerHtml = output;
}
private void getDirectoryTree(string dirPath) {
try {
System.IO.DirectoryInfo rootDirectory = new System.IO.DirectoryInfo(dirPath);
foreach (System.IO.DirectoryInfo subDirectory in rootDirectory.GetDirectories()) {
output = output + "<ul><li><a>" + Regex.Replace(subDirectory.Name, "_", " ");
if (subDirectory.GetFiles().Length != 0 || subDirectory.GetDirectories().Length != 0) {
output = output + " +</a>";
} else {
output = output + "</a>";
}
getDirectoryTree(subDirectory.FullName);
if (subDirectory.GetFiles().Length != 0) {
output = output + "<ul>";
foreach (System.IO.FileInfo file in subDirectory.GetFiles()) {
output = output + "<li><a href='" + file.FullName + "'>" + file.Name + "</a></li>";
}
output = output + "</ul>";
}
output = output + "</li></ul>";
}
} catch (System.UnauthorizedAccessException) {
//This throws when we don't have access.
}
}
You should use System.Text.StringBuilder (Good performance) instead of string concatenate(Immutable) Bad performance.
You should use normal string replace function is not using complex search. subDirectory.Name.replace("_", " ");
Main reason for slowness in your code is most likely multiple calls to GetFiles and GetDirectories. You are calling them over and over again in if conditions as well as in your initial lookups. You only need the counts only once. Also, adding strings aren't helping the cause.
Following code was able to run through my simple usb-drive in 300ms and return with over 400 folders and 11000 files. On slow network drive, it was able to return in 9 seconds for 4000 files in 300 folders. It can probably be further optimized with Parallel.ForEach during recursion.
protected void Page_Load(object sender, EventArgs e) {
itemWrapper.InnerHtml = GetDirectory(Request.QueryString["path"]);
}
static string GetDirectory(string path)
{
StringBuilder output = new StringBuilder();
var subdir = System.IO.Directory.GetDirectories(path);
var files = System.IO.Directory.GetFiles(path);
output.Append("<ul><li><a>");
output.Append(path.Replace("_", " "));
output.Append(subdir.Length > 0 || files.Length > 0 ? "+</a>" : "</a>");
foreach(var sb in subdir)
{
output.Append(GetDirectory(sb));
}
if (files.Length > 0)
{
output.Append("<ul>");
foreach (var file in files)
{
output.AppendFormat("<li>{1}</li>", file, System.IO.Path.GetFileName(file));
}
output.Append("</ul>");
}
output.Append("</ul>");
return output.ToString();
}
Can someone tell me what is going to happen in this code when an error is encountered? Ideally it should continue the foreach statement until it gets to the last record, but I suspect it's stopping in the middle of the operation because when I check the number of files moved it's off by 225. If it is in fact stopping because of an error, what can I do to make it continue the loop?
I'm creating a new upload manager for our software and need to clean the old files up. There are about 715 orphaned files equaling around 750 MB after a year and a half of use because the original developers didn't write the code to correctly overwrite old files when a new one was uploaded. They also saved the files in a single directory. I can't stand that so I'm moving all of the files into a structure - Vessel Name - ServiceRequesetID - files uploaded for that service. I'm also giving the users a gridview to view and delete files they no longer need as they work the service.
protected void Button1_Click(object sender, EventArgs e)
{
GridViewRow[] rowArray = new GridViewRow[gv_Files.Rows.Count];
gv_Files.Rows.CopyTo(rowArray, 0);
int i = -1;
foreach(GridViewRow row in rowArray)
{
i++;
string _serviceRequestID = ((Label)gv_Files.Rows[row.RowIndex].FindControl("lbl_SRID")).Text;
string _vesselName = ((Label)gv_Files.Rows[row.RowIndex].FindControl("lbl_VesselID")).Text;
string _uploadDIR = Server.MapPath("uploadedFiles");
string _vesselDIR = Server.MapPath("uploadedFiles" + "\\" + _vesselName);
string _fileName = ((Label)gv_Files.Rows[row.RowIndex].FindControl("lbl_FileName")).Text;
DirectoryInfo dInfo = new DirectoryInfo(_uploadDIR);
DirectoryInfo dVessel = new DirectoryInfo(_vesselDIR);
DirectoryInfo dSRID = new DirectoryInfo(_serviceRequestID);
dInfo.CreateSubdirectory(_vesselName);
dVessel.CreateSubdirectory(_serviceRequestID);
string _originalFile = _uploadDIR + "\\" + _fileName;
string _fileFullPath = Path.Combine(Server.MapPath("uploadedFiles/" + _vesselName + "/" + _serviceRequestID + "/"), _fileName);
FileInfo NewFile = new FileInfo(_fileFullPath);
string _fileUploadPath = _vesselName + "/" + _serviceRequestID + "/" + _fileName;
string sourceFile = _originalFile;
FileInfo _source = new FileInfo(sourceFile);
string destinationFile = _fileFullPath;
try
{
{
File.Move(sourceFile, destinationFile);
movefiles.InsertNewUploadPath(Convert.ToDecimal(_serviceRequestID), 1, _fileUploadPath);
}
}
catch (Exception ex)
{
CreateLogFiles Err = new CreateLogFiles();
Err.ErrorLog(Server.MapPath("Logs/ErrorLog"), ex.Message);
}
}
_utility.MessageBox("Completed processing files.");
}
As long as the error encountered occurs within the try catch clause, the code will continue executing within the foreach loop. However, if the error occurs outside of the try catch, the function will exit and throw an error. How many files does your error log report??