Throw exception if more than one file in a folder - c#

Below is the snippet of code I'm using to count the files in a folder (just files, not additional folders). If there is more than one file in this folder I need to throw an exception.
private bool CheckCondition2(String FolderName)
{
bool ConditionPassed = false;
System.IO.DirectoryInfo dir = new System.IO.DirectoryInfo(FolderName);
int count = dir.GetFiles().Length;
ConditionPassed = (count > 1);
return ConditionPassed;
}
I then call it in main with:
if (!CheckCondition2(SourceFolder))
{
CanCopy = false;
throw new Exception("More than one mark-off file.");
}
Currently when I test it, it tells me there is more than one file in the directory despite there only being one. What have I done wrong in my code?

In your method, you return true if there are more than one files.
In your if-statement, you check for false, however. You seem to have mixed these up a little.
It's always a good idea to debug your code and follow the value as it changes to see if you've got any logic-errors. A more automatic and reliable way to do this is, of course, writing a unit test.
You could switch the condition in your method, to be
ConditionPassed = (count <= 1);
That way, it means that the method would return true when you're in a 'correct' state. You could instead change the if-statement to read
if (CheckCondition2(SourceFolder))
Either would probably work for you. In the latter example, I would also suggest changing the name of the method to something like HasMoreThanOneFile to make it abundantly obvious what it does.

Try this:
ConditionPassed = (count <= 1); //check should pass if there is at most one file

Either change the condition
ConditionPassed = (count <= 1);
or the if statement
if (CheckCondition2(SourceFolder)))
I presume your success scenario is to be at most 1 files in the source folder

Instead of using a bool, why not try...
System.IO.DirectoryInfo dir = new System.IO.DirectoryInfo(FolderName);
int count = dir.GetFiles().Length;
if (count > 1)
{
throw new Exception("More than one mark-off file.");
}
else
{
// Something else
}
Its a bit neater code (Sorry the OCD is kicking in!)

I think there's a flaw in your logic. ConditionPassed = (count > 1) should be ConditionPassed = (count <=1). Hope this help!

Please try debugging your code.
It just worked after making a small change in the if statement, it should have been if (CheckCondition2(SourceFolder))
Try updating name of the function to avoid confusion.

Related

"The process cannot access the file because it is being used by another process." with SystemReader

I have no coding experience but have been trying to fix a broken program many years ago. I've been fumbling through fixing things but have stumbled upon a piece that I can't fix. From what I've gathered you get Alexa to append a Dropbox file and the program reads that file looking for the change and, depending on what it is, executes a certain command based on a customizable list in an XML document.
I've gotten this to work about five times in the hundred of attempts I've done, every other time it will crash and Visual Studio gives me: "System.IO.IOException: 'The process cannot access the file 'C:\Users\\"User"\Dropbox\controlcomputer\controlfile.txt' because it is being used by another process.'"
This is the file that Dropbox appends and this only happens when I append the file, otherwise, the program works fine and I can navigate it.
I believe this is the code that handles this as this is the only mention of StreamReader in all of the code:
public static void launchTaskControlFile(string path)
{
int num = 0;
StreamReader streamReader = new StreamReader(path);
string str = "";
while (true)
{
string str1 = streamReader.ReadLine();
string str2 = str1;
if (str1 == null)
{
break;
}
str = str2.TrimStart(new char[] { '#' });
num++;
}
streamReader.Close();
if (str.Contains("Google"))
{
MainWindow.googleSearch(str);
}
else if (str.Contains("LockDown") && Settings.Default.lockdownEnabled)
{
MainWindow.executeLock();
}
else if (str.Contains("Shutdown") && Settings.Default.shutdownEnabled)
{
MainWindow.executeShutdown();
}
else if (str.Contains("Restart") && Settings.Default.restartEnabled)
{
MainWindow.executeRestart();
}
else if (!str.Contains("Password"))
{
MainWindow.launchApplication(str);
}
else
{
SendKeys.SendWait(" ");
Thread.Sleep(500);
string str3 = "potato";
for (int i = 0; i < str3.Length; i++)
{
SendKeys.SendWait(str3[i].ToString());
}
}
Console.ReadLine();
}
I've searched online but have no idea how I could apply anything I've found to this. Once again before working on this I have no coding experience so act like you're talking to a toddler.
Sorry if anything I added here is unnecessary I'm just trying to be thorough. Any help would be appreciated.
I set up a try delay pattern like Adriano Repetti said and it seems to be working, however doing that flat out would only cause it to not crash so I had to add a loop around it and set the loop to stop when a variable hit 1, which happened whenever any command types are triggered. This takes it out of the loop and sets the integer back to 0, triggering the loop again. That seems to be working now.

Why does my file sometimes disappear in the process of reading from it or writing to it?

I have an app that reads from text files to determine which reports should be generated. It works as it should most of the time, but once in awhile, the program deletes one of the text files it reads from/writes to. Then an exception is thrown ("Could not find file") and progress ceases.
Here is some pertinent code.
First, reading from the file:
List<String> delPerfRecords = ReadFileContents(DelPerfFile);
. . .
private static List<String> ReadFileContents(string fileName)
{
List<String> fileContents = new List<string>();
try
{
fileContents = File.ReadAllLines(fileName).ToList();
}
catch (Exception ex)
{
RoboReporterConstsAndUtils.HandleException(ex);
}
return fileContents;
}
Then, writing to the file -- it marks the record/line in that file as having been processed, so that the same report is not re-generated the next time the file is examined:
MarkAsProcessed(DelPerfFile, qrRecord);
. . .
private static void MarkAsProcessed(string fileToUpdate, string
qrRecord)
{
try
{
var fileContents = File.ReadAllLines(fileToUpdate).ToList();
for (int i = 0; i < fileContents.Count; i++)
{
if (fileContents[i] == qrRecord)
{
fileContents[i] = string.Format("{0}{1} {2}"
qrRecord, RoboReporterConstsAndUtils.COMPLETED_FLAG, DateTime.Now);
}
}
// Will this automatically overwrite the existing?
File.Delete(fileToUpdate);
File.WriteAllLines(fileToUpdate, fileContents);
}
catch (Exception ex)
{
RoboReporterConstsAndUtils.HandleException(ex);
}
}
So I do delete the file, but immediately replace it:
File.Delete(fileToUpdate);
File.WriteAllLines(fileToUpdate, fileContents);
The files being read have contents such as this:
Opas,20170110,20161127,20161231-COMPLETED 1/10/2017 12:33:27 AM
Opas,20170209,20170101,20170128-COMPLETED 2/9/2017 11:26:04 AM
Opas,20170309,20170129,20170225-COMPLETED
Opas,20170409,20170226,20170401
If "-COMPLETED" appears at the end of the record/row/line, it is ignored - will not be processed.
Also, if the second element (at index 1) is a date in the future, it will not be processed (yet).
So, for these examples shown above, the first three have already been done, and will be subsequently ignored. The fourth one will not be acted on until on or after April 9th, 2017 (at which time the data within the data range of the last two dates will be retrieved).
Why is the file sometimes deleted? What can I do to prevent it from ever happening?
If helpful, in more context, the logic is like so:
internal static string GenerateAndSaveDelPerfReports()
{
string allUnitsProcessed = String.Empty;
bool success = false;
try
{
List<String> delPerfRecords = ReadFileContents(DelPerfFile);
List<QueuedReports> qrList = new List<QueuedReports>();
foreach (string qrRecord in delPerfRecords)
{
var qr = ConvertCRVRecordToQueuedReport(qrRecord);
// Rows that have already been processed return null
if (null == qr) continue;
// If the report has not yet been run, and it is due, add i
to the list
if (qr.DateToGenerate <= DateTime.Today)
{
var unit = qr.Unit;
qrList.Add(qr);
MarkAsProcessed(DelPerfFile, qrRecord);
if (String.IsNullOrWhiteSpace(allUnitsProcessed))
{
allUnitsProcessed = unit;
}
else if (!allUnitsProcessed.Contains(unit))
{
allUnitsProcessed = allUnitsProcessed + " and "
unit;
}
}
}
foreach (QueuedReports qrs in qrList)
{
GenerateAndSaveDelPerfReport(qrs);
success = true;
}
}
catch
{
success = false;
}
if (success)
{
return String.Format("Delivery Performance report[s] generate
for {0} by RoboReporter2017", allUnitsProcessed);
}
return String.Empty;
}
How can I ironclad this code to prevent the files from being periodically trashed?
UPDATE
I can't really test this, because the problem occurs so infrequently, but I wonder if adding a "pause" between the File.Delete() and the File.WriteAllLines() would solve the problem?
UPDATE 2
I'm not absolutely sure what the answer to my question is, so I won't add this as an answer, but my guess is that the File.Delete() and File.WriteAllLines() were occurring too close together and so the delete was sometimes occurring on both the old and the new copy of the file.
If so, a pause between the two calls may have solved the problem 99.42% of the time, but from what I found here, it seems the File.Delete() is redundant/superfluous anyway, and so I tested with the File.Delete() commented out, and it worked fine; so, I'm just doing without that occasionally problematic call now. I expect that to solve the issue.
// Will this automatically overwrite the existing?
File.Delete(fileToUpdate);
File.WriteAllLines(fileToUpdate, fileContents);
I would simply add an extra parameter to WriteAllLines() (which could default to false) to tell the function to open the file in overwrite mode, and not call File.Delete() at all then.
Do you currently check the return value of the file open?
Update: ok, it looks like WriteAllLines() is a .Net Framework function and therefore cannot be changed, so I deleted this answer. However now this shows up in the comments, as a proposed solution on another forum:
"just use something like File.WriteAllText where if the file exists,
the data is just overwritten, if the file does not exist it will be
created."
And this was exactly what I meant (while thinking WriteAllLines() was a user defined function), because I've had similar problems in the past.
So, a solution like that could solve some tricky problems (instead of deleting/fast reopening, just overwriting the file) - also less work for the OS, and possibly less file/disk fragmentation.

Cosmos custom OS, addmapping?

I am new to C# and is currently using COSMOS to make a simple FileSystem for my OS class. Currently I'm trying to implement a "reformat" function that, when the word "reformat" is typed into the console, the OS (emulated via QEMU), partitions the disk. Currently this is my code:
public static void console()
{
while (true)
{
Console.WriteLine("Console: ");
String input = Console.ReadLine();
if (input == "exit")
{
Cosmos.Sys.Deboot.ShutDown();
}
else if (input == "cpumem")
{
Console.WriteLine(Cosmos.Kernel.CPU.AmountOfMemory.ToString());
}
else if (input == "restart")
{
Cosmos.Sys.Deboot.Reboot();
}
else if (input == "devices")
{
var devices = Cosmos.Sys.FileSystem.Disk.Devices.ToArray();
}
else if (input == "reformat")
{
try
{
Partition part = null;
for (int j = 0; j < Cosmos.Hardware.BlockDevice.Devices.Count; j++)
{
if (Cosmos.Hardware.BlockDevice.Devices[j] is Partition)
{
part = (Partition)Cosmos.Hardware.BlockDevice.Devices[j];
}
}
var fs = new Cosmos.Sys.FileSystem.FAT32.FAT32(part);
uint cluster = 100;
fs.Format("newCluster", cluster);
}
catch
{
//Do Something warn user.
}
}
}
}
Most important is this bit:
else if (input == "reformat")
{
try
{
Partition part = null;
for (int j = 0; j < Cosmos.Hardware.BlockDevice.Devices.Count; j++)
{
if (Cosmos.Hardware.BlockDevice.Devices[j] is Partition)
{
part = (Partition)Cosmos.Hardware.BlockDevice.Devices[j];
}
}
var fs = new Cosmos.Sys.FileSystem.FAT32.FAT32(part);
uint cluster = 100;
fs.Format("newCluster", cluster);
}
catch
{
//Do Something warn user.
}
}
Which is analogous to what is located here: http://cosmos-tutorials.webs.com/atafat.html
However, when I run it, I get this error:
I believe this is because I lack this line:
Cosmos.System.Filesystem.FileSystem.AddMapping("C", FATFS);
FATFileList = FATFS.GetRoot();
Located in the link above. Is there any other way to map? Or am I missing something completely? The COSMOS documentation doesn't really tell much, the source code is honestly confusing for a beginner like me as it has no comments whatsoever on how the functions work or what they do. I am using an older version of COSMOS (Milestone 4) as it's the only one that works for Visual Studio C# 2008. Newer versions run only in Visual Studio C# 2010.
Ah, I recognize this... had to debug a similar situation on a Cosmos project I'm working on myself (I'm using the VS2010-compatible Cosmos but the same situation might apply to older versions as well...)
This can happen if you try to call a method on a null object. Type 0x........, Method 0x........ is specifically mentioning the location in the compiled code where the call failed. "Not FOUND!" means that the method it is looking for cannot be found, presumably because you called it on a null reference.
I'm testing with VirtualBox myself, and found that if you're using a brand-new blank hard disk image, there will be no Partitions on it. Thus, the condition will never get satisfied, your Partition will never get set and then Cosmos will try to execute a method on the null Partition!
Look closely at how you set the Partition (it's initialized to null). For starters I would print a simple message each time the "if (block device is partition)" condition is satisfied... I would be willing to bet it will never print.
Hope this helps... I am still learning about Cosmos and custom kernels myself but fixing the null reference in my case solved my occurrence of the problem. If that's the problem, then the next step, of course, is figuring out why you're not getting any Partitions in the first place...
The rest of your code looks fine but I am not sure how you implemented the rest of your classes. Kernel debugging can be a nightmare, good luck to you!

Check if DirectoryInfo.FullName is special folder

My goal is to check, if DirectoryInfo.FullName is one of the special folders.
Here is what I'm doing for this (Check directoryInfo.FullName to each special folder if they are equal):
DirectoryInfo directoryInfo = new DirectoryInfo("Directory path");
if (directoryInfo.FullName == Environment.GetFolderPath(Environment.SpecialFolder.Windows) ||
directoryInfo.FullName == Environment.GetFolderPath(Environment.SpecialFolder.ProgramFiles ||)
...
...
)
{
// directoryInfo is the special folder
}
But there are many special folders (Cookies, ApplicationData, InternetCache, etc.). Is there any way to do this task more efficiently?
Thanks.
Try this following code :
bool result = false;
DirectoryInfo directoryInfo = new DirectoryInfo("Directory path");
foreach (Environment.SpecialFolder suit in Enum.GetValues(typeof(Environment.SpecialFolder)))
{
if (directoryInfo.FullName == Environment.GetFolderPath(suit))
{
result = true;
break;
}
}
if (result)
{
// Do what ever you want
}
hope this help.
I'm afraid the answers given seem to be the only way, I hate the special folders because what ought to be a very simple function -
void CollectFiles(string strDir, string pattern) {
DirectoryInfo di = new DirectoryInfo(strDir);
foreach(FileInfo fi in di.GetFiles(pattern) {
//store file data
}
foreach(DirectoryInfo diInfo in di.GetDirectories()) {
CollectFiles(diInfo);
}
}
Becomes ugly because you have to include
Check If This Is A Special Folder And Deal With It And Its Child Folders Differently ();
Fair enough Microsoft, to have a folder that could exist anywhere, on a remote PC, on a server etc. But really what is wrong with the UNIX/Linux way, use links to folder and if the destination physical folder has to move, alter the link. Then you can itterate them in a nice neat function treating them all as if ordinary folders.
I don't have enough reputation to add a comment so as a +1 to BobRassler's answer, string comparisons might be more useful.
bool isSpecialFolder = false;
DirectoryInfo directoryInfo = new DirectoryInfo(Path.Combine(tbx_FolderName.Text, fileName));
foreach (Environment.SpecialFolder specialFolder in Enum.GetValues(typeof(Environment.SpecialFolder)))
{
if (directoryInfo.FullName.ToString()
.ToLower() ==
Environment.GetFolderPath(specialFolder)
.ToLower())
{
isSpecialFolder = true;
break;
}
}
if (isSpecialFolder)
{
// something
}
else
{
// something else
}
Use a reflection to get all values from that enum, like here http://geekswithblogs.net/shahed/archive/2006/12/06/100427.aspx and check against collection of generated paths you get.
I ended up using it this way:
public static bool IsSpecialFolder(DirectoryInfo directoryInfo, out Environment.SpecialFolder? _specialFolder) {
bool isSpecialFolder = false;
_specialFolder = null;
string directoryInfo_FullPath = directoryInfo.FullName;
foreach (Environment.SpecialFolder specialFolder in Enum.GetValues(typeof(Environment.SpecialFolder))) {
var specialFolder_FullPath = Environment.GetFolderPath(specialFolder);
if (string.Equals(directoryInfo_FullPath, specialFolder_FullPath, StringComparison.OrdinalIgnoreCase)) {
isSpecialFolder = true;
_specialFolder = specialFolder;
break;
}
}
return isSpecialFolder;
}
If handling strings from dubious sources (the user :-) ), there are three caveats to keep in mind:
Path.Combine vs. Path.Join, since they handle absolute paths (or paths that look like absolute paths) differently.
Path.GetFullPath, which takes a string an produces the full and normalized version of it.
GetFolderPath can return an empty string, which generates a System.ArgumentException: 'The path is empty. (Parameter 'path')' when used for creating a DirectoryInfo.
I like to keep this logic outside the method, but I am not sure if the OrdinalIgnoreCase or any other normalization is still necessary. I guess not.
P.S.: I think in modern lingo the method should be called TrySpecialFolder or something :-)

C# Unreachable code detected

I'm getting a "Unreachable code detected" message in Visual Studio at the point i++ in my code below. Can you spot what I've done wrong?
try
{
RegistryKey OurKey = Registry.CurrentUser;
OurKey.CreateSubKey("Software\\Resources\\Shared");
OurKey = OurKey.OpenSubKey("Software\\Resources\\Shared", true);
for (int i = 0; i < cmbPaths.Items.Count; i++) //<---- problem with i
{
OurKey.SetValue("paths" + i, cmbPaths.Items[i]);
break;
}
}
The problem is that this actually isn't a loop. You don't have any condition on the break so you could equivalently write something like
if(cmbPath.Items.Count > 0)
{
OurKey.SetValue("paths" + 0, cmbPaths.Items[0]);
}
Alternatively you have to correct with something like
for (int i = 0; i < cmbPaths.Items.Count; i++)
{
OurKey.SetValue("paths" + i, cmbPaths.Items[i]);
if(someConditionHolds)
break;
}
You're breaking out of the loop before the end of the first iteration.
The problem is that because you break; in the loop with no chance of it doing anything else, the increment of i (i++) will never be reached.
Although your problem is solved i need to tell you this,
you can just using the CreateSubKey() method for your purpose. I think It's a better choice.
:)
//Creates a new subkey or opens an existing subkey for write access.
var ourKey = Registry.CurrentUser.CreateSubKey("Software\\Resources\\Shared");
You can also end up getting unreachable code if you use say for example Entity Framework, and you didn't add that reference to that project.
Say you have several projects like A Data Layer Project, a Domain Classes, then you create a console app for testing or whatever and you reference where your dbcontext is at, but if you don't use say nuget and add in EF, you will get code unreachable when trying to write a loop etc...

Categories

Resources