I have other C# code that drops a call recording file into the folder c:\Recordings
Each file has the extension of .wma
I'd like to be able to check the folder every 5 minutes. If the folder contains a file ending in .wma i'd like to execute some code.
If the folder does not contain a file with the .wma extension, i'd like the code to pause for 5 minutes and then re-check (infinitely).
i've started with the following the check if the folder has any files in it at all, but when I run it, it always reports the folder contains files, even though it does not.
string dirPath = #"c:\recordings\";
if (Directory.GetFiles(dirPath).Length == 0)
{
NewRecordingExists = true;
Console.WriteLine("New Recording exists");
}
else
{
NewRecordingExists = false;
Console.WriteLine("No New Recording exists");
System.Threading.Thread.Sleep(300000);
}
if (Directory.GetFiles(dirPath).Length == 0)
This is checking if there are no files... then you are reporting "New Recording exists". I think you just have your logic the wrong way around. else is where it means you have found some files.
In addition, if you want to check for just *.wma files then you can use the GetFiles overload that takes a search pattern parameter, for example:
if (Directory.GetFiles(dirPath, "*.wma").Length == 0)
{
//NO matching *.wma files
}
else
{
//has matching *.wma files
}
SIDE NOTE: You may be interested in the FileSystemWatcher, this would enable you to monitor your recordings folder for changes (including when files are added). This would eliminate your requirement to poll every 5 minutes, and you get near-instant execution when the file is added, as opposed to waiting for the 5 minute interval to tick over
First of all your logic is reversed! ;)
here is you correct code:
bool NewRecordingExists;
string dirPath = #"c:\recordings\";
string[] fileNames = Directory.GetFiles(dirPath, "*.wma", SearchOption.TopDirectoryOnly);
if (fileNames.Length != 0)
{
NewRecordingExists = true;
foreach (string fileName in fileNames)
{
Console.WriteLine("New Recording exists: {0}", fileName);
/* do you process for each file here */
}
}
else
{
NewRecordingExists = false;
Console.WriteLine("No New Recording exists");
System.Threading.Thread.Sleep(300000);
}
Although, i recommend using System.Timers.Timer class for you application!
Don't use GetFiles if you're going to throw the result away.
Use an enumeration so you can exit early:
Directory.EnumerateFiles(Folder, "*.wma", SearchOption.AllDirectories).FirstOrDefault() != null
Related
I am trying to avoid empty files in my Program and the way i am doing it doesnt work.
I got a machschine that create logs, at Weekend is nobody here and but the maschine create a file with only 5,6 lines, the right one should have 20k lines.
I know there is FileInfo.Length but i dont know how to use it with this what a have right now.
public List<SystemLogFileData> ProcessSystemLogFiles(List<string> systemLogsFilePaths)
{
List<SystemLogFileData> systemLogFilesData = new List<SystemLogFileData>();
foreach (var filePath in systemLogsFilePaths)
{
string[] lines = System.IO.File.ReadAllLines(filePath);
var systemLogFileData = ProcessSystemLogFile(lines.ToList());
if (File.ReadAllText(systemLogFileData).Length > 100)
{
systemLogFilesData.Add(systemLogFileData);
}
}
return systemLogFilesData;
}
I solved it and it works well. Not in my main Program but in Autobackup.
insted of ignoring the the not important file i don't copy them and i reached my goal.
Thakns for the help!
//path of file
string pathToOriginalFile = #"C:\Users\Desktop\c#\Logging\Systemlog.bk66";
//duplicate file path
string PathForDuplicateFile = #"C:\\\Desktop\c#\Systemlog";
//rename fileName if Exists
FixFileName(ref PathForDuplicateFile, ".bk");
if (File.ReadAllText(pathToOriginalFile).Length > 2000)
{
File.Copy(pathToOriginalFile, PathForDuplicateFile);
}
else
{
}
//provide source and destination file paths
I'm writing a simple desktop application to copy files from one PC to another. Having trouble with the Windows 10 reparse points, specifically My Music. I thought was going to get away with one simple line of code:
ZipFile.CreateFromDirectory(documentsFolder, docSavePath + #"\Documents.zip", CompressionLevel.Optimal, false);
But not so, it crashes on the My Music folder. I've also tried a bunch of different ways of doing this, all with the same result - access denied. Can copying and/or zipping the Documents folder really be this hard? I doubt it, I'm just missing something. I tried elevating privileges and that didn't work, either. Anyone have an example of how to do this?
I was able figure out how to check for the ReparsePoint attribute, which was relatively easy, but then had to piece together how to loop through all the files and add them to the ZipArchive. The credit for the RecurseDirectory goes to this answer.
Then I added in what I learned about the reparse file attributes.
private void documentBackup(string docSavePath)
{
if (File.Exists(docSavePath + #"\Documents.zip")) File.Delete(docSavePath + #"\Documents.zip");
using (ZipArchive docZip = ZipFile.Open(docSavePath + "\\Documents.zip", ZipArchiveMode.Create))
{
foreach (FileInfo goodFile in RecurseDirectory(documentsFolder))
{
var destination = Path.Combine(goodFile.DirectoryName, goodFile.Name).Substring(documentsFolder.ToString().Length + 1);
docZip.CreateEntryFromFile(Path.Combine(goodFile.Directory.ToString(), goodFile.Name), destination);
}
}
}
public IEnumerable<FileInfo> RecurseDirectory(string path, List<FileInfo> currentData = null)
{
if (currentData == null)
currentData = new List<FileInfo>();
var directory = new DirectoryInfo(path);
foreach (var file in directory.GetFiles())
currentData.Add(file);
foreach (var d in directory.GetDirectories())
{
if ((d.Attributes & FileAttributes.ReparsePoint) == FileAttributes.ReparsePoint)
{
continue;
}
else
{
RecurseDirectory(d.FullName, currentData);
}
}
return currentData;
}
It takes longer than I'd like to run - but after looking at this dang problem for days I'm just happy it works!
My application creates files and directories throughout the year and needs to access the timestamps of those directories to determine if it's time to create another one. So it's vital that when I move a directory I preserve its timestamps. I can do it like this when Directory.Move() isn't an option (e.g. when moving to a different drive).
FileSystem.CopyDirectory(sourcePath, targetPath, overwrite);
Directory.SetCreationTimeUtc (targetPath, Directory.GetCreationTimeUtc (sourcePath));
Directory.SetLastAccessTimeUtc(targetPath, Directory.GetLastAccessTimeUtc(sourcePath));
Directory.SetLastWriteTimeUtc (targetPath, Directory.GetLastWriteTimeUtc (sourcePath));
Directory.Delete(sourcePath, true);
However, all three of these "Directory.Set" methods fail if File Explorer is open, and it seems that it doesn't even matter whether the directory in question is currently visible in File Explorer or not (EDIT: I suspect this has something to do with Quick Access, but the reason isn't particularly important). It throws an IOException that says "The process cannot access the file 'C:\MyFolder' because it is being used by another process."
How should I handle this? Is there an alternative way to modify a timestamp that doesn't throw an error when File Explorer is open? Should I automatically close File Explorer? Or if my application simply needs to fail, then I'd like to fail before any file operations take place. Is there a way to determine ahead of time if Directory.SetCreationTimeUtc() for example will encounter an IOException?
Thanks in advance.
EDIT: I've made a discovery. Here's some sample code you can use to try recreating the problem:
using System;
using System.IO;
namespace CreationTimeTest
{
class Program
{
static void Main( string[] args )
{
try
{
DirectoryInfo di = new DirectoryInfo( #"C:\Test" );
di.CreationTimeUtc = DateTime.UtcNow;
Console.WriteLine( di.FullName + " creation time set to " + di.CreationTimeUtc );
}
catch ( Exception ex )
{
Console.WriteLine( ex );
//throw;
}
finally
{
Console.ReadKey( true );
}
}
}
}
Create C:\Test, build CreationTimeTest.exe, and run it.
I've found that the "used by another process" error doesn't always occur just because File Explorer is open. It occurs if the folder C:\Test had been visible because C:\ was expanded. This means the time stamp can be set just fine if File Explorer is open and C:\ was never expanded. However, once C:\Test becomes visible in File Explorer, it seems to remember that folder and not allow any time stamp modification even after C:\ is collapsed. Can anyone recreate this?
EDIT: I'm now thinking that this is a File Explorer bug.
I have recreated this behavior using CreationTimeTest on multiple Windows 10 devices. There are two ways an attempt to set the creation time can throw the "used by another process" exception. The first is to have C:\Test open in the main pane, but in that case you can navigate away from C:\Test and then the program will run successfully again. But the second way is to have C:\Test visible in the navigation pane, i.e. to have C:\ expanded. And once you've done that, it seems File Explorer keeps a handle open because the program continues to fail even once you collapse C:\ until you close File Explorer.
I was mistaken earlier. Having C:\Test be visible doesn't cause the problem. C:\Test can be visible in the main pane without issue. Its visibility in the navigation pane is what matters.
Try this:
string sourcePath = "";
string targetPath = "";
DirectoryInfo sourceDirectoryInfo = new DirectoryInfo(sourcePath);
FileSystem.CopyDirectory(sourcePath, targetPath, overwrite);
DirectoryInfo targetDirectory = new DirectoryInfo(targetPath);
targetDirectory.CreationTimeUtc = sourceDirectoryInfo.CreationTimeUtc;
targetDirectory.LastAccessTimeUtc = sourceDirectoryInfo.LastAccessTimeUtc;
targetDirectory.LastWriteTimeUtc = sourceDirectoryInfo.LastWriteTimeUtc;
Directory.Delete(sourcePath, true);
This will allow you to set the creation/access/write times for the target directory, so long as the directory itself is not open in explorer (I am assuming it won't be, as it has only just been created).
I am suspecting FileSystem.CopyDirectory ties into Explorer and somehow blocks the directory. Try copying all the files and directories using standard C# methods, like this:
DirectoryCopy(#"C:\SourceDirectory", #"D:\DestinationDirectory", true);
Using these utility methods:
private static void DirectoryCopy(string sourceDirName, string destDirName, bool copySubDirs)
{
// Get the subdirectories for the specified directory.
DirectoryInfo dir = new DirectoryInfo(sourceDirName);
if (!dir.Exists)
{
throw new DirectoryNotFoundException("Source directory does not exist or could not be found: " + sourceDirName);
}
if ((dir.Attributes & FileAttributes.ReparsePoint) == FileAttributes.ReparsePoint)
{
// Don't copy symbolic links
return;
}
var createdDirectory = false;
// If the destination directory doesn't exist, create it.
if (!Directory.Exists(destDirName))
{
var newdir = Directory.CreateDirectory(destDirName);
createdDirectory = true;
}
// Get the files in the directory and copy them to the new location.
DirectoryInfo[] dirs = dir.GetDirectories();
FileInfo[] files = dir.GetFiles();
foreach (FileInfo file in files)
{
if ((file.Attributes & FileAttributes.ReparsePoint) == FileAttributes.ReparsePoint)
continue; // Don't copy symbolic links
string temppath = Path.Combine(destDirName, file.Name);
file.CopyTo(temppath, false);
CopyMetaData(file, new FileInfo(temppath));
}
// If copying subdirectories, copy them and their contents to new location.
if (copySubDirs)
{
foreach (DirectoryInfo subdir in dirs)
{
string temppath = Path.Combine(destDirName, subdir.Name);
DirectoryCopy(subdir.FullName, temppath, copySubDirs);
}
}
if (createdDirectory)
{
// We must set it AFTER copying all files in the directory - otherwise the timestamp gets updated to Now.
CopyMetaData(dir, new DirectoryInfo(destDirName));
}
}
private static void CopyMetaData(FileSystemInfo source, FileSystemInfo dest)
{
dest.Attributes = source.Attributes;
dest.CreationTimeUtc = source.CreationTimeUtc;
dest.LastAccessTimeUtc = source.LastAccessTimeUtc;
dest.LastWriteTimeUtc = source.LastWriteTimeUtc;
}
I'm not really into multithreading so probably the question is stupid but it seems I cannot find a way to solve this problem (especially because I'm using C# and I've been using it for a month).
I have a dynamic number of directories (I got it from a query in the DB). Inside those queries there are a certain amount of files.
For each directory I need to use a method to transfer these files using FTP in a cuncurrent way because I have basically no limit in FTP max connections (not my word, it's written in the specifics).
But I still need to control the max amount of files transfered per directory. So I need to count the files I'm transfering (increment/decrement).
How could I do it? Should I use something like an array and use the Monitor class?
Edit: Framework 3.5
You can use the Semaphore class to throttle the number of concurrent files per directory. You would probably want to have one semaphore per directory so that the number of FTP uploads per directory can be controlled independently.
public class Example
{
public void ProcessAllFilesAsync()
{
var semaphores = new Dictionary<string, Semaphore>();
foreach (string filePath in GetFiles())
{
string filePathCapture = filePath; // Needed to perform the closure correctly.
string directoryPath = Path.GetDirectoryName(filePath);
if (!semaphores.ContainsKey(directoryPath))
{
int allowed = NUM_OF_CONCURRENT_OPERATIONS;
semaphores.Add(directoryPath, new Semaphore(allowed, allowed));
}
var semaphore = semaphores[directoryPath];
ThreadPool.QueueUserWorkItem(
(state) =>
{
semaphore.WaitOne();
try
{
DoFtpOperation(filePathCapture);
}
finally
{
semaphore.Release();
}
}, null);
}
}
}
var allDirectories = db.GetAllDirectories();
foreach(var directoryPath in allDirectories)
{
DirectoryInfo directories = new DirectoryInfo(directoryPath);
//Loop through every file in that Directory
foreach(var fileInDir in directories.GetFiles()) {
//Check if we have reached our max limit
if (numberFTPConnections == MAXFTPCONNECTIONS){
Thread.Sleep(1000);
}
//code to copy to FTP
//This can be Aync, when then transfer is completed
//decrement the numberFTPConnections so then next file can be transfered.
}
}
You can try something along the lines above. Note that It's just the basic logic and there are proberly better ways to do this.
In C#, how do I check if a specific file exists in a directory or any of its subdirectories?
System.IO.File.Exists only seems to accept a single parameter with no overloads to search subdirectories.
I can do it with LINQ and System.IO.Directory.GetFiles using the SearchOption.AllDirectories overload, but that seems a bit heavy handed.
var MyList = from f in Directory.GetFiles(tempScanStorage, "foo.txt", SearchOption.AllDirectories)
where System.IO.Path.GetFileName(f).ToUpper().Contains(foo)
select f;
foreach (var x in MyList)
{
returnVal = x.ToString();
}
If you're looking for a single specific filename, using *.* is indeed heavy handed. Try this:
var file = Directory.GetFiles(tempScanStorage, foo, SearchOption.AllDirectories)
.FirstOrDefault();
if (file == null)
{
// Handle the file not being found
}
else
{
// The file variable has the *first* occurrence of that filename
}
Note that this isn't quite what your current query does - because your current query would find "xbary.txt" if you foo was just bar. I don't know whether that's intentional or not.
If you want to know about multiple matches, you shouldn't use FirstOrDefault() of course. It's not clear exactly what you're trying to do, which makes it hard to give more concrete advice.
Note that in .NET 4 there's also Directory.EnumerateFiles which may or may not perform better for you. I highly doubt that it'll make a difference when you're searching for a specific file (instead of all files in the directory and subdirectories) but it's worth at least knowing about. EDIT: As noted in comments, it can make a difference if you don't have permission to see all the files in a directory.
The alternative is to write the search function yourself, one of these should work:
private bool FileExists(string rootpath, string filename)
{
if(File.Exists(Path.Combine(rootpath, filename)))
return true;
foreach(string subDir in Directory.GetDirectories(rootpath, "*", SearchOption.AllDirectories))
{
if(File.Exists(Path.Combine(subDir, filename)))
return true;
}
return false;
}
private bool FileExistsRecursive(string rootPath, string filename)
{
if(File.Exists(Path.Combine(rootPath, filename)))
return true;
foreach (string subDir in Directory.GetDirectories(rootPath))
{
if(FileExistsRecursive(subDir, filename))
return true;
}
return false;
}
The first method still extracts all of the directory names and would be slower when there are many subdirs but the file is close to the top.
The second is recursive which would be slower in 'worst case' scenarios but faster when there are many nested subdirs but the file is in a top level dir.
To Check for file existing in any specific directory do the following
Note: "UploadedFiles" is name of the folder.
File.Exists(Server.MapPath("UploadedFiles/"))
Enjoy Coding
It is a recursive search on the filesystem. You have some functional examples in CodeProject:
Simple File Search Class (by jabit)
Scan directories using recursion using events (by Jan Schreuder)
This is a recursive search function that will break out as soon as finds the file you've specified. Please note the parameters should be specified as fileName (eg. testdb.bak) and directory (eg. c:\test).
Be aware that this can be quite slow if you do this in a directory with a large quantity of subdirecories and files.
private static bool CheckIfFileExists(string fileName, string directory) {
var exists = false;
var fileNameToCheck = Path.Combine(directory, fileName);
if (Directory.Exists(directory)) {
//check directory for file
exists = Directory.GetFiles(directory).Any(x => x.Equals(fileNameToCheck, StringComparison.OrdinalIgnoreCase));
//check subdirectories for file
if (!exists) {
foreach (var dir in Directory.GetDirectories(directory)) {
exists = CheckIfFileExists(fileName, dir);
if (exists) break;
}
}
}
return exists;
}