private void getTotalBytes(IEnumerable<string> urls)
{
files = new List<FileInfo>(urls.Count());
For example urls count is 399.
But when I'm using a break point and hit F11, I see that the count of files is 0.
With new List<FileInfo>(urls.Count()); you create an empty list with an expected capacity of 399. Therefore the count of files is 0.
The next step is to fill the list with actual FileInfo objects; e.g.
private void getTotalBytes(IEnumerable<string> urls)
{
files = new List<FileInfo>(urls.Count());
foreach (var url in urls)
{
files.Add(new FileInfo(url));
}
or with Linq
private void getTotalBytes(IEnumerable<string> urls)
{
files = urls.Select(u => new FileInfo(u)).ToList();
To literally answer your question:
var files = new List<FileInfo>();
foreach(var file in myEnumerableStringCollection)
{
files.Add(new FileInfo(file));
}
However, your example's variable is 'urls' so I suspect you're attempting to do more with this than you explained.
Related
So this seems to me to be a simple looping issue it's just that I keep confusing myself over the logic.
So I want to count all the files within a folder, and then all the folders within that folder, I want to count the files that are in there too.
Which mean I have to loop through to check wether there is a folder and then check it until there are no more folders. But I can't write the algoritm because I keep confusing myself.
I'm pretty sure there is a standard algorithm for something like this but I can't remember the name.
This is what I have so far:
var rootDir = Directory.GetDirectories(#"C:\");
foreach (var dir in rootDir)
{
if (Directory.GetDirectories(dir).Length > 0)
{
}
}
Do I understand right, you need to count only files in folder and all subfolders? Directory.GetFiles has option for review all subfolders. Try this
Directory.GetFiles(WorkingDir, "*", SearchOption.AllDirectories);
use GetFiles
Directory.GetFiles
// Process all files in the directory passed in, recurse on any directories
// that are found, and process the files they contain.
public static void ProcessDirectory(string targetDirectory)
{
// Process the list of files found in the directory.
string [] fileEntries = Directory.GetFiles(targetDirectory);
foreach(string fileName in fileEntries)
ProcessFile(fileName);
// Recurse into subdirectories of this directory.
string [] subdirectoryEntries = Directory.GetDirectories(targetDirectory);
foreach(string subdirectory in subdirectoryEntries)
ProcessDirectory(subdirectory);
}
// Insert logic for processing found files here.
public static void ProcessFile(string path)
{
Console.WriteLine("Processed file '{0}'.", path);
}
here is a solution to count all files files and number of file per dir: i am using a dictionary to stock all datas
class Program
{
public static Dictionary<string, int> dico = new Dictionary<string, int>();
public static void CountFiles(string nameDirectory)
{
int nbrfiles = Directory.GetFiles(nameDirectory).Length;
dico[targetDirectory] = nbrfiles;
string[] subdirectories = Directory.GetDirectories(nameDirectory);
foreach (string subdir in subdirectories)
CountFiles(subdir);
}
static void Main(string[] args)
{
string tdir = "e:\\example";
CountFiles(tdir);
var totalfiles = dico.Sum(x => x.Value);
Console.WriteLine($"Directory {tdir} contains {totalfiles} files");
foreach (var item in dico)
{
Console.WriteLine($"Directory {item.Key} has {item.Value} file(s)");
}
}
}
I have a .NET webform that is displaying files found in a directory in a listView. This is the code for the display:
private void files()
{
try
{
DirectoryInfo dinfo = new DirectoryInfo(label2.Text);
FileInfo[] Files = dinfo.GetFiles("*.doc");
foreach (FileInfo file in Files)
{
listView1.Items.Add(file.Name);
}
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
}
label2.Text contains the directory that houses the files. What I need is for a second listView to display a list of documents housed in another directory to display if the file does not appear in the first list view.
The second directory contains templates where as the first directory contains completed documents. The names are different in each directory, but they are similar. For example a completed document displayed in the first listView may be called DEFECT1_AA09890.doc. It's template may be called 05DEFECT.doc.
It is easy enough to display the contents of the template directory using this code:
private void templateDocuments()
{
string path = #"\\directoryname\foldername";
try
{
DirectoryInfo dinfo = new DirectoryInfo(path);
FileInfo[] Files = dinfo.GetFiles("*.doc");
foreach (FileInfo file in Files)
{
listView2.Items.Add(file.Name);
}
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
}
But this does not compare contents and display based on the results.
Long story short, I want to display the contents of a directory in a listView, compare it to the contents of another directory, and display in a second listView what does not appear in the first.
Any help would be much appreciated.
Cheers.
Before adding file names to listView2, you need to check whether you already added them to listView1. One way of doing that is to store the files in listView1 in a HashSet<string>, then checking that before adding to listView2. Something like this should work:
private void filesAndTemplates()
{
string path = #"\\directoryname\foldername";
HashSet<string> files = new HashSet<string>();
try
{
DirectoryInfo dinfo = new DirectoryInfo(label2.Text);
FileInfo[] Files = dinfo.GetFiles("*.doc");
foreach (FileInfo file in Files)
{
files.Add(file.Name);
listView1.Items.Add(file.Name);
}
dinfo = new DirectoryInfo(path);
Files = dinfo.GetFiles("*.doc");
foreach (FileInfo file in Files)
{
if (files.Contains(file.Name))
{
continue; // We already saw this file
}
listView2.Items.Add(file.Name);
}
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
}
EDIT
If you want inexact matching, you need to reduce the file name to its essence -- remove any decorations, which in your case looks to be one (or both) of
Leading digits
Underscore followed by whatever
The essence of 01hello_world.doc would thus be hello.
Regex should fit the bill quite nicely -- although the exact definition of the regular expression would depend on your exact requirements.
Define the Regex and a transformation method somewhere suitable:
private static readonly Regex regex = new Regex(
#"[0-9]*(?<core>[^_]+)(_{1}.*)?", RegexOptions.Compiled);
private static string Transform(string fileName)
{
int extension = fileName.LastIndexOf('.');
if (extension >= 0)
{
fileName = fileName.Substring(0, extension);
}
Match match = regex.Match(fileName);
if (match.Success)
{
return match.Groups["core"].Value;
}
return fileName;
}
Then modify the original method to transform the filename before adding files to the HashSet and before checking for their presence:
DirectoryInfo dinfo = new DirectoryInfo(label2.Text);
FileInfo[] Files = dinfo.GetFiles("*.doc");
foreach (FileInfo file in Files)
{
files.Add(Transform(file.Name)); // Here!
listView1.Items.Add(file.Name);
}
dinfo = new DirectoryInfo(path);
Files = dinfo.GetFiles("*.doc");
foreach (FileInfo file in Files)
{
if (files.Contains(Transform(file.Name))) // Here!
{
continue;
}
listView2.Items.Add(file.Name);
}
Note the two calls to Transform.
In my application there is a situation like this.Before creating a file, my application search for files in a directory under a particular filename. If any file/files found, then it should read each files contents and write these contents(of each file) to a new file. I have googled many and tried some like this:
string temp_file_format = "ScriptLog_" + DateTime.Now.ToString("dd_MM_yyyy_HH");
string[] files = Directory.GetFiles(path,temp_file_format);
foreach (FileAccess finfo in files)
{
string text = File.ReadAllText(finfo);
}
and
System.IO.DirectoryInfo dir = new DirectoryInfo(path);
System.IO.FileInfo[] files = dir.GetFiles(temp_file_format);
foreach (FileInfo finfo in files)
{
finfo.OpenRead();
}
But all these failed..Can anyone show me an alternative for this?
Is there anything wrong in my temp_file_format string?
It will be nice if I could prepend these contents to the new file. Else also, no worries..
any help would be really appreciated..
This is a compete working implementation that does all of that
without reading everything in memory at one time (which doesn't work for large files)
without keeping any files open for more than the required time
using System.IO;
using System.Linq;
public static class Program {
public static void Main()
{
var all = Directory.GetFiles("/tmp", "*.cpp")
.SelectMany(File.ReadAllLines);
using (var w = new StreamWriter("/tmp/output.txt"))
foreach(var line in all)
w.WriteLine(line);
}
}
I tested it on mono 2.10, and it should work on any .NET 4.0+ (for File.ReadAllLines which is a lazy linewise enumerable)
Here's a short snippet that reads all the files and out puts them to the path outputPath
var lines = from file in Directory.GetFiles(path,temp_file_format)
from line in File.ReadAllLines(file)
select line;
File.WriteAllLines(outputPath, content);
The problem you are having with your code is not really related to reading files but simply trying to use an object as a type it's not. Directory.GetFiles returns an array of string and File.ReadXXX and File.OpenRead expects the path as a string. So you simply need to pass each of the strings returned as the path argument to the appropriate method. The above is one such example. Hope it helps both solve your problem and explain the actually issue with your code
try this:
foreach (FileInfo finfo in files)
{
try
{
using (StreamReader sr = new StreamReader("finfo "))
{
String line = sr.ReadToEnd();
Console.WriteLine(line);
}
}
catch (Exception e)
{
Console.WriteLine("The file could not be read:");
Console.WriteLine(e.Message);
}
}
using (var output = File.Create(outputPath))
{
foreach (var file in Directory.GetFiles(InputPath,temp_file_format))
{
using (var input = File.OpenRead(file))
{
input.CopyTo(output);
}
}
}
I am having a problem writing the files in folders and subfolders .
For Example:- test is the main folder
1) C:\test\
and i want to read and write the subfolder files
2)C:\test\12-05-2011\12-05-2011.txt
3)C:\test\13-05-2011\13-05-2011.txt
4)C:\test\14-05-2011\14-05-2011.txt
My code is:
private void button1_Click(object sender, EventArgs e)
{
const string Path1 = #"C:\test";
DoOnSubfolders(Path1);
try
{
StreamReader reader1 = File.OpenText(Path1);
string str = reader1.ReadToEnd();
reader1.Close();
reader1.Dispose();
File.Delete(Path1);
string[] Strarray = str.Split(new char[] { Strings.ChrW(10) });
int abc = Strarray.Length - 2;
int xyz = 0;
while (xyz <= abc)
}
I am getting an error. The error is
Access to the path 'C:\test' is denied.
Can anyone say me what i need to change in this code?
At first you could flatten your recursive calls by calling DirectoryInfo.GetFiles(string, SearchOption) and setting the SearchOption to AllDirectories.
What's also a common mistake (but not clear from your question) is that a directory needs to be created, before you can create a file. Simply call Directory.CreateDirectory(). And put in the complete path (without filename) into it. It will automatically do nothing if the directory already exists and is also able to create the whole needed structure. So no checks or recursive calls are needed (maybe a try-catch if you don't have write access).
Update
So here is an example that reads in a file, does some conversion on each line and writes the result into a new file. If this works properly the original file will be replaced by the converted one.
private static void ConvertFiles(string pathToSearchRecursive, string searchPattern)
{
var dir = new DirectoryInfo(pathToSearchRecursive);
if (!dir.Exists)
{
throw new ArgumentException("Directory doesn't exists: " + dir.ToString());
}
if (String.IsNullOrEmpty(searchPattern))
{
throw new ArgumentNullException("searchPattern");
}
foreach (var file in dir.GetFiles(searchPattern, SearchOption.AllDirectories))
{
var tempFile = Path.GetTempFileName();
// Use the using statement to make sure file is closed at the end or on error.
using (var reader = file.OpenText())
using (var writer = new StreamWriter(tempFile))
{
string line;
while (null != (line = reader.ReadLine()))
{
var split = line.Split((char)10);
foreach (var item in split)
{
writer.WriteLine(item);
}
}
}
// Replace the original file be the converted one (if needed)
////File.Copy(tempFile, file.FullName, true);
}
}
In your case you could call this function
ConvertFiles(#"D:\test", "*.*")
To recursively walk the sub-folders, you need a recursive function ie. One that calls itself. here is an example that should be enough for you to work with:
static void Main(string[] args)
{
const string path = #"C:\temp\";
DoOnSubfolders(path);
}
private static void DoOnSubfolders(string rootPath)
{
DirectoryInfo d = new DirectoryInfo(rootPath);
FileInfo[] fis = d.GetFiles();
foreach (var fi in fis)
{
string str = File.ReadAllText(fi.FullName);
//do your stuff
}
DirectoryInfo[] ds = d.GetDirectories();
foreach (var info in ds)
{
DoOnSubfolders(info.FullName);
}
}
You need use class Directory info and FileInfo.
DirectoryInfo d = new DirectoryInfo("c:\\test");
FileInfo [] fis = d.GetFiles();
DirectoryInfo [] ds = d.GetDirectories();
Here's a quick one liner to write the contents of all text files in a given directory (and all subdirectories) to the console:
Directory.GetFiles(myDirectory,"*.txt*",SearchOption.AllDirectories)
.ToList()
.ForEach(a => Console.WriteLine(File.ReadAllText(a)));
This code:
const string Path1 = #"C:\test";
StreamReader reader1 = File.OpenText(Path1);
Says open "c:\test" as a text file... The error you're getting is:
Access to the path 'C:\test' is denied
You're getting the error because as you stated above, 'c:\test' is a folder. You can't open folders like they are text files, hence the error...
A basic (full depth search) for files with a .txt extension looks like this:
static void Main(string[] args) {
ProcessDir(#"c:\test");
}
static void ProcessDir(string currentPath) {
foreach (var file in Directory.GetFiles(currentPath, "*.txt")) {
// Process each file (replace this with your code / function call /
// change signature to allow a delegate to be passed in... etc
// StreamReader reader1 = File.OpenText(file); // etc
Console.WriteLine("File: {0}", file);
}
// recurse (may not be necessary), call each subfolder to see
// if there's more hiding below
foreach (var subFolder in Directory.GetDirectories(currentPath)) {
ProcessDir(subFolder);
}
}
Have a look at http://support.microsoft.com/kb/303974 for a start. The secret is Directory.GetDirectories in System.IO.
You have to configure (NTFS) security on the c:\Test folder.
Normally you would have the application run under non-admininstrator account so the account that is running the program should have access.
If you are running on Vista or Windows 7 with UAC, you might be an administrator but you will not be using the administrative (elevated) permissions by default.
EDIT
Look at these lines:
const string Path1 = #"C:\test";
DoOnSubfolders(Path1);
try
{
StreamReader reader1 = File.OpenText(Path1);
That last line is trying to read the FOLDER 'c:\test' as if it was a text file.
You can't do that. What are you trying to accomplish there?
I want to keep a list of existing log files from the log directory. whenever this list reached the max limitation, say 20 files, I will delete the oldest log file.
Each time when application is launched, it will check the log directory and keep all log file names in a list. but this list should be sorted with the creation time.
what's the good way to do this? thanks,
Try this:
List<FileInfo> fi = new List<FileInfo>();
//load fi
List<FileInfo> SortedFi = fi.OrderBy(t=>t.CreationTime);
This grabs all the files older than the newest 20 and deletes them:
int numberOfFilesToKeep = 20;
string logFilePath = #"c:\temp";
FileInfo[] logFiles = (new DirectoryInfo(logFilePath)).GetFiles();
var oldFiles = logFiles.OrderByDescending(t => t.CreationTime).Skip(numberOfFilesToKeep);
foreach (var file in oldFiles)
file.Delete(); //you'll want a try/catch here
Note, you may want to use LastWriteTime rather than CreationTime above, depending upon how the log files are being used.
Something like this:
static void Main(string[] args)
{
var files = new List<string>();
foreach (var file in Directory.GetFiles("<path to your log files>"))
{
files.Add(file);
}
files.Sort(
new Comparison<string>(
(a, b) => new FileInfo(b).CreationTime.CompareTo(new FileInfo(a).CreationTime)
)
);
foreach (var file in files.Skip(20))
{
// Delete file.
}
}
using System.Linq;
using System.IO;
DirectoryInfo di = new DirectoryInfo("mylogdir");
FileSystemInfo[] files = di.GetFileSystemInfos();
var orderedFiles = files.OrderBy(f => f.CreationTime);
Cache orderedFiles, and refresh as needed whenever you roll over to the next.