File gets locked when overwriting - c#

Title explains a small part so let me explain 2 scenarios. Scenario 1 is raising errors, scenario 2 works like a charm.
Scenario 1:
I checkout a document with the method below, when the document is saved to a location where already is a file with that name it gets overwritten, But surprisingly it also locks the file for some reason:
public bool SaveDocument(int bestandsId, string fileName, string path)
{
//Initialize the Sql Query
var sql = "SELECT DATA FROM Documenten WHERE BESTAND_ID = " + bestandsId;
//Initialize SqlConnection
var connection = new SqlConnection(Instellingen.Instance.DmsConnectionString);
//Initialize SqlCommand
var command = new SqlCommand(sql, connection);
try
{
//Open Connection
connection.Open();
//Fill 'data' from command.ExecuteScalar()
var data = (byte[]) command.ExecuteScalar();
//Write 'data' to file.
File.WriteAllBytes(path + #"\" + fileName, data);
//Return true if no exceptions are raised.
return true;
}
catch (Exception ex)
{
//Initialize Dms Exception
var dmsEx = new DmsException(ex);
//Write Dms Exception to Log File.
DmsException.WriteErrorsToLog(dmsEx);
//Return false, because something went wrong...
return false;
}
finally
{
//Close Sql Connection
connection.Close();
}
}
The method runs smoothly. No problems occur. But when I check in the document with the method below, I get this exception:
Scenario 2:
When I use the SaveDocument method to save the document to a location where there isn't a file with the same name, the file is newly created and is ready to be edited or what ever you want to do with it.
Using scenario 2 is working perfect. The document is ready to be checked in again without receiving an error as shown in the picture above.
Request for code by: #CodeCaster
---------------------------------BEGIN EDIT---------------------------------
public static bool InsertDocument(Document document)
{
try
{
//Exception is thrown when Initializing the FileStream
var fileStream = new FileStream(document.Fileinfo.FullName, FileMode.Open, FileAccess.Read);
var binaryReader = new BinaryReader(fileStream);
var totalNumberOfBytes = new FileInfo(document.Fileinfo.FullName).Length;
var data = binaryReader.ReadBytes((Int32) totalNumberOfBytes);
fileStream.Close();
fileStream.Dispose();
binaryReader.Close();
binaryReader.Dispose();
var pdftext = string.Empty;
try
{
if (document.DocumentType == ".pdf")
{
var reader = new PdfReader(document.Fileinfo.FullName);
var text = string.Empty;
for (var page = 1; page <= reader.NumberOfPages; page++)
{
text += PdfTextExtractor.GetTextFromPage(reader, page);
}
reader.Close();
pdftext = text;
}
}
catch (Exception ex)
{
var dmsEx = new DmsException(ex);
DmsException.WriteErrorsToLog(dmsEx);
}
return InsertIntoDatabase(document.BestandsNaam, document.Eigenaar, document.Omschrijving,
document.DatumToevoeg.ToString(), document.DatumIncheck.ToString(),
document.DatumUitcheck.ToString(), document.UitgechecktDoor,
document.DocumentType, data, pdftext, document.Versie, document.Medewerker,
document.DossierNummer, document.PersonalFolderId.ToString(),
document.DossierFolderId, -1, document.DocumentProgres,
document.OriBestandId.ToString(), 0);
}
catch (Exception ex)
{
var dmsEx = new DmsException("Fout bij inlezen voor toevoeging van nieuw document",
"Klasse Document (InsertDocument)", ex);
ExceptionLogger.LogError(dmsEx);
return false;
}
}
---------------------------------END EDIT---------------------------------
My questions:
What is the cause for the file being locked when it gets overwritten?
How can I prevent this from happening?
Is there some sort of function or parameter that I can set so it doesn't get locked?
Using a tool called "Unlocker" I managed to see what program is locking the file, and YES -> DMS.exe is my application.......:

using(var stream = File.Create(newPath)){}
File.WriteAllBytes(newPath, item.File);
With StreamWriter
using (FileStream fs = File.Create(newPath))
{
fs.Write(item.File, 0, item.File.Length);
}
Or:
File.WriteAllBytes(newPath, item.File);
Reference: "The process cannot access the file because it is being used by another process" with Images

Related

Windows Service Filestream giving System.IO.IOException: The process cannot access the file "filename" because it is being used by another process

I've got a windows service that I have to modify. Current code is this:
public IRecord2 GetRecord(string name)
{
string path = Path.Combine(this.DirectoryPath, name);
if (!File.Exists(path))
return null;
byte[] contents;
lock (locker) {
using(FileStream fs = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.ReadWrite, bufferSize:4096, useAsync:true)) //WHERE THE PROBLEM IS OCCURRING
{
using (BinaryReader br = new BinaryReader(fs))
{
contents = br.ReadBytes((int)fs.Length);
br.Close(); //unnecessary but threw it in just to be sure
fs.Close(); //unnecessary but threw it in just to be sure
}
}
}
return new Record2()
{
Name = name,
Contents = contents
};
}
Code that calls the function:
public void Process(string pickupFileName)
{
string uniqueId = DateTime.Now.ToString("(yyyy-MM-dd_HH-mm-ss)");
string exportFileName = Path.GetFileNameWithoutExtension(pickupFileName) + "_" + uniqueId + ".csv";
string archiveFileName = Path.GetFileNameWithoutExtension(pickupFileName) + "_" + uniqueId + Path.GetExtension(pickupFileName);
string unprocessedFileName = Path.GetFileNameWithoutExtension(pickupFileName) + "_" + uniqueId + Path.GetExtension(pickupFileName);
try
{
_logger.LogInfo(String.Format("Processing lockbox file '{0}'", pickupFileName));
IRecord2 record = _pickup.GetRecord(pickupFileName);
if (record == null)
return;
_archive.AddOrUpdate(new Record2() { Name = archiveFileName, Contents = record.Contents });
string pickupFileContents = UTF8Encoding.UTF8.GetString(record.Contents);
IBai2Document document = Bai2Document.CreateFromString(pickupFileContents);
StringBuilder sb = Export(document);
_export.AddOrUpdate(new Record2() { Name = exportFileName, Contents = Encoding.ASCII.GetBytes(sb.ToString()) });
_pickup.Delete(pickupFileName);
}
catch(Exception ex)
{
throw ex;
}
}
Function that calls Process:
public void Process()
{
foreach (ConfigFolderPath configFolderPath in _configSettings.ConfigFolderPaths)
{
IRecordRepository pickup = new FileRepository(configFolderPath.PickupFolderPath);
IRecordRepository export = new FileRepository(configFolderPath.ExportFolderPath);
IRecordRepository archive = new FileRepository(configFolderPath.ArchiveFolderPath);
IRecordRepository unprocessed = new FileRepository(configFolderPath.UnprocessedFolderPath);
Converter converter = new Converter(Logger,pickup, export, archive, unprocessed);
foreach (string fileName in pickup.GetNames())
{
if (_configSettings.SupportedFileExtensions.Count > 0 && !_configSettings.SupportedFileExtensions.Any(extension => extension.ToLower() == Path.GetExtension(fileName).ToLower()))
continue;
Action action = () => converter.Process(fileName);
_queue.TryEnqueue(action, new WorkTicket() { Description = String.Format("Processing '{0}'", fileName), SequentialExecutionGroup = fileName });
}
}
}
When 1 file is sent to the service, it processes and reads the file correctly. However, if two files are sent (difference of 3 minutes), the first file will process correctly, but the second will give me "System.IO.IOException: The process cannot access the file "filename" because it is being used by another process.
Is the solution to use a mutex as per https://stackoverflow.com/a/29941548/4263285 or is there a better solution to solve this?
Edit: More context:
Service is constantly running - as soon as files are dropped into a folder, it begins the process.
get the file data (function up above)
take the data, transform it, and put it into a different file
Delete the original file from the one up above
rinse and repeat if more files
if one file is placed in the folder, it works correctly.
if two files are placed in the folder, it breaks on the second file
if service is stopped and restarted, it works again
In your code add ".Close()" here, at the end of the line :
using(FileStream fs = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.ReadWrite, bufferSize:4096, useAsync:true).Close())

Reading from file and writing to a temporary .txt file in C#?

I know there are a lot of similar topics on this website, but I think that I went through most of them and still cannot debug this piece of code. I really need to get this working. I'm newbie to C# and programming. Tho, I did this same assignment in Java, but for some reason, I can't make it work here. If some could please pitch in...
So I have some objects, which I am keeping in .txt file, one line = data for one object. First data of the line is an Id of an object, primary key basically. Right now I am implementing CRUD operations, that is, an Update. This edit function is supposed to contribute to that functionality.
If a user edit some of the selected object properties, that change needs to be reflected in .txt file. So, I will go through every object/line in the file, write them to some temp.txt file, once I hit object which has same Id as the passed object o, that means I need to write that edited object to temp.txt. After that I need to rename temp.txt to original file and delete temp.txt.
I have tried bunch of options and combinations, but none worked.
I really make sure that GetTxtPath returns correct absolute path from within my project.
Version 1:
public static void edit(Transformable o, string fileName)
{
try
{
if (!File.Exists(FileUtils.GetTxtPath("temp.txt")))
{
File.Create(FileUtils.GetTxtPath("temp.txt"));
}
using (FileStream stream = File.OpenRead(FileUtils.GetTxtPath(fileName)))
using (FileStream writeStream = File.OpenWrite(FileUtils.GetTxtPath("temp.txt")))
{
StreamReader reader = new StreamReader(stream);
StreamWriter writer = new StreamWriter(writeStream);
String line;
while ((line = reader.ReadLine()) != null)
{
if (!line.Equals(""))
{
if (o.GetId() == getIdFromString(line))
{
writer.Write(o.WriteToFile());
}
else
{
writer.Write(line + "\n");
}
}
else
{
continue;
}
}
}
}
catch (FileNotFoundException e)
{
Console.WriteLine($"The file was not found: '{e}'");
}
catch (DirectoryNotFoundException e)
{
Console.WriteLine($"The directory was not found: '{e}'");
}
catch (IOException e)
{
Console.WriteLine($"The file could not be opened: '{e}'");
}
}
public static string GetTxtPath(string fileName)
{
var startDirectory =
Directory.GetParent(Directory.GetCurrentDirectory()).Parent.Parent.FullName;
var absPath = startDirectory + #"\data\" + fileName;
return absPath;
}
private static int getIdFromString(string line)
{
return Int32.Parse(line.Split('|')[0]);
}
Version 2:
public static void Edit(Transformable o, string fileName)
{
try
{
if (!File.Exists(FileUtils.GetTxtPath("temp.txt")))
{
File.Create(FileUtils.GetTxtPath("temp.txt"));
}
using (StreamReader reader = FileUtils.GetTxtReader(fileName))
using (StreamWriter writer = FileUtils.GetTxtWriter("temp.txt"))
{
String line;
while ((line = reader.ReadLine()) != null)
{
if (!line.Equals(""))
{
if (o.GetId() == getIdFromString(line))
{
writer.Write(o.WriteToFile());
}
else
{
writer.Write(line + "\n");
}
}
else
{
continue;
}
}
}
File.Move(FileUtils.GetTxtPath("temp.txt"), FileUtils.GetTxtPath(fileName));
File.Delete(FileUtils.GetTxtPath("temp.txt"));
//Here I tied many differenet options but nonthing worked
//Here is Java code which did the job of renaming and deleting
//------------------------------------------------------------
// File original = FileUtils.getFileForName(fileName);
// File backUpFile = new File("backUp");
// Files.move(original.toPath(), backUpFile.toPath(),
// StandardCopyOption.REPLACE_EXISTING);
// File temporary = FileUtils.getFileForName(temporaryFilePath);
// temporary.renameTo(original);
// backUpFile.delete();
// File original = FileUtils.getFileForName(path);
//--------------------------------------------------------
//public static File getFileForName(String name)
//{
// String dir = System.getProperty("user.dir");
// String sP = System.getProperty("file.separator");
// File dirData = new File(dir + sP + "src" + sP + "data");
// File file = new File(dirData.getAbsolutePath() + sP + name);
// return file;
//}
//---------------------------------------------------------------------
}
catch (FileNotFoundException e)
{
Console.WriteLine($"The file was not found: '{e}'");
}
catch (DirectoryNotFoundException e)
{
Console.WriteLine($"The directory was not found: '{e}'");
}
catch (IOException e)
{
Console.WriteLine($"The file could not be opened: '{e}'");
}
public static StreamReader GetTxtReader(string fileName)
{
var fileStream = new FileStream(GetTxtPath(fileName), FileMode.Open, FileAccess.Read);
return new StreamReader(fileStream, Encoding.UTF8);
}
public static StreamWriter GetTxtWriter(string fileName)
{
FileStream fileStream = new FileStream(GetTxtPath(fileName), FileMode.Append);
return new StreamWriter(fileStream, Encoding.UTF8);
}
public static void Edit(Transformable o, string fileName)
{
try
{
string tempName = "temp.txt"; // create here correct path
using (var readStream = File.OpenRead(fileName))
using (var writeStream = File.OpenWrite(tempName))
using (var reader = new StreamReader(readStream))
using (var writer = new StreamWriter(writeStream))
{
string line;
while ((line = reader.ReadLine()) != null)
{
if (!line.Equals(""))
{
if (o.GetId() == GetId(line))
{
writer.WriteLine(o.ToWriteableString());
}
else
{
writer.WriteLine(line);
}
}
}
}
File.Delete(fileName);
File.Move(tempName, fileName);
}
catch ...
}
File.OpenWrite method opens an existing or creates a new file for writing. So there is no need to manually check and create the file.
You have wrapped FileStreams in a using statement quite correctly. However, StreamReader and StreamWriter also must to be released after use.
I renamed some methods, giving them names that conform to the naming rules in C#: Edit, GetId, ToWriteableString.
The else branch with the continue statement is not needed.
In the end, just use the File.Delete and File.Move methods.
Note: the int.Parse method can throw exceptions that also need to be handled.

Dumping SQL table to .csv C#

I am trying to implement a script in my application that will dump the entire contents (for now, but I am trying to write the code so that I can easily customize it to only grab certain columns) of a sql db (running ms sql server express 2014) to a .csv file.
Here is the code I have written currently:
public void doCsvWrite(string timeStamp){
try {
//specify file name of log file (csv).
string newFileName = "C:/TestDirectory/DataExport-" + timeStamp + ".csv";
//check to see if file exists, if not create an empty file with the specified file name.
if (!File.Exists(newFileName)) {
FileStream fs = new FileStream(newFileName, FileMode.CreateNew);
fs.Close();
//define header of new file, and write header to file.
string csvHeader = "ITEM1,ITEM2,ITEM3,ITEM4,ITEM5";
using (FileStream fsWHT = new FileStream(newFileName, FileMode.Append, FileAccess.Write))
using(StreamWriter swT = new StreamWriter(fsWHT))
{
swT.WriteLine(csvHeader.ToString());
}
}
//set up connection to database.
SqlConnection myDEConnection;
String cDEString = "Data Source=localhost\\NAMEDPIPE;Initial Catalog=db;User Id=user;Password=pwd";
String strDEStatement = "SELECT * FROM table";
try
{
myDEConnection = new SqlConnection(cDEString);
}
catch (Exception ex)
{
//error handling here.
return;
}
try
{
myDEConnection.Open();
}
catch (Exception ex)
{
//error handling here.
return;
}
SqlDataReader reader = null;
SqlCommand myDECommand = new SqlCommand(strDEStatement, myDEConnection);
try
{
reader = myDECommand.ExecuteReader();
while (reader.Read())
{
for (int i = 0; i < reader.FieldCount; i++)
{
if(reader["Column1"].ToString() == "") {
//does nothing if the current line is "bugged" (containing no values at all, typically happens after reboot of 3rd party equipment).
}
else {
//grab relevant tag data and set the csv line for the current row.
string csvDetails = reader["Column1"] + "," + reader["Column2"] + "," + String.Format("{0:0.0}", reader["Column3"]) + "," + String.Format("{0:0.000}", reader["Column4"]) + "," + reader["Column5"];
using (FileStream fsWDT = new FileStream(newFileName, FileMode.Append, FileAccess.Write))
using(StreamWriter swDT = new StreamWriter(fsWDT))
{
//write csv line to file.
swDT.WriteLine(csvDetails.ToString());
}
}
}
}
}
catch (Exception ex)
{
//error handling here.
myDEConnection.Close();
return;
}
myDEConnection.Close();
}
catch (Exception ex)
{
//error handling here.
MessageBox.Show(ex.Message);
}
}
Now, this was working fine when I was using it with a 3rd party SQLite-based database, but the output I'm getting after modifing this to my MSSQL db looks something like this (ITEM1 is the primary key, a standard auto-incrementing ID-field):
ITEM1,ITEM2,ITEM3,ITEM4,ITEM5
1,row1_item2,row1_item3,row1_item4,row1_item5
1,row1_item2,row1_item3,row1_item4,row1_item5
1,row1_item2,row1_item3,row1_item4,row1_item5
1,row1_item2,row1_item3,row1_item4,row1_item5
1,row1_item2,row1_item3,row1_item4,row1_item5
1,row1_item2,row1_item3,row1_item4,row1_item5
2,row2_item2,row2_item3,row2_item4,row2_item5
2,row2_item2,row2_item3,row2_item4,row2_item5
2,row2_item2,row2_item3,row2_item4,row2_item5
2,row2_item2,row2_item3,row2_item4,row2_item5
2,row2_item2,row2_item3,row2_item4,row2_item5
3,row3_item2,row3_item3,row3_item4,row3_item5
3,row3_item2,row3_item3,row3_item4,row3_item5
3,row3_item2,row3_item3,row3_item4,row3_item5
3,row3_item2,row3_item3,row3_item4,row3_item5
....
So it seems that it writes several entries of the same row, where I would just like one single line each row. Any suggestions?
Thanks in advance.
edit: Thanks everyone for your answers!
The for loop isn't needed in the section below. Because it loops from 0 to FieldCount I assume the loop was originally meant to append the text from each column together but inside the loop there's a single line that concatenates the text and assigns it to csvDetails.
try
{
reader = myDECommand.ExecuteReader();
while (reader.Read())
{
for (int i = 0; i < reader.FieldCount; i++)
{
if(reader["Column1"].ToString() == "") {
//does nothing if the current line is "bugged" (containing no values at all, typically happens after reboot of 3rd party equipment).
}
else {
//grab relevant tag data and set the csv line for the current row.
string csvDetails = reader["Column1"] + "," + reader["Column2"] + "," + String.Format("{0:0.0}", reader["Column3"]) + "," + String.Format("{0:0.000}", reader["Column4"]) + "," + reader["Column5"];
using (FileStream fsWDT = new FileStream(newFileName, FileMode.Append, FileAccess.Write))
using(StreamWriter swDT = new StreamWriter(fsWDT))
{
//write csv line to file.
swDT.WriteLine(csvDetails.ToString());
}
}
}
}
}
Usually, we use specialy designed export/import utilites for dumping data.
However, if you have to implement you own routine I suggest decomposing.
private static IEnumerable<IDataRecord> SourceData(String sql) {
using (SqlConnection con = new SqlConnection(ConnectionStringHere)) {
con.Open();
using (SqlCommand q = new SqlCommand(sql, con)) {
using (var reader = q.ExecuteReader()) {
while (reader.Read()) {
//TODO: you may want to add additional conditions here
yield return reader;
}
}
}
}
}
private static IEnumerable<String> ToCsv(IEnumerable<IDataRecord> data) {
foreach (IDataRecord record in data) {
StringBuilder sb = new StringBuilder();
for (int i = 0; i < record .FieldCount; ++i) {
String chunk = Convert.ToString(record .GetValue(0));
if (i > 0)
sb.Append(',');
if (chunk.Contains(',') || chunk.Contains(';'))
chunk = "\"" + chunk.Replace("\"", "\"\"") + "\"";
sb.Append(chunk);
}
yield return sb.ToString();
}
}
Having SourceData and ToCsv you can easily implement
private static void WriteMyCsv(String fileName) {
var source = SourceData("SELECT * FROM table");
File.WriteAllLines(fileName, ToCsv(source));
}
You have a for loop which is looping over the fieldcount.
for (int i = 0; i < reader.FieldCount; i++)
I think it will work if you remove the loop as you don't need to iterate through the columns.
it happens because output placed inside for-loop
for (int i = 0; i < reader.FieldCount; i++)
and every record repeats FieldCount-times
Complete example. Verified working .NET 4.8, May 22. Code simplified for demo.
Why the DataTable ? Under circumstances it is useful. If you converting hundreds of files at once and multi threading - it works as large buffer + you can do pretty complex data mangling at the same time - should you need it.
UNFORTUNATELY - Microsoft trying to detect the column types and if your data not comply with the mechanism it ends with hard to correct errors. In that case use the second solution.
// Get the data from SQLite
SqliteConnection SQLiDataCon = new SqliteConnection(#"Data Source=c:\sqlite.db3");
SQLiDataCon.Open();
SqliteDataReader SQLiDtaReader = new SqliteCommand(#"SELECT * FROM stats;", SQLiDataCon).ExecuteReader();
// Load data to DataTable
DataTable csvTable = new DataTable();
csvTable.Load(SQLiDtaReader);
// Get "one" string with column names
string csvFields = #"""" + String.Join(#""",""",csvTable.Columns.Cast<DataColumn>().Select(dc => dc.ColumnName).ToArray()) + #"""";
// Prep "in memory the entire content of the CSV"
StringBuilder csvString = new StringBuilder();
// Write the header in
csvString.AppendLine(csvFields);
// Write the rows in
foreach (DataRow dr in csvTable.Rows)
{
csvString.AppendLine(#"""" + String.Join(#""",""", dr.ItemArray) + #"""");
}
// Save to file
StreamWriter csvFile = new StreamWriter(#"c:\stats.csv");
csvFile.Write(csvString);
Without DataTable.
// SQLITE
SqliteConnection SQLiDataCon = new SqliteConnection(#"Data Source=c:\sqlite.db3");
SQLiDataCon.Open();
StringBuilder csvString = new StringBuilder();
StreamWriter csvFile;
Object[] csvRow;
SqliteDataReader SQLiDtaReader = new SqliteCommand(#"SELECT * FROM sometable;", SQLiDataCon).ExecuteReader();
// CSV HEADER
csvString.AppendLine(#"""" + String.Join(#""",""", SQLiDtaReader.GetSchemaTable().AsEnumerable().Select(dr => dr.Field<string>("ColumnName")).ToArray<string>()) + #"""");
// CSV BODY
while (SQLiDtaReader.Read())
{
SQLiDtaReader.GetValues(csvRow = new Object[SQLiDtaReader.FieldCount]);
csvString.AppendLine(#"""" + String.Join(#""",""",csvRow ) + #"""");
}
// WRITE IT
csvFile = new StreamWriter(#"C:\somecsvfile.csv");
csvFile.Write(csvString);

The process cannot access the file because it is being used by another process. (Text File will not close)

I am trying to Write to a text file after this code block checks for the last time the PC was restarted. The code below reads from a text file, the last time the PC was resarted, and from there it determines whether to show a splash-screen. However, After this method runs, i need to write to the text file what the current "System Up-Time" is. But i keep getting an error that says the text file is in use. This has driven me insane. I have made sure all StreamWriters and StreamReaders are closed. I have tried Using Statements. I have tried GC.Collect. I feel like i have tried everything.
Any help would be appreciated.
private void checkLastResart()
{
StreamReader sr = new StreamReader(Path.GetDirectoryName(Application.ExecutablePath) + #"\Settings.txt");
if (sr.ReadLine() == null)
{
sr.Close();
MessageBox.Show("There was an error loading 'System UpTime'. All settings have been restored to default.");
StreamWriter sw = new StreamWriter(Path.GetDirectoryName(Application.ExecutablePath) + #"\Settings.txt", false);
sw.WriteLine("Conversion Complete Checkbox: 0");
sw.WriteLine("Default Tool: 0");
sw.WriteLine("TimeSinceResart: 0");
sw.Flush();
sw.Close();
}
else
{
try
{
StreamReader sr2 = new StreamReader(Path.GetDirectoryName(Application.ExecutablePath) + #"\Settings.txt");
while (!sr2.EndOfStream)
{
string strSetting = sr2.ReadLine();
if (strSetting.Contains("TimeSinceResart:"))
{
double lastTimeRecorded = double.Parse(strSetting.Substring(17));
//If the lastTimeRecorded is greater than timeSinceResart (computer has been resarted) OR 2 hours have passed since LVT was last run
if (lastTimeRecorded > timeSinceRestart || lastTimeRecorded + 7200 < timeSinceRestart)
{
runSplashScreen = true;
}
else
{
runSplashScreen = false;
}
}
}
sr2.Close();
sr2.Dispose();
}
catch (Exception e) { MessageBox.Show("An error has occured loading 'System UpTime'.\r\n\r\n" + e); }
}
}
Below is a sample of writing to the Text file, after the above code has been run. It doesnt matter if i open a StreamWriter, or use File.WriteAllLines, an error is thrown immediately.
StreamWriter sw = new StreamWriter(Path.GetDirectoryName(Application.ExecutablePath) + #"\Settings.txt");
string[] lines = File.ReadAllLines(Path.GetDirectoryName(Application.ExecutablePath) + #"\Settings.txt");
lines[2] = "TimeSinceResart: " + timeSinceRestart;
foreach (string s in lines)
sw.WriteLine(s);
Your writing code should be changed in this way
string file = Path.Combine(Path.GetDirectoryName(Application.ExecutablePath),"Settings.txt");
// First read the two lines in memory
string[] lines = File.ReadAllLines(file);
// then use the StreamWriter that locks the file
using(StreamWriter sw = new StreamWriter(file))
{
lines[2] = "TimeSinceResart: " + timeSinceRestart;
foreach (string s in lines)
sw.WriteLine(s);
}
In this way the lock on the StreamWriter doesn't block the reading with FileReadAllLines.
Said that, please note a couple of things. Do not create path strings with string concatenation, use the static methods of the Path class. But most important, when you create a disposable object like a stream be sure to use the using statement to close correctly the file
To complete the answer in response to your comment. Using statement also for the first part of your code
private void checkLastResart()
{
string file = Path.Combine(Path.GetDirectoryName(Application.ExecutablePath),"Settings.txt");
using(StreamReader sr = new StreamReader(file))
{
if (sr.ReadLine() == null)
{
sr.Close();
MessageBox.Show(...)
using(StreamWriter sw = new StreamWriter(file, false))
{
sw.WriteLine("Conversion Complete Checkbox: 0");
sw.WriteLine("Default Tool: 0");
sw.WriteLine("TimeSinceResart: 0");
sw.Flush();
}
}
else
{
....
}
} // exit using block closes and disposes the stream
}
Where you create sr2, sr still has settings.txt open.

File.Move throws error when used with BackgroundWorkerr in C#

Solved
I figured out that the GetNewFolderNameBasedOnDate method internally didn't close the file. I have that method fixed and it working normal now
I am trying to move selected files from one folder to another using BackgroundWorker process in C#. Here is my DoWork() method that determine whether to move the files or just copy. My File.Move() throws an exception that "The process cannot access the file because it is being used by another process". I tried different methods as mentioned in the threads here in stackoverflow.
private void FileProcessor_DoWork(object sender, DoWorkEventArgs e)
{
// Copy files
long bytes = 0;
string destSubFolder = String.Empty;
string destFile = string.Empty;
foreach (FileInfo file in oSettings.SourceFiles)
{
try
{
this.BeginInvoke(OnChange, new object[] { new UIProgress(file.Name, bytes, oSettings.MaxBytes) });
destSubFolder = GetNewFolderNameBasedOnDate(file);
//Create a new subfolder under the current active folder
string newPath = Path.Combine(oSettings.TargetFolder, destSubFolder);
// Create a new target folder, if necessary.
if (!System.IO.Directory.Exists(newPath))
{
System.IO.Directory.CreateDirectory(newPath);
}
destFile = Path.Combine(oSettings.TargetFolder, destSubFolder, file.Name);
if (chkDeleteSourceFiles.Checked)
{
FileInfo f = new FileInfo(file.FullName);
if (f.Exists)
{
File.Move(file.FullName, destFile);
}
}
else
{
File.Copy(file.FullName, destFile, true);
}
//Thread.Sleep(2000);
}
catch (Exception ex)
{
UIError err = new UIError(ex, file.FullName);
this.Invoke(OnError, new object[] { err });
if (err.result == DialogResult.Cancel) break;
}
bytes += file.Length;
}
}
I tried to delete the files in "RunWorkerCompleted" method too. But didn't resolve the problem. This fails when it tries to delete the last file in the list.
private void FileProcessor_RunWorkerCompleted(object sender, RunWorkerCompletedEventArgs e)
{
// Operation completed, update UI
ChangeUI(false);
foreach (FileInfo file in oSettings.SourceFiles)
{
File.Delete(file.FullName);
}
}
GetNewFolderNameBasedOnDate() calls GetDateTaken() which was the culprit. Earlier I didn't use FileStream object but used Image myImage = Image.FromFile(filename); I didn't know that Image.FromFile locks the file.
private DateTime GetDateTaken(string fileName)
{
try
{
using (FileStream fs = new FileStream(fileName, FileMode.Open, FileAccess.Read))
{
Image myImage = Image.FromStream(fs);
PropertyItem propItem = myImage.GetPropertyItem(36867);
DateTime dtaken;
//Convert date taken metadata to a DateTime object
string sdate = Encoding.UTF8.GetString(propItem.Value).Trim();
string secondhalf = sdate.Substring(sdate.IndexOf(" "), (sdate.Length - sdate.IndexOf(" ")));
string firsthalf = sdate.Substring(0, 10);
firsthalf = firsthalf.Replace(":", "-");
sdate = firsthalf + secondhalf;
dtaken = DateTime.Parse(sdate);
return dtaken;
}
}
catch (Exception ex)
{
return DateTime.Now;
}
}
Instead of creating new FileInfo objects, keep it simple and re-use the same one. I suspect the problem is that you have multiple references to the same file in your code, which prevents it from being removed. Try something like this to move it:
if (chkDeleteSourceFiles.Checked)
{
if (file.Exists)
{
file.MoveTo(destFile);
}
}
My guess is that it is the BeginInvoke call to OnChange and the new UIProgress() object that is holding onto the file. Does UIProgress open the file? You could try just using Invoke() and see if that helps.

Categories

Resources