Unlock file after program executing - c#

When i started my c# program i can not delete executing file.
How can i just unlock my assembly after executing and give to user possibility to delete my file? Maybe copy assembly to other place and then execute it? But i think it's not better way.

I think you'll have to copy your assembly to a temporary location and relaunch it. Something like this (haven't tested it):
public class Program {
static void Main() {
if (Array.IndexOf(Environment.GetCommandLineArgs(), "/unlocked") == -1) {
// We have not been launched by ourself! Copy the assembly to a temp location
// so the user can delete our original location.
string temp_location = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.LocalApplicationData), Path.GetFileName(Application.ExecutablePath));
File.Copy(Application.ExecutablePath, temp_location);
Process proc = new Process();
proc.StartInfo.FileName = temp_location;
proc.StartInfo.Arguments = "/unlocked";
proc.Start();
// no more work in main, so application closes...
} else {
// else initialize application as normal...
}
}
}

You cannot delete your file because the file itself is the backing store for the virtual memory pages that hold (in memory) the program code.
When a page of your code is paged out from memory by the OS, it does not go to the pagefile, but it is simply discarded and (when needed) loaded back from the file.
I suppose you could force the program to be pagefile backed (iirc, programs executed from a removable media like a cdrom are pagefile backed), but probably the best solution is just to copy the assembly to another location, as you suggested.

Related

Batch file with C# console app, how to stop the .bat file from excuting?

I have a batch file. I also have a C# CONSOLE APP that checks whether a certain set of files exist in a certain directory.
what I need is: If the files are present then continue to run the next steps in the .bat file. Otherwise EXIT.
I need some help with how the console app can talk to the .bat file ( you know what I mean ) and then stop the rest of the .bat file from running if the input files were not present.
It's really unclear what you are asking for, however let me try to answer. I believe you want to continue the program is "some files" exist, otherwise you will exit the console app.
Assuming these files are somewhere which you can access consistently through a directory, and their path won't be changing, you could use the following code
string pathToFiles = #"C:/Users/YOUR_USER/"; //wherever
string[] fileNames = { "wordDocument.docx", "application.exe", "list.txt" }; //etc
static void Main()
{
foreach (string file in fileNames)
{
if (File.Exists(Path.Combine(pathToFiles, file)))
{
continueTheProgram();
}
else
{
Environment.Exit(-1); //-1 can signify that the file does not exist
}
}
}
Edit: I'm noticing that you mention a batch file. Why not just have the Console App do all the work, rather than bouncing between the two?

How to use Thread Pool and Mutex in c#?

I try to learn how to use Thread Pool and Mutex, as a practice I'm trying to make an application that copy files from one path in the computer to another path in the computer.
To do this application I used Thread Pool (so a few copies could happen simultaneously):
object[] paths = new object [2]; // The source path and the destination path
string[] files[] = System.IO.Directory.GetFiles(sourceFolderPath); //string sourceFolderPath = Folder path that contains the files
foreach(string s in files)
{
paths[0] = s; // The file source - s = the file name with the file path;
paths[1] = s.Replace(sourceFolderPath, destFolderPath); // Replaces between the source folder and the destination folder but keeps the file name
ThreadPool.QueueUserWorkItem(new waitCallback(CopyFIle), paths);
}
So far, the application sends each one of the files to the function that copy the file from the source folder to the destination folder.
The CopyFile function looks like this:
static void CopyFiles(object paths)
{
object[] temp = paths as object[]; // The casting to object array
string sourcePath = temp[0].ToString();
string destPath = temp[1].ToString();
System.IO.File.Copy(filePath, destPath, true); // The copy from the source to the dest
}
The weird thing is that when I run the application it throws an exception: "The process cannot access to the file 'C:..........' because it is being used by another process".
When I try to find out the mistake and I run the application step by step the application run properly and the whole files are copied from the source folder to the destination folder.
That case made me think that maybe the fact that I'm using ThreadPool made a two threads or more to open the same file (the thing that should not happen because I used foreach and each one of the files paths is transferred as a parameter only once).
To solve this I tried to use Mutex and now the CopyFile function looks like this:
static Mutex mutex = new Mutex();
static void CopyFiles(object paths)
{
Mutex.WaitOne(); //Waits until the critical section is free from other threads
try
{
object[] temp = paths as object[]; // The casting to object array
string sourcePath = temp[0].ToString();
string destPath = temp[1].ToString();
System.IO.File.Copy(filePath, destPath, true); // The copy from the source to the dest
}
Finally
{
Mutex.ReleaseMutex(); //Release the critical area
}
}
Now the application should wait until the critical area is free and just then try to copy the file so the exception: "The process cannot access to the file 'C:..........' because it is being used by another process" should not appear.
As I thought, the exception did not appear but the application copied only one file from the source folder to the destination folder and not all of the files.
When I tried to run this application step by step the same weird thing happened and everything went properly, all of the files were copied to the destination folder.
Why this is happen? And how can I solve this problem so all the files will be copied to the destination folder in a normal application running and not step by step?
Your problem isn't the ThreadPool. What goes wrong is the argument.
In your first code-snippet, you fill an object-array with two argument-values and pass that to the Queue-method. What happens here is, that you use always the same array. So in the first iteration of your foreach-loop, you write two values into the array, pass it. Eventually, the method gets executed be the ThreadPool, using that object-array. In the same time, in the second iteration of the foreach-loop, you write again into that exact array and pass it again to the ThreadPool. That means, two (or more) Threads are starting to work on that array.
You don't know when the CopyFiles-method is active, so you can't tell when the array was unboxed and is ready for re-usage. That could be achieved using a mutual exclusion (the easiest way in C# is using the lock-keyword), but that's not the way you should use here.
The better way would be to create new object-arrays each iteration of the foreach-loop. Simply change the code to:
string[] files[] = System.IO.Directory.GetFiles(sourceFolderPath); //string sourceFolderPath = Folder path that contains the files
foreach(string s in files)
{
object[] paths = new object [2]; // The source path and the destination path
paths[0] = s; // The file source - s = the file name with the file path;
paths[1] = s.Replace(sourceFolderPath, destFolderPath); // Replaces between the source folder and the destination folder but keeps the file name
ThreadPool.QueueUserWorkItem(new waitCallback(CopyFIle), paths);
}

jpegoptim on ASP.Net - "error opening temporary file"

I suspect I'm failing to understand where jpegoptim tries to write its temp files.
I have IIS 7.5 running an ASP.Net 4 AppDomain. In it I have a process that optimizes JPEGs with jpegoptim like so:
FileHelper.Copy(existingPath, optimizerPath);
var jpegOptimResult = await ImageHelper.JpegOptim(optimizerPath, 30);
Running locally I get an optimized image. Running on the above server I get:
D:\www\hplusf.com\b\pc\test.jpg 4096x2990 24bit N Adobe [OK] jpegoptim: error opening temporary file.
I can show the code for FileHelper.Copy(), but it's basically just File.Copy() that overwrites if the file already exists.
Here's ImageHelper.JpegOptim:
public static async Task<string> JpegOptim(string path, int quality)
{
string jpegOptimPath = Path.GetDirectoryName(new Uri(Assembly
.GetExecutingAssembly().CodeBase).LocalPath)
+ #"\Lib\jpegoptim.exe";
var jpegOptimResult = await ProcessRunner.O.RunProcess(
jpegOptimPath,
"-m" + quality + " -o -p --strip-all --all-normal \"" + path + "\"",
false, true
);
return jpegOptimResult;
}
jpegOptimResult is what you're seeing there as the error message it's producing. And here's ProcessRunner.RunProcess:
public async Task<string> RunProcess(string command, string args,
bool window, bool captureOutput)
{
var processInfo = new ProcessStartInfo(command, args);
if (!window)
makeWindowless(processInfo);
string output = null;
if (captureOutput)
output = await runAndCapture(processInfo);
else
runDontCapture(processInfo);
return output;
}
protected void makeWindowless(ProcessStartInfo processInfo)
{
processInfo.CreateNoWindow = true;
processInfo.WindowStyle = ProcessWindowStyle.Hidden;
}
protected async Task<string> runAndCapture(ProcessStartInfo processInfo)
{
processInfo.UseShellExecute = false;
processInfo.RedirectStandardOutput = true;
processInfo.RedirectStandardError = true;
var process = Process.Start(processInfo);
var output = process.StandardOutput;
var error = process.StandardError;
while (!process.HasExited)
{
await Task.Delay(100);
}
string s = output.ReadToEnd();
s += '\n' + error.ReadToEnd();
return s;
}
So:
jpegOptim runs properly on my local machine, and optimizes the file, so it's not how I'm calling jpegOptim.
The Copy operation succeeds without Exception, so it's not a Permissions issue with the ASP.Net user reading/writing from that directory
jpegOptim just optimizes and overwrites the file, so if it is in fact running under the same ASP.Net user, it should have no problem writing this file, but...
It's unclear where jpegOptim attempts to write its temp file, so perhaps the underlying issue is where this temporary file is being written.
However, judging by the Windows source:
http://sourceforge.net/p/jpegoptim/code/HEAD/tree/jpegoptim-1.3.0/trunk/jpegoptim.c
jpegOptim's "temporary file" appears to just be the destination file when used with the above options. Relevant lines of jpegOptim source:
int dest = 0;
int main(int argc, char **argv)
{
...
There's some code here looking for the -d argument that sets dest=1 - meaning here dest remains 0. It then hits an if branch, and the else clause, for dest == 0, does this:
if (!splitdir(argv[i],tmpdir,sizeof(tmpdir)))
fatal("splitdir() failed!");
strncpy(newname,argv[i],sizeof(newname));
That's copying the directory name portion of the input image filename to the variable tmpdir - so like C:\Blah\18.jpg would assign tmpdir="C:\Blah\". Then it dumps the entire input image filename to newname, meaning it's just going to overwrite it in place.
At this point in the code the variables it's using should be:
dest=0
argv[i]=D:\www\hplusf.com\b\pc\test.jpg
tmpdir=D:\www\hplusf.com\b\pc\
newname=D:\www\hplusf.com\b\pc\test.jpg
It then in fact opens the file, and there's an opportunity to error out there, suggesting jpegoptim is successfully opening the file. It also decompresses the file further confirming it's successfully opening it.
The specific error message I'm seeing occurs in these lines - I'll confess I don't know if MKSTEMPS is set or not for a default build (which I'm using):
snprintf(tmpfilename,sizeof(tmpfilename),
"%sjpegoptim-%d-%d.XXXXXX.tmp", tmpdir, (int)getuid(), (int)getpid());
#ifdef HAVE_MKSTEMPS
if ((tmpfd = mkstemps(tmpfilename,4)) < 0)
fatal("error creating temp file: mkstemps() failed");
if ((outfile=fdopen(tmpfd,"wb"))==NULL)
#else
tmpfd=0;
if ((outfile=fopen(tmpfilename,"wb"))==NULL)
#endif
fatal("error opening temporary file");
So snprintf is like C# String.Format(), which should produce a path like:
D:\www\hplusf.com\b\pc\jpegoptim-1-2.XXXXXX.tmp
Judging by what I can find it's likely MKSTEMPS is not defined meaning fopen is being called with "wb" meaning it's writing a binary file, and it's returning null meaning it failed to open, and out comes the error message.
So - possible causes:
Bad path in tmpdir It's possible I'm following the C++ poorly (likely), but, from the looks of it it should be identical to the source path of the image. But perhaps it's mangled for tmpdir, by jpegoptim? The input path is clearly clean because jpegoptim actually emits it cleanly in the error message.
Permissions issue Seems fairly unlikely. The ASP.Net user this is running under can clearly read and write because it copies to the dir before jpegoptim fires, and the only user on the machine with any permissions to this dir is that user, so jpegoptim should have failed prior to this point if it were permissions. It could be attempting to access a different dir, but that would really be the Bad tmpdir scenario.
Something else I've not thought of.
Ideas?
Note: This question is similar:
Using jpegtran, jpegoptim, or other jpeg optimization/compression in C#
However, that question is asking about a shared env on GoDaddy, causing answers to spiral around the likelihood he can't spin up processes. We have full control over our server, and as should be clear from the above, the jpegoptim Process is definitely starting successfully, so it's a different scenario.
As it turns out my reading of jpegoptim was incorrect. The tmpdir it uses is where the executable's Working Directory points to, not where the input images are, and not where the executable sits. So, the solution was 2-fold:
Give the exe permissions to write to its own directory* (but deny it access to modify itself)
Modify ProcessRunner to run processes in-place - set the Working Directory to where the exe resides.
The second modification looks like this:
var processInfo = new ProcessStartInfo(command, args);
// Ensure the exe runs in the path where it sits, rather than somewhere
// less safe like the website root
processInfo.WorkingDirectory = (new FileInfo(command)).DirectoryName;
*Note: I happen to have jpegoptim.exe isolated on the server to its own dir to limit risk. If you had it someplace more global like Program Files, you definitely should not do this - instead set the Working Directory as above, but to someplace isolated/safe like a tmp dir or even better a scratch disk. If you've got the RAM for it a RAMdrive would be fastest.
**Second Note: Because of how hard drives and jpegoptim work if the tmp location is not the same disk as the ultimate destination of the output there is a potential, partial race condition introduced between jpegoptim and other code you might be using that depends on its outputs. In particular if you use the same disk, when jpegoptim is done the output JPEG is complete - the OS changes the entry in its file table but the data for the image on the hard drive has already been written to completion. When tmp and destination are separate disks, jpegoptim finishes by telling the OS to move from the tmpdir to the output dir. That's a data move that finishes sometime after jpegoptim is done running. If your waiting code is fast enough it will start its work with an incomplete JPEG.

Clarification on Build Action

I have a project that uses an Access DB file for reference tables. This is to be used at work, but I am developing it at home. Up until now, I've simply run the debugger in VS2010, then copied the relevant class files, exe, etc from the /bin folder to a flash drive, and it's worked fine. But with the DB added in, it suddenly crashes on launch.
I know the problem is the file location of the DB file. Originally the Build Action of the DB was sent to Content. I have changed it to Embedded Resource, which as far as I understand means it will be part of the exe file now.
Am I correct in this? If not, what option do I need to select to have the DB become just a compiled part of the exe, or one of the other dll's?
If the db file is embedded, you can't access it to add/removes rows etc.
Why did you change the build action to Embedded Resource ? It'll be better to put as Content, so the db is a separate file than the exe (but still in the same directory), and then build the path to the db file (i.e. using Application.StartupPath).
Anyway, if you want to set it as Embedded you'll need to extract the db at runtime and store it somewhere before using it.
Here is a method that can extract a file from the embedded resources (of course you'll need to change the filename, or pass it as argument):
private void ExtractFromAssembly()
{
string strPath = Application.LocalUserAppDataPath + "\\MyFile.db";
if (File.Exists(strPath)) return; // already exist, don't overwrite
Assembly assembly = Assembly.GetExecutingAssembly();
//In the next line you should provide NameSpace.FileName.Extension that you have embedded
var input = assembly.GetManifestResourceStream("MyFile.db");
var output = File.Open(strPath, FileMode.CreateNew);
CopyStream(input, output);
input.Dispose();
output.Dispose();
System.Diagnostics.Process.Start(strPath);
}
private void CopyStream(Stream input, Stream output)
{
byte[] buffer = new byte[32768];
while (true)
{
int read = input.Read(buffer, 0, buffer.Length);
if (read <= 0)
return;
output.Write(buffer, 0, read);
}
}
The file will be copied in the local application path, in the user directory. It'll be done the first time the app is started, because otherwise the db file will be overwritten each time the application start (overwritten with the clean db package in the exe)

Randomly Thrown IO Exceptions in C#

I have been working on a program to archive old client data for the company I work for. The program copies the data from the work server to the local machine, creates a .zip file of all the data, then copies it to the archive server.
After it does all that, it deletes the original files and the local copies. Every now and then, the program errors because it can't delete the local copies off my computer. It does not error every different folder that it zips. I will error after doing 300 of them, or after 5. It throws one of the 3 following errors,"The directory is not empty", "File is being used by another process", or "Access to the file is denied". I have tried setting the file attributes to normal, using a forced garbage collection, and ending the winzip process manually.
I really do not understand why it does this only sometimes. I am the admin on my computer and it should be able to delete the files. I figured something else has to be using it, but there should be nothing else using it on my machine except the program in Visual Studio. Thanks.
Below is the cleanup method where it is not deleting the files and the method that zips the files.
[MethodImplAttribute(MethodImplOptions.NoInlining)]
static void CleanUp(SqlConnection Connection, string jNumber, DirectoryInfo dir, bool error, string prefix)
{
if (!error | (!error & emptyFolder))
{
try
{
SqlCommand updateJob = new SqlCommand(string.Format("update job set archived = 1 where job = {0}", jNumber), sdiConnection);
updateJob.ExecuteNonQuery();
}
catch
{
WriteToLog("SQL Error: " + jNumber, "There was an error changing the archive bit to 1 after the job has been archived");
}
try
{
GC.Collect();
GC.WaitForPendingFinalizers();
}
catch
{
WriteToLog("Error cleaning up after processing job", "There was an error garbage collecting.");
}
try
{
//path of the temporary folder created by the program
string tempDir = Path.Combine(Path.Combine(System.Environment.CurrentDirectory, "Temp"), jNumber);
//path of the destination folder
string destDir = Path.Combine(dir.ToString(), jNumber);
//SetFileAttributes(tempDir);
try
{
File.Delete(tempDir + ".zip");
}
catch (System.IO.IOException)
{
File.Delete(tempDir + ".zip");
}
try
{
Directory.Delete(destDir, true);
}
catch (System.IO.IOException)
{
Directory.Delete(destDir, true);
}
try
{
Directory.Delete(tempDir, true);
}
catch (System.IO.IOException)
{
Directory.Delete(tempDir, true);
}
}
catch
{
WriteToLog("File Error: " + jNumber, "There was an error removing files and/or folders after the job has been archived. Please check the source server and destination server to make sure everything copied correctly. The archive bit for this job was set.");
Directory.Delete(Path.Combine(System.Environment.CurrentDirectory, "Temp"), true);
Directory.CreateDirectory(Path.Combine(System.Environment.CurrentDirectory, "Temp"));
}
}
static bool ZipJobFolder(string jNumber, string jPath)
{
try
{
string CommandStr = #"L:\ZipFiles\winzip32.exe";
string parameters = "-min -a -r \"" + jNumber + "\" \"" + jPath + "\"";
ProcessStartInfo starter = new ProcessStartInfo(CommandStr, parameters);
starter.CreateNoWindow = true;
starter.RedirectStandardOutput = false;
starter.UseShellExecute = false;
Process process = new Process();
process.StartInfo = starter;
Console.WriteLine("Creating .zip file");
process.Start();
process.WaitForExit();
Process[] processes;
string procName = "winzip32.exe";
processes = Process.GetProcessesByName(procName);
foreach (Process proc in processes)
{
Console.WriteLine("Closing WinZip Process...");
proc.Kill();
}
}
catch
{
WriteToLog(jNumber, "There was error zipping the files of this job");
return false;
}
return true;
}
I have noticed this behavior using windows explorer, while deleting large folders with a lot of files and sub-folders. But after waiting a bit and then deleting again, it appears to work fine.
Because of that, I have always assumed it was a flaky behavior of the operating system.
Although this is not a solution, you could try it by making your application sleep for a small amount of time before attempting to delete those files, and see if the error occurs still.
If the errors go away it would appear to be related to some timing issue. I would myself want to know the source of the issue though.
Commenters are pointing to Anti Virus program. That would make sense, if that is true then you need to write some code to check if the file is locked before trying to delete. If it is locked then sleep for a bit, then check again until it is no longer locked and you can go ahead and delete it.
You just need to be careful not to get into an infinite race condition.
Edit:
There is a related question about How to best wait for a filelock to release check it out for ideas.
Edit2:
Here is another possible solution Wait until file is unlocked in .Net
Most likely you are getting a sharing violation - the delete can't get the exclusive file handle. One way this happens is the AV software gets triggered and doesn't finish scanning before the call to delete. It could also be that the WinZip process isn't fully dead yet.
Several ways to handle it:
1) On failure, sleep for a few seconds & try again.
2) I would probably not use WinZip & instead use ZipStorer (http://zipstorer.codeplex.com/). It will zip the file in the process and you won't have to do the kill step & you will have much more granular control. You could also do the zips in parallel by spinning up multiple threads.
One thing I found that helps is to not try to create temp files and directories in a single File.Move or File.Copy call. Rather, manually create the parent directories, starting at the highest level and working downwards. Finally, when all parent directories exist, Move or Copy the file.
Antivirus software could be an issue because if antivirus software is currently reading your file it will cause that, to be honest I've seen this pop up many a time when using the .NET framework and I just toss the handling in a loop and attempt to do whatever file operation is needed until it no longer throws the exception. This also happens if a file is currently being copied or is being registered in the buffer of the kernel if some kind of watcher is implemented.

Categories

Resources