jpegoptim on ASP.Net - "error opening temporary file" - c#

I suspect I'm failing to understand where jpegoptim tries to write its temp files.
I have IIS 7.5 running an ASP.Net 4 AppDomain. In it I have a process that optimizes JPEGs with jpegoptim like so:
FileHelper.Copy(existingPath, optimizerPath);
var jpegOptimResult = await ImageHelper.JpegOptim(optimizerPath, 30);
Running locally I get an optimized image. Running on the above server I get:
D:\www\hplusf.com\b\pc\test.jpg 4096x2990 24bit N Adobe [OK] jpegoptim: error opening temporary file.
I can show the code for FileHelper.Copy(), but it's basically just File.Copy() that overwrites if the file already exists.
Here's ImageHelper.JpegOptim:
public static async Task<string> JpegOptim(string path, int quality)
{
string jpegOptimPath = Path.GetDirectoryName(new Uri(Assembly
.GetExecutingAssembly().CodeBase).LocalPath)
+ #"\Lib\jpegoptim.exe";
var jpegOptimResult = await ProcessRunner.O.RunProcess(
jpegOptimPath,
"-m" + quality + " -o -p --strip-all --all-normal \"" + path + "\"",
false, true
);
return jpegOptimResult;
}
jpegOptimResult is what you're seeing there as the error message it's producing. And here's ProcessRunner.RunProcess:
public async Task<string> RunProcess(string command, string args,
bool window, bool captureOutput)
{
var processInfo = new ProcessStartInfo(command, args);
if (!window)
makeWindowless(processInfo);
string output = null;
if (captureOutput)
output = await runAndCapture(processInfo);
else
runDontCapture(processInfo);
return output;
}
protected void makeWindowless(ProcessStartInfo processInfo)
{
processInfo.CreateNoWindow = true;
processInfo.WindowStyle = ProcessWindowStyle.Hidden;
}
protected async Task<string> runAndCapture(ProcessStartInfo processInfo)
{
processInfo.UseShellExecute = false;
processInfo.RedirectStandardOutput = true;
processInfo.RedirectStandardError = true;
var process = Process.Start(processInfo);
var output = process.StandardOutput;
var error = process.StandardError;
while (!process.HasExited)
{
await Task.Delay(100);
}
string s = output.ReadToEnd();
s += '\n' + error.ReadToEnd();
return s;
}
So:
jpegOptim runs properly on my local machine, and optimizes the file, so it's not how I'm calling jpegOptim.
The Copy operation succeeds without Exception, so it's not a Permissions issue with the ASP.Net user reading/writing from that directory
jpegOptim just optimizes and overwrites the file, so if it is in fact running under the same ASP.Net user, it should have no problem writing this file, but...
It's unclear where jpegOptim attempts to write its temp file, so perhaps the underlying issue is where this temporary file is being written.
However, judging by the Windows source:
http://sourceforge.net/p/jpegoptim/code/HEAD/tree/jpegoptim-1.3.0/trunk/jpegoptim.c
jpegOptim's "temporary file" appears to just be the destination file when used with the above options. Relevant lines of jpegOptim source:
int dest = 0;
int main(int argc, char **argv)
{
...
There's some code here looking for the -d argument that sets dest=1 - meaning here dest remains 0. It then hits an if branch, and the else clause, for dest == 0, does this:
if (!splitdir(argv[i],tmpdir,sizeof(tmpdir)))
fatal("splitdir() failed!");
strncpy(newname,argv[i],sizeof(newname));
That's copying the directory name portion of the input image filename to the variable tmpdir - so like C:\Blah\18.jpg would assign tmpdir="C:\Blah\". Then it dumps the entire input image filename to newname, meaning it's just going to overwrite it in place.
At this point in the code the variables it's using should be:
dest=0
argv[i]=D:\www\hplusf.com\b\pc\test.jpg
tmpdir=D:\www\hplusf.com\b\pc\
newname=D:\www\hplusf.com\b\pc\test.jpg
It then in fact opens the file, and there's an opportunity to error out there, suggesting jpegoptim is successfully opening the file. It also decompresses the file further confirming it's successfully opening it.
The specific error message I'm seeing occurs in these lines - I'll confess I don't know if MKSTEMPS is set or not for a default build (which I'm using):
snprintf(tmpfilename,sizeof(tmpfilename),
"%sjpegoptim-%d-%d.XXXXXX.tmp", tmpdir, (int)getuid(), (int)getpid());
#ifdef HAVE_MKSTEMPS
if ((tmpfd = mkstemps(tmpfilename,4)) < 0)
fatal("error creating temp file: mkstemps() failed");
if ((outfile=fdopen(tmpfd,"wb"))==NULL)
#else
tmpfd=0;
if ((outfile=fopen(tmpfilename,"wb"))==NULL)
#endif
fatal("error opening temporary file");
So snprintf is like C# String.Format(), which should produce a path like:
D:\www\hplusf.com\b\pc\jpegoptim-1-2.XXXXXX.tmp
Judging by what I can find it's likely MKSTEMPS is not defined meaning fopen is being called with "wb" meaning it's writing a binary file, and it's returning null meaning it failed to open, and out comes the error message.
So - possible causes:
Bad path in tmpdir It's possible I'm following the C++ poorly (likely), but, from the looks of it it should be identical to the source path of the image. But perhaps it's mangled for tmpdir, by jpegoptim? The input path is clearly clean because jpegoptim actually emits it cleanly in the error message.
Permissions issue Seems fairly unlikely. The ASP.Net user this is running under can clearly read and write because it copies to the dir before jpegoptim fires, and the only user on the machine with any permissions to this dir is that user, so jpegoptim should have failed prior to this point if it were permissions. It could be attempting to access a different dir, but that would really be the Bad tmpdir scenario.
Something else I've not thought of.
Ideas?
Note: This question is similar:
Using jpegtran, jpegoptim, or other jpeg optimization/compression in C#
However, that question is asking about a shared env on GoDaddy, causing answers to spiral around the likelihood he can't spin up processes. We have full control over our server, and as should be clear from the above, the jpegoptim Process is definitely starting successfully, so it's a different scenario.

As it turns out my reading of jpegoptim was incorrect. The tmpdir it uses is where the executable's Working Directory points to, not where the input images are, and not where the executable sits. So, the solution was 2-fold:
Give the exe permissions to write to its own directory* (but deny it access to modify itself)
Modify ProcessRunner to run processes in-place - set the Working Directory to where the exe resides.
The second modification looks like this:
var processInfo = new ProcessStartInfo(command, args);
// Ensure the exe runs in the path where it sits, rather than somewhere
// less safe like the website root
processInfo.WorkingDirectory = (new FileInfo(command)).DirectoryName;
*Note: I happen to have jpegoptim.exe isolated on the server to its own dir to limit risk. If you had it someplace more global like Program Files, you definitely should not do this - instead set the Working Directory as above, but to someplace isolated/safe like a tmp dir or even better a scratch disk. If you've got the RAM for it a RAMdrive would be fastest.
**Second Note: Because of how hard drives and jpegoptim work if the tmp location is not the same disk as the ultimate destination of the output there is a potential, partial race condition introduced between jpegoptim and other code you might be using that depends on its outputs. In particular if you use the same disk, when jpegoptim is done the output JPEG is complete - the OS changes the entry in its file table but the data for the image on the hard drive has already been written to completion. When tmp and destination are separate disks, jpegoptim finishes by telling the OS to move from the tmpdir to the output dir. That's a data move that finishes sometime after jpegoptim is done running. If your waiting code is fast enough it will start its work with an incomplete JPEG.

Related

System.IO.File.Delete throws "The process cannot access the file because it is being used by another process"

Every time I save a file and delete it right away using the function below, I keep getting this error message: "System.IO.IOException: The process cannot access the file because it is being used by another process".
Waiting for a couple of minutes or closing visual studio seems to only unlock the files that you uploaded previously.
public static bool DeleteFiles(List<String> paths)
{ // Returns true on success
try
{
foreach (var path in paths)
{
if (File.Exists(HostingEnvironment.MapPath("~") + path))
File.Delete(HostingEnvironment.MapPath("~") + path);
}
}
catch (Exception ex)
{
return false;
}
return true;
}
I think that the way I'm saving the files may cause them to be locked. This is the code for saving the file:
if (FileUploadCtrl.HasFile)
{
filePath = Server.MapPath("~") + "/Files/" + FileUploadCtrl.FileName;
FileUploadCtrl.SaveAs(filePath)
}
When looking for an answer I've seen someone say that you need to close the streamReader but from what I understand the SaveAs method closes and disposes automatically so I really have no idea whats causing this
After some testing, I found the problem. turns out I forgot about a function I made that was called every time I saved a media file. the function returned the duration of the file and used NAudio.Wave.WaveFileReader and NAudio.Wave.Mp3FileReader methods which I forgot to close after I called them
I fixed these issues by putting those methods inside of a using statement
Here is the working function:
public static int GetMediaFileDuration(string filePath)
{
filePath = HostingEnvironment.MapPath("~") + filePath;
if (Path.GetExtension(filePath) == ".wav")
using (WaveFileReader reader = new WaveFileReader(filePath))
return Convert.ToInt32(reader.TotalTime.TotalSeconds);
else if(Path.GetExtension(filePath) == ".mp3")
using (Mp3FileReader reader = new Mp3FileReader(filePath))
return Convert.ToInt32(reader.TotalTime.TotalSeconds);
return 0;
}
The moral of the story is, to check if you are opening the file anywhere else in your project
I think that the problem is not about streamReader in here.
When you run the program, your program runs in a specific folder. Basically, That folder is locked by your program. In that case, when you close the program, it will be unlocked.
To fix the issue, I would suggest to write/delete/update to different folder.
Another solution could be to check file readOnly attribute and change this attribute which explained in here
Last solution could be using different users. What I mean is that, if you create a file with different user which not admin, you can delete with Admin user. However, I would definitely not go with this solution cuz it is too tricky to manage different users if you are not advance windows user.

How to use Thread Pool and Mutex in c#?

I try to learn how to use Thread Pool and Mutex, as a practice I'm trying to make an application that copy files from one path in the computer to another path in the computer.
To do this application I used Thread Pool (so a few copies could happen simultaneously):
object[] paths = new object [2]; // The source path and the destination path
string[] files[] = System.IO.Directory.GetFiles(sourceFolderPath); //string sourceFolderPath = Folder path that contains the files
foreach(string s in files)
{
paths[0] = s; // The file source - s = the file name with the file path;
paths[1] = s.Replace(sourceFolderPath, destFolderPath); // Replaces between the source folder and the destination folder but keeps the file name
ThreadPool.QueueUserWorkItem(new waitCallback(CopyFIle), paths);
}
So far, the application sends each one of the files to the function that copy the file from the source folder to the destination folder.
The CopyFile function looks like this:
static void CopyFiles(object paths)
{
object[] temp = paths as object[]; // The casting to object array
string sourcePath = temp[0].ToString();
string destPath = temp[1].ToString();
System.IO.File.Copy(filePath, destPath, true); // The copy from the source to the dest
}
The weird thing is that when I run the application it throws an exception: "The process cannot access to the file 'C:..........' because it is being used by another process".
When I try to find out the mistake and I run the application step by step the application run properly and the whole files are copied from the source folder to the destination folder.
That case made me think that maybe the fact that I'm using ThreadPool made a two threads or more to open the same file (the thing that should not happen because I used foreach and each one of the files paths is transferred as a parameter only once).
To solve this I tried to use Mutex and now the CopyFile function looks like this:
static Mutex mutex = new Mutex();
static void CopyFiles(object paths)
{
Mutex.WaitOne(); //Waits until the critical section is free from other threads
try
{
object[] temp = paths as object[]; // The casting to object array
string sourcePath = temp[0].ToString();
string destPath = temp[1].ToString();
System.IO.File.Copy(filePath, destPath, true); // The copy from the source to the dest
}
Finally
{
Mutex.ReleaseMutex(); //Release the critical area
}
}
Now the application should wait until the critical area is free and just then try to copy the file so the exception: "The process cannot access to the file 'C:..........' because it is being used by another process" should not appear.
As I thought, the exception did not appear but the application copied only one file from the source folder to the destination folder and not all of the files.
When I tried to run this application step by step the same weird thing happened and everything went properly, all of the files were copied to the destination folder.
Why this is happen? And how can I solve this problem so all the files will be copied to the destination folder in a normal application running and not step by step?
Your problem isn't the ThreadPool. What goes wrong is the argument.
In your first code-snippet, you fill an object-array with two argument-values and pass that to the Queue-method. What happens here is, that you use always the same array. So in the first iteration of your foreach-loop, you write two values into the array, pass it. Eventually, the method gets executed be the ThreadPool, using that object-array. In the same time, in the second iteration of the foreach-loop, you write again into that exact array and pass it again to the ThreadPool. That means, two (or more) Threads are starting to work on that array.
You don't know when the CopyFiles-method is active, so you can't tell when the array was unboxed and is ready for re-usage. That could be achieved using a mutual exclusion (the easiest way in C# is using the lock-keyword), but that's not the way you should use here.
The better way would be to create new object-arrays each iteration of the foreach-loop. Simply change the code to:
string[] files[] = System.IO.Directory.GetFiles(sourceFolderPath); //string sourceFolderPath = Folder path that contains the files
foreach(string s in files)
{
object[] paths = new object [2]; // The source path and the destination path
paths[0] = s; // The file source - s = the file name with the file path;
paths[1] = s.Replace(sourceFolderPath, destFolderPath); // Replaces between the source folder and the destination folder but keeps the file name
ThreadPool.QueueUserWorkItem(new waitCallback(CopyFIle), paths);
}

Dispose slow on very big filestreams?

I've got this code
string archiveFileName = BuildArchiveFileName(i, null);
string tmpArchiveFileName = BuildArchiveFileName(i, "tmp");
try
{
using (FileStream tmpArchiveMemoryStream = new FileStream(tmpArchiveFileName, FileMode.Create))
{
using (BinaryWriter pakWriter = new BinaryWriter(tmpArchiveMemoryStream))
{
if (i == 0)
{
WriteHeader(pakWriter, pakInfo.Header);
WriteFileInfo(pakWriter, pakInfo.FileList);
uint remainingBytesToDataOffset = pakInfo.Header.DataSectionOffset - CalculateHeaderBlockSize(pakInfo.Header);
pakWriter.Write(Util.CreatePaddingByteArray((int)remainingBytesToDataOffset));
}
foreach (String file in pakInfo.FileList.Keys)
{
DosPak.Model.FileInfo info = pakInfo.FileList[file];
if (info.IndexArchiveFile == i)
{
//Console.WriteLine("Writing " + file);
byte[] fileData = GetFileAsStream(file, false);
int paddingSize = (int)CalculateFullByteBlockSize((uint)fileData.Length) - fileData.Length;
pakWriter.Write(fileData);
pakWriter.Write(Util.CreatePaddingByteArray(paddingSize));
}
}
}
}
}
finally
{
File.Delete(archiveFileName);
File.Move(tmpArchiveFileName, archiveFileName);
}
I've tested this with NUnit on small file sizes and it works perfectly. Then when I tried it on a real life example , that are files over 1 GB. I get in trouble on the delete. It states the file is still in use by another process. While it shouldn't that file should have been disposed of after exiting the using branch. So I'm wondering if the dispose of the filestream is slow to execute and that is the reason I'm getting in trouble. Small note in all my file handling I use a FileStream with the using keyword.
While it shouldn't that file should have been disposed of after exiting the using branch
That's not what it is complaining about, you can't delete archiveFileName. Some other process has the file opened, just as the exception message says. If you have no idea what process that might be then start killing them off one-by-one with Task Manager's Processes tab. That being your own process is not entirely unusual btw. Best way is with SysInternals' Handle utility, it can show you the process name.
Deleting files is in general a perilous adventure on a multi-tasking operating system, always non-zero odds that some other process is interested in the file as well. They ought to open the file with FileShare.Delete but that's often overlooked.
The safest way to do this is with File.Replace(). The 3rd argument, the backup filename, is crucial, it allows the file to be renamed and continue to exist so that other process can continue to use it. You should try to delete that backup file at the start of your code. If that doesn't succeed then File.Replace() cannot work either. But do check that it isn't a bug in your program first, run the Handle utility.

Unlock file after program executing

When i started my c# program i can not delete executing file.
How can i just unlock my assembly after executing and give to user possibility to delete my file? Maybe copy assembly to other place and then execute it? But i think it's not better way.
I think you'll have to copy your assembly to a temporary location and relaunch it. Something like this (haven't tested it):
public class Program {
static void Main() {
if (Array.IndexOf(Environment.GetCommandLineArgs(), "/unlocked") == -1) {
// We have not been launched by ourself! Copy the assembly to a temp location
// so the user can delete our original location.
string temp_location = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.LocalApplicationData), Path.GetFileName(Application.ExecutablePath));
File.Copy(Application.ExecutablePath, temp_location);
Process proc = new Process();
proc.StartInfo.FileName = temp_location;
proc.StartInfo.Arguments = "/unlocked";
proc.Start();
// no more work in main, so application closes...
} else {
// else initialize application as normal...
}
}
}
You cannot delete your file because the file itself is the backing store for the virtual memory pages that hold (in memory) the program code.
When a page of your code is paged out from memory by the OS, it does not go to the pagefile, but it is simply discarded and (when needed) loaded back from the file.
I suppose you could force the program to be pagefile backed (iirc, programs executed from a removable media like a cdrom are pagefile backed), but probably the best solution is just to copy the assembly to another location, as you suggested.

Randomly Thrown IO Exceptions in C#

I have been working on a program to archive old client data for the company I work for. The program copies the data from the work server to the local machine, creates a .zip file of all the data, then copies it to the archive server.
After it does all that, it deletes the original files and the local copies. Every now and then, the program errors because it can't delete the local copies off my computer. It does not error every different folder that it zips. I will error after doing 300 of them, or after 5. It throws one of the 3 following errors,"The directory is not empty", "File is being used by another process", or "Access to the file is denied". I have tried setting the file attributes to normal, using a forced garbage collection, and ending the winzip process manually.
I really do not understand why it does this only sometimes. I am the admin on my computer and it should be able to delete the files. I figured something else has to be using it, but there should be nothing else using it on my machine except the program in Visual Studio. Thanks.
Below is the cleanup method where it is not deleting the files and the method that zips the files.
[MethodImplAttribute(MethodImplOptions.NoInlining)]
static void CleanUp(SqlConnection Connection, string jNumber, DirectoryInfo dir, bool error, string prefix)
{
if (!error | (!error & emptyFolder))
{
try
{
SqlCommand updateJob = new SqlCommand(string.Format("update job set archived = 1 where job = {0}", jNumber), sdiConnection);
updateJob.ExecuteNonQuery();
}
catch
{
WriteToLog("SQL Error: " + jNumber, "There was an error changing the archive bit to 1 after the job has been archived");
}
try
{
GC.Collect();
GC.WaitForPendingFinalizers();
}
catch
{
WriteToLog("Error cleaning up after processing job", "There was an error garbage collecting.");
}
try
{
//path of the temporary folder created by the program
string tempDir = Path.Combine(Path.Combine(System.Environment.CurrentDirectory, "Temp"), jNumber);
//path of the destination folder
string destDir = Path.Combine(dir.ToString(), jNumber);
//SetFileAttributes(tempDir);
try
{
File.Delete(tempDir + ".zip");
}
catch (System.IO.IOException)
{
File.Delete(tempDir + ".zip");
}
try
{
Directory.Delete(destDir, true);
}
catch (System.IO.IOException)
{
Directory.Delete(destDir, true);
}
try
{
Directory.Delete(tempDir, true);
}
catch (System.IO.IOException)
{
Directory.Delete(tempDir, true);
}
}
catch
{
WriteToLog("File Error: " + jNumber, "There was an error removing files and/or folders after the job has been archived. Please check the source server and destination server to make sure everything copied correctly. The archive bit for this job was set.");
Directory.Delete(Path.Combine(System.Environment.CurrentDirectory, "Temp"), true);
Directory.CreateDirectory(Path.Combine(System.Environment.CurrentDirectory, "Temp"));
}
}
static bool ZipJobFolder(string jNumber, string jPath)
{
try
{
string CommandStr = #"L:\ZipFiles\winzip32.exe";
string parameters = "-min -a -r \"" + jNumber + "\" \"" + jPath + "\"";
ProcessStartInfo starter = new ProcessStartInfo(CommandStr, parameters);
starter.CreateNoWindow = true;
starter.RedirectStandardOutput = false;
starter.UseShellExecute = false;
Process process = new Process();
process.StartInfo = starter;
Console.WriteLine("Creating .zip file");
process.Start();
process.WaitForExit();
Process[] processes;
string procName = "winzip32.exe";
processes = Process.GetProcessesByName(procName);
foreach (Process proc in processes)
{
Console.WriteLine("Closing WinZip Process...");
proc.Kill();
}
}
catch
{
WriteToLog(jNumber, "There was error zipping the files of this job");
return false;
}
return true;
}
I have noticed this behavior using windows explorer, while deleting large folders with a lot of files and sub-folders. But after waiting a bit and then deleting again, it appears to work fine.
Because of that, I have always assumed it was a flaky behavior of the operating system.
Although this is not a solution, you could try it by making your application sleep for a small amount of time before attempting to delete those files, and see if the error occurs still.
If the errors go away it would appear to be related to some timing issue. I would myself want to know the source of the issue though.
Commenters are pointing to Anti Virus program. That would make sense, if that is true then you need to write some code to check if the file is locked before trying to delete. If it is locked then sleep for a bit, then check again until it is no longer locked and you can go ahead and delete it.
You just need to be careful not to get into an infinite race condition.
Edit:
There is a related question about How to best wait for a filelock to release check it out for ideas.
Edit2:
Here is another possible solution Wait until file is unlocked in .Net
Most likely you are getting a sharing violation - the delete can't get the exclusive file handle. One way this happens is the AV software gets triggered and doesn't finish scanning before the call to delete. It could also be that the WinZip process isn't fully dead yet.
Several ways to handle it:
1) On failure, sleep for a few seconds & try again.
2) I would probably not use WinZip & instead use ZipStorer (http://zipstorer.codeplex.com/). It will zip the file in the process and you won't have to do the kill step & you will have much more granular control. You could also do the zips in parallel by spinning up multiple threads.
One thing I found that helps is to not try to create temp files and directories in a single File.Move or File.Copy call. Rather, manually create the parent directories, starting at the highest level and working downwards. Finally, when all parent directories exist, Move or Copy the file.
Antivirus software could be an issue because if antivirus software is currently reading your file it will cause that, to be honest I've seen this pop up many a time when using the .NET framework and I just toss the handling in a loop and attempt to do whatever file operation is needed until it no longer throws the exception. This also happens if a file is currently being copied or is being registered in the buffer of the kernel if some kind of watcher is implemented.

Categories

Resources