I use the following code to scale and crop all images in a folder.
string fileNameWithoutExtension = Path.GetFileNameWithoutExtension(file);
string fileExtension = Path.GetExtension(file);
string filePath = Path.GetDirectoryName(file);
string newFileName = string.Empty;
long fileSize = new FileInfo(file).Length;
if (fileSize > fileSizeLimit)
{
string tempFile = System.IO.Path.GetTempFileName();
File.Copy(file, tempFile, true);
Bitmap sourceImage = (Bitmap)System.Drawing.Image.FromFile(tempFile);
System.Drawing.Image imgPhoto = ScaleCrop(sourceImage, sourceImage.Width / 4, sourceImage.Height / 4, AnchorPosition.Top);
Bitmap bitImage = new Bitmap(imgPhoto);
File.Delete(file);
newFileName = filePath + "\\" + fileNameWithoutExtension + "_" + DateTime.Now.ToString("yyyyMMddHHmmss") + "_" + CoilWarehouseProcessed + fileExtension;
bitImage.Save(newFileName, System.Drawing.Imaging.ImageFormat.Jpeg);
imgPhoto.Dispose();
bitImage.Dispose();
}
If I run the application locally (in debug mode in VS2010) and point it to a network drive then all images are processed every time.
If I run it from a our local webserver the problem is that the app may process no images, it may process 5, it may process 1, it never does all of the images in a given folder, only ever some of them... then it hangs in the clients browser.
There are no events to view via the event log... the application does not crash or error in anyway... the fact that it will process an image proves it's not a permissions issue.
Any ideas why this is happening?
EDIT: Thanks to wazdev, but I ended up testing a less intrusive (and also don't like dependencies relying on 3rd party software) solution, and it all seems good so far... Basically I changed it so that when it copies the stream to produce a new image 'System.Drawing.Image imgPhoto = ...' to use a using statement to ensure that the 'temp' image is disposed of. I also moved the delete of the original (uncropped / unscaled image) file to be the last operation (In tests it has worked fine, only time will tell once more users come online and concurrency is tested):
string tempFile = System.IO.Path.GetTempFileName();
File.Copy(file, tempFile, true);
Bitmap sourceImage = (Bitmap)System.Drawing.Image.FromFile(tempFile);
System.Drawing.Image imgPhoto = ScaleCrop(sourceImage, sourceImage.Width / 4, sourceImage.Height / 4, AnchorPosition.Top);
Bitmap bitImage;
using (var bmpTemp = new Bitmap(imgPhoto))
{
bitImage = new Bitmap(bmpTemp);
}
newFileName = filePath + "\\" + fileNameWithoutExtension + "_" + DateTime.Now.ToString("yyyyMMddHHmmss") + "_" + CoilWarehouseProcessed + fileExtension;
bitImage.Save(newFileName, System.Drawing.Imaging.ImageFormat.Jpeg);
imgPhoto.Dispose();
bitImage.Dispose();
File.Delete(file);
EDIT2: It's been live now for a few days and i've tested it every day and it is working well.. Here's all that I did;
Basically inside the ScaleCrop() call there was a GC.Collect and a Wait For Pending Finalisers() call. I removed the wait for pending call and moved the GC.Collect() to after the File.Delete().
I've seen this behaviour in the past when RAM was exhausted resulting in paging to disk. I've found a great deal of success in utilising the ImageResizing.net libraries:
http://imageresizing.net/
I updated my code to use this library and have never looked back.
HTH
(I have no affiliation with imageresizing.net - I'm just a very happy user)
Related
I'm using c#. I'm receiving an error about a path is currently accessed by other processes. What my system is trying to do is to access the path: #"C:\temps\" + client_ids + "_" + rown + ".pdf" and use the same path for attachment before sending it to client's email.
here's what I've done so far. I comment out some of my code because I'm not sure what to do.
FileStream fs = null;
using (fs = new FileStream(#"C:\\temps\\" + client_ids + "_" +
rown + ".pdf",
FileMode.Open,FileAccess.Read,FileShare.ReadWrite))
{
TextReader tr = new StreamReader(fs);
//report.ExportToDisk
//(CrystalDecisions.Shared.ExportFormatType.PortableDocFormat,tr);
//report.Dispose();
//Attachment files = new Attachment(tr);
//Mailmsg.Attachments.Add(files);
//Clients.Send(Mailmsg);
}
you can make temp copy of file before you use it in mail attachment and then use the copy instead of the original file
You cannot attach a file to an email if that file is open. You must close (save) the file first.
While #ali answer is technically correct, it is unnecessary. Why go through the overhead of creating a copy of the file which then needs to be deleted, etc.?
Assuming I understand what you are trying to do correctly, simply move your code for mail to after the file is successfully created and saved. And, I don't think you need the overhead of either the filestream or the textreader. As long as your report object can save the file to disk someplace, you can attach that file to your email message and then send it.
While I do not claim to know anything about how Crystal Decisions handles exports, etc. Perhaps something like this would work:
(I got this code from: https://msdn.microsoft.com/en-us/library/ms226036(v=vs.90).aspx)
private void ExportToDisk (string fileName)
{
ExportOptions exportOpts = new ExportOptions();
DiskFileDestinationOptions diskOpts =
ExportOptions.CreateDiskFileDestinationOptions();
exportOpts.ExportFormatType = ExportFormatType.RichText;
exportOpts.ExportDestinationType =
ExportDestinationType.DiskFile;
diskOpts.DiskFileName = fileName;
exportOpts.ExportDestinationOptions = diskOpts;
Report.Export(exportOpts);
}
You will need to change the ExportFormatType property.
Then, simply attach the file to your email and send:
Attachment Files = new Attachment(filename);
Mailmsg.Attachments.add(files);
Clients.Send(Mailmsg);
I've started to encounter a problem with File.Copy. This works fine for my data creation script, managing to duplicate thousands of files with no issues. My problem occurs when trying to create temp files later in my code.
I have added the code sample below that isn't working correctly. I've tried numerous different ways to try to resolve this to no avail. What I am doing is copying some user data files created in a directory on the C drive into a temp folder inside that user data folder.
Code
foreach (string originalFile in OriginalDataFileNames)
{
string tempFile = originalFile;
TempDataFiles.Add(tempFile);
Console.WriteLine("GlobalDataCtrl: Original Data File: " + XWSDataDirectory + "\\" + tempFile);
Console.WriteLine("GlobalDataCtrl: Saved Temp Data File: " + tempPath + "\\" + tempFile);
File.Copy(XWSDataDirectory + "\\" + originalFile, tempPath + "\\" + tempFile);
}
Exit Error
The program '[6256] XtremeWrestlingSim.vshost.exe' has exited with code -1073741819 (0xc0000005) 'Access violation'.
Any help is appreciated, thanks in advance!
SOLUTION:
FileStream outputFS = null;
FileStream inputFS = null;
outputFS = new FileStream(tempPath + "\\" + tempFile, FileMode.CreateNew, FileAccess.ReadWrite);
using (inputFS = new FileStream(XWSDataDirectory + "\\" + originalFile, FileMode.Open))
{
inputFS.CopyTo(outputFS);
}
outputFS.Close();
inputFS.Close();
Not sure how nicely formatted this is, but it works. Replace File.Copy with the above code.
You are using File.Create just before you call File.Copy, I think that is the issue, it is leaving an open stream.
Maybe removing the File.Create call will solve the issue. If not you could get the returned value (which is a stream) and close it before trying to copy.
The file is opened with read/write access and must be closed before it can be opened by another application.
See remarks https://msdn.microsoft.com/en-us/library/ms143361(v=vs.110).aspx
I'm using Magick to convert Adobe files (Pdf, Ai, Psd) to Png images and it all works fine except that the Ai files can take over a minute to convert and Psd files lose their shape when converted, as the layers are laid out side by side instead of overlaying each other. This is the code I am using..
MagickReadSettings settings = new MagickReadSettings();
settings.Density = new Density(300);
using (MagickImageCollection images = new MagickImageCollection())
{
images.Read(file, settings);
using (MagickImage horizontal = images.AppendHorizontally())
{
file = path + "\\" + ThumbnailFolder + "\\TempThumb.Png";
horizontal.Write(path + "\\" + ThumbnailFolder + "\\TempThumb.Png");
}
}
Are there changes I can make in the Settings to fix these issues?
I've had some help to fix this issue which I want to share in case anyone has a similar problem. Firstly the .ai files are taking so long because of the detailed resolution set out in my settings, reduce the resolution and they are created faster. Secondly, the Psd files are being created as they are because I'm appending them using the Horizontal method. When I changed my code to the code below it worked.
using (MagickImageCollection images = new MagickImageCollection())
{
images.Read(file, settings);
images.Write(path + "\\" + ThumbnailFolder + "\\TempThumb.png");
}
I am trying to save an image in an azure worker roles' file system.
This is my code :
BlobStream inputImage = inputBlob.OpenRead();
Stream inputStream = inputImage;
string inputPath = "F:\\approot\\input\\" + imageId; //Image id is 582
inputStream.Position = 0;
//save image to physicall file
var img = Image.FromStream(inputStream);
string imagePath = pInputPath + "\\" + pImageId + ".png";
img.Save(imagePath, ImageFormat.Png);
var xml = "";
This problem came all of the sudden, my application was working but suddenly it started crashing at this point.
I'm guessing that the issue is with the f drive. You'll find that, occasionally, e: and f: may change upon restart. Since you claimed that things were working, and then suddenly started crashing, I'm thinking that, if you rdp into your role instance, you'll find that approot is now on e: instead of f:.
There should be an environment variable, RoleRoot, that you can use for determining what drive your approot is on:
Environment.GetEnvironmentVariable("RoleRoot");
I have the following problem or question,
I have this function
private void SavePic(Canvas canvas, string filename)
{
RenderTargetBitmap renderBitmap = new RenderTargetBitmap(
(int)canvas.Width, (int)canvas.Height,
96d, 96d, PixelFormats.Pbgra32);
// needed otherwise the image output is black
canvas.Measure(new Size((int)canvas.Width, (int)canvas.Height));
canvas.Arrange(new Rect(new Size((int)canvas.Width, (int)canvas.Height)));
renderBitmap.Render(canvas);
//JpegBitmapEncoder encoder = new JpegBitmapEncoder();
PngBitmapEncoder encoder = new PngBitmapEncoder();
encoder.Frames.Add(BitmapFrame.Create(renderBitmap));
using (FileStream file = File.Create(filename))
{
encoder.Save(file);
}
}
and the corresponding call
SavePic(mySuperDefaultPainting, #"C:\KinDraw\out.png");
Now I wanted to attach the file name the date + time?
You can grab this DateTime function in the function call?
maybe I can someone help here?
try (Updated for File Path):
string fileName=string.Format("{0}-{1:ddMMMyyyy-HHmm}.png", #"C:\KinDraw\out",
DateTime.Now);
if(!Directory.Exists(Path.GetDirectoryName(fileName)))
{
Directory.CreateDirectory(Path.GetDirectoryName(fileName));
}
SavePic(mySuperDefaultPainting, fileName);
Say the time is 29-JAN-2013 07:30 PM it will give you: C:\KinDraw\out-29JAN2013-1930.png.
But please check details about CreateDirectory on this MSDN page. Also look for Exceptions and wrap in try-catch blocks.
string timestamp =DateTime.Now.ToString("MMddyyyy.HHmmss");
SavePic(mySuperDefaultPainting, #"C:\KinDraw\out"+timestamp+".png");
Update: (to create the directory if it does not exist)
if (!Directory.Exists(filepath))
Directory.CreateDirectory(filepath);
Hope it helps :)
Try to add this at the beginning of your code:
var extension = Path.GetExtension(filename);
var newName = filename.Replace(filename, extension) + DateTime.Now.ToString("yyyy-MM-dd HH:mm:dd") + extension;
Just put this line in there:
string stampedFileName = filename.Replace(".",
string.Format("{0:YYYY-mm-dd hhmmss}", DateTime.UtcNow) + ".");
and then change
using (FileStream file = File.Create(filename))
to
using (FileStream file = File.Create(stampedFilename))
It is important to use DateTime.UtcNow rather than DateTime.Now because the former is not influenced by daylight saving time.
EDIT: The format I propose above has the advantage that sorting your filenames alphabetically then automatically also sorts them chronologically.
This is how I did it and it works. Tweaked #Avishek code a bit to make it work for mine. No need to delete file or lose what's in it.
Call "Rename()" method after you output the file..
public static void Rename()
{
string timestamp = DateTime.Now.ToString("MMddyyyy.HHmmss");
string originalFile = #"C:\Users\Data_Output\" + fileName + ".csv";
string newFile = #"C:\Users\Data_Output\" + fileName + "_" + timestamp + ".csv";
File.Move(originalFile, newFile);
}