Hi I am currently trying to test failure of a change to the config file,
I use the following code to do this...
string path = _pathInvalidFormat.Substring(0, _pathInvalidFormat.LastIndexOf('\\'));
System.IO.File.Copy("TestInvalidXmlConfigurationFile.xml", _pathInvalidFormat, true);
FileSystemWatcher fileSystemWatcher = new FileSystemWatcher(path);
//catch the invalid file getting deleted
fileSystemWatcher.Deleted += new FileSystemEventHandler(fileSystemWatcher_Deleted);
//catch the temporary config file getting renamed
fileSystemWatcher.Renamed += new RenamedEventHandler(fileSystemWatcher_Renamed);
fileSystemWatcher.EnableRaisingEvents = true;
CreateConfig(_pathInvalidFormat);
System.Threading.Thread.Sleep(5000);
Assert.That(_oldFileDeleted);
Assert.That(_newFileCreated);
ResetConfig(_pathInvalidFormat);
However I am not happy with this use of System.Threading.Thread.Sleep(5000);
I have tried using fileSystemWatcher.WaitForChanged(WatcherChangeTypes.Deleted, 5000); but cant seem to get it to work, as it always just seems to reach the timeout. Even when I can see the Deleted event has been hit, it still doesn't return.
Alternatively is there a better way of getting the system to wait for asynchronous events to be fired?
Thanks
Kieran
-- edited:
I use createconfig(path) to create a system config file using the following method
private void ReadConfiguration(string path)
{
var configFileMap = new ExeConfigurationFileMap()
{
ExeConfigFilename = path,
};
// if we try to read bad formatted file let's
// 1. delete this file
// 2. call read configuration to form correct file
try
{
Config = System.Configuration.ConfigurationManager
.OpenMappedExeConfiguration(configFileMap, ConfigurationUserLevel.None);
}
catch (ConfigurationErrorsException)
{
// ok, there is bad format file, let delete it and create another one
// TODO: check security rights?
File.Delete(path);
ReadConfiguration(path);
}
}
Hope this gives some insight as to what exactly my test is trying to achieve.
Put all the code in a function and execute it asynchronous with thread.
Now, you can use manual reset events of threading to execute further code after sleep when file has been deleted, renamed and invalid path error checked.
In this manner, code execution will be not blocked.
refer this link for example
Related
Every time I save a file and delete it right away using the function below, I keep getting this error message: "System.IO.IOException: The process cannot access the file because it is being used by another process".
Waiting for a couple of minutes or closing visual studio seems to only unlock the files that you uploaded previously.
public static bool DeleteFiles(List<String> paths)
{ // Returns true on success
try
{
foreach (var path in paths)
{
if (File.Exists(HostingEnvironment.MapPath("~") + path))
File.Delete(HostingEnvironment.MapPath("~") + path);
}
}
catch (Exception ex)
{
return false;
}
return true;
}
I think that the way I'm saving the files may cause them to be locked. This is the code for saving the file:
if (FileUploadCtrl.HasFile)
{
filePath = Server.MapPath("~") + "/Files/" + FileUploadCtrl.FileName;
FileUploadCtrl.SaveAs(filePath)
}
When looking for an answer I've seen someone say that you need to close the streamReader but from what I understand the SaveAs method closes and disposes automatically so I really have no idea whats causing this
After some testing, I found the problem. turns out I forgot about a function I made that was called every time I saved a media file. the function returned the duration of the file and used NAudio.Wave.WaveFileReader and NAudio.Wave.Mp3FileReader methods which I forgot to close after I called them
I fixed these issues by putting those methods inside of a using statement
Here is the working function:
public static int GetMediaFileDuration(string filePath)
{
filePath = HostingEnvironment.MapPath("~") + filePath;
if (Path.GetExtension(filePath) == ".wav")
using (WaveFileReader reader = new WaveFileReader(filePath))
return Convert.ToInt32(reader.TotalTime.TotalSeconds);
else if(Path.GetExtension(filePath) == ".mp3")
using (Mp3FileReader reader = new Mp3FileReader(filePath))
return Convert.ToInt32(reader.TotalTime.TotalSeconds);
return 0;
}
The moral of the story is, to check if you are opening the file anywhere else in your project
I think that the problem is not about streamReader in here.
When you run the program, your program runs in a specific folder. Basically, That folder is locked by your program. In that case, when you close the program, it will be unlocked.
To fix the issue, I would suggest to write/delete/update to different folder.
Another solution could be to check file readOnly attribute and change this attribute which explained in here
Last solution could be using different users. What I mean is that, if you create a file with different user which not admin, you can delete with Admin user. However, I would definitely not go with this solution cuz it is too tricky to manage different users if you are not advance windows user.
I have the method here http://teocomi.com/export-revit-warnings-list-from-api/ and am calling it from an application macro method to export warnings for a folder of rvt files:
public async void ExportWarningHTML()
{
Autodesk.Revit.UI.UIApplication uiapp = this;
Document doc = uiapp.ActiveUIDocument.Document;
// Input Directory
string inputDir = #"C:\input";
// Output Directory
string outputDir = #"C:\output";
//Get files from inputDir
string[] files = Directory.GetFiles(inputDir, "*.rvt");
// Set open options to detach from central and preserve ws
OpenOptions openOptions = new OpenOptions();
openOptions.DetachFromCentralOption = DetachFromCentralOption.DetachAndPreserveWorksets;
// Process each *.rvt file in folder
// Naive approach. DOES NOT WORK.
foreach(string file in files)
{
// Get current doc
var docLast = uiapp.ActiveUIDocument.Document;
// Open new document
var docNext = ActiveUIDocument.Application.OpenAndActivateDocument(file);
// Close last document
docLast.Close(false);
// Export Warnings
var html = await Win32Api.ExportWarinings(uiapp, outputDir);
}
}
}
However this only works for the first file then crashes. How can I modify this code or the linked "ExportWarnings" code I linked to to have this process a folder of .rvt files.
Congratulations on your very nice solution to Export Revit Warnings List From Api!
As you know, the Revit API can only be used within a valid Revit API context. Such a context is provided only within callback functions provided by the Revit API, such as external command Execute. Furthermore, the Revit API is not multi-threading. Making calls to the API outside such a context can lead to a crash. That may well be exactly what you are experiencing.
Therefore, I wonder whether async can be used at all in this context. One possibility to handle these restrictions is by making use of external events:
http://thebuildingcoder.typepad.com/blog/about-the-author.html#5.28
Is this code running in an external command Execute method? If so, how about just removing the async stuff, and simply calling Sleep repeatedly until Revit has finished processing the first file?
No, that will probably not work, and is probably not right at all.
Next suggestion: remove async; make the call to process the next file; when it is done, raise an external event; within the external event, repeat the algorithm to process the next file; etc.
I am very much looking forward to hearing how you resolve this!
I have a C# single thread application that creates a file. Uses that file and then deletes it. Some times the app has trouble deleting that file. The error I get is:
"The process cannot access the file --file path and file name-- because it is being used by another process."
How can I find out what process has a hold on this file and how can I make that process to let go so that the file can be deleted.
This thing rocks for that very "gotcha".
http://technet.microsoft.com/en-us/sysinternals/bb896645.aspx
Process Monitor v3.05
It has a "Filter" submenu so you can fine tune it to the file that is locked.
You need to post the relevant code so we can see.
It is however always important to make sure that your app close the file that it has opened.
usually something like this will ensure that:
using(var f = File.OpenRead("myfile")) {
...
}
or the equivalent:
try {
var f = File.OpenRead("myfile");
} finally {
f.close()
}
Make sure that you are closing file before delete.
if you are using StreamWriter class make sure that you are closing with its variable
Ex. StreamWriter sw = new StreamWriter();
// some writing operation
sw.Close();
We are using EventLog to log exceptions. there is a background thread which check once the eventlog get full and programmaticaly transfers the entries into an XML file and then clear the event log.
This works fine but it seems like there is too much work getting done, I thought it would be better to simply copy the .evt file used for logging the current application and then clear the event log.
is there any way to find the location/path of the file which will work on every windows OS?
its suggested to use
Registry.LocalMachine.OpenSubKey("System\\CurrentControlSet\\Services\\EventLog\\" + e.Log);
but then my application log names dont have a File property.
How are you archiving them now? Maybe that method can be improved to gain performance.
Here's an example.
EventLogSession els = new EventLogSession();
els.ExportLogAndMessages("Security", // Log Name to archive
PathType.LogName, // Type of Log
"*", // Query selecting all events
"C:\\archivedLog.evtx", // Exported Log Path
false, // Stop archive if query is invalid
CultureInfo.CurrentCulture);
Or you can use the ClearLog() method.
EventLogSession els = new EventLogSession();
// Clears all the events and archives them to the .evtx file
els.ClearLog("System", // Channel to Clear
"c:\\myLog.evtx"); // Backup File Path
More information can be found here:
Export, Archive, and Clear Event Logs
I have about 5-6 Server Manager programs that write their own configuration file out to a particualr folder, such as C:\ACME. The config files all end with a *ServerConfig.cfg" where * = Program name that created it.
I have a Windows service that has a FileSystemWatcher setup that I want to FTP the configuration files each time the program updates. I've gotten everything to work, but I'm noticing that the different Server Manager programs are behaving differently.
When saving a configuration file, the FileSystemWatcher is picking up two "change" events. This is causing my program to FTP the configuration file twice where I only need it once.
In other instances I'm seeing where it may create 4, 5, or 6 "change" events when saving a configuration file.
What is the best way to handle processing/FTPing these files when they are really done saving only one time.
I really dont want o set something up to poll the directory for filechanges every so often... and like the idea that each time a configuration is saved, I get a duplicate copy along with a date/timestamp appended to the filename copied elsewhere.
I have seen lots of suggestions Googling around and even here on Stackoverflow, but nothing that seems to be all-in-one for me.
I suppose I could put the filename in a queue when a "change" event occurred if it didn't already exist in the queue. Not sure if this is the best approx.
Here is my sample code:
Startup-code:
private DateTime _lastTimeFileWatcherEventRaised = DateTime.Now;
_watcherCFGFiles = new FileSystemWatcher();
_watcherCFGFiles.Path = #"C:\ACME";
_watcherCFGFiles.IncludeSubdirectories = true;
_watcherCFGFiles.Filter = "*ServerConfig.cfg";
_watcherCFGFiles.NotifyFilter = NotifyFilters.Size;
//_watcherCFGFiles.NotifyFilter = NotifyFilters.LastAccess | NotifyFilters.FileName;
_watcherCFGFiles.Changed += new FileSystemEventHandler(LogFileSystemChanges);
_watcherCFGFiles.Created += new FileSystemEventHandler(LogFileSystemChanges);
_watcherCFGFiles.Deleted += new FileSystemEventHandler(LogFileSystemChanges);
_watcherCFGFiles.Renamed += new RenamedEventHandler(LogFileSystemRenaming);
_watcherCFGFiles.Error += new ErrorEventHandler(LogBufferError);
_watcherCFGFiles.EnableRaisingEvents = true;
Here is that actual handler for the "change" event. I'm skipping the first "change" event if the second is within 700ms. But this doesn't account for the files that make 3-4 change events...
void LogFileSystemChanges(object sender, FileSystemEventArgs e)
{
string log = string.Format("{0} | {1}", e.FullPath, e.ChangeType);
if( e.ChangeType == WatcherChangeTypes.Changed )
{
if(DateTime.Now.Subtract(_lastTimeFileWatcherEventRaised).TotalMilliseconds < 700)
{
return;
}
_lastTimeFileWatcherEventRaised = DateTime.Now;
LogEvent(log);
// Process file
FTPConfigFileUpdate(e.FullPath);
}
}
I had the exact same issue. I used a HashMap that mapped filenames to times of writes, I then used this as a lookup table for files to check and see if the changed event had been applied very quickly. I defined some epsilon (for me it was about 2 seconds to make sure events were flushed). If the time found in the map was older than that I would put it on a queue to be processed. Essentially all I had to do was keep the HashMap up to date with events and changes and this worked out (although you may want to change your epsilon value depending on your application).
Its normal this behavior because the antivirus system or other programs make more writes when a file change the content. I usually create a (global) HashTable and check if the filename exists, if don't, put the filename in it and start and an asynchronous operation to remove the filename after 3-5 seconds.
This is expected behavior - so you need to figure out how to handle it in your particular case.
The file system does not have a concept of "program done working with this file". I.e. one can write editor that updates (open/write/close) file on every keystroke. File system will report a lot of updates, but from the user point of view there is only one update when the editor is closed.