I am using the SourceSafe COM object (SourceSafeTypeLib) from C# to automate a SourceSafe recursive get (part of a larger build process). The recursive function is shown below. How do I ensure that all the COM objects created in the foreach loop get released correctly?
/// <summary>
/// Recursively gets files/projects from SourceSafe (this is a recursive function).
/// </summary>
/// <param name="vssItem">The VSSItem to get</param>
private void GetChangedFiles(VSSItem vssItem)
{
// 'If the object is a file perform the diff,
// 'If not, it is a project, so use recursion to go through it
if(vssItem.Type == (int)VSSItemType.VSSITEM_FILE)
{
bool bDifferent = false; //file is different
bool bNew = false; //file is new
//Surround the diff in a try-catch block. If a file is new(doesn't exist on
//the local filesystem) an error will be thrown. Catch this error and record it
//as a new file.
try
{
bDifferent = vssItem.get_IsDifferent(vssItem.LocalSpec);
}
catch
{
//File doesn't exist
bDifferent = true;
bNew = true;
}
//If the File is different(or new), get it and log the message
if(bDifferent)
{
if(bNew)
{
clsLog.WriteLine("Getting " + vssItem.Spec);
}
else
{
clsLog.WriteLine("Replacing " + vssItem.Spec);
}
string strGetPath = vssItem.LocalSpec;
vssItem.Get(ref strGetPath, (int)VSSFlags.VSSFLAG_REPREPLACE);
}
}
else //Item is a project, recurse through its sub items
{
foreach(VSSItem fileItem in vssItem.get_Items(false))
{
GetChangedFiles(fileItem);
}
}
}
If it is a short running program and there is nothing to "commit" on the COM side, it is ok to let them go, believe it or not. The GC will come and properly release the interfaces when it needs to.
If it is a long running program (like a server component or takes hours and hours to complete), or you need to "commit" or "save" changes the best bet would be to release them as you would any VSSItem right after your call to GetChangedFiles(fileItem); in your foreach loop.
Example:
foreach (VSSItem fileItem in vssItem.get_Items(false))
{
GetChangedFiles(fileItem);
// fileItem.Release(); or fileItem.Dispose();
// or even Marshal.ReleaseComObject(fileItem);
}
Related
So,
I'm using a Raspberry Pi model 3b and i have successfully got .NetCore v2.0.1 running on it. I have been able to build the CoreIOT Hello World project and ran it without problem on my Pi now i have some custom hardware i have created using the GPIO board and the only way i could think to use it in .NETCore was to use the FileSystem method /sys/class/gpio I have built a very small app
https://github.com/barkermn01/CanReader
There is not much in there but for some reason when i run the app all i get is a message saying Proccess is terminating due to StackOverflowException I'm not sure what is causing it there is nothing in there too big i don't think just basic file system reading.
The only thing i can think of is it does not like the infinite loops
From: https://github.com/barkermn01/CanReader/blob/master/Program.cs
while (!exiting)
{
Console.Write("\rCurrent State " + state);
string input = Console.ReadLine();
if (input == "exit")
{
watcher.Stop();
exiting = true;
}
}
and From: https://github.com/barkermn01/CanReader/blob/master/GPIOWatcher.cs
public void watch()
{
GPIO.Value val = Gpio.ReadValue();
while (val != OldValue)
{
val = Gpio.ReadValue();
if (Delegate != null)
{
Delegate.DynamicInvoke();
}
Thread.Sleep(10);
}
}
Check the property getter of GPIO.FullPath which accesses itself:
/// <summary>
/// Generates the full path for the Filesystem mappings for this GPIO pin
/// </summary>
public string FullPath
{
get
{
return this.Path + "/" + this.FullPath;
}
}
I have an app that reads from text files to determine which reports should be generated. It works as it should most of the time, but once in awhile, the program deletes one of the text files it reads from/writes to. Then an exception is thrown ("Could not find file") and progress ceases.
Here is some pertinent code.
First, reading from the file:
List<String> delPerfRecords = ReadFileContents(DelPerfFile);
. . .
private static List<String> ReadFileContents(string fileName)
{
List<String> fileContents = new List<string>();
try
{
fileContents = File.ReadAllLines(fileName).ToList();
}
catch (Exception ex)
{
RoboReporterConstsAndUtils.HandleException(ex);
}
return fileContents;
}
Then, writing to the file -- it marks the record/line in that file as having been processed, so that the same report is not re-generated the next time the file is examined:
MarkAsProcessed(DelPerfFile, qrRecord);
. . .
private static void MarkAsProcessed(string fileToUpdate, string
qrRecord)
{
try
{
var fileContents = File.ReadAllLines(fileToUpdate).ToList();
for (int i = 0; i < fileContents.Count; i++)
{
if (fileContents[i] == qrRecord)
{
fileContents[i] = string.Format("{0}{1} {2}"
qrRecord, RoboReporterConstsAndUtils.COMPLETED_FLAG, DateTime.Now);
}
}
// Will this automatically overwrite the existing?
File.Delete(fileToUpdate);
File.WriteAllLines(fileToUpdate, fileContents);
}
catch (Exception ex)
{
RoboReporterConstsAndUtils.HandleException(ex);
}
}
So I do delete the file, but immediately replace it:
File.Delete(fileToUpdate);
File.WriteAllLines(fileToUpdate, fileContents);
The files being read have contents such as this:
Opas,20170110,20161127,20161231-COMPLETED 1/10/2017 12:33:27 AM
Opas,20170209,20170101,20170128-COMPLETED 2/9/2017 11:26:04 AM
Opas,20170309,20170129,20170225-COMPLETED
Opas,20170409,20170226,20170401
If "-COMPLETED" appears at the end of the record/row/line, it is ignored - will not be processed.
Also, if the second element (at index 1) is a date in the future, it will not be processed (yet).
So, for these examples shown above, the first three have already been done, and will be subsequently ignored. The fourth one will not be acted on until on or after April 9th, 2017 (at which time the data within the data range of the last two dates will be retrieved).
Why is the file sometimes deleted? What can I do to prevent it from ever happening?
If helpful, in more context, the logic is like so:
internal static string GenerateAndSaveDelPerfReports()
{
string allUnitsProcessed = String.Empty;
bool success = false;
try
{
List<String> delPerfRecords = ReadFileContents(DelPerfFile);
List<QueuedReports> qrList = new List<QueuedReports>();
foreach (string qrRecord in delPerfRecords)
{
var qr = ConvertCRVRecordToQueuedReport(qrRecord);
// Rows that have already been processed return null
if (null == qr) continue;
// If the report has not yet been run, and it is due, add i
to the list
if (qr.DateToGenerate <= DateTime.Today)
{
var unit = qr.Unit;
qrList.Add(qr);
MarkAsProcessed(DelPerfFile, qrRecord);
if (String.IsNullOrWhiteSpace(allUnitsProcessed))
{
allUnitsProcessed = unit;
}
else if (!allUnitsProcessed.Contains(unit))
{
allUnitsProcessed = allUnitsProcessed + " and "
unit;
}
}
}
foreach (QueuedReports qrs in qrList)
{
GenerateAndSaveDelPerfReport(qrs);
success = true;
}
}
catch
{
success = false;
}
if (success)
{
return String.Format("Delivery Performance report[s] generate
for {0} by RoboReporter2017", allUnitsProcessed);
}
return String.Empty;
}
How can I ironclad this code to prevent the files from being periodically trashed?
UPDATE
I can't really test this, because the problem occurs so infrequently, but I wonder if adding a "pause" between the File.Delete() and the File.WriteAllLines() would solve the problem?
UPDATE 2
I'm not absolutely sure what the answer to my question is, so I won't add this as an answer, but my guess is that the File.Delete() and File.WriteAllLines() were occurring too close together and so the delete was sometimes occurring on both the old and the new copy of the file.
If so, a pause between the two calls may have solved the problem 99.42% of the time, but from what I found here, it seems the File.Delete() is redundant/superfluous anyway, and so I tested with the File.Delete() commented out, and it worked fine; so, I'm just doing without that occasionally problematic call now. I expect that to solve the issue.
// Will this automatically overwrite the existing?
File.Delete(fileToUpdate);
File.WriteAllLines(fileToUpdate, fileContents);
I would simply add an extra parameter to WriteAllLines() (which could default to false) to tell the function to open the file in overwrite mode, and not call File.Delete() at all then.
Do you currently check the return value of the file open?
Update: ok, it looks like WriteAllLines() is a .Net Framework function and therefore cannot be changed, so I deleted this answer. However now this shows up in the comments, as a proposed solution on another forum:
"just use something like File.WriteAllText where if the file exists,
the data is just overwritten, if the file does not exist it will be
created."
And this was exactly what I meant (while thinking WriteAllLines() was a user defined function), because I've had similar problems in the past.
So, a solution like that could solve some tricky problems (instead of deleting/fast reopening, just overwriting the file) - also less work for the OS, and possibly less file/disk fragmentation.
I am using visual studio 2010 and I am having a .DWG file which I want to open in autocad. Till now I have used this.
Process p = new Process();
ProcessStartInfo s = new ProcessStartInfo("D:/Test File/" + fileName);
p.StartInfo = s;
p.Start();
But what I want is to close the file inside the Autocad but not the autocad itself. (Means atocad.exe should be kept running).
Till now I hve used this but its closing the acad.exe not the file.
foreach (Process Proc in Process.GetProcesses())
{
if (Proc.ProcessName.Equals("acad"))
{
Proc.CloseMainWindow();
Proc.Kill();
}
}
Take the Autocad .NET libraries from Autodesk Sites (http://usa.autodesk.com/adsk/servlet/index?id=773204&siteID=123112)
Then you will be able to use Application and Document classes.
They will give you full control over opening and closing documents within the application.
You can find many articles on that, and can ask further questions.
AutoCAD does have an api. there are 4 assemblys. Two for in-process and two for COM.
inprocess :
acdbmgd.dll
acmgd.dll
COMInterop :
Autodesk.Autocad.Interop.dll
Autodesk.Autocad.Interop.Common.dll
this is a method that will open a new instance of AutoCAD or it will connect to an existing running instance of AutoCAD.
you will need to load these .dlls into your project references.
using Autodesk.AutoCAD.Interop;
using Autodesk.AutoCAD.Interop.Common;
namespace YourNameSpace {
public class YourClass {
AcadApplication AcApp;
private const string progID = "AutoCAD.Application.18.2";// this is AutoCAD 2012 program id
private string profileName = "<<Unnamed Profile>>";
private const string acadPath = #"C:\Program Files\Autodesk\AutoCAD 2012 - English\acad.exe";
public void GetAcApp()
{
try
{
AcApp = (AcadApplication)Marshal.GetActiveObject(progID);
} catch {
try {
var acadProcess = new Process();
acadProcess.StartInfo.Arguments = string.Format("/nologo /p \"{0}\"", profileName);
acadProcess.StartInfo.FileName = (#acadPath);
acadProcess.Start();
while(AcApp == null)
{
try { AcApp = (AcadApplication)Marshal.GetActiveObject(progID); }
catch { }
}
} catch(COMException) {
MessageBox.Show(String.Format("Cannot create object of type \"{0}\"",progID));
}
}
try {
int i = 0;
var appState = AcApp.GetAcadState();
while (!appState.IsQuiescent)
{
if(i == 120)
{
Application.Exit();
}
// Wait .25s
Thread.Sleep(250);
i++;
}
if(AcApp != null){
// set visibility
AcApp.Visible = true;
}
} catch (COMException err) {
if(err.ErrorCode.ToString() == "-2147417846"){
Thread.Sleep(5000);
}
}
}
}
}
closeing it is as simple as
Application.Exit();
and forgive the code. its atrocious, this was one of my first methods when i just started developing...
I doubt you will be able to do this unless AutoCAD has an API that you can hook into and ask it to close the file for you.
Your c# app can only do things to the process (acad.exe) , it doesn't have access to the internal operations of that process.
Also, you shouldn't use Kill unless the process has become unresponsive and certainly not immediately after CloseMainWindow.
CloseMainWindow is the polite way to ask an application to close itself. Kill is like pulling the power lead from the socket. You aren't giving it the chance to clean up after itself and exit cleanly.
There is one other possibility - this will only work if your C# code is running on the same machine as the AutoCAD process and it is not really recommended, but, if you are really stuck and are prepared to put up with the hassle of window switching you can send key strokes to an application using the SendKeys command.
MSDN articles here:
http://msdn.microsoft.com/EN-US/library/ms171548(v=VS.110,d=hv.2).aspx
http://msdn.microsoft.com/en-us/library/system.windows.forms.sendkeys.send.aspx
Using this you could send the key strokes to simulate the user using the menu commands to close the file.
To perform the closing of file, best way out is to follow the steps at this ObjectARX SDK for c# and change the following code with the below code.
[CommandMethod("CD", CommandFlags.Session)]
static public void CloseDocuments()
{
DocumentCollection docs = Application.DocumentManager;
foreach (Document doc in docs)
{
// First cancel any running command
if (doc.CommandInProgress != "" &&
doc.CommandInProgress != "CD")
{
AcadDocument oDoc =
(AcadDocument)doc.AcadDocument;
oDoc.SendCommand("\x03\x03");
}
if (doc.IsReadOnly)
{
doc.CloseAndDiscard();
}
else
{
// Activate the document, so we can check DBMOD
if (docs.MdiActiveDocument != doc)
{
docs.MdiActiveDocument = doc;
}
int isModified =
System.Convert.ToInt32(
Application.GetSystemVariable("DBMOD")
);
// No need to save if not modified
if (isModified == 0)
{
doc.CloseAndDiscard();
}
else
{
// This may create documents in strange places
doc.CloseAndSave(doc.Name);
}
}
}
I've written a simple windows service to watch a folder and run relog (the windows tool to export data from binary perf mon files) on any files that arrive.
When I run it from my c# process (using System.Diagnostics.Process.Start()) I get:
Error:
Unable to open the specified log file.
But if I copy and paste the command into a console window it works fine.
I've looked all over the net but everything seems to point to a corrupt file, which I know is not the case as I can import perfectly when running manually.
Any help greatly appreciated.
If you are using FileSystemWatcher to monitor for files it will fire the created event before the file is completely written to disk, this would cause the kind of error from relog about being unable to "open" a file since it might still be locked and technically corrupt as far as it's concerned.
I've written the following helper method that I always use in conjunction with FileSystemWatcher to wait for a file to be completely written and ready for processing after a created event and will also kick out after a timeout:
public static bool WaitForFileLock(string path, int timeInSeconds)
{
bool fileReady = false;
int num = 0;
while (!fileReady)
{
if (!File.Exists(path))
{
return false;
}
try
{
using (File.OpenRead(path))
{
fileReady = true;
}
}
catch (Exception)
{
num++;
if (num >= timeInSeconds)
{
fileReady = false;
}
else
{
Thread.Sleep(1000);
}
}
}
return fileReady;
}
Hi guys I have a dictionary which has to be shared between two different exe files. The first application creates a key, then stores it in the dictionary, then the other application creates a key and stores it in the dictionary.
At the moment i do this:
private static void WriteToFile(Dictionary<string, byte[]> dictionary, string path)
{
Contract.Requires(dictionary != null);
Contract.Requires(!string.IsNullOrEmpty(path));
if (!(timestamp == File.GetLastWriteTime(DatabasePath)))
{
using (FileStream fs = File.OpenWrite(path))
using (var writer = new BinaryWriter(fs))
{
// Put count.
writer.Write(dictionary.Count);
// Write pairs.
foreach (var pair in dictionary)
{
writer.Write(pair.Key);
writer.Write(pair.Value);
}
timestamp = DateTime.Now;
File.SetLastWriteTime(DatabasePath, timestamp);
}
}
}
/// <summary>
/// This is used to read a dictionary from a file
/// http://www.dotnetperls.com/dictionary-binary
/// </summary>
/// <param name="path">The path to the file</param>
/// <returns>The dictionary read from the file</returns>
private static Dictionary<string, byte[]> ReadFromFile(string path)
{
Contract.Requires(!string.IsNullOrEmpty(path));
var result = new Dictionary<string, byte[]>();
using (FileStream fs = File.OpenRead(path))
using (var reader = new BinaryReader(fs))
{
// Determine the amount of key value pairs to read.
int count = reader.ReadInt32();
// Read in all the pairs.
for (int i = 0; i < count; i++)
{
string key = reader.ReadString();
//// The byte value is hardcoded as the keysize is consistent
byte[] value = reader.ReadBytes(513);
result[key] = value;
}
}
return result;
}
Then when I want to store a key I call this method:
public static bool StoreKey(byte[] publicKey, string uniqueIdentifier)
{
Contract.Requires(ValidPublicKeyBlob(publicKey));
Contract.Requires(publicKey != null);
Contract.Requires(uniqueIdentifier != null);
Contract.Requires(uniqueIdentifier != string.Empty);
bool success = false;
if (File.Exists(DatabasePath))
{
keyCollection = ReadFromFile(DatabasePath);
}
if (!keyCollection.ContainsKey(uniqueIdentifier))
{
if (!keyCollection.ContainsValue(publicKey))
{
keyCollection.Add(uniqueIdentifier, publicKey);
success = true;
WriteToFile(keyCollection, DatabasePath);
}
}
return success;
}
When the programs generates the key and when we then try to access them, it only has 1 key, what am I doing wrong? The key and string is stored perfectly, but I'm just afraid that they are overwriting the files or something.
Thank you very much in advance, any help is greatly appreciated
PS: The databasePath is the path where I want to save the file, created as a field.
It is hard to say what exactly going on since you've not provided an information regarding how many items in dictionary and so on, but it seems like you've encountered some kind of a file access issue when accessing the same file from multiple processes.
You can use named Mutex as a cross process synchronization object so before accessing a file you have to ensure that Mutex handle is released so you can aquire an ownership and an other process would be able to wait before accessing a file.
// Create a mutex
Mutex mutex = new Mutex(false, "DictionaryAccessMutex");
// Acquire an ownership
mutex.WaitOne();
// Release
mutex.ReleaseMutex();
EDIT: New finding
Also you trying to write immediately after the read, so perhaps FileSystem operation is not completed yet so write failed, I'm not sure 100% in this perhaps .NET managed classes like File/StreamReader/etc already handled such cases but I believe it worth to double check in your case since is not 100% clear what is happened. So just try out adding some timeout like Thread.Sleep(500) between read and write operations.
EDIT: One more thing you can do is download Process Monitor SysInternals utility and see which operations are failed when accessign a given file. So just add a new filter Path=file name and you would be able see what is going on on the low level.
Writing to a file in parallel is generally not the best idea. You have two options here:
Use a mutex for cross-process synchronization to regulate access to the file.
Forward all write requests to a third process that has exclusive ownership of the file and does the actual writing.
So Process 1 loads the dictionary, adds an item, calls write.
So Process 2 loads the dictionary, adds an item, calls write.
You get which ever one writes second, and you don't know which one it will be.
Trying to make this work is way more trouble than it's worth, and it wil be as future proof as an inflatable dartboard.
Mutex at a push or a third process to maintain the dictionary.