I have code, including unit tests, that tries to save the config file multiple times. When I try to save the config file a second time I get this error:
Unable to save config to file [FileName]
My code is basically doing this:
{
Configuration config = ConfigurationManager.OpenExeConfiguration(configName);
...change some values...
config.save();
}
and then again later
{
Configuration config = ConfigurationManager.OpenExeConfiguration(configName);
...change some values in a different way...
config.save();
}
When I execute the config.save for the second time, there is a 10 second delay, and then I get the error. Does anyone know how to 'unlock' this file?
I tried keeping the instance of the config variable, i.e. 'in-scope' but that did not work.
private static Configuration GetConfiguration(string configName)
{
Configuration retval;
if (_configs == null) _configs = new Dictionary<string, Configuration>();
if (_configs.TryGetValue(configName, out retval)) return retval;
retval = ConfigurationManager.OpenExeConfiguration(configName);
_configs.Add(configName, retval);
return retval;
}
and then
{
Configuration config = GetConfiguration(configName);
...change some values...
config.save();
}
along with
{
Configuration config = GetConfiguration(configName);
...change some values in a different way...
config.save();
}
Ug. I was causing the problem elsewhere in my code.
I had a function like this:
private bool FileHasString(string filename, string searchString)
{
string content = (new StreamReader(filename)).ReadToEnd();
return content.IndexOf(searchString) > 0;
}
I assumed that when the stream reader left scope, that the lock on the file ended. The lock apparently did not, and that what was causing the trouble.
This works much better.
private bool FileHasString(string filename, string searchString)
{
var sr = new StreamReader(filename);
string content = sr.ReadToEnd();
sr.Close();
return content.IndexOf(searchString) > 0;
}
Thanks for the help.
Can you keep config in scope between these two code blocks, so you don't have to load it a second time?
The logic behind ConfigurationManager and the System.Configuration namespace is actually quite complex, because the configuration it produces isn't from just one file, even when you are specifically asking for the EXE config. It's possible, for instance, for the user.config to be corrupted, which prevents loading the entire EXE config. I've seen that happen more times than I can count, and the best fix I know of is just to blow away the user config and start over. Similarly, it's possible that the user.config is locked because ConfigurationManager itself is doing something to it.
Related
With Advanced Installer, I'm trying to make a Custom Action, that at installationtime, encrypt the Connection String.
I seems like I can't use "~" here. (I moved my working code from the MVC project, to here).
Is there a simple alternative to that line or am I forced to make a complete rewrite and use e.g. a solution that uses somekind of Stream (like this Modifying Web.Config During Installation
Exception thrown by custom action:
System.Reflection.TargetInvocationException: Exception has been thrown by the
target of an invocation. ---> System.ArgumentException:
The application relative virtual path '~' is not allowed here.
Custom Action:
[CustomAction]
public static ActionResult EncryptConnStr(Session session)
{
try
{
var config = WebConfigurationManager.OpenWebConfiguration("~");
var section = (ConnectionStringsSection)config.GetSection("connectionStrings");
var cms = section.ConnectionStrings[GetConnectionStringName()];
var connStr = BuildConnStr(session["CONN_STR_SERVER"], session["CONN_STR_DATABASE"], session["CONN_STR_USERNAME"], session["CONN_STR_PASSWORD"]);
if (cms == null)
{
// Add new Connection String
section.ConnectionStrings.Add(new ConnectionStringSettings(GetConnectionStringName(), connStr));
}
else
{
// Update existing Connection String
cms.ConnectionString = connStr;
}
// Encrypt
section.SectionInformation.ProtectSection(ConnStrEncryptionKey);
// Save the configuration file.
config.Save(ConfigurationSaveMode.Modified);
return ActionResult.Success;
}
catch (Exception ex)
{
MessageBox.Show(ex.StackTrace, ex.Message);
throw;
}
}
The solution to the path issue, is to use ConfigurationManager a long with some mapping, like this, instead of the web version WebConfigurationManager.
var map = new ExeConfigurationFileMap { ExeConfigFilename = path };
Configuration config = ConfigurationManager.OpenMappedExeConfiguration(map, ConfigurationUserLevel.None);
The encryption works fine as the code is, but the issue with save is still not solved because the execution time is to early. The installation isn't finished and the web.config isn't yet copyed to the APPDIR.
I have an app that reads from text files to determine which reports should be generated. It works as it should most of the time, but once in awhile, the program deletes one of the text files it reads from/writes to. Then an exception is thrown ("Could not find file") and progress ceases.
Here is some pertinent code.
First, reading from the file:
List<String> delPerfRecords = ReadFileContents(DelPerfFile);
. . .
private static List<String> ReadFileContents(string fileName)
{
List<String> fileContents = new List<string>();
try
{
fileContents = File.ReadAllLines(fileName).ToList();
}
catch (Exception ex)
{
RoboReporterConstsAndUtils.HandleException(ex);
}
return fileContents;
}
Then, writing to the file -- it marks the record/line in that file as having been processed, so that the same report is not re-generated the next time the file is examined:
MarkAsProcessed(DelPerfFile, qrRecord);
. . .
private static void MarkAsProcessed(string fileToUpdate, string
qrRecord)
{
try
{
var fileContents = File.ReadAllLines(fileToUpdate).ToList();
for (int i = 0; i < fileContents.Count; i++)
{
if (fileContents[i] == qrRecord)
{
fileContents[i] = string.Format("{0}{1} {2}"
qrRecord, RoboReporterConstsAndUtils.COMPLETED_FLAG, DateTime.Now);
}
}
// Will this automatically overwrite the existing?
File.Delete(fileToUpdate);
File.WriteAllLines(fileToUpdate, fileContents);
}
catch (Exception ex)
{
RoboReporterConstsAndUtils.HandleException(ex);
}
}
So I do delete the file, but immediately replace it:
File.Delete(fileToUpdate);
File.WriteAllLines(fileToUpdate, fileContents);
The files being read have contents such as this:
Opas,20170110,20161127,20161231-COMPLETED 1/10/2017 12:33:27 AM
Opas,20170209,20170101,20170128-COMPLETED 2/9/2017 11:26:04 AM
Opas,20170309,20170129,20170225-COMPLETED
Opas,20170409,20170226,20170401
If "-COMPLETED" appears at the end of the record/row/line, it is ignored - will not be processed.
Also, if the second element (at index 1) is a date in the future, it will not be processed (yet).
So, for these examples shown above, the first three have already been done, and will be subsequently ignored. The fourth one will not be acted on until on or after April 9th, 2017 (at which time the data within the data range of the last two dates will be retrieved).
Why is the file sometimes deleted? What can I do to prevent it from ever happening?
If helpful, in more context, the logic is like so:
internal static string GenerateAndSaveDelPerfReports()
{
string allUnitsProcessed = String.Empty;
bool success = false;
try
{
List<String> delPerfRecords = ReadFileContents(DelPerfFile);
List<QueuedReports> qrList = new List<QueuedReports>();
foreach (string qrRecord in delPerfRecords)
{
var qr = ConvertCRVRecordToQueuedReport(qrRecord);
// Rows that have already been processed return null
if (null == qr) continue;
// If the report has not yet been run, and it is due, add i
to the list
if (qr.DateToGenerate <= DateTime.Today)
{
var unit = qr.Unit;
qrList.Add(qr);
MarkAsProcessed(DelPerfFile, qrRecord);
if (String.IsNullOrWhiteSpace(allUnitsProcessed))
{
allUnitsProcessed = unit;
}
else if (!allUnitsProcessed.Contains(unit))
{
allUnitsProcessed = allUnitsProcessed + " and "
unit;
}
}
}
foreach (QueuedReports qrs in qrList)
{
GenerateAndSaveDelPerfReport(qrs);
success = true;
}
}
catch
{
success = false;
}
if (success)
{
return String.Format("Delivery Performance report[s] generate
for {0} by RoboReporter2017", allUnitsProcessed);
}
return String.Empty;
}
How can I ironclad this code to prevent the files from being periodically trashed?
UPDATE
I can't really test this, because the problem occurs so infrequently, but I wonder if adding a "pause" between the File.Delete() and the File.WriteAllLines() would solve the problem?
UPDATE 2
I'm not absolutely sure what the answer to my question is, so I won't add this as an answer, but my guess is that the File.Delete() and File.WriteAllLines() were occurring too close together and so the delete was sometimes occurring on both the old and the new copy of the file.
If so, a pause between the two calls may have solved the problem 99.42% of the time, but from what I found here, it seems the File.Delete() is redundant/superfluous anyway, and so I tested with the File.Delete() commented out, and it worked fine; so, I'm just doing without that occasionally problematic call now. I expect that to solve the issue.
// Will this automatically overwrite the existing?
File.Delete(fileToUpdate);
File.WriteAllLines(fileToUpdate, fileContents);
I would simply add an extra parameter to WriteAllLines() (which could default to false) to tell the function to open the file in overwrite mode, and not call File.Delete() at all then.
Do you currently check the return value of the file open?
Update: ok, it looks like WriteAllLines() is a .Net Framework function and therefore cannot be changed, so I deleted this answer. However now this shows up in the comments, as a proposed solution on another forum:
"just use something like File.WriteAllText where if the file exists,
the data is just overwritten, if the file does not exist it will be
created."
And this was exactly what I meant (while thinking WriteAllLines() was a user defined function), because I've had similar problems in the past.
So, a solution like that could solve some tricky problems (instead of deleting/fast reopening, just overwriting the file) - also less work for the OS, and possibly less file/disk fragmentation.
Maybe someone knows a simple solution to my problem.
I do not know the entry of the file so it's not a static value.
It can be changed through the BizTalk gui and there we have a URI through the receiveport. But I do not believe it's accessible that easy. What I want to do is write out the full path as the filename. It works well with the messageID where the file is given a specific filepath name. But the Path-Name where the file was dropped is not working that well.
I keep getting this error :
Message:
Object reference not set to an instance of an object.
the message resource is present but the message is not found in the string/message table
-Does not say me much
Below you can see a snip from my code
internal static string UpdateMacroPathProperty(IBaseMessage baseMessage, string macroPathProperty, string macroDefsFile)
{
if (macroName == "MessageID")
{
contextPropertyValue = baseMessage.MessageID.ToString();
}
else if (macroName == "SourceFileName")
{
contextPropertyValue = Directory.GetCurrentDirectory();
}
}
This is an specific created pipeline. Has anyone encountered this problem or can point me in the right way.
I know that BizTalk has a built in function for this, BizTalk Server: List of Macros as the %SourceFileName% but I'm trying to save this as logs in a specific map structure so that it does not get processed.
It's adapter dependent; some adapters will use the FILE adapter's namespace even though they're not the file adapter, but this is the kind of logic that I've used in the past for this:
string adapterType = (string)pInMsg.Context.Read("InboundTransportType",
"http://schemas.microsoft.com/BizTalk/2003/system-properties");
string filePath = null;
if (adapterType != null)
{
if (adapterType == "FILE")
{
filePath = (string)pInMsg.Context.Read("ReceivedFileName",
"http://schemas.microsoft.com/BizTalk/2003/file-properties");
}
else if (adapterType.Contians("SFTP") && !adapterType.Contains("nsoftware"))
// nsoftware uses the FTP schema
{
filePath = (string)pInMsg.Context.Read("ReceivedFileName",
"http://schemas.microsoft.com/BizTalk/2012/Adapter/sftp-properties");
}
else if (adapterType.Contains("FTP"))
{
filePath = (string)pInMsg.Context.Read("ReceivedFileName",
"http://schemas.microsoft.com/BizTalk/2003/ftp-properties");
}
}
And then you can just fall back to the MessageID if you can't get the file path from any of these.
I have an application written in C#, and I am seeking to write some information to the hidden ProgramData in order to access the same connection string from both the application's front end and back end.
I am accessing the directory using path variables as follows:
private bool ProgramDataWriteFile(string contentToWrite)
{
try
{
string strProgramDataPath = "%PROGRAMDATA%";
string directoryPath = Environment.ExpandEnvironmentVariables(strProgramDataPath) + "\\MyApp\\";
string path = Environment.ExpandEnvironmentVariables(strProgramDataPath)+"\\MyApp\\ConnectionInfo.txt";
if (Directory.Exists(directoryPath))
{
System.IO.StreamWriter file = new System.IO.StreamWriter(path);
file.Write(contentToWrite);
file.Close();
}
else
{
Directory.CreateDirectory(directoryPath);
System.IO.StreamWriter file = new System.IO.StreamWriter(path);
file.Write(contentToWrite);
file.Close();
}
return true;
}
catch (Exception e)
{
}
return false;
}
This seems to work correctly. However, my question is, when I used this path variable: %AllUsersProfile%(%PROGRAMDATA%)
instead, it expanded into an illegal(and redundant) file path : C:\ProgramData(C:\ProgramData)\
However, I thought that the latter path variable was the correct full name. Was I just using it incorrectly? I need to ensure that this connection info will be accessible to all users, will just using %PROGRAMDATA% allow that? I am using Windows 7 in case that is relevant.
From here:
FOLDERID_ProgramData / System.Environment.SpecialFolder.CommonApplicationData
The user would never want to browse here in Explorer, and settings changed here should affect every user on the machine. The default location is %systemdrive%\ProgramData, which is a hidden folder, on an installation of Windows Vista. You'll want to create your directory and set the ACLs you need at install time.
So, just use %PROGRAMDATA%, or better still:
Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData)
I am trying to create a file with a FileInfo object and I am getting strange behavior.
Here is the gist of what I am doing -
public void CreateLog()
{
FileInfo LogFile = new FileInfo("");
if (!LogFile.Directory.Exists) { LogFile.Directory.Create(); }
if (!LogFile.Exists) { LogFile.Create(); }
if (LogFile.Length == 0)
{
using (StreamWriter Writer = LogFile.AppendText())
{
Writer.WriteLine("Quotes for " + Instrument.InstrumentID);
Writer.WriteLine("Time,Bid Size,Bid Price,Ask Price,Ask Size");
}
}
}
However, when it checks to see the length of the logfile, it says that the file does not exist (I checked - it does exist).
When I substitute LogFile.Length with the following:
File.ReadAllLines(LogFile.FullName).Length;
Then I get an exception that says that it cannot access the file because something else is already accessing it.
BUT, if I do a Thread.Sleep(500) before I do ReadAllLines, then it seems to work fine.
What am I missing?
LogFile.Create() if you user this function ,you may lock the file, so you can use using ,like this
using(LogFile.Create()){}
after that you can use the file again