I have this strange problem having hard time correcting it. Whenever I update options in my config files it wont detect the changes. I will keep getting exception error saying that option not detected, even i refresh the VS-2012, re start VS02012 and IIS, refesh browsers. It takes long time before it will detect the changes and I can use them. Error I get is
System.Exception: unable to vend object, interface [abc.IExec] reference [option.changeEmployees] ---> System.Exception: option set not found [api_changeEmployees]
While this config file has those options values is saved and updated. What is the fix? Help! Thanks.
Use configuration files for semi-static values, like connection strings, tcp/ip ports. For other settings that should be changed on the fly, use, for example, a ConfigurationTable.
The process actually has to stop and restart to read in new config values. The config values are read in the first time they're accessed and cached in a dictionary for the lifetime of the application.
According to Microsoft though, changing the config file and saving should trigger restarting the application.
If that doesn't work you should be able to just stop and start the app pool hosting your site or issue an IIS reset.
Related
I need to enable self log of seri logger to text file. My configuration are follows;
__serilogLogger = new LoggerConfiguration()
.Enrich.WithProperty("ApplicationIPv4", _ipv4)
.Enrich.WithProperty("ApplicationIPv6", _ipv6)
.WriteTo.MSSqlServer(connectionString, tableName /*, columnOptions: columnOptions*/)
.WriteTo
.Seq(ConfigurationManager.AppSettings["SerilogServer"])
.CreateLogger();
var file = File.CreateText("Self.log");
Serilog.Debugging.SelfLog.Enable(TextWriter.Synchronized(file));
But is hows File access error when run the application. Please find the error details below;
Additional information: The process cannot access the file 'C:\Program
Files (x86)\IIS Express\Self.log' because it is being used by another
process.
Can anyone help me on this
try
Serilog.Debugging.SelfLog.Enable(msg => File.AppendAllText (serilogSelfLogFilePath, msg));
for a non-locking way to write to a file. serilogSelfLogFilePath is a valid path string to the file you want to use.
Don't forget Log.CloseAndFlush(); when you're closing your other logs per https://github.com/serilog/serilog/issues/864 or it won't get written anyway
Note, if you are using an async Sink then you will eventually get an exception when different threads contend to write to that file if it gets busy.
You must have another instance of this application running when it comes to this line. Or maybe this code is somehow being invoked twice? Check taskmanager and kill anything that maybe using it. If this is a web app try recycling the app pool.
I was facing a problem manifesting this same The process cannot access the file 'X' because it is being used by another process. error message. In my case it appeared in the server's event log every time the application pool recycled in IIS 8.5, even though Maximum worker processes was set to 1. Maybe worth saying: the code enabling self log executes in a static constructor.
No luck did I have after properly closing the TextWriter, not even when adding generous Thread.Sleep before File.CreateText in hope of waiting for that "another process" to finish.
The solution was to set Disable Overlapped Recycle to true in Advanced Settings for the application pool.
I have a function in my Global.asax.cs file that applies a couple changes to the Web.config file, if needed. The file is saved with:
config.Save(ConfigurationSaveMode.Modified);
I recently discovered that TempData wasn't working for my app: it appeared to be empty on the next request. My MCVE:
public ActionResult Foo()
{
TempData["error"] = "testing error passing";
return RedirectToAction("Bar");
}
public ActionResult Bar()
{
throw new Exception(TempData["error"] as string);
}
I added that to my HomeController, and visiting /Home/Foo would redirect to /Home/Bar. However, with the above config.Save line active, I get:
Server Error in '/' Application.
Exception of type 'System.Exception' was thrown.
But if I comment out that line, I get:
Server Error in '/' Application.
test error passing
as I was expecting originally.
(I had a couple instances where I got the inverse result, but it was usually on the first request after I commented or un-commented the line, so probably I should blame caching.)
Why does this happen, and how can I fix it?
Based from this question:
What happens when I edit web.config?
Imagine each ASP.NET application (as defined in IIS) is a program on
the desktop. Saving web.config will do something similar to closing
the program and reopening it. - Dan Goldstein
IIS default behavior automatically reset entire session state and recycles existing AppDomain when any changes applied to web.config file at runtime, thus all values stored on TempData array are lost, leaving null values instead.
When config.Save line left uncommented, the line:
throw new Exception(TempData["error"] as string);
will contain null value from TempData["error"]:
throw new Exception(null);
Since as operator returns null if the object doesn't exist, it throws new Exception instance with null string, causing default exception message to show instead of custom one.
If config.Save line commented, the configuration changes doesn't applied into web.config file, thus existing AppDomain remains running and TempData values are still in place.
The default behavior can be changed by these steps:
Open IIS Manager.
Select Application Pools => [your application pool name] => Advanced Settings.
On Recycling area, find Disable Recycling for Configuration Changes (DisallowRotationOnConfigChange) section and set it to True.
Related problem: How to prevent an ASP.NET application restarting when the web.config is modified?
TempData stores the values in session and it seems your default session store is InProc which stores the data in server memory which is mainly running under the AppDomain for the Application Pool
By Default, when there is any change in the files under virtual directory, IIS detects that using a File Watcher and recycle the application pool, this will de-allocate all the memory assigned to this web application.
To avoid that, you can change session mode to be either Database or State Server.
But the best thing is to avoid changing the web.config at all, if you have any configuration that is user defined or need to be changed at run time, then you have to store it somewhere else and cache to avoid any performance hits
I am downloading files from a client's SFTP.
When I do it from Filezilla it always succeeds in the standard way.
On the other side, when I do it from our app, that uses Tamir SharpSSH library for SFTP communication, periods constantly emerge when our all download attempts for a file fail.
I know the app works as that code has not been changes for several months and it worked much more often then it did not, but the periods keep reemerging when for the whole day or more all file downloads fail only for the app.
The exception I get is Tamir.SharpSsh.jsch.SftpException . Obviously not very helpful.
My guess is the client is doing modifications on their side, or changing permissions, as their side is not live yet, but with the exception message I do not know.
Does anybody has some suggestion? Where could I look for the solution? What should I test/try?
Thank you for the time!
The real message was 'No such file'. The reason was, a slash has been omitted for the root folder path, in one of our config files.
When you open the exception variable in VS Watch you will see all info properties from standard exception are null or simply set to 'Tamir.SharpSsh.jsch.SftpException'.
But, an additional property was apparently added to Tamir.SharpSsh.jsch.SftpException class - "message" and that is where the real message is stored, while Exception.Message is pretty often set to just "Tamir.SharpSsh.jsch.SftpException" .
The issue is the additional property is private and is only visible by VS Watch or similar.
Since our exception propagation mechanism is based on logging Exception.Message I was most of the time getting "Tamir.SharpSsh.jsch.SftpException"
I build web-site and want to count clicks on some button. I create and try this class of counter:
public static class Counter
{
public static int counter = 0;
}
Every time I click on the button the counter is increament (counter++) ans I see it in my site, But, if I close the chrome and enter again to my site the counter starts from zero.
How can I save the counter? "Static" dont need to do that?
My bet is that it happens because the application space is flushed - it shouldn't reset just because you closed your browser window, thus abandoning the current session (if the session cookie isn't persistent, that is.)
Visual Studio may republish your files (if using a remote IIS) or just plainly restart a local IIS Express instance, depending on how you set your development environment; I do believe setting a specific content as Static would cause it to be available to all current sessions.
That said, you may want to keep it under the current session (using the Session object).
Optionally, if you want to persist information in between server restarts, you may try reading and writing to a local storage, such as a plaintext or XML file. You can find a very nice article about this on the following link:
http://www.codeproject.com/Articles/544839/Implement-ASP-NET-custom-XML-file-configuration
A more sophisticated version would use a local (or remote) database, for example.
Hope it works for you.
static fields are unique per-process. Depending on your application pool configuration, you could have 2, 20 or 100 copies of that.
They're also not thread safe. There are very, very few instances (pun) where a static member is appropriate.
Just off the top of my head, a particular "instance" of a static will disappear when:
The application pool is recycled. On IIS, this defaults to 20 minutes of inactivity.
The application process exits (you may have multiple processes running within your app pool). This happens as part of (1), but will also happen if, say, you're using the Visual Studio debug web server (Cassini), have your project configured to launch the site for debugging, and close the browser that was launched initially. (This happens because VS considers closing the browser that it launched equivalent to saying "I'm done playing. Back to coding now," or hitting the stop button.)
Another thread overwrites the value you've stored (google "race condition.")
You really, really should be storing this in a database. If you're building a website, you need a database anyway. ANYTHING related to application state should be stored in the database.
ALSO, this really, really shouldn't be happening server-side. Are you really performing a postback every time someone clicks anywhere on a page? If so, you have JavaScript in place to handle that, so just skip this insanity, have said script fire off an AJAX request, and have the target handler log it in the database.
Looks like your using a web site so presuming ASP.net. There are a number of ways to store the information. Database could be one or a persistent cookie could be the way to do it. See this article on how to create cookies: How do I set/unset cookie with jQuery?
You can try save it in session and then it will stay until the session is time out(20 minutes) if you want it to long time just write it to file in known location and when you close the web write the value to the file and when the web is up again take the vakue from the file.
I've written a Windows Service in C#/VS2008/.NET3.5 that monitors some FTP directories with FileSystemWatchers and moves the files out to another location for processing. I noticed today that it throws errors stating "The parameter is incorrect" soon after the service has started up, but if we wait a few minutes the file gets copied without incident. I saw that the error message is often related to incorrect permissions, but I verified permissions on the directories (target and source) were correct and as I said the file move works just a few minutes later.
Here's a snippet of the code that gets called when the file is finished copying into the FTP directory being monitored:
//found the correct source path
string targetDir = dir.TargetDirectory;
string fileName = Path.GetFileName(e.FullPath);
errorlocation = "move file";
string targetFilePath = Path.Combine(targetDir, fileName);
if (File.Exists(targetFilePath))
{
File.Delete(targetFilePath);
}
File.Move(e.FullPath, Path.Combine(targetDir, fileName));
dir refers to and object with information about the directory the file was being loaded into. e is the FileSystemEventArgs. Targetdir is grabbed from the directory's settings in a custom configuration block in the app.config that tells the service where to copy the new files to.
I didn't include the code here, but I know it's failing on the File.Move (last line above) due to some EventLog entries I made to trace the steps.
Any idea as to why the move fails soon after the service startup, but works fine later?
Basic overview of the process in case it sheds some light: external vendors FTP us a number of files each day. When the file comes in, my code identifies who the file is coming from based off the FTP directory and then loads settings to pass on to SSIS jobs that will parse and save the files. There are maybe a dozen or so directories being monitored right now each of which has its own configuration setting for the SSIS job. Is it possible that the system gets confused as startup and just need some time to populate all the settings? Each source directory does have its own FileSystemWatcher on it.
Thanks for your help.
The first question I'd answer is, what are the values of these when it fails:
e.FullPath
targetDir
fileName
chances are one of those values isn't what you expect
I'm marking this answered because the problem went away. We haven't changed anything in the code, but it now works immediately after restart. The best theory we have is: since I posted this, the client I was working for moved offices and as part of the migration a lot of system and network policies were updated and server setting tweaked for the new environment. It's likely one (or more) of those changes fixed this issue.
Further support for this theory: prior to the move my development VM could not run web browsers. (I'd click to load the browser and it wouldn't work, sometimes it would appear briefly in Task Manager and then disappear.) After the office move, this problem no longer occurs.
So it was likely some network setting somewhere that caused issues. Sorry I can't be more specific.