WCF Request/Response logging - Advisable to use Mutex for synchronization - c#

Is this advisable to use Mutex object to synchronize request/response xmls writing for WCF service? Is there any better approach for it?
Background
We have one very old WCF service which uses by some other Java service and thousands of users use that.
To troubleshoot some prod issues I am implementing logging to capture all the request and response xmls so It will have huge logging.
Since multiple process are trying to write on same xml file i am getting IO exception and to resolve that I thought to use Mutex based on below link however my queries are
Is this advisable to use it for my service? thousands of requests would hit service in a day so my service would become slower due to mutex ?
what is the best solution for this type of situation. I am sure this is very common scenario where it is required to log lot to troubleshoot issues in WCF.
OR
LOCK OR Monitor Object should be ok in my case while wrinting to log file? my service is with ConcurrencyMode Multiple OR Mutix is required ?
System.ServiceModel.FaultException - The process cannot access the file 'xxx' because it is being used by another process
Note: Since this is very old service, i don't have an option to use Log4Net library here.
Thanks for your help.
Like below method I have around 30 methods which log request and response xml and for that I am getting IO error - file is in use.
Code
WebMethod1(object Request)
{
string requestId = Guid.newguid().tostring()
LogRequestResponse(Request)
response = CallOtherService();
LogRequestResponse(response);
}
LogRequestResponse(object xmlReqRes)
{
try
{
using(StreamWriter myWriter = new StreamWriter(filePath, true, Encoding.UTF8))
{
xmlSerializer mySerializer = new xmlSerializer(xmlReqRes.GetType())
mySerializer.Serialize(myWriter, xmlReqres);
}
}
catch {}
}

Related

Tracking outgoing requests in Azure Functions

As part of a Microservice based solution that we are building we have a number of Azure Functions sitting in one Azure Function App. The functions Orchestrate numerous requests to different APIs some of which take a long time to complete. We added Application Insights to the functions to allow for some tracking of the requests made, but dependency tracking is not working yet in Azure Functions. It is possible to manually track dependencies but that involves inserting some tracking code around each dependency call, however we want to avoid manually tracking dependencies on each and every call.
One of the solutions I have thought of would be to create a request tracker that tracks all outgoing web requests from the functions. Within the request tracker I could then track the dependency requests including their time. I want to hook the request tracker into some sort of web traffic handler, unfortunately I was unable to find much about doing this. A lot of posts mention using System.Net trace writer for this, but as far as I can see this requires a Web.config to setup and functions do not have one.
I have seen a few posts mentioning to create a request wrapper and place that on my outgoing requests, but unfortantely that is not an option as we use a number of packages that make requests internally. If you have any ideas that could get me going in the right direction please let me know. Thanks
Update:
I added the following helper method which allows me to manually track tasks as dependency requests
public static async Task<T> TrackDependency<T>(this Task<T> task, string dependecyName, string callName, string operationId)
{
var telemtryClient = new TelemetryClient();
var startTime = DateTime.UtcNow;
var timer = System.Diagnostics.Stopwatch.StartNew();
var success = true;
T result = default(T);
try
{
result = await task;
}
catch (Exception)
{
success = false;
}
finally
{
timer.Stop();
var dependencyTelemetry = new DependencyTelemetry(dependecyName, callName, startTime, timer.Elapsed, success);
dependencyTelemetry.Context.Operation.Id = operationId;
telemtryClient.Track(dependencyTelemetry);
}
return result;
}
It can then be used as follows:
client.Accounts.UpdateWithHttpMessagesAsync(accountId, account).TrackDependency("Accounts", "UpdateAccounts", requestContextProvider.CorrelationId);
I can now see individual request dependencies in Application Insights, but obviously the actual telemetry on them is very limited, it does not contain path info or much else.
So when you say dependency tracking is not working in Azure Functions, what exactly do you mean? Have you actually added and configured the Application Insights SDK to your actual function yet? The out-of-the-box monitoring experience with Azure Functions doesn't automatically add dependency tracing, but if you actually add/configure the Application Insights SDK in your function project it should start tracking everything going on in there.

c# read write same file more robust way?

I'm developing a progress tracking and monitoring type of an application(c# .net 4.5) A single file both gets written and read from a network location.
I'm having trouble (unresponsive UI / Crashes) reading writing that file in such cases:
if network location is momentarily not responding,
if network location is reached over internet and there is considerable lag,
at startup while client firewall kicks in, it grants delayed access to network resources,
So I'm in need of a more robust way of reading and writing rather than
using (StreamWriter wfile = File.AppendText(path))
{
//...
}
using (StreamReader rfile = new StreamReader(path))
{
//...
}
Async methods seem to conflict reader and writer threads. What is the best practice and your suggestions over this issue? Thanks
You need a service to resolve the issue where the service does the reading and writing of the file. Windows does not properly handle the conflicts. If you don't want to create a service use a database like SQL Server which automatically resolves the conflicts. By the way, SQL Server is a service.

StreamWriter in server file is in use, but in local works

I have a webservices's project. I'm trying to write a log per each method using StreamWriter, in my local machine everything is working fine.
Something like this:
static StreamWriter sw;
try{
if (File.Exists(Directorio + FILE_NAME))
{
sw = File.AppendText(Directorio + FILE_NAME);
}
else
{
sw = File.CreateText(Directorio + FILE_NAME);
sw.WriteLine("---LOG ");
}
sw.WriteLine(Date);
sw.WriteLine(Header);
sw.WriteLine();
sw.Close();//*/
}catch(Exception){}
But when is uploaded to the server sometimes it throws an error that can't write because the file is in use. But I close it every time and I thought that with the try catch should ignore that part and continue with the method, because I don't want to affect the process of each method.
I know that is little information, and I can't reproduce my problem here but hope that someone who had an error like this could give me a hint.
Web servers typically handle multiple requests at once. The occasional error is most likely due to one request trying to log while another request is already logging, and not yet done.
If you wish do use your own logging framework, you will need to coordinate writes to the file.
You could also use one of the exceptional, open-source logging frameworks such as NLog.
This could be due to multiple requests coming to web server and one request trying to write to this log file while other is trying to open. possible fix could be thread synchronization, which is not good as it would significantly degrade the performance of web service. Alternatively I'd recommend using nLog (http://nlog-project.org/), used in several projects without any issues.

Recycle App Pool on IIS6 using C#

I'm attempting to recycle an app pool on IIS6 programmatically through a web application. I have searched all over the net and found a bunch of solutions (Most involving impersonation) but none of them seem to work. The most common error I get is E_ACCESSDENIED despite entering a valid username and password. Can anybody point me in the right direction?
The solution I use for this sort of thing (Where you're trying to run a process from ASP.NET that needs administrative privileges) is the following:
Write whatever you need done as a Self hosted WCF service. Preferably an Http REST Service, so it's easy to call (even using just a browser for testing)
Make sure you service is run using an administrator account. You can use the task scheduler to make sure the service is running at all times as well as run using an Administrator account.
Execute methods on the service from your ASP.NET application using a WCF Client
And it works all the time no matter what "process" I'm trying to run from within an ASP.NET application.
Now as far are the details (code) is concerned let me know if you need help. The code below
is the code you'd have in a console application in order to make it a self hosted WCF Service.
In this case it's an Http service listening on port 7654.
static void Main(string[] args)
{
var webServiceHhost = new WebServiceHost(typeof(AppCmdService), new Uri("http://localhost:7654"));
ServiceEndpoint ep = webServiceHhost.AddServiceEndpoint(typeof(AppCmdService), new WebHttpBinding(), "");
var serviceDebugBehavior = webServiceHhost.Description.Behaviors.Find<ServiceDebugBehavior>();
serviceDebugBehavior.HttpHelpPageEnabled = false;
webServiceHhost.Open();
Console.WriteLine("Service is running");
Console.WriteLine("Press enter to quit ");
Console.ReadLine();
webServiceHhost.Close();
}
AppCmdService is a WCF Service class that looks like this (in my case). In your case you probably don't need a response from your service. In my case I'm getting a Json response. The actual implementation of what it is you're trying to do will be different obviously. But I'm assuming you already have that piece worked out. So simply call a method of that class from here.
[ServiceContract]
public class AppCmdService
{
[WebGet(UriTemplate = "/GetCurrentExcutingRequests/?", ResponseFormat= WebMessageFormat.Json)]
[OperationContract]
public IEnumerable<ExecutingRequestJson> GetCurrentExcutingRequests()
{
return CurrentExecutingRequestJsonProvider.GetCurrentExecutingRequests("localhost");
}
}
On your ASP.NET side, you don't really need a WCF client. All you need is a way to make an http call to the service. So you can simply use HttpWebRequest to make the call out to your service, which in turn execute your process.
Hope all of this makes sense?
Maybe this SO question helps you. There are several solutions (also for IIS6):
Restarting (Recycling) an Application Pool
IMHO the best you could do is to decide to go with a concrete approach an then when you run into an exception, to ask a concrete question with the source code of your approach. Otherwise it's just very vage to answer your question.

WebService parameter object being renamed

This is the first time I've used a webservice for anything so the question may be a little basic. Anyhow, I have a webservice that acts as a proxy to our vendors site. It simplifies the "screen scrape" that we would usually have to do. The webservice function looks like this:
namespace foo
{
public class MyService : WebService
{
[WebMethod]
public string UploadFile(System.IO.FileStream fileToUpload)
{
return _obj.Upload(fileToUpload);
}
}
}
The client throws an error when you try to give it the FileStream that the method asks for. In compilation somewhere, the webservice changed the type of the parameter from System.IO.FileStream to foo.FileStream. Does anyone have any ideas as to how I did this to myself?
Thanks in advance!
In .NET when you are making calls across application domains (as you are here), you can't pass data that is specific to that application domain.
The general version of this is that when you are making calls between two separate processes, you can't send information that is specific (i.e. only has significance in that context) to that process and expect it to have significance in the other process.
This is what you are doing with the filestream. The filestream is a handle to the file on the OS that is specific to a process. There is no guarantee that a process on the same computer, let alone a process on another machine will be able to understand this.
This being a web service, that's exactly the situation you have, as you have two processes on different machines.
To address the issue, the data you send has to be self-contained. In this specific case, that means sending the contents of the entire file.
You should change the parameter to a byte array, and then process the bytes appropriately in your method.
ASMX web services do not support the use of System.IO.Stream or any derived type of Stream. You would need WCF for this.

Categories

Resources