Trouble debugging asp.net web service - c#

I'm having trouble figuring out why my web service is not working correctly when called from my asp.net application in production only.
I am able to run my asp.net application locally, calling the web service (in production) and it completes correctly.
I am able to modify the web.config to allow me to use the test form on the production web service and it calls correctly.
I'm calling the web service in a shared dll, and I already check that I am actually getting the updated dll.
This is a very basic web service to log any exceptions on our sites that aren't handled elsewhere. I'm adding additional parameters to it, and also moving it to another project so it is grouped with more of our web services. It is in it's own asmx file called ExceptionServices
My shared dll has a class ErrorHandler which calls (at this point, a test method) TestEmail(string to) and all that does is sends me an email.
Like I said, when running my app locally, it calls the production web service and all is good.
I am running in a hosted environment and am unable to install the remote debugging tools so I cannot step through the production code (unless anybody knows any tricks).
It just seems like this should work (banging head on keyboard)...
Here is my basic web method:
[WebMethod]
public void TestEmail(string to)
{
MailMessage mm = new MailMessage("no-reply#mydomain.com", to, "test", "body here");
SmtpClient client = new SmtpClient("localhost"); // already tried tweaking smtp server, and all my options work when I use the test form
client.Send(mm);
}
ErrorHandler class in the shared dll
public class ErrorHandler
{
public static void ThrowError(Exception ex, string sitename, string ip, string username)
{
//if (ip != "127.0.0.1") // exclude local errors when developing
{
EILib.ExceptionServices.EIExceptionHandler eh = new EILib.ExceptionServices.EIExceptionHandler();
eh.TestEmail("senloe#....com");
}
}
}
and finally my global.asax where it all begins:
protected void Application_Error(object sender, EventArgs e)
{
//Get the Error.
System.Exception anError = Server.GetLastError();
EILib.ErrorHandler.ThrowError(anError, "mydomain.com", EILib.Utilities.GetUserHostAddress(Request), User.Identity.Name);
....
}

Verify that the SMTP service is running on your production server. Check that it is configured to allow the default credentials (a hosting company might well require credentials for email). Check that the service is configured to receive mail from from the local computer (as opposed to using one of the pickup directory delivery methods).

Related

Blazor Server App on Azure: AuthenticationException: The remote certificate is invalid according to the validation procedure

I'm developing a Blazor Server App with VS2019. When running locally (debug or release) it is running and working fine. After publishing it to Azure App Services I get the remote certificate invalid message. At the moment I call a controller method.
Part if the razor page code is:
protected override async Task OnParametersSetAsync()
{
await Task.Run(async () => await GetExperimentInfo());
}
protected async Task GetExperimentInfo()
{
if (string.IsNullOrEmpty(eid))
{
ExperimentName = "Experiment entity not provided";
return;
}
HttpClient client = new HttpClient();
client.BaseAddress = new Uri(NavigationManager.BaseUri);
ExpInfo = await client.GetFromJsonAsync<ExperimentInfo>("api/experiment/" + eid);
if (ExpInfo == null)
{
ExperimentName = "Error: Experiment not found";
return;
}
ExperimentName = ExpInfo.Name;
}
The 'eid' is specified as an argument calling the razor page.
When calling the controller GET method in the server app on Azure App Service directly returns the correct information.
Calling the same controller GET method from within the razor page returns the AuthenticationException of invalid remote certificate!
The method called in the controller is:
[Route("api/experiment/{eid}")]
[HttpGet]
public ExperimentInfo GetExperimentInfo(string eid)
{
var ExpInfo = GetSNExperimentData(eid);
return ExpInfo;
}
I've browsed a lot of articles on the web, but so far did not find a correct answer why and how to resolve this.
Anyone any idea or experience? Thx
The problem was solved by Microsoft Azure Support (by Kevin Kessler) with the following explanation:
This indicates that whichever Root CA is within the remote web service's certificate chain, is not trusted. This is due to the Root CA not being contained within the app service's Trusted Root store.
The Azure web app resides on an App Service Environment (ASE). In this case you may be able to resolve the issue by uploading the needed certificates and assigning their thumbprint values to the app service settings.
Please see this document, which covers the use of certificates on an ASE and how to configure on an app service:
https://learn.microsoft.com/en-us/azure/app-service/environment/certificates#private-client-certificate
Additionally, this StackOverflow article may provide further guidance:
How to make the azure app service trust the certificates issued by the OnPrem CA?
Resolution: Uploaded Root and intermediate certificates to ASE

Is my code running in a web role or a web site?

I'm moving the deployment of a web app from an Azure Website into a Web Role in a Cloud Service.
Part of the migration has involved reserving some local storage in the role config and changing interactions with the local file-system to use the following mantra to find a path that is good for writing to:
LocalResource tempStorageResource = RoleEnvironment
.GetLocalResource("SomeRoleStorage");
var targetFolderPath = tempStorageResource.RootPath;
However, I'd like to keep things working in the WebSite instance. I'm going to write a path provider that abstracts the actual location away. Part of implementing this will require detecting whether I'm running locally/in the debugger, but I also need to know whether the running code is running under a WebSite or a WebRole. How can I do this?
public class AzurePathProvider : ILocalStoragePathProvider
{
public string GetStoragePath(string key)
{
var isWebRole = //????;
if(isWebRole)
{
LocalResource tempStorageResource =
RoleEnvironment
.GetLocalResource(key);
return tempStorageResource.RootPath;
}
else
{
return "/some/other/storage/location";
}
}
}
Check for RoleEnvironment.IsAvailable to decide if the code is running in Cloud Service or not. It will always be true when your code is running in Cloud Service otherwise it will be false.
Furthermore to detect if the code is running in compute emulator, you can check for RoleEnvironment.IsEmulated along with RoleEnvironment.IsAvailable.

Windows Service - WCF Service Design

I have a Windows Service that hosts a WCF service and I am successfully able to connect to it using WCFTestClient and a Custom Client. The windows service is based upon what used to be an exe, but since the program will be used for a long running process on a server, the service is a better route. The problem is that I cannot access static variables in the application from the WCF service.
In the .exe (I switched this to a .dll which is the server application) I use a global class implemented as such:
public static class Globals
{
....
}
This holds references to the major parts of the program so that if any part needs to reference another I can use the syntax Globals.JobManager.RunJob().
The problem that I am encountering is that the WCF service is not able to reference Globals at run-time. One example of where I need this to be done is in the GetJob method:
public class ConsoleConnection : IConsoleConnection
{
public string[] RetrieveJobList()
{
string[] jobs = Globals.JobManager.GetAllJobNames().ToArray();
return jobs;
}
}
This method returns null when tested in WCFTestClient and throws an exception in the created client.
I believe this issue to be caused by the way the Windows Service, WCF Service, and the application DLL are initiated. The current method is such:
public class ETLWindowsService : ServiceBase
{
....
protected override void OnStart(string[] args)
{
if (serviceHost != null)
{
serviceHost.Close();
}
Globals.InitializeGlobals();
serviceHost = new ServiceHost(typeof(ConsoleConnection));
serviceHost.Open();
}
....
}
Here the Windows Service starts, Calls the Globals.InitializeGlobals() that creates all the necessary parts of the application, then starts the WCF service (If this is the wrong way to do this, let me know. I'm piecing this together as I go). I'm assuming that these actions are being done in the wrong order and that is the cause of the problems.
Do I need to have the Windows Service create the WCF Service which in turn creates the application (this doesnt make sense to me), or do I have the Windows Service create the application which then creates the WCF Service? Or is there a third option that I am missing?
The application is in a .dll with the WCF in a separate .dll
I totally agree with Andy H.
If I review this kind of code, I won't try to make the stuff work with the global static variable (even if in the end this is probably possible). A static global class is smelly. First of all, I will figure out to make it work without it.
There are several solution: dependency injection, messaging communication, event driven...
To help you: a long running process in a web service is very common, youy have a good description
here. But in any case, it never uses a static class to synchronize the jobs :)
Improve your design, and you will see that your current problem won't exist at all.

Recycle App Pool on IIS6 using C#

I'm attempting to recycle an app pool on IIS6 programmatically through a web application. I have searched all over the net and found a bunch of solutions (Most involving impersonation) but none of them seem to work. The most common error I get is E_ACCESSDENIED despite entering a valid username and password. Can anybody point me in the right direction?
The solution I use for this sort of thing (Where you're trying to run a process from ASP.NET that needs administrative privileges) is the following:
Write whatever you need done as a Self hosted WCF service. Preferably an Http REST Service, so it's easy to call (even using just a browser for testing)
Make sure you service is run using an administrator account. You can use the task scheduler to make sure the service is running at all times as well as run using an Administrator account.
Execute methods on the service from your ASP.NET application using a WCF Client
And it works all the time no matter what "process" I'm trying to run from within an ASP.NET application.
Now as far are the details (code) is concerned let me know if you need help. The code below
is the code you'd have in a console application in order to make it a self hosted WCF Service.
In this case it's an Http service listening on port 7654.
static void Main(string[] args)
{
var webServiceHhost = new WebServiceHost(typeof(AppCmdService), new Uri("http://localhost:7654"));
ServiceEndpoint ep = webServiceHhost.AddServiceEndpoint(typeof(AppCmdService), new WebHttpBinding(), "");
var serviceDebugBehavior = webServiceHhost.Description.Behaviors.Find<ServiceDebugBehavior>();
serviceDebugBehavior.HttpHelpPageEnabled = false;
webServiceHhost.Open();
Console.WriteLine("Service is running");
Console.WriteLine("Press enter to quit ");
Console.ReadLine();
webServiceHhost.Close();
}
AppCmdService is a WCF Service class that looks like this (in my case). In your case you probably don't need a response from your service. In my case I'm getting a Json response. The actual implementation of what it is you're trying to do will be different obviously. But I'm assuming you already have that piece worked out. So simply call a method of that class from here.
[ServiceContract]
public class AppCmdService
{
[WebGet(UriTemplate = "/GetCurrentExcutingRequests/?", ResponseFormat= WebMessageFormat.Json)]
[OperationContract]
public IEnumerable<ExecutingRequestJson> GetCurrentExcutingRequests()
{
return CurrentExecutingRequestJsonProvider.GetCurrentExecutingRequests("localhost");
}
}
On your ASP.NET side, you don't really need a WCF client. All you need is a way to make an http call to the service. So you can simply use HttpWebRequest to make the call out to your service, which in turn execute your process.
Hope all of this makes sense?
Maybe this SO question helps you. There are several solutions (also for IIS6):
Restarting (Recycling) an Application Pool
IMHO the best you could do is to decide to go with a concrete approach an then when you run into an exception, to ask a concrete question with the source code of your approach. Otherwise it's just very vage to answer your question.

Web service cannot write to Event Log when called by anonymous user

SUMMARY: How to configure a web service such that writing to the Event Log is always possible (regardless of caller)?
DETAILS:
I have a web service which writes an entry to the Application Log. I established the event source for this by means of a little console application and I think I understand that part of things. When I test this WS, I see I am successfully writing my entry to the Event log.
The virtual directory which hosts this WS does NOT allow anonymous access and is configured for Integrated Windows Auth only.
I have a web client application that calls this Webservice. When the web client site is configured for Integrated Windows Auth only, calls to the Webservice result in logging as desired.
Yet, if I change the web client site to allow anonymous access then the Webservice attempt to log results in an InvalidOperationException. I ignore it but it would be nice to know how to get logging in the webservice regardless of how it is called. Here is a bit of my code:
public FileService()
{
try
{
if (!EventLog.SourceExists(g_EventSource))
EventLog.CreateEventSource(g_EventSource, g_EventLog);
System.Security.Principal.WindowsIdentity UserIdentityInfo;
UserIdentityInfo = System.Security.Principal.WindowsIdentity.GetCurrent();
string AuthType = UserIdentityInfo.AuthenticationType;
if (AuthType == "Kerberos")
{ engineWSE.Credentials = System.Net.CredentialCache.DefaultCredentials; }
else
{ engineWSE.Credentials = new System.Net.NetworkCredential("u", "p", "domain"); }
EventLog.WriteEntry(g_EventSource,
"Caller: " + UserIdentityInfo.Name +
" AuthType: " + UserIdentityInfo.AuthenticationType,
EventLogEntryType.Information, 1);
}
catch (InvalidOperationException e)
{
// do nothing to ignore: "Cannot open log for source 'myAppSourceName'. You may not have write access."
}
}
The example in the constructor above is sort of contrived for here (I am mainly interested in being able to write out info related to errors in the web service).
I hope there is a way to configure the web service virtual directory (or the code within) so that logging is possible regardless of how it got called.
Network Service is allowed to write to the Event Log, but not create an event source. you could give permissions to HKLM\SYSTEM\CurrentControlSet\Services\Eventlog\ to allow it to create - but if you've already created it at install time, there's no need.
It's possible that it's failing on the SourceExists as well - since that requires enumerating the same registry key. I'd probably just remove the SourceExists/Create check and trust that it's there - if you're anonymous, you can't create it anyway.
You should also check your web.config.
If IIS is set to anonymous and web.config is set to windows / impersonate. Then it will be the anonymous IIS user that is trying to write to the event log.

Categories

Resources