I have a c# application which runs as a windows service. This application uses open source tiny http server for URL communications. There is a flext application developed to update and select the data from sqlite database via c# application using get/post methods.
I have a url called https:/domainname:portnumber/folder/tree/200
which reads data from database with the help of a c# service and returns a huge amount of data in xml form to the client.
Some times when this url gets called totral c# windows service is getting restarted.
And then needs to refresh the flext application to start it again.
The server firewall where the windows service is installed is turned off and the machine is also reachable.
When I checked log I found after this url call, the server restarts. Also, when I checked the traffic in fiddler I got the error below:
HTTP/1.1 502 Fiddler - Connection Failed
Content-Type: text/html; charset=UTF-8
Connection: close
Timestamp: 10:18:52.685
[Fiddler] The socket connection to (domainname) failed. <br />ErrorCode: 10061.
The code used for calling this folder/tree is below
public string Tree()
{
try
{
string langstr = "";
if (Request.QueryString["lang"] != null && !string.IsNullOrEmpty(Request.QueryString["lang"].Value))
{
langstr = Request.QueryString["lang"].Value.ToString();
}
else
{
ThingzDatabase db = SessionDatabase;
langstr = db.DefaultLanguage;
db = null;
}
folderTree = new FolderTree(Convert.ToInt32(Id), true, SessionDatabase, langstr);
XmlDocument doc = folderTree.XML;
Response.ContentType = ContentType.Xml;
langstr = null;
folderTree.db2 = null;
folderTree = null;
//GC.Collect();
return doc.InnerXml;
}
catch (Exception e)
{
TouchServer.Log(Logger.MessageType.Error, 1, e.ToString());
return "Get folder tree failed, reason:" + e.Message;
}
}
To execute the query from sqlite database the code below is used
public SQLiteDataReader ExecuteSQL(String sqlExpr)
{
if (conn.State != ConnectionState.Open
Open(DataFile);
using (SQLiteCommand cmd = conn.CreateCommand())
{
cmd.CommandText = sqlExpr + ";PRAGMA read_uncommitted = 1;";
cmd.CommandType = CommandType.Text;
return cmd.ExecuteReader();
}
}
Whats the size of the return string? You could write it to a file to verify the length of the return string.
There may be problems with web service if the length exceeds certain limit.
The following link discusses similar problem.
http://social.msdn.microsoft.com/forums/en-US/wcf/thread/58e420e9-43a3-4119-b541-d18158038e36/
The Service may crash if you have not handled the exception.
In that service, that is sending a big bunch of text over http; when writing to the reponse add response.BufferOutput = false; at the beginning of the method and call Flush after every some writes.
Note: I am not sure if this works in your embedded http server. But this works for IIS.
It looks as if your SQL code is not catching exceptions... At least not immediately within the method. It stands to reason that if an exception were thrown, the service would crash due to the unhandled exception. Try hooking into the AppDomain.UnhandledException event and printing any caught exceptions to a file, log source, etc. This will allow you to determine if SQL exceptions (read timeouts, etc.) are causing the crash.
Related
How can i check if a method in a web service is working fine or not ? I cannot use ping. I still want to check any kind of method being invoked from the web service by the client. I know it is difficult to generalize but there should be some way.
I use this method and it works fine :
public bool IsAddressAvailable(string address)
{
try
{
System.Net.WebClient client = new WebClient();
client.DownloadData(address);
return true;
}
catch
{
return false;
}
}
The only way to know if a web service method is working "fine" is to call the method and then to evaluate whether the result is "fine". If you want to keep a record of "fine" vs. time, then you can log the result of the evaluation.
There's no more general way to do this that makes any sense. Consider:
You could have code that creates an HTTP connection to the service endpoint, but success doesn't tell you whether the service will immediately throw an exception as soon as you send it any message.
You could connect and send it an invalid message, but that doesn't tell you much.
You could connect and send it a valid message, then check the result to ensure that it is valid. That will give you a pretty good idea that when a real client calls the service immediately afterwards, the real client should expect a valid result.
Unless the service takes that as an opportunity to crash, just to spite you!
The best technique would be to use WCF tracing (possibly with message-level tracing) to log what actually happens with the service, good or bad. A human can then look at the logs to see if they are "fine".
Powershell is by far an easy way to 'ping' a webservice endpoint.
Use the following expression:
Test-NetConnection -Port 4408 -ComputerName 192.168.134.1
Here is a failure response for a port that does not exist or is not listening;
WARNING: TCP connect to 192.168.134.1:4408 failed
ComputerName : 192.168.134.1
RemoteAddress : 192.168.134.1
RemotePort : 4408
InterfaceAlias : Ethernet0
SourceAddress : 192.168.134.1
PingSucceeded : True
PingReplyDetails (RTT) : 0 ms
TcpTestSucceeded : False
Here is a success result if the address/port is listening and accessible:
ComputerName : 192.168.134.1
RemoteAddress : 192.168.134.1
RemotePort : 4407
InterfaceAlias : Ethernet0
SourceAddress : 192.168.134.1
TcpTestSucceeded : True
just use try catch inside the method of your webservice and log exceptions to a log file or to the event log.
Example:
[OperationContract]
public bool isGUID(string input)
{
bool functionReturnValue = false;
try
{
Guid guid;
functionReturnValue = Guid.TryParse(input, guid);
}
catch (Exception ex)
{
Log.WriteServerErrorLog(ex);
}
return functionReturnValue;
}
You don't need to ping the webservice, but instead ping the server with a watchdog service or something. There is no need to "ping" the webservice. I also think you don't need to do this anyway.
Either your webservice works or it doesn't because of bad code.
You may try curl. It's a Linux tool, should be there in Cygwin too.
$ curl http://google.com
<HTML><HEAD><meta http-equiv="content-type" content="text/html;charset=utf-8">
<TITLE>301 Moved</TITLE></HEAD><BODY>
<H1>301 Moved</H1>
The document has moved
here.
</BODY></HTML>
There are lots of options; examples can be found in the 'net.
You can write your self a little tool or windows service or whatever you need then look at these 2 articles:
C#: How to programmatically check a web service is up and running?
check to if web-service is up and running - efficiently
EDIT:
This was my implementation in a similar scenario where I need to know if an external service still exists every time before the call is made:
bool IsExternalServiceRunning
{
get
{
bool isRunning = false;
try
{
var endpoint = new ServiceClient();
var serviceUri = endpoint.Endpoint.Address.Uri;
var request = (HttpWebRequest)WebRequest.Create(serviceUri);
request.Timeout = 1000000;
var response = (HttpWebResponse)request.GetResponse();
if (response.StatusCode == HttpStatusCode.OK)
isRunning = true;
}
#region
catch (Exception ex)
{
// Handle error
}
#endregion
return isRunning;
}
}
As I see it, you have 2 options:
If you can access the server it is running on, Log every call (and exceptions thrown). Read the log file with a soft like baretail that updates as the file is being written.
If you can't access the server, then you have to make the webservice write that log remotely to another computer you have access to.
Popular loggers have this functionality built in. (Log4Net, ...)
You could also use tracing.
http://msdn.microsoft.com/en-us/library/ms732023.aspx
http://msdn.microsoft.com/en-us/library/ms733025.aspx
I have an asp.net web page which interacts with a SQL Server database, grabs some data and then returns an XML response (which I feed into Freeswitch using xml_curl).
Because Freeswitch (FS from now on) does not store cookies, each request creates a new session.
When the number of requests gets too much (about 97 to 100), the SqlConnection.Open() method gets timeout from the SQL Server Instance, which then results in HTTP Error 500.
To test my assumption, I have created a small script using PHP and cURL, which make repeated requests to my asp.net page. If I store cookies (and thus sessions) in the PHP script I can make 10000 successful requests in almost 314 seconds.
But without sessions, I get stuck at about 97~100 requests, and then I get HTTP Error 500.
Is there anyway to overcome this problem?
==Edit==
Here is how I interact with the database:
String connectionString = WebConfigurationManager.ConnectionStrings["SqlServerConnection"].ConnectionString;
SqlConnection connection = new SqlConnection(connectionString);
SqlCommand command = connection.CreateCommand();
command.CommandType = CommandType.Text;
command.CommandText = "Select * from dbo.Template where Name = '" + name + "'";
Template template = new Template();
connection.Open();
SqlDataReader reader = command.ExecuteReader();
if (reader.HasRows)
{
reader.Read();
template.Name = reader["Name"].ToString();
template.XMLContent = reader["XMLContent"].ToString();
}
else
{
template.Name = "";
template.XMLContent = "";
}
reader.Close();
connection.Close();
return template;
And the Template table has these fields:
ID => int, identity, primary key
Name => nvarchar(255), unique
XMLContent => ntext
It appears you are using a connection pool. By default these pools have a max of 100 connections to your SQL server and queue any additional connections. The queue has a timeout (default 15 seconds) which can be extended if you wish to queue your requests longer. This means that you might get backed up on your server. You can also increase the pool max size if your SQL server can handle it.
Here is how you increase your connection settings by adding these parameters:
Timeout=60
Max Pool Size=150
etc etc
Some steps to impove this code.
If you do not need session, disabled it for this page so not cookie is going to be made.
Use some cache here base on the name If the request for name is the same, get it from cache and not open the database
Use a static variable to read only one time the connection string.
Place it on [try catch | using] , to be sure that you close the connection in case of failure
Maybe you can try a mutex lock logic, to avoid too many request together.
Use parameters on your sql call.
In addition to #Aristos suggestions:
Use Async-Pages!
Example and "Benchmark"
Some time ago I asked nearly the same question here on so
Currently I have a very simple code that downloads a file from a server, however i keep running into the following exceptions:
The remote server returned an error: (500)
Unable to connect to the remote server
There is nothing wrong with the webserver it has to do with my service and i guess it times out, how can i handle these more robustly? I have my code shown below, it's really simple.
try
{
string[] splitCrawlerid = StaticStringClass.crawlerID.Split('t');
WebClient webClient = new WebClient();
if (Directory.Exists("C:\\ImageDepot\\" + splitCrawlerid[2]))
{
}
else
{
Directory.CreateDirectory("C:\\ImageDepot\\" + splitCrawlerid[2]);
}
webClient.DownloadFile(privateHTML, #"C:\ImageDepot\" + splitCrawlerid[2] + "\\" + "AT" + carID + ".jpeg");
}
catch (Exception ex)
{
//not sure how to really handle these two exceptions reliably
}
The ideal situation for me would be to attempt to download the file again.
Try setting a user-agent header. The WebClient doesn't send that be default and MSDN warns that some web servers will return a 500 error if user-agent isn't set.
A WebClient instance does not send optional HTTP headers by default.
If your request requires an optional header, you must add the header
to the Headers collection. For example, to retain queries in the
response, you must add a user-agent header. Also, servers may return
500 (Internal Server Error) if the user agent header is missing.
See the example on the MSDN page for how to add the header.
You could wrap the whole thing in a for loop that goes 0..3, and the line after webClient.DownloadFile(...) could be a break;. That way if there's an exception, the break gets skipped and the app tries again. But that seems to be more of a band-aid to me; I'd spend more time figuring out exactly why things are going wrong.
If you want to remove all the "try while blah else until rethrow whatever" code from the business logic of your app, you could define an extension method like
public static T TryNTimes<T>(this Func<T> func, int n) {
while (true) {
try {
return func();
} catch {
if (++i == n) throw;
}
}
}
and use it like
Func<File> downloader = () => client.DownloadFile(...);
var file = downloader.TryNTimes(5);
I have some problems in database connection and wonder if I have something wrong in my code. Please review. This question is related: Switch between databases, use two databases simultaneously question.
cs="Data Source=mywebsite.com;Initial Catalog=database;User Id=root;Password=toor;Connect Timeout=10;Pooling='true';"
using (SqlConnection cnn = new SqlConnection(WebConfigurationManager.ConnectionStrings["cs"].ConnectionString))
{
using (SqlCommand cmmnd = new SqlCommand("", cnn))
{
try
{
cnn.Open();
#region Header & Description
cmmnd.Parameters.Add("#CatID", SqlDbType.Int).Value = catId;
cmmnd.CommandText = "SELECT UpperID, Title, Description FROM Categories WHERE CatID=#CatID;";
string mainCat = String.Empty, rootCat = String.Empty;
using (SqlDataReader rdr = cmmnd.ExecuteReader())
{
if (rdr.Read())
{
mainCat = rdr["Title"].ToString();
upperId = Convert.ToInt32(rdr["UpperID"]);
description = rdr["Title"];
}
else { Response.Redirect("/", false); }
}
if (upperId > 0) //If upper category exists add its name
{
cmmnd.Parameters["#CatID"].Value = upperId;
cmmnd.CommandText = "SELECT Title FROM Categories WHERE CatID=#CatID;";
using (SqlDataReader rdr = cmmnd.ExecuteReader())
{
if (rdr.Read())
{
rootCat = "<a href='x.aspx'>" + rdr["Title"] + "</a> ยป ";
}
}
}
#endregion
#region Sub-Categories
if (upperId == 0) //show only at root categories
{
cmmnd.Parameters["#CatID"].Value = catId;
cmmnd.CommandText = "SELECT Count(CatID) FROM Categories WHERE UpperID=#CatID;";
if (Convert.ToInt32(cmmnd.ExecuteScalar()) > 0)
{
cmmnd.CommandText = "SELECT CatID, Title FROM Categories WHERE UpperID=#CatID ORDER BY Title;";
using (SqlDataReader rdr = cmmnd.ExecuteReader())
{
while (rdr.Read())
{
subcat.InnerHtml += "<a href='x.aspx'>" + rdr["Title"].ToString().ToLower() + "</a>\n";
description += rdr["Title"] + ", ";
}
}
}
}
#endregion
}
catch (Exception ex) { HasanG.LogException(ex, Request.RawUrl, HttpContext.Current); Response.Redirect("/", false); }
finally { cnn.Close(); }
}
}
The random errors I'm receiving are:
A transport-level error has occurred when sending the request to the server. (provider: TCP Provider, error: 0 - An existing connection was forcibly closed by the remote host.)
A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: Named Pipes Provider, error: 40 - Could not open a connection to SQL Server)
Timeout expired. The timeout period elapsed prior to obtaining a connection from the pool. This may have occurred because all pooled connections were in use and max pool size was reached.
Cannot open database "db" requested by the login. The login failed. Login failed for user 'root'.
There's no real issues here.
You don't need the extraneous finally { cnn.close(); } as the using clause will take care of that for you. However changing it will have exactly zero impact.
Another thing is that I would put the try .. catch outside of the using clause with a redirect. But, again, I don't think that would affect the dispose from being called.
It's interesting that you would get connection pool errors (timeout expired) if you are always properly disposing of your connections, as you've shown.
Which leaves us with only one real solution: switch hosting providers. They have either overloaded their DB server to the point of unusability or some hardware element in their network setup (nic, switch, router, etc) is bad and dropping packets.
There are couple of inconsistencies which need to fixed:
description = rdr["Title"]; no proper casting defined.
Same command object is used for each sql statement and even you are not clearing parameters, it would be ideal if a separate command should be used for each sql statement.
Too many redirections as well, it is best to handle redirection at the end of method.
Check the database server health as well, it looks like database server is not responsive enough.
Hope it will help.
If you're connecting remotely to a database provider, you need to look at several possibilities like your own network configuration, firewall setup, etc.
Use a packet sniffer to figure out if lost packets are the issue.
Connection pooling is setup on your local machine, the server making the connections. If the database provider only allows for 5 connections and your connection pool is setup for 50 connections, well... you can do the math. It looks like you're closing the connections properly, so no issues there.
True... one error on "description = rdr["Title"];", should that be "description = rdr["Description"].ToString()"?
No need to put a using statement around the SqlCommand object and since you're using ad-hoc queries, just use string.Format("sql test {0}", param). This way, you can reuse the SqlCommand object without having to clear the parameters.
The biggest issue I see here is that you've mixed the presentation layer with the business layer with the datasource layer. Dump the try...catch and allow the business layer to handle logging stuff. Return an object to the presentation layer and allow it to perform the redirects. Keep the datasource layer very simple... get the data and return an entity. The business layer can handle any business logic on the entity itself.
SQL Server not found could be your fault or the providers... if the provider is at fault often, change providers.
Are you sure that the DB is configured to grant remote access using TCP?
I am limiting file size users can upload to the site from Web.config. As explained here, it should throw a ConfigurationErrorsException if size is not accepted. I tried to catch it from the action method or controller for upload requests but no luck. Connection is resetted and I can't get it to show an error page.
I tried catching it in BeginRequest event but no matter what I do the exception is unhandled.
Here's the code:
protected void Application_BeginRequest(Object sender, EventArgs e)
{
HttpContext context = ((HttpApplication)sender).Context;
try
{
if (context.Request.ContentLength > maxRequestLength)
{
IServiceProvider provider = (IServiceProvider)context;
HttpWorkerRequest workerRequest = (HttpWorkerRequest)provider.GetService(typeof(HttpWorkerRequest));
// Check if body contains data
if (workerRequest.HasEntityBody())
{
// get the total body length
int requestLength = workerRequest.GetTotalEntityBodyLength();
// Get the initial bytes loaded
int initialBytes = 0;
if (workerRequest.GetPreloadedEntityBody() != null)
initialBytes = workerRequest.GetPreloadedEntityBody().Length;
if (!workerRequest.IsEntireEntityBodyIsPreloaded())
{
byte[] buffer = new byte[512];
// Set the received bytes to initial bytes before start reading
int receivedBytes = initialBytes;
while (requestLength - receivedBytes >= initialBytes)
{
// Read another set of bytes
initialBytes = workerRequest.ReadEntityBody(buffer, buffer.Length);
// Update the received bytes
receivedBytes += initialBytes;
}
initialBytes = workerRequest.ReadEntityBody(buffer, requestLength - receivedBytes);
}
}
}
}
catch(HttpException)
{
context.Response.Redirect(this.Request.Url.LocalPath + "?action=exception");
}
}
But I still get this:
Maximum request length exceeded.
Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code.
Exception Details: System.Web.HttpException: Maximum request length exceeded.
Source Error:
An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below.
Update:
What method raises the exception anyway? If I read the request it raises exception If I don't read it at all, I get "101 Connection Reset" in browser. What can be done here?
You cant catch error in action method becouse exception comes earlier, but you can catch it here
protected void Application_Error() {
var lastError = Server.GetLastError();
if(lastError !=null && lastError is HttpException && lastError.Message.Contains("exceed")) {
Response.Redirect("~/errors/RequestLengthExceeded");
}
}
Actualy when file size exceeds limits HttpException error arise.
There is also IIS limit on content - wich can't be catched in application. IIS 7 throws
HTTP Error 404.13 - Not Found The
request filtering module is configured
to deny a request that exceeds the
request content length.
You can google it, there is a lot of information about this iis error.
There is no way to do it right without a client-side help. You cannot determine if the request is too long unless you read all of it. If you read each request to the end, anyone come and keep your server busy. If you just look at content length and drop the request, other side is going to think there is a connection problem. It's nothing you can do with error handling, it's a shortcoming of HTTP.
You can use Flash or Javascript components to make it right because this thing can't fail nicely.
I am not 100% on this, but I think it might help if you tried changing:
context.Response.Redirect(this.Request.Url.LocalPath + "?action=exception");
to
Server.Transfer(this.Request.Url.LocalPath + "?action=exception,false)
My thinking is that the the over-max-request-length Request is still being processed in the Redirect call but if you tell it to ditch the form data, it will become under the max request length and then it might behave differently.
No guarantees, but its easy to check.
catch (Exception ex)
{
if (ex is HttpException && (ex as HttpException).WebEventCode == 3004)
{
//-- you can now inform the client that file uploaded was too large.
}
else
throw;
}
I have a similar issue in that I want to catch the 'Maximum request length exceeded' exception within the Application_Error handler and then do a Redirect.
(The difference is that I am writing a REST service with ASP.Net Web API and instead of redirecting to an error page, I wanted to redirect to an Error controller which would then return the appropriate response).
However, what I found was that when running the application through the ASP.Net Development Server, the Response.Redirect didn't seem to be working. Fiddler would state "ReadResponse() failed: The server did not return a response for this request."
My client (Advanced REST Client for Chrome) would simply show "0 NO RESPONSE".
If I then ran the application via a local copy of IIS on my development machine then the redirect would work correctly!
I'm not sure i can definitively say that Response.Redirect does not work on the ASP.Net Development Server but it certainly wasn't working in my situation.
So, I recommend trying to run your application through IIS instead of IIS Express or the Development Server and see if you get a different result.
See this link on how to Specify the Web Server for Web Projects in Visual Studio:
http://msdn.microsoft.com/en-us/library/ms178108(v=vs.100).aspx