I am looking for a way to close a WebTest response stream (JSON Object) without having to use the timeout property as it makes the test fail and not always works, the reason to do this is that the stream ticks infinitely unless it is closed by the client, right now my tests just time out because I haven't found a way to close them from code.
The JSON object doesn't need to be valid, but an example of an object and how it looks like when streamed can be found here: http://tradestation.github.io/webapi-docs/en/stream/
My load test parses an IISLog and then it sends the Web API requests it finds as WebTestRequests, some of those requests are answered with JSON objects that stream endlessly and I need to close those streams based on the time it took the request to complete in the IISlog.
public class WebTest1Coded : WebTest
{
public WebTest1Coded()
{
this.PreAuthenticate = true;
}
public override IEnumerator<WebTestRequest> GetRequestEnumerator()
{
//Substitute the highlighted path with the path of the iis log file
IISLogReader reader = new IISLogReader(#"C:\IisLogsToWebPerfTest\TestData\log.log");
foreach (WebTestRequest request in reader.GetRequests())
{
if (this.LastResponse != null) {
this.LastResponse.HtmlDocument.ToString();
}
yield return request;
}
}
}
Thanks!
Related
So i have an app that works using a token for a user. User can be signed in only on 1 device ( after login previous tokens get expired ). So i came up with an idea to cache some data. So i created a CacheManager that is a singleton.
CacheManager has a Dictionary with previously fetched data in it.
So here is an example:
/// <summary>
/// Tries to get Global settings data from the cache. If it is not present - asks ServiceManager to fetch it.
/// </summary>
/// <returns>Global setting object</returns>
public async Task<GlobalSettingsModel> GetGlobalSettingAsync()
{
GlobalSettingsModel result;
if (!this.cache.ContainsKey("GlobalSettings"))
{
result = await ServiceManager.Instance.RequestGlobalSettingAsync();
if (result != null)
{
this.cache.Add("GlobalSettings", result);
}
// TODO: Figure out what to do in case of null
}
return (GlobalSettingsModel)this.cache["GlobalSettings"];
}
So the question is, how can i modify this method, to handle such case:
For example, method that i call from the server, works longer than user navigated to the page where the data is needed i want to show a loading indicator and hide it when the data has actually been received.
Why do i need it, we have 2 pages - ExtendedSplashScreen and UpdatesPage user can quickly skip them ( 1s ) or stay and read interesting info ( lets say 1m ).
In this time, i have started to fetch the GetGlobalSetting in order to have the process ended or download atleast something ( to minify wait for the user ) when he gets to LoginPage.
On my ExtendedSplashScreen i launched:
CacheManager.Instance.GetGlobalSettingAsync();
For test purposes, i have modified the ServiceManager method:
/// <summary>
/// Fetches the object of Global Settings from the server
/// </summary>
/// <returns>Global setting object</returns>
public async Task<GlobalSettingsModel> RequestGlobalSettingAsync()
{
await Task.Delay(60000);
// Request and response JSONs are here, because we will need them to be logged if any unexpected exceptions will occur
// Response JSON
string responseData = string.Empty;
// Request JSON
string requestData = JsonConvert.SerializeObject(new GlobalSettingsRequestModel());
// Posting list of keys that we want to get from GlobalSettings table
HttpResponseMessage response = await client.PostAsync("ServerMethod", new StringContent(requestData, Encoding.UTF8, "application/json"));
// TODO: HANDLE ALL THE SERVER POSSIBLE ERRORS
Stream receiveStream = await response.Content.ReadAsStreamAsync();
StreamReader readStream = new StreamReader(receiveStream, Encoding.UTF8);
// Read the response data
responseData = readStream.ReadToEnd();
return JsonConvert.DeserializeObject<GlobalSettingsResponseModel>(responseData).GlobalSettings;
}
So, when user gets to the LoginPage i do the:
// The await is here because there is no way without this data further
GlobalSettingsModel settings = await CacheManager.Instance.GetGlobalSettingAsync();
And here, i would like to get the data from the cache if it was already downloaded or that CacheManager would return me the data as soon as it will be finished downloading.
One way would be to cache the Task<GlobalSettingsModel> instead of the GlobalSettingsModel itself. When you obtain it from the cache, you could check if it has completed and then either await or use its result accordingly.
I'm working with a frustrating API that has an annoying habit of varying it's throttling rate. Sometimes I can send one request per second, and sometimes I can only send a request every three to four seconds.
With this in mind, I need to create a way to manage this. Whenever a request fails, it returns a 503 response (service unavailable). My current plan is to use the HttpStatusCodeof my WebResponse to determine whether or not I should swallow the current WebException. I can repeat this x number of times until either the request is successful, or the process is cancelled altogether.
Note that I cannot stop and restart the process, because it is both time consuming for the user and damaging to the structure of the data.
Currently, I have wrapped up the API call and XML load into a method of it's own:
int webexceptionnumber = 200;
public bool webResponseSuccessful(string uri, XmlDocument doc)
{
try
{
WebRequest request = HttpWebRequest.Create(uri);
WebResponse response = request.GetResponse();
doc.Load(response.GetResponseStream());
return true;
}
catch(WebException l)
{
if (((HttpWebResponse)l.Response).StatusCode == HttpStatusCode.ServiceUnavailable)
{
webexceptionnumber = 503; // I need to do this in a much neater
return false; //fashion, but this is quick and dirty for now
}
else
{
return false;
}
}
}
I can then call this, checking to see if it returns a false value, like so:
if (!webResponseSuccessful(signedUri, xDoc))
{
//Here is where I'm struggling - see below
}
I'm stumped as to how to do this cleanly. I'd like to avoid getting messier by using a goto statement, so how do I repeat the action when a 503 response is returned? Is a while loop the answer, or do I really need to do away with that extra integer and do this in a cleaner fashion?
Change the bool to a "return type" in that type have a bool that says IsSuccessful and ShouldTryAgain. Then have the caller decide to run the operation again or continue.
public class ReturnType {
public IsSuccessFul{get;set;}
public ShouldTryAgain {get;set;}
}
I have scenario where we are maintaining Rates file(.xml) which is access by 3 different application running on 3 different servers. All 3 application uses RatesMaintenance.dll which has below 4 methods Load, Write, Read and Close.
All 3 applications writes into file continuously and hence I have added Monitor.Enter and Monitor.Exit mechanism assuming that these 3 operation from 3 different application will not collide. But at this moment, in some case, I am getting error - "Could not open the rates file"
As per my understanding, this means, some reason 3 application tries to access at same. Could anyone please suggest how to handle such scenario?
Monitor.Enter(RatesFileLock);
try
{
//Open Rates file
LoadRatesFile(false);
//Write Rates into file
WriteRatesToFile();
//Close Rates file
CloseRatesFile();
}
finally
{
Monitor.Exit(RatesFileLock);
}
Method signature of Load-
LoadRatesFile(bool isReadOnly)
For opening file-
new FileStream(RatesFilePath,
isReadOnly ? FileMode.Open : FileMode.OpenOrCreate,
isReadOnly ? FileAccess.Read : FileAccess.ReadWrite,
isReadOnly ? FileShare.ReadWrite : FileShare.None);
.... remaining Rates reading logic code here
For reading Rates from file-
Rates = LoadRatesFile(true);
For Writing Rates into file-
if (_RatesFileStream != null && _RatesInfo != null && _RatesFileSerializer != null)
{
_RatesFileStream.SetLength(0);
_RatesFileSerializer.Serialize(_RatesFileStream, _RatesInfo);
}
In closing file method-
_RatesFileStream.Close();
_RatesFileStream = null;
I hope, I try to explain my scenario in details. Please let me know in case anyone more details.
While the other answers are correct and that you won't be able to get a perfect solution with files that are accessed concurrently by multiple processes, adding a retry mechanism may make it reliable enough for your use case.
Before I show one way to do that, I've got two minor suggestions - C#'s "using" blocks are really useful for dealing with resources such as files and locks that you really want to be sure to dispose of after use. In your code, the monitor is always exited because you use try..finally (though this would still be clearer with an outer "lock" block) but you don't close the file if the WriteRatesToFile method fails.
So, firstly, I'd suggest changing your code to something like the following -
private static object _ratesFileLock = new object();
public void UpdateRates()
{
lock (_ratesFileLock)
{
using (var stream = GetRatesFileStream())
{
var rates = LoadRatesFile(stream);
// Apply any other update logic here
WriteRatesToFile(rates, stream);
}
}
}
private Stream GetRatesFileStream()
{
return File.Open("rates.txt", FileMode.OpenOrCreate, FileAccess.ReadWrite, FileShare.ReadWrite);
}
private IEnumerable<Rate> LoadRatesFile(Stream stream)
{
// Apply any other logic here
return RatesSerialiser.Deserialise(stream);
}
private void WriteRatesToFile(IEnumerable<Rate> rates, Stream stream)
{
RatesSerialiser.Serialise(rates, stream);
}
This tries to opens the file stream once and then reuses it between load and write actions - and reliably dispose of it, even if an error is encountered inside in the using block (same applies to the "lock" block, which is simpler than Monitor.Enter/Exit and try..finally).
This could quite simply be extended to include a retry mechanism so that if the file is locked by another process then we wait a short time and then try again -
private static object _ratesFileLock = new object();
public void UpdateRates()
{
Attempt(TryToUpdateRates, maximumNumberOfAttempts: 50, timeToWaitBetweenRetriesInMs: 100);
}
private void TryToUpdateRates()
{
lock (_ratesFileLock)
{
using (var stream = GetRatesFileStream())
{
var rates = LoadRatesFile(stream);
// Apply any other update logic here
WriteRatesToFile(rates, stream);
}
}
}
private Stream GetRatesFileStream()
{
return File.Open("rates.txt", FileMode.OpenOrCreate, FileAccess.ReadWrite, FileShare.ReadWrite);
}
private IEnumerable<Rate> LoadRatesFile(Stream stream)
{
// Apply any other logic here
return RatesSerialiser.Deserialise(stream);
}
private void WriteRatesToFile(IEnumerable<Rate> rates, Stream stream)
{
RatesSerialiser.Serialise(rates, stream);
}
private static void Attempt(Action work, int maximumNumberOfAttempts, int timeToWaitBetweenRetriesInMs)
{
var numberOfFailedAttempts = 0;
while (true)
{
try
{
work();
return;
}
catch
{
numberOfFailedAttempts++;
if (numberOfFailedAttempts >= maximumNumberOfAttempts)
throw;
Thread.Sleep(timeToWaitBetweenRetriesInMs);
}
}
}
What you're trying to do is difficult bordering on impossible. I won't say that it's impossible because there is always a way, but it's better not to try to make something work in a way it wasn't intended to.
And even if you get it to work and you could ensure that applications on multiple servers don't overstep each other, someone could write some other process that locks the same file because it doesn't know about the system in place for gaining access to that file and playing well with the other apps.
You could check to see if the file is in use before opening it, but there's no guarantee that another server won't open it in between when you checked and when you tried to open it.
The ideal answer is not to try to use a file as database accessed by multiple applications concurrently. That's exactly what databases are for. They can handle multiple concurrent requests to read and write records. Sometimes we use files for logs or other data. But if you've got applications going on three servers then you really need a database.
I'm working with .NET 3.5 with a simple handler for http requests. Right now, on each http request my handler opens a tcp connection with 3 remote servers in order to receive some information from them. Then closes the sockets and writes the server status back to Context.Response.
However, I would prefer to have a separate object that every 5 minutes connects to the remote servers via tcp, gets the information and keeps it. So the HttpRequest, on each request would be much faster just asking this object for the information.
So my questions here are, how to keep a shared global object in memory all the time that can also "wake" an do those tcp connections even when no http requests are coming and have the object accesible to the http request handler.
A service may be overkill for this.
You can create a global object in your application start and have it create a background thread that does the query every 5 minutes. Take the response (or what you process from the response) and put it into a separate class, creating a new instance of that class with each response, and use System.Threading.Interlocked.Exchange to replace a static instance each time the response is retrieved. When you want to look the the response, simply copy a reference the static instance to a stack reference and you will have the most recent data.
Keep in mind, however, that ASP.NET will kill your application whenever there are no requests for a certain amount of time (idle time), so your application will stop and restart, causing your global object to get destroyed and recreated.
You may read elsewhere that you can't or shouldn't do background stuff in ASP.NET, but that's not true--you just have to understand the implications. I have similar code to the following example working on an ASP.NET site that handles over 1000 req/sec peak (across multiple servers).
For example, in global.asax.cs:
public class BackgroundResult
{
public string Response; // for simplicity, just use a public field for this example--for a real implementation, public fields are probably bad
}
class BackgroundQuery
{
private BackgroundResult _result; // interlocked
private readonly Thread _thread;
public BackgroundQuery()
{
_thread = new Thread(new ThreadStart(BackgroundThread));
_thread.IsBackground = true; // allow the application to shut down without errors even while this thread is still running
_thread.Name = "Background Query Thread";
_thread.Start();
// maybe you want to get the first result here immediately?? Otherwise, the first result may not be available for a bit
}
/// <summary>
/// Gets the latest result. Note that the result could change at any time, so do expect to reference this directly and get the same object back every time--for example, if you write code like: if (LatestResult.IsFoo) { LatestResult.Bar }, the object returned to check IsFoo could be different from the one used to get the Bar property.
/// </summary>
public BackgroundResult LatestResult { get { return _result; } }
private void BackgroundThread()
{
try
{
while (true)
{
try
{
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create("http://example.com/samplepath?query=query");
request.Method = "GET";
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
using (StreamReader reader = new StreamReader(response.GetResponseStream(), System.Text.Encoding.UTF8))
{
// get what I need here (just the entire contents as a string for this example)
string result = reader.ReadToEnd();
// put it into the results
BackgroundResult backgroundResult = new BackgroundResult { Response = result };
System.Threading.Interlocked.Exchange(ref _result, backgroundResult);
}
}
}
catch (Exception ex)
{
// the request failed--cath here and notify us somehow, but keep looping
System.Diagnostics.Trace.WriteLine("Exception doing background web request:" + ex.ToString());
}
// wait for five minutes before we query again. Note that this is five minutes between the END of one request and the start of another--if you want 5 minutes between the START of each request, this will need to change a little.
System.Threading.Thread.Sleep(5 * 60 * 1000);
}
}
catch (Exception ex)
{
// we need to get notified of this error here somehow by logging it or something...
System.Diagnostics.Trace.WriteLine("Error in BackgroundQuery.BackgroundThread:" + ex.ToString());
}
}
}
private static BackgroundQuery _BackgroundQuerier; // set only during application startup
protected void Application_Start(object sender, EventArgs e)
{
// other initialization here...
_BackgroundQuerier = new BackgroundQuery();
// get the value here (it may or may not be set quite yet at this point)
BackgroundResult result = _BackgroundQuerier.LatestResult;
// other initialization here...
}
I have created my own implementation (pretty straight forward) in order to talk to a REST-service. The code for GET requests can be found below. However, I would like to hear if there are some obvious pitfalls in my code that makes the requests perform worse than they could. They work decently at the moment, but I have a feeling I could have done a better job.
Any feedback would be greatly appreciated!
public static void Get<T>(string url, Action<Result<T>> callback, NetworkCredential credentials = null, JsonConverter converter = null)
{
// Checks for no internet
if (!NetworkInterface.GetIsNetworkAvailable())
{
callback(new Result<T>(new NoInternetException()));
return;
}
// Sets up the web request for the given URL (REST-call)
var webRequest = WebRequest.Create(url) as HttpWebRequest;
// Makes sure we'll accept gzip encoded responses
webRequest.Headers[HttpRequestHeader.AcceptEncoding] = "gzip";
// If any credentials were sent, attach them to request
webRequest.Credentials = credentials;
// Queues things up in a thread pool
ThreadPool.QueueUserWorkItem((object ignore) =>
{
// Starts receiving the response
webRequest.BeginGetCompressedResponse(responseResult =>
{
try
{
// Fetches the response
var response = (HttpWebResponse)webRequest.EndGetResponse(responseResult);
// If there _is_ a response, convert the JSON
if (response != null)
{
// Gives us a standard variable to put stuff into
var result = default(T);
// Creates the settings-object to insert all custom converters into
var settings = new JsonSerializerSettings();
// Inserts the relevant converters
if (converter != null)
{
if (converter is JsonMovieConverter)
{
settings.Converters.Add(new JsonMovieListConverter());
}
settings.Converters.Add(converter);
}
// Depending on whether or not something is encoded as GZIP - deserialize from JSON in the correct way
if (response.Headers[HttpRequestHeader.ContentEncoding] == "gzip")
{
var gzipStream = response.GetCompressedResponseStream();
result = JsonConvert.DeserializeObject<T>(new StreamReader(gzipStream).ReadToEnd(), settings);
}
else
{
result = JsonConvert.DeserializeObject<T>(new StreamReader(response.GetResponseStream()).ReadToEnd(), settings);
}
// Close the response
response.Close();
// Launch callback
callback(new Result<T>(result));
}
}
catch (Exception ex) // Deals with errors
{
if (ex is WebException && ((WebException)ex).Response != null && ((HttpWebResponse)((WebException)ex).Response).StatusCode == HttpStatusCode.Unauthorized)
{
callback(new Result<T>(new UnauthorizedException()));
}
else
{
callback(new Result<T>(ex));
}
}
}, webRequest);
});
}
In general this code should be quite self explanatory, but here's a few more facts:
I am using Delay's optimized gzip-decoder, which provides me with the method GetCompressedResponse() (basically the same as the original method).
I have created some JSON.net custom JsonConverter classes in order to deserialize my JSON correctly. These are fairly simple and doesn't affect performance.
The Result-class is simply a wrapper class for my results (contains a Value and Error field)
I don't know JSON.net, but is there a form that takes a stream or a streamreader rather than forcing you to read the entire string into memory first? That's rather wasteful if the streams could be large, though it'll make no difference if they're all small.
There's been a HttpWebrequest.AutomaticDecompression property that can simplify your code since 2.0 (in fairness, I'm always forgetting about that myself).
You can use the CachePolicy property to have the request use the IE cache, which can be a big saving if you'll hit the same URIs and the server handles it appropriately (appropriate max-age, correct handling of conditional GET). It also allows some flexibility - e.g. if your use has a high requirement for freshness you can use the Revalidate level so you'll always contact the server, even if the max-age suggests the server shouldn't be contacted, but you can still act on a 304 appropriately (presented to your code as if it were a 200, so you don't need to rewrite everything).
You could even build an object cache on top of this, where you use the IsFromCache method to know that whether it's safe to use the cached object, or if you need to rebuild it because the data it was built from has changed. (This is really sweet actually, there's the famous line about cache invalidation being a hard problem, and this lets us pass the buck for that hard bit down to the HTTP layer, while having the actual cached items living in the .NET layer and not needing to be deserialised again - it's a bit of work so don't do it if you won't have frequent cache hits due to the nature of your data, but where it does work, it rocks).