I currently started a little project, expieriencing the world of JS and HTML5.
I tried a few months ago, already, but I stopped, because I've had not enough time to create a MVC single page application from scratch. There were too many concepts and patterns, that had to be understood and I would have regretted losing all that knowledge from lack of use on my daily work. Use it or lose it!
Yesterday I just found this post on John Papa's blog and I thought that would be great to use as a start. Basically it's an MVC template, called HotTowel, which implements already great concepts like data-binding, minification and so forth. I would experience the code as far as I needed for the moment and would experience it further, as soon as I'd need to.
I'd like to build an application for fetching data from my works existing data model project. In our Silverlight application, we bootstrap it through preloading and initializing dictionaries and other properties and calling async Init() methods (e.g. for downloading XML files containing custom codes and put them into dictionaries). MEF is used to get rid of unhandy dependencies.
As far as I understood server side initialization has to be done in the Application_Start() method in the Global.asax file. I wonder how I'd await async calls in this method? What's the best practices? My queries on the client side heavily rely on these classes to be initialized. What options are there?
My thoughts were the following:
Application_Start() fires and forgets the async initialization process. If I'd perform a request (on a controller I guess) before the initialization finished, I'd have to wait for the callback of the initialization process and start queries as soon as it arrives. Advantage of this'd be, that initialization runs, while the user already can navigate through the application.
I'd implement some kind of lazy initialization. I'd process the initialization as soon as the first request is made. That may take long for the first request, though.
I'd run the initialization process synchronously in Application_Start(). The major disadvantage of this I've seen so far is, that the browser window seems freezed to the user. I'd be comfortable with this solution, if it was possible to let the user keep track of the current initialization status (some kind of splash screen).
Though I don't know how any of them would work concretly and would be glad if any of you could give me some advices for how and where to start.
You can use a Task<MyDataModel> to represent the data.
static Task<MyDataModel> dataTask;
public static Task<MyDataModel> LoadDataModelAsync()
{
var ret = new MyDataModel();
await ret.Init();
return ret;
}
Kick it off in Application_Start (or a static constructor):
dataTask = LoadDataModelAsync();
Then each of your actions that needs it can await for it to complete:
MyDataModel data = await dataTask;
...
If it's already complete, the await will detect that and continue (synchronously).
Related
I know this is a general question - however I want to gain a little advice before investing a lot of time looking into something that may not be suitable.
I have a C# .NET MVP web application, it needs to complete a new complex/heavy query when a user logs in (to find new messages for a user), however needs to do so without impacting performance. I have looked into multi threading in the past, however it would still need to wait until the new query completes, and adds a lot of complexity into the solution (we do not use multi threading currently).
I am wondering if Ajax would be a solution, so once the screen loads, is it possible to kick off an ajax command after page load that would execute and refresh part of the screen with the results?
How does the application handle if the user navigates to different screen? (note I planned to add the ajax component to master/base page which is part of every screen).
Would appreciate feedback, if people think its a possible approach I would do a proof of concept to prove it out.
If it doesn't need to return result to client side you can use HostingEnvironment.QueueBackgroundWorkItem to start the task then return it to client straight away.
If the query needs to return the result to client side then you need a to use Ajax call from client to server then update the view with data returned. You may want have a look at the client side framework such as angularJS.
I've read myself blue and am hoping there's a simple answer.
I have a web API that handles telemetry from various apps "in the wild" In one of my controllers, I want to receive a request to log an error to my central monitoring database and return a response near immediately as possible (I have no real way of knowing how critical performance might be on the caller's end and there's already a significant hit for making the initial web service request).
Essentially, what I'm looking for is something like this:
public IHttpActionResult Submit() {
try {
var model = MyModel.Parse(Request.Content.ReadAsStringAsync().Result);
// ok, I've got content, now log it but don't wait
// around to see the results of the logging, just return
// an Accepted result and begone
repository.SaveSubmission(model); // <-- fire and forget, don't wait
return Accepted();
} catch (Exception)
return InternalServerError();
}
}
It seems like it ought to be straightforward, but apparently not. I've read any number of various posts indicating everything from yup, just use Task.Run() to this is a terrible mistake and you can never achieve what you want!
The problem in my scenario appears to be the fact that this process could be terminated mid-process due to it running on the ASP.NET worker process, regardless of the mire of different ways to invoke async methods (I've spend the last two hours or so reading various SO questions and Stephen Cleary blogs... whew).
If the underlying issue in this case is that the method I'd 'fire and forget' is bound to the http context and subject to early termination by the ASP.NET worker process, then my question becomes...
Is there some way to remove this method/task/process from that ASP.NET context? Once that request is parsed into the model, I myself have no more specific need to be operating within the http context. If there's an easy way I can move it out of there (and thus letting the thing run barring a website/apppool restart), that'd be great.
For the sake of due diligence, let's say I get rid of the repository context in the controller and delegate it to some other context:
public IHttpActionResult Submit() {
try {
var model = MyModel.Parse(Request.Content.ReadAsStringAsync().Result);
SomeStaticClass.SaveSubmission(model); // <-- fire and forget, don't wait
return Accepted();
} catch (Exception)
return InternalServerError();
}
}
... then the only thing that has to "cross lines" is the model itself - no other code logic dependencies.
Granted, I'm probably making a mountain of a molehill - the insertion to the database won't take but a fraction of time anyway... it seems like it should be easy though, and I'm apparently too stubborn to settle for "good enough" tonight.
Ok, found a few more that were actually helpful to my scenario. The basic gist of it seems to be don't do it.
In order to do this correctly, one needs to submit this to a separate component in a distributed architecture (e.g., message or service queue of some sort where it can be picked up separately for processing). This appears to be the only way to break out of the ASP.NET worker process entirely.
One S/O comment (to another S/O post) lead me to two articles I hadn't yet seen before posting: one by Stephen Cleary and another by Phil Haack.
SO post of interest: How to queue background tasks in ASP.NET Web API
Stephen's Fire and Forget on ASP.NET blog post (excellent, wish I had found this first): http://blog.stephencleary.com/2014/06/fire-and-forget-on-asp-net.html
And Phil's article: http://haacked.com/archive/2011/10/16/the-dangers-of-implementing-recurring-background-tasks-in-asp-net.aspx/
The following project by Stephen may be of interest as well: https://github.com/StephenCleary/AspNetBackgroundTasks
I thought I'd delete my question but then figured it took me so long digging around to find my answer that maybe another question floating around SO wouldn't hurt...
(in this particular case, submitting to another service is going to take near as long as writing to the database anyway, so I'll probably forego the async processing for this api method, but at least now I know for when I actually do need to do it)
A database insert shouldn't take so long that you have to offload that processing to a background task. For starters just writing the task to a queue (or, as you suggested, handing off to a service) is going to take just as long but either approach should be sub-second.
However, if time is critical for you one way to speed up your response time is to make the database write as fast as possible using some form of in-memory cache so that the slower write to physical database storage is a queued background task. High-volume sites frequently use in-memory databases that implement this kind of behaviour (I've never needed one so can't help you choose a product) but you could also code this yourself just using a per-application instance list of objects and a background loop of some form.
This is where those articles you've linked apply and it gets complicated so a pre-built implementation is almost always the best approach - check out HangFire if you want a pre-built fire-and-forget implementation.
I am currently working on a project in ASP.NET MVC 4 and came along a module where a progress bar is needed. The question I am having right now is "What is the best way to implement an async progress bar?".
After some lookup I came across the following method:
Create a startEvent() and getProgress() in C# code.
Use javascript setTimeout() to call the getProgress() method asynchronously.
(Example: https://www.devexpress.com/Support/Center/Example/Details/E4244)
My remark with this method is that that causes the code to be dependent on the timeout you choose. So it would take some fiddling to find the best and most performant timeout.
Now the method that I would most likely have used before I researched the matter is the following:
In code behind, create a method handleItem(int index) which takes an index and does everything you want to do with the item at that index.
Determine the number of items you want to handle and pass that to your javascript.
In javascript, initiate a for loop that loops from 0 to the amount - 1, and for each index, it initiates an ajax-call to handleItem(i).
On that ajax-call's complete-statement, you can update the progress bar with the new amount.
My questions here are the following:
Does this expose too much of the program logic?
Does this create too much overhead seeing as every call goes to the server and back?
Are there any other reasons why I should refrain from using this method?
Thanks in advance
Koen Morren
This is not a recommended strategy, because the client drives the process. If there is any discontinuation of connectivity or maybe the user closes the browser, the process will stop.
Generally, if you use straight HTTP you will need to poll (aka pull) from javascript. The pseudo code is pretty much this:
Call Creates Task ID and sends it to client
Client queries the status of task with given ID
Another possibility are WebSockets, which allow your client to listen for changes that are pushed by the server.
There are many options to store the progress of a given state. You can index the progress by the HttpContext, task id, or some user id, or even store it in a database and use SqlDependency to get notifications of when the status is changed.
In summary, polling has more lag than push mechanisms. Clients should not drive an asynchronous process, but they should be either notified or provided some mechanisms on the status of an async process.
Unlike ASP.NET, there is few way to push data from server to client in MVC, WebSockets or SingnalR like api(s) can work for you.
The ajax approach is good and give you reliable mechanism to update data no matter user go to other page or closes the browser, every time ajax launched it will update UI. So there is nothing wrong there just have a fair interval in javascript.
Does this expose too much of the program logic?
Code will be written only in class file to calculate current %age.
2.Does this create too much overhead seeing as every call goes to the server and back?
No, ajax are light-weight calls
3.Are there any other reasons why I should refrain from using this method?
This method will allow user to freely navigate to other resources as ajax will work independently.
This might be quite complex so sorry for the wordy question.
1) Im going to redesign my application now to work with multiple threads (Backgroundworkers to be precise.) I will probably be having 5 or 6 bgw's for a particular gui. My first issue is, i have one method call that a gui needs to get its "core" data. Various parts of this data is then used for various other calls to places which also forms data that is displayed on the same page as the core data. How can i process this with various background workers such that backgroundworker1 does the core data getting, backgroundworker2 uses a particular item of the core to get more data, backgroundworker3 uses some core data and so on? Thus leaving my gui and main thread in an unlocked state
2) As i said previously, the gui has to get a set of core data first and then a fair few other database calls to get the rest of the important data. As i have seen i need to get this data outside of the gui constructor so there arent such big demands when the gui is created. In a design sense, how should i construct my gui such that it has access to data that then just needs to be displayed on creation opposed to, accessed and then displayed?
I hope these arent too wordy questions? I can see already that a lot of this comes down to programme design which as a novice is quite difficuly ( in my opinion of course). Hopefully someone can advise me as to what they would do in this situation.
Thanks
This sounds like a good task for a work queue. The main idea behind this is to add a work item to the queue, and this work item will have an associated function to do work on the data. The work is typically distributed to any number of threads you specify.
Several of these exist, just google for it.
Have you had a look at the .net 4 Task Parallel Library? Task Parallel Library
Check out the area titled Creating Task Continuations almost halfway down the page.
This is an example form the linked site
Task<byte[]> getData = new Task<byte[]>(() => GetFileData());
Task<double[]> analyzeData = getData.ContinueWith(x => Analyze(x.Result));
Task<string> reportData = analyzeData.ContinueWith(y => Summarize(y.Result));
getData.Start();
System.IO.File.WriteAllText(#"C:\reportFolder\report.txt", reportData.Result);
//or...
Task<string> reportData2 = Task.Factory.StartNew(() => GetFileData())
.ContinueWith((x) => Analyze(x.Result))
.ContinueWith((y) => Summarize(y.Result));
System.IO.File.WriteAllText(#"C:\reportFolder\report.txt", reportData.Result);
I have a code in my asp.net page where I am inserting some data in the database while uploading a file to the server. The problem is, it seems that the application is waiting for the file to be uploaded before it will insert to the database. Below is a code similar to mine.
public partial class _Default : System.Web.UI.Page
{
protected HtmlInputFile XLSFileInput;
...
protected void ImportButton_Click(object sender, EventArgs e)
{
InsertToDatabase(); //method to insert to database
XLSFileInput.PostedFile.SaveAs(filePath + fileName);
}
...
}
The problem here is that it seems that the InsertToDatabase() method is executing only after the file is uploaded to the server. Any help is appreciated.
This has NOTHING to do with the IIS (webserver) threading, it's more of a HTTP "problem".
The file is selected on the client and then all response data (including the file) is posted to the server before any server code is ran.
So the file is uploaded but not saved before the InsertToDatabase(); is executed.
This behaviour can only be worked around doing several posts (eg. with ajax) and is probably not a god solution for you.
Tell us more about what you are trying to accomplish and we might come up with some better suggestions :).
I would not recommend trying to manually control threading in ASP.NET, that is a job best left purely to IIS.
IMO for ASP.NET the better way to handle this is either invoke these requests through either multiple AJAX operations from the browser or to setup a WCF service that supports one way operations so when you call "InsertToDatabase" it executes a fire and forget operation to the WCF service sitting ontop your database that executes immediately and then continues onto the next line of code. Then IIS is running the code that the service method calls in it's own thread.
IMO using WCF is one of the most appropriate ways of handling threading in ASP.NET. Since you can easily set any service method to be synchronous or asynchronous.
Edit:
Introduction to Building Windows Communication Foundation Services
What You Need To Know About One-Way Calls, Callbacks, And Events
The file upload is a part of the HTML form post. Since the upload is a part of the form post process, it's always going to happen before any of your server-side code executes. The PostedFile.SaveAs() call doesn't cause the upload to happen, it just saves what was already uploaded as part of the request.
If you absolutely need the database insert to happen before the upload begins, you could do as #Chris Marisic suggests and run the insert as an AJAX call prior to submitting the form.
That's generally how single threaded applications work.
From your code I am guessing your using ASP.NET Web Forms.
You would have to consider sending the InsertToDatabase() operation off to free up the program to do your file upload. Perhaps depending upon your version consider UpdatePanels as a quick and dirty way to achieve this?
You suggest these operations are separate, provide more details to help figure out what each task does and if JavaScript is possible.
But your code sample indicates that InsertToDatabase() should be going before the file save.
If you are unsure of AJAX/JavaScript you could use a Thread pool to do your InsertToDatabase() method. However it all depends on what that method does. Please provide more code/details. I personally use this for a database insert which happens in the background (A logging Action Filter in ASP.NET MVC so other users may disagree on the validity of this usage. However it might save you learning another language.
ThreadPool.QueueUserWorkItem(delegate
{
// Code here
}
);
You could start another thread for the database insert. For Example:
ThreadStart job = new ThreadStart(InsertToDatabase);
Thread thread = new Thread(job);
thread.Start();