I have a web part in Moss 2007 that iterates through a given list using the Sharepoint API, and sets the field value to some predetermined value. Basically bulk editing a list.
For small lists, the web part works great. But when I am dealing with list that have a large amount of items, I get an internal error, or a request timeout.
Is there a way that I can click on the button, and have in the background Asynchronously perform the actions, so that the submitter doesn't have to wait for operation to complete, or can get the post back later?
How would I do this?
Thanks in advance, need all the help i cant get.
PS. There is no .aspx page in project, just a straight up class that has createchildcontrols and other functions.
PSS. I have very little control over and web.config or anything of that matter on the server itself
Can you build a custom timer job that performs the bulk editing, and the web part would just add items for the timer job to process.
If you can deploy a web service to the farm you could re-write your web part to use AJAX to call into the web service to perform the work.
Another option would be to have your web part spawn background threads, but I would be very wary of this option. Not only is managing threads difficult, but they will be running in SharePoint's IIS worker process thus increasing the opportunity for your web part to bring down the site.
Related
I am working on a ASP.NET MVC project that should be implement a multi-threading functionality. In fact, in this application, a user can navigate from a page to an other, so he can change the action of his current controller. My question is, Is there a way in ASP.NET MVC that can guarantee that the action is running in background, even though the user has switched the action. It means that even when he returns to the View after he navigates, he can get what he has launched in his current session (knowing that it may take a bit of time in order to do it).I know that it is contradictory with the MVC pattern, but this application should be a server side application.
I did some research and they say that a thread pool and asychronous controllers (http://msdn.microsoft.com/en-us/library/ee728598%28v=vs.100%29.aspx#performing_multiple_operations_in_parallel) may be a solution to this problem. I will be glad to hear any other suggestions to help me implement this project in the right way.
I think you should re-visit the premise that this should be implementing a parallel pattern.
For this to work the way I think you want it to, you will need a shared cache keyed by session id's. Your asynchronous tasks will be storing their results there. You will need some middleware that is initialized when the app initializes, this middleware would consist of a managed thread pool and a buffered queue of tasks. Your UI threads/web server threads would queue a task for the middleware to handle and dump the results in the shared cache which you would then check for results on subsequent web requests from that client. That's a lot of work. Especially if your client and server side applications are already intimately tied together as they usually are in ASP.NET.
Or, you could move the application away from ASP.NET and implement the server side application as a REST API in C# that your client application, written in Javascript, would hit using ajax requests. You build the client app as a single page app in some js MVC framework. This will allow the user to the client app in a seamless experience as calls to the server are non-blocking unless the client app wants them to be. Then there's really no need for the asynchronous patterns you mentioned above, which, honestly are not going to give you any sort of performance gains and it's not going to scale well.
I am writing an MVC webAPI that will be used to return values that will be bound to dropdown boxes or used as type-ahead textbox results on a website, and I want to cache values in memory so that I do not need to perform database requests every time the API is hit.
I am going to use the MemoryCache class and I know I can populate the cache when the first request comes in but I don't want the first request to the API to be slower than others. My question is: Is there a way for me to automatically populate the cache when the WebAPI first starts? I see there is an "App_Start" folder, maybe I just throw something in here?
After the initial population, I will probably run an hourly/daily request to update the cache as required.
MemoryCache:
http://msdn.microsoft.com/en-us/library/system.runtime.caching.memorycache.aspx
UDPATE
Ela's answer below did the trick, basically I just needed to look at the abilities of Global.asax.
Thanks for the quick help here, this has spun up a separate question for me about the pros/cons of different caching types.
Pros/Cons of different ASP.NET Caching Options
You can use the global.asax appplication start method to initialize resources.
Resources which will be used application wide basically.
The following link should help you to find more information:
http://www.asp.net/web-forms/tutorials/data-access/caching-data/caching-data-at-application-startup-cs
Hint:
If you use in process caching (which is usually the case if you cache something within the web context / thread), keep in mind that your web application is controlled by IIS.
The standard IIS configuration will shut down your web application after 20 minutes if no user requests have to be served.
This means, that any resources you have in memory, will be freed.
After this happens, the next time a user accesses your web application, the global asax, application start will be excecuted again, because IIS reinitializes your web application.
If you want to prevent this behaviour, you either configure the application pool idle timeout to not time out after 20minutes. Or you use a different cache strategy (persistent cache, distributed cache...).
To configure IIS for this, here you can find more information:
http://brad.kingsleyblog.com/IIS7-Application-Pool-Idle-Time-out-Settings/
Here's an API server which can give me real time news: every minute there will be something new to retrieve. There's also my web page, with a Javascript that will ask the API to get some news once every minute.
And this is not fine... unless my web page is made for a single user and will be open only on one machine at a time (which is not the case of the internet). the API, infact, restricts the number of call I can do per minute: let's suppose the API will ban me if I do more than 1 call per minute. If 100 users will load my web page, the API will receive 100 calls per minute (!!!).
Since the flow is my web page >> calls >> the API I think there is no solution without inserting another node which lazy loads from the api server.
my web page >> calls >> my server >> calls every minute >> the API
Since the instances of my web page may be many while my server is just one I think this is the solution.
However, I have no idea if:
a) is this the correct solution? Or could I somehow get my web page to behave correctly without the need of an intermediary server?
b) how can I implement this in ASP.NET MVC4? Is there any support for server side timers?
c) even if I can get IIS to retrieve the data every minute, should I then store it in a database to serve it to my web page?
d) the api server I'm talking about is The Times Newswire API. If anyone ever used a similar API, did you really created a domain model, a database table, a service and a routine just to retrieve data from it or did you just writed some javascript code in my web page? What then if you have milions of users?
You can use SignalR for this purpose, This is a push service which works by using sockets and therefore can be configured to send out one message to 1,000,000 listeners (or more obviously).
I've used this to great effect when creating a little prototype game last year and found it to be very reliable. You can use NuGet to grab the package in vs2010 and vs2012.
see Asp.net SignalR, see examples or simply google SignalR and you'll find a host of examples.
the API will ban me if I do more than 1 call per minute
Then you need a solution that calls the API for you every minute and stores it on your server. There's tons of ways of doing this, depending on many requirements. You can even go as far as writing a static HTML file which you then show the client.
You should call the API from your serverside. Fetch it one time every minute to your database. Then serve it to the users from your database via in example a restful service. ASP.NET MVC 4 can make all of this.
Also yes it do have a timer. You can check out Timer class.
You don't have to struggle with API restricts with this solution.
A) The best solution is probably to use your web server as an intermediate to the API
B) There are a lot of possibilities to choose from. The easiest would be starting a new thread when the web application starts (i.e in the Global.asax OnApplicationStarted event), and have that thread poll the external API and store the result for your clients to fetch. Another options, if you want full control of the lifecycle of this background procces, would be to create a windows service and host for instance a WebAPI in it that you clients can connect to.
Of course, these are just suggestions.
C) Depends on whether you want your clients to be able to access the latest fetch data, even if the background process has failed and the data is old. Otherwise, you can just store it in a static field somewhere. As long as the application isn't terminated the static field persists its value.
I'm working on an ASP.net application in which I've multiple web parts in a page. Each web part has its own data source pulling data from them and showing it. When a user tries to load the page I want to load the page instantly and asynchronously render all the web parts with the data.
But I see using Ajax I cannot initiate multiple requests simultaneously. But I can queue the requests and process them which is not going to help in any way since if any of the request is going to take long time all the subsequent requests need to wait.
Any ideas on how to achieve this behavior?
I would like to set timer for getting page source from some page and compare it if some changes happens with a version of yesterday.
which logic should i use to call some page once a day? my program is just a webpage and it cant send requests every 24 hours. how can i achieve to send request and check if changes happens?
you don't want to use a Web Application to do this, since a Web application typically responds to requests and doesn't wake up and make requests of its own (which is what you need).
What you need is a regular .NET application. It could be a console application that makes the call out to this other website. you could use the WebClient class or similar to do the job.
Once you have it all working, you can use Windows Scheduler to schedule the task at whatever interval you need.
If you need to then communicate with you Web Application from the console app, you can do the same thing, that is make a request to a specific page or handler from you console app (just like you called a page on the other website). When you web application receives the request, you can act on this and do what you need to.
Keep in mind that if all you need to do is update some database tables, you might as well do that from the console application.
Think of your console app (that uses WebClient) as a custom browser. You can call any url you need to and the Web application on the other end sees that call as if it were made by a browser. So using you specialized "browser" you can call other web sites as well as your own.