Occaisionally our office printer craps out on us in the middle of a print job, or someone just forgets to print because they get interrupted. In the good 'ole days, I built up my response using a StringBuilder and output the contents to the screen and to a log file in case we ever needed to go back and re-print.
Now I'm working with a system that makes use of all the .Net yumminess (Repeaters, page events, etc) rather than building up the HTML in code. Is there a way for me to log/archive the entire HTML response generated by the server for a particular page (e.g. hook into the Page_Render event and dump the output to a file)?
Thanks,
Sam
You can write an Http Module that plugs in at the end of the request pipeline and records the complete output.
See this example that should get you most of the way there.
Related
I have this specific scenario:
The user is sending me request which contains a URL to a file in my private repository.
The server side catch this request and Download the file.
The server making some calculation on the downloaded file.
The server sending the results back to client.
I implemented this in the "Naive" way. Which mean, I downloading the file (step 2) for each request. In most cases, the user will send the same file. So, I thought about better approach: keep the downloaded file in short term "cache".
This mean, I will download the item once, and use this for every user request.
Now the question is, how to manage those files?
In "perfect world", I will use the downloaded file for up to 30 minutes. After this time, I won't use it any more. So, optional solutions are:
Making a file system mechanism to handling files for short terms. Negative: Complex solution.
Using temporary directory to do this job (e.g. Path.GetTempFileName()). Negative: What if the system will start to delete those files, in the middle of reading it?
So, it's seems that each solution has bad sides. What do you recommend?
If my web application has a specific component(widget) which make a connection to another server(which is out of control) to read from an xml file .
sometimes the admin of the server which i connect to put a firewall or change some configuration . and when my application try to connect to this server it takes long time before the widget comes empty.
The problem is the time trying to connect to that server is a part of the time to load the page . and i feel there 's some thing wrong with all this time to request the page !
How can i determine if i can connect to that server to read the data or there is some issue which prevent me to do this.?
I don't quite understand what your widget is composed of and thus why it blocks the loading of the page. But two ways to decouple the widget for the page loading are:
Put the widget in an iframe element.
First insert a placeholder for the widget (e.g. a div element and a Loading... text). Then, after the page has loaded, use Javascript to replace the placeholder with the effective HTML.
Are you allowed to use this XML feed on your site? If not, they may be deliberately blocking your access to it.
However, I would cache the XML file locally, and let a cron job regularly pull the newest version from the other server.
I have a page that downloads a large HTML file from another domain then serve it to the user. The file is around 100k - 10MB and usually takes about 5min. What was think about doing something like this to make the user experience better.
download file
if file is not download within 10 seconds then displays a page that tells the user that the file is being downloaded
if the server completes the download in 1 second then it will serve the downloaded html
can this be done? do I need to use the async feature?
Updated question: the downloaded file is a html file
In order to provide an 'asynchronous' file download try a trick that Google is using: Create a hidden iframe and set it's source to the file you want to download. You can then still run javascript on your original page while the file is being downloaded through the iframe.
I think you should:
Return an HTML page to the user straight away, to tell them the transfer has started.
Start the download from the other domain in a separate process on your server.
Have the HTML from step 1 repeatedly reload, so you can check if the download has completed already, and possibly give an ETA or update to the user.
Return a link to the user when the initial transfer is complete.
It sounds like you need to use a waiting page that refreshes itself every so often and displays the status of your download. The download can be run on a separate thread using a System.Threading.Task, for instance.
I'm having an issue within my application Pelotonics. When a user downloads a file the system seems to block all incoming requests until that file is done downloading. What is the proper technique to to open a download dialog box (standard from the browser), let the user start downloading the file, then while the file is downloading, let the user continue throughout the application.
The way we're getting the file from the server is we have a separate ASPX page that get's passed in a value through the query string, then retrieves the stream of the file from the server, then I add the "content-disposition" header to the Response and then loop through the file's stream and read 2KB chunks out to the response.outputstream. Then once that's done I do a Response.End.
Watch this for a quick screencast on the issue:
http://www.screencast.com/users/PeloCast/folders/Jing/media/8bb4b1dd-ac66-4f84-a1a3-7fc64cd650c0
by the way, we're in ASP.NET and C#...
Thanks!!!
Daniel
I think ASP.NET allows one simultaneous page execution per session and I'm not aware of any way to configure this otherwise.
This is not a very pretty workaround, but it might help if you rewrote ASP.NET_SESSIONID value to the request cookie in Application_BeginRequest (in global.asax). Of course, you would need to the authentication some other way. I haven't tried this, though.
Another way would be launching a separate thread for the download process, but you would need to find a way how this can be done without the worker thread closing it's resources.
May I ask, is there a reason why don't you just use HttpResponse.TransmitFile?
I have a web page.There is a link to generate pdf reports.When the user clicks on the link the pdf generation process should start.Since the file size of the generated pdf is huge(>15Mb) the user cannot wait & thus has to proceed ahead with his other activities.That means the following simultaneous things would be happening now
The PDF Generation process continues unabated
The user continues with browsing without any jolt
After the pdf generation is completed the user should receive an email containing the download link.
Basically the implementation is
User clicks on generate report button
Using AJAX I make a call to the c# function say generateReport()
The problem
When I do this the user is not allowed to perform anything unless & untill the entire process completes.Ofcourse he can click on different links but with no response because of the AJAX call still getting implemented
How do I achieve this.I am a dot net(framework 2.0) developer creating aspx web pages using C#.I use javascript & AJAX(AjaxPro) to get rid of the postback in typical ASP.NET web applications.
In cases like this, you might want to consider splitting your PDF generation code out into a separate service, which your AJAX code could interact with to kick off the PDF creation. Once the service has created the PDF file, the service can email the user with the relevant info.
The AJAX code would use remoting to communicate with the service.
The concept you have in creating the reports and having them emailed to the user is a good one.
The implementation of this should be quite simple on the client side, ie a basic call to indicate that a report needs to be created. ie save this to a table (report queue)
The actual creation of the report should not be triggered by any of the calls from the front-end directly, Create a service (Windows Service) that runs through the "report queue" generating the PDF files and sending the emails.
As an added option, assuming the PDF's are not destroyed (ie not an email only solution) an ajax popup could be created on the client where the user can then go to a reports page and download the already generated file.
You could try on a timer make an AJAX call to a Function isReportDone(), that in turn checks a repository for the PDF. For generating the PDF, I THINK you'll need to pass that task out of your normal Request-Response Thread. Either by calling a seperate Thread (Multi-threading is fun) to process it, or by passing the data out to a seperate service on the server.
These are just two ideas really I had a similar issue with generating a file from a DB that had to be forcibly downloaded. I ended up calling a seperate page, in a seperate window, that wrote the data to the response stream. It actually worked great until the customer's network blocked any and all popups.
User asks for report to be generated (clicks a button)
Page kicks off an async service call to GeneratePdf(args).
User sees a spinner of some sort while Pdf is being generated, that way they know something is happening.
Pdf is generated (iTextSharp might come in handy here).
Pdf is stored somewhere. Database blob field would be ideal, that way the webservice can just pass back the id of the new file. Failing that, pass a filename back to the ajax code.
Webservice async (see this one) complete event happens, ajax code throws up either a popup or redirects to a new page. Maybe you want to replace the spinner with a "Ready!" link.
Link to pdf report file is emailed to user.