I am experiencing a strange issue where some sections of the Asp.Net(aspx pages) site are taking a long time to load than others. After going through all the stages of the cycle and getting the page and the master page processed after it exits the code it takes about 20 seconds for the application to get to the Application_EndRequest event. I am not sure what it is doing for those 20 seconds. Since I know what I am telling is not specific I am just asking for suggestions on how to debug the issue or any helpful tips I can follow to see what's the holdup.
Thanks
You can try to use a code profiler. I used Stackify's Prefix in a similar situation. It's a free download.
Related
I have a dot net web application. There is one page where we enter data & submit the form.We upload the attachment before submitting the form.The submit action is taking long time almost minute for files with attachment of 650KB. The code behind is C#. We use third party API(Ektron).Its a CMS tool.
Please let me know , in what all ways i can analyse the bottle neck for the issue.Please provide open source Tool & the browser addons.. other than Page speed & Yslow .
Please check if the time taken is for the request to initiate or the response to comeback to your browser..
It is only then you can look for a solution..
To answer the second half of your question. At the very least most modern browsers (FireFox, Chrome and Safari) have a developer console that will give you a breakdown of the times taken in each request state on a per request basis. My personal preference is FireFox with FireBug as I find the Network pane view easy to interpret.
Redgate ANTS Performance Profiler is pretty much the bees knees for troubleshooting performance problems in ASP.net.
I have an ASP.NET application which is taking more time to load initially. After loading the first time, the page loads faster.
My page has an Image gallery. This gallery is loaded based on the category selection. This is done by ajax. When I click a particular category , it will load a gallery via ajax. But the problem is The first Ajax request to load a category will take more time. Second time we try to access the same category it will load faster.
I have not enabled the server-side and client-side caching. What actually happens behind scene? I think When I try to read a file from a disk for the first time, it will cache in memory and second time it will given from the memory. Is it true my assumption? So my questions are:
1.Will Os will disk cache the file read operation?
2.If not so what is the problem happening when open the first time?
3.How can resolve this problem? Is there any IIS setting or Page Level?
Please Help.
Try deploying a precompiled solution to the server:
http://msdn.microsoft.com/en-us/library/ms228015(v=vs.85).aspx
so im making a program which is kind of a web crawler. it downloads the html of a page and parses it for a specific text using regex and then adds it to a list.
to achieve this, i used async http requests. the GET request is sent asynchronously and the parsing operation is performed on the returned html.
my issue, and i'm not sure if it may be simple, is that the program doesn't run smoothly. it will send a bunch of requests, pause for a couple seconds, then increments the items parsed all at once (although the counter is programmed to increment once every time an item is added) so that for example it jumps from 53 to 69 instead of showing, 54,55,56,...
sorry for being a newb but i taught myself all this stuff and some experienced advice would go a long way.
thanks
That sounds correct.
The slowest part of your task is downloading the pages over the network.
Your program starts downloading a bunch of pages at once, waits for them to arrive, then parses them all almost instantly.
my scenario is this; the user selects the list of reports they wish to print, once they select and click on the a button, i open up another page with the selected reports ready for printing. I am using a session variable to pass reports from one page to another.
first time you try it, it works fine, second time you try it, it opens the report window with the previous selected reports. I have to refresh the page to make sure it loads the latest selections.
is there a way to get the latest value from the session every time you use it? or is there a better way to solve this problem. open for suggestions...
Thanks
C# Asp.net, IE&7 /IE 8
After doing some more checking maybe if you check out COMET it might help.
The idea is that you can have code in your second page which will keep checking the server for updated values every few seconds and if it finds updated values it will refresh itself.
There are 2 very good links explaining the imlementation.
Scalable COMET Combined with ASP.NET
Scalable COMET Combined with ASP.NET - Part 2
The first link explains what COMET is and how it ties in with ASP.NET, the second link has an example using a chat room. However, I'm sure the code querying for updates will be pretty generic and can be applied to your scenario.
I have never implemented COMET yet so I'm not sure how complex it is or if it is easy to implement into your solution.
Maybe someone developing the SO application is able to resolve this issue for you. SO uses some real-time feature for the notifications on a page, i.e: You are in the middle of writing an answer and a message pops up in your client letting you know someone else has added an answer and to click "here" to refresh.
The proper fix is to set the caching directives on the HTTP response correctly, so that the cached response is not reused without validation from the server.
When you fail to specify the cache lifetime, the client has to "guess" how long the response is good for, and the browser's guess probably isn't what you want. See http://blogs.msdn.com/b/ie/archive/2010/07/14/caching-improvements-in-internet-explorer-9.aspx
It's better to use URL paramaters. So you have a view of value of the paramaters.
I have a report that is 68,000 pages plus that needs to be generated in PDF. (I put the comma in so you knew it wasn’t a typo :)
And this actually works fine if I generate it through the Website/WebService of SSRS.
It takes a little over a minute and generates the first page. During this time Server CPU goes to 100% and Memory to 2GB. After the first page is generated both CPU and Memory drop down to their pre-report state. Now if I choose to export to PDF. CPU on the Server goes to 100 but Memory does not jump significantly maybe .05GB (50 MB) as the pdf is being generated. This takes about 10-15 Minutes.
Now if I use the Render method in code
rs.Render(Me.ReportName, Me.ContentType, Nothing, Nothing, ....
I set the rs.Timeout to 1800000 //(30 minutes). CPU and Memory on the server spikes and about after 10 Minutes I get a out of Memory Exception. I believe from the server and not the host machine with the calling code (a web service).
Now I did notice when the PDF gets render through the SSRS website it creates a new URL with the parameters
ReportSession=gvrjxt4504wtpkiydu0o51fo
ControlID=5754f0889fb34bea80e7b5e97c120cfd
Culture=1033
UICulture=9
ReportStack=1
OpType=Export
FileName=Invoice+Session+Register+Batch
ContentDisposition=OnlyHtmlInline
Format=PDF
Now it is my belief that it is this ReportSession or ControlId that makes the PDF generation not take up so much memory.
Either way my question is how can I mimic through code the behavior that the Website is showing?
I about to look into LoadReport method and also NULL is one of the content types you can pass to the render method. But I cannot find an example nor and explanation of what it does.
So before I go down all these rabbit holes, has anyone else done something like this or encounter a project like this?
Background:If I generate the report one page at a time it took 9.5 hours to run and generate all the PDFs. I was really excited when I could generate the whole report in 10 minutes and use PDFSharp to split the report. Now I most likely can generate 10,000 or 20,000 at a time but it really fustrates me when a method works but I cannot duplicate it in code.