A lot of lead up:
I have a simple ASP .NET 3.5 data entry webform with a series of dropdown lists, text boxes, and text areas and user authentication is being handled by Active Directory.
The user enters a alphanumeric id, and clicks a button. The button's onclick() in the aspx.cs:
1. calls a stored procedure to determine if it a new record, or the existing data if it already exists
2. if the record exists, then pre-fills the form with the existing values.
There are three textboxes which have been extended to use ASP .NET AJAX AutoComplete (each contained in their own asp:UpdatePanel), which also postback successfully.
When the user is done entering the data, there is a single button to save the record, which:
1. calls a stored procedure, which either inserts or updates accordingly
2. clears the webform
3. displays a quick success message.
There have been over 4000 records inserted and updated through this form since it's launch. I now have a problem where there is ONE record that cannot be updated, it was inserted a month ago, through the same form, without issue.
In Internet Explorer (6, 7 and 8): When you click the save button, it asks for you to provide your domain username and password. Entering a valid username and password displays the "Internet Explorer cannot display the webpage" screen.
In Firefox 3: When you click the save button, it displays the "Connection interrupted" screen. Clicking the "Try Again" button does not change the results.
There are no entries in the logging that the application uses, the server's Event Log, nor for SQL Server 2005.
I have tried:
- on different computers, and it failed.
- with different users, and it failed.
- with numerous other records, and they update perfectly.
I have loaded the record into the test environment in two different manners:
1. copied and pasted directly from the production database to the test database
2. copied and pasted directly from the production webform to the test webform.
The issue does not happen in test or my local development system. Both production and test are running ASP .NET 3.5 SP1.
I even saved a copy of the production page which is failing and the test page which works as HTML and ran them through Total Commander's "Compare Files by Content" function, in the hopes that the differences would be highlighted in red to be nice and obvious. The only differences were in the areas auto-generated by .NET at runtime, and the occasional place where a dropdown's item list has more entries in production then in test.
I realize there is probably something wrong with the data that is causing the final postback to fail, but it appears to be failing before the postback even starts. I have looked through the record's data through viewing the source of the loaded aspx page and in the database in the hopes of finding a stray invisible character or a textarea that has too many characters that may be causing it to choke, but no luck.
A coworker suggested setting viewStateEncryptionMode="never" in the web.config and this has 'fixed' the problem, and the record can now be updated without error.
Unfortunately, I cannot provide the data that is causing the form to fail.
My question:
Does anyone have an idea why this happened in the first place or why setting viewStateEncryptionMode="never" fixed it? A better solution than setting the ViewStateEncryptMode to never would also be welcome.
Thanks!
First, and most important - besides .Net 3.5 SP1 - make sure you have the latest patches installed on the server (this should always be the first step).
Assuming the server is up to date, I'd start by checking the firewall and anti-virus software on your server (they should have logs). Anti-virus can block web sites that use suspicious code - like a known JavaScript exploit (I'll admit I tried that one. For science). It's possible a specific combination on the viewstate looks like suspicious code or file (seems weird, but possible).
Next you can check the IIS. Enable logging and see if you get any errors. Check if you have any isapi filters installed (these can be in several places - the 'Web Service Extensions' folder, or a tab on the web site's Properties, or one of its parents' Properties)
Related
I have a project in C# that needs to upload files in a page with a lot of others information.
Problems:
1 - File upload doesn't give any feedback for the user, so they can't know how long will takes (doesn't work with UploadProgress and UpdatePanel).
2 - Some validations I can't do with javascript (relationships for example), so if I get any error on the server side, FileUpload loses the file and the user needs to upload the whole file AGAIN.
3 - End user have a really poor link, so for 10mb will takes a long time (10mb is the maximum allowed).
Solutions (none of them works great):
1 - I can use a client side file upload with javascript (like uploadify) and get the percentage, but works as async method so I need to block the screen to don't allow the user to do another things. My worry is more about when I receive the file and save it, because I need to link this file with the other entity if not I will lose the bridge between the file and the entity. (Same happens with the AsyncFileUpload Control). This doesn't solve the problem number 2.
2 - Just do everything synchronous as FileUpload, when all the files arrive to the server, save the file and put all the informations that I need in HiddenFields, so if I get any error on the server side I can recover the file. The problem is that I can't give any feedback for the user while he uploads the file. This doesn't solve the problem number 1.
3 - Split all the files from the others entities (this will mess a lot the project) and upload file individually. The problem is that if I do that I need some mechanism to create the link between the file and the entity AND I can't allow to use the file more than once, so probably this will request a lot of resources to check it. This solves the problems listed but I think create another, complex for the user and a lot of new verifications for the system.
4 - Create 2 buttons, one button VALIDATE for validations on the server side (with no file uploads) and after this check, allow the user to click on the SAVE button. This doesn't solve the problem number 1.
Well, as you can see I'm thinking a lot about the problem but I can't find a really good solution. One that fits all my needs. Anyone have a idea?
PS: I have FileUploads inside repeaters as well, so the IDs are automatics.
In my web app I have a lot of gridviews and so on. Now I was trying to change some data format in the gridview, and the changes did not take effect when i debugged, either in Chrome nor Internet Explorer, it kept showing the same gridview as before. I tried to change the SQL data I was providing to it, and it actually changed, so the problem is when I'm altering in my web project.
Wondering what this could be, I did another test by removing the function that triggered when a button is pressed. When I debugged not only I did not get the error that usually comes with not assigning anything to a OnButtonClick but also when I pressed the button it did what it was doing before i removed the function.
Did this already happen to any of you?
Edit1. Now I tried to create an empty webform and when I debugged I get this error:
Error of analyzer
Description: Error parsing a resource required to make this request. See the
specific analysis error details below and modify your source file appropriately .
Are you hosting in IIS or debugging with Visual Studio's inbuilt server?
Try cleaning and rebuilding your code, then clearing your browser cache
Try deleting cache and cookies from (both of) your browsers.
Sometimes this gives problems when you are debugging because you will keep seeing previous parts of your current app.
I'm facing some problems, somehow to provide to content authors solution/advice to delete huge sections of website translated (having language versions) and with references set on presentation or data items inside content tree.
Removing all items plus references raise another problem how to make the changes visible on web database. Publish cannot be used, even workflow is present changes are made on presentation (master instance is mapped to a preview site for client) on pages (by different agencies) and cannot reach live, only when they decide to. Used as well number of versions per language (limited to 6, rest removed) if all 6 variants are trapped in workflow publish will publish item with all fields empty, we don't want to have a blank page.
And there are other concerns regards incorrect work, sometimes agencies publish/submit incorrect/unproved content and they blame each other case someone publish accidental pages (presentation settings/datasource) so i want to pick the best strategy here not to give room for content authoring agency's to blame their incorrect work other then them.
I have few approaches to do it right but..
do same deletion of items on web database via CMS in theory works (tested on locally in sitecore 7.2 - works) on Production CMS (Sitecore 7.0 - admin rights, no access to server files for me) fails as in next image:
write a module (Sitecore command to remove all references and removing/recycling web database item cannot be performed via code because it will trow same error as above)
Question 1. Why this error occurs at CMS item deletion if web database is selected (Sitecore 7.0)
Found in this deprecated post something what seems to be the solution: Add publishing commands to the recycle bin. Well here I get a bit blocked, adding the publish command on Recycle Bin ribbons, don't know if it's possible to do this (what I'm publishing I do not understand). I created something but it's not working:
This is the result :(
Question 2. Is publishing commands to the recycle bin possible? If yes please help me with a documentation to implement it and understated what/how is published.
If you have other suggestions (beside publish item + sub items or publish website) please help me out.
Thank You.
I sometimes find that I need to press CTRL+REFRESH BUTTON (or simply REFRESH BUTTON) in order for pages to be updated.
I thought this may have been a problem with using AJAX Update Panel and things, but it also happens on pages where there is no AJAX partial rendering.
I have also removed if(!isPostBack), and yet still I need to refresh the page for the contents to be updated.
Is it to do with the cache?
Does anyone know of a fix for this?
I believe it only happens with IE 7 (which I am using). I tried the same feature with Chrome, and it worked as it is supposed to.
EDIT: Unfortuanetly, it is not as easy as setting to cache header to 0 or in IE retriving the latest page always on page load. I have done these and the same problem happens.
For instance, on one part of my site, you can change the profile picture. If I choose to remove the profile picture (which should then set to the default picture), it only deletes the picture (but doesnt display the default picture). The page loads again but it still references to the picture I deleted (so I get an X for the picture). I have to go onto a different page, and then back to the profile page for me to see the default picture. CTRL + REFRESH also works.
Note that this particular problem happens under all browsers (Chrome included).
If it helps, I am using Content pages which are in a master page.
Changing your browser cache settings will fix the problem locally, but to fix it for a general case, add the header "Expires: 0" to your outbound page, which will prevent browsers from caching it at all.
To do this in C#, add this code to the page load event:
Response.AddHeader("Expires", "0");
Ctrl+refresh forces you IE to reload page from server instead of using locally cached version. First, check your browser's settings: Settings - General - Browsing history. "Check for newer versions of stored pages" should be set to "Automatically". Then, check if you're adding any "expires" header to your pages.
You can also consider setting the caching policy on the response object or set the entity tag to something different every time...
http://msdn.microsoft.com/en-us/library/system.web.httpcachepolicy.aspx
This may be some "best practices" thing I've overlooked or don't know about, so go easy on me please.
I have an asp.net website that populates a gridview with columns from my database table. One of those columns gets processed into a link to a word document on another server. The issue is that if a user clicks on the word document to view it, and then that document is updated on the remote server, the user cannot access the changed document until their browser cache is cleared and it's forced to go out to the network to grab a fresh copy when the link is clicked.
Basically I want to somehow force the machine never to use the cached copy of the document, but always go out to the network to get the newest copy.
Bonus question: Would this be better handled somehow by storing the documents in SharePoint?
UPDATE: using Response.Cache.SetCacheability(HttpCacheability.NoCache); in my codebehind I have now resolved the issue in FireFox, but IE8 is weird. If I update the document and then left click on it, it brings up the word doc in the IE window without the changes. However, if I make changes, save them and then middle click on the document so it opens up a new tab, the document reflects the changes. I'm mostly there...
Try adding a little extra data to the link. Here's an example using js; if you're building the url server side, it should be essentially the same:
var url = "http://www.mydomain.com/mywordfile.doc?ts=" + (new Date()).getTime();
That'll force the url to have a different query url each time, which (in theory) should force the browser to re-request and re-download it.
By chance are you seeing this with IE8 specifically? We've seen it show this behavior where caching was previously not an issue.
Typically it can be cleared up with a couple steps: explicitly telling the browser not to cache via HTTP headers, and also expiring the page immediately. Google the "pragma no-cache" header, there is typically a couple of different lines you need to add to cover all browsers.