Kentico custom table data editing issue - c#

Custom table's data hangs on the loading screen after saving any changes. This is happening on some of tables and it seems that the majority of records are saved, however I have noticed a couple that didn't save within some custom table until reapplying the change!
I was wondering what can cause the issue.

I have found the issue using browser's developer tools.
Issue
Clicking save button was creating
Mixed Content: The page at 'https://address' was loaded over HTTPS, but requested an insecure form action 'http://address'. This request has been blocked; the content must be served over HTTPS. JS error on the browser and browser was blocking the content. However, the form action was not pointing to absolute URL address.
Solution
As the server SSL config was fine, therefore, there was not any other way than changing core CustomTableForm.ascx.cs Kentico file. Although it is not recommended. The problem solved by developing RedirectUrlAfterSave property of customTableFormobject to make sure it will redirect correct protocol instead of Absolute URL
Hope it will help you guys.

This was just brought to my attention, not sure how I missed it before. So, I will post my answer just for the future reference :-)
I guess there is some SLL offloading going on before the actual IIS where Kentico is running. In this case, SSL Accelerator must be implemented. The link goes to Xperience 13 version but the same idea applies for older versions. Just use the version selector in the top bar - there could be some API differences.
And the same applies e.g. when uploading media files - the browser console will show mixed content warning. This is for security reasons. Browser sees the HTTPS but behind the offloader there is HTTP communication and the GetAbsoluteURL method takes the protocol from the request. Thus, mixed content. Using the ssl accelerator will tell Kentico to use HTTPS internally.

Related

Why is Chrome flooding my site with GET requests?

I'm getting a periodic issue with my IIS hosted website whereby one of my clients browsers (Google Chrome 77/78 or higher) suddenly begins submitting dozens of requests per second to my website for the same page.
The user is always a valid, authenticated user with my application. The requests also don't seem to follow any standard pattern that I can determine from our logs. For instance, it's not a authorization redirect issue for instance, it's almost like the browser is sending through dozens of requests which have somehow been initiated by the user. For instance, opening a bookmarked version of our page dozens of times.
Looking at the request details I can see the following fetch headers:
HTTP_SEC_FETCH_USER: ?1
HTTP_SEC_FETCH_SITE: none
HTTP_SEC_FETCH_MODE: navigate
Which from what I can understand means that the action was user-initiated, and that it did not come from my own application, in terms of a ajax request or page refresh. I can only get the above combination of fetch headers when I open my page in a new tab in Chrome for instance.
Could this actually be related to the Chrome browser itself? I cannot replicate the issue in development, but it's happened a few times now and I'm not sure where to start in terms of determing a cause.
As other users have pointed out in comments, this can be in fact caused by Mobile Chrome Predictive Loading mechanism.
A recent version of Chrome for Android (78.0.3924.108) has experimented with predictive loading, changing the rules when links are selected for prefetching. This can cause arbitrary links to be "loaded" (issuing a GET request, distorting stats and causing any side effect that action has) without any user input when visiting your website.
This has been rolling out over the past week, and has caused many issues in many different scenarios (logging users out, clicking on paid or aggregator links, etc.)
More info on the Chromium issue tracker:
https://bugs.chromium.org/p/chromium/issues/detail?id=1027991
Requests made by prefetching issue a Purpose: prefetch header - at least by Chrome, other browsers might issue other headers
This has since then been fixed today morning (25th november 2019).

Deploy WebAPI to external website

We have a public website that is already exposed to the outside, although in reality there's really nothing there. Simply default.htm file with "Coming Soon" text in it. (http://vensuresoftware.com/)
We also have a WebAPI we've put together that we want to add to this website. When I publish locally to my IIS6, it works no problem. It's accessed as http://localhost/HRConnect/api/Claims just fine. I've used PostMan, a C# client, and Javascript AJAX to access this just fine. I can also load it in a browser at that URL, and I get the appropriate default controller and action.
However, I have been totally unable to accomplish this same thing on the website. Ideally I'd like to include it as a Virtual Directory to the http://vensuresoftware.com and access it as http://vensuresoftware.com/HRConnect/api/Claims but I've had zero luck doing so.
I have tried to add it as a Virtual Directory as well as an Application under that specific website, but when I access the URL, all I get is "The resource you are looking for has been removed, had its name changed, or is temporarily unavailable."
I've ensure the Application pool is correct, with an appropriate user and the pass through connection test succeeds. But I just cannot access the service or the URL.
Any ideas or suggestions at all on what I can try? I'm not sure what else I can include here. Nothing special in IIS, nothing special in the service really. There's only 3 actions in it. As I said, it all works beautifully locally, under localhost though.
IIS 7 doesn't have built-in support for extensionless URLs which causes a lot of headaches trying to get MVC and Web API apps to run. I've gotten it to work using both these options. Pick the one that applies to you.
Install this IIS patch which allows IIS 7 to handle extensionless URLs.
If the patch isn't an option because you're worried about breaking other sites on the server, you can make the Web.config adjustment found in this answer. You'll have to do this for every MVC/Web API app you have running on the server.

Entire website to migrate from Http to Https

This seems like a duplicate question - but after hours of search, it seems there is no clear question-answer which summarize the issues i'm raising here.
We have a web application (built using asp.net MVC4) which stores customers sensitive customer information.
We've decided to migrate our entire application to https.
My question is, except for the IIS and certificates technical issues, which we've already know how to deal with, what should be changed on code level?
What will happen for instance for:
Included external scripts containing http, such as: http://code.jquery.com/jquery-1.7.1.min.js - will it work automatically without any problem and popup messages or blocking on the client browsers?
Internal links, which we've forgotten to change, which redirect to our site using http?
Images/Sources which have http in their URL.
Should we change all references from http to relative, or just specifying // without the http/https protocol ? (as seen on other posts on this subject)
Should we do nothing, will it happen automatically?
Is there a way to do something in IIS or Global.asax etc, in order to automatically take care of all http leftovers?
What else should we take in account when migrating to https?
Thanks in advance.
For all internal static resources hopefully you have used #Url.Content helper and for all internal dynamic resources you have used #Html.ActionLink, #Html.BeginForm, ... helpers to generate the links. This way you don't need to worry about anything.
For all external resources you could use // syntax in the link which will respect the protocol.
Since you are switching to HTTPS you might consider marking all your cookies (if any) with the secure flag to ensure that they are transmitted only over a secure channel.

IE 11 broke all our sites

When Microsoft rolled out IE 11 on the automatic updates it broke all our sites.
For example, one of our sites started getting a weird string in the URL. After the '/' the following gets added.
(F(IZOtnSYyVIaxfgEbqezGvIKHeTq8scRxJzvlSVK2airuqpB29zOonBkpv3_Lf61u7hveLZH053qcPgI6cTpejnOWojBJBiePNrC1Z7lShzsKs7VdayYOlA9dF_vIodMiRbUCzDRHbf9UlxsYNbuo_UabOT81))
And because of this nothing on the site works.
Adding the site to Compatibility View in IE 11 works. But what has Microsoft done with IE 11 that destroys everything?
Is there anyway to fix this without having to add the site to Compatibility view?
What you see is a forms authentication ticket, sent in the url, instead of a cookie. ASP.NET thinks that you are using an unknown browser that cannot handle cookies, JavaScript, etc. You should update your browser definition files. A similar issue has actually happened with IE10, not 11: http://www.hanselman.com/blog/BugAndFixASPNETFailsToDetectIE10CausingDoPostBackIsUndefinedJavaScriptErrorOrMaintainFF5ScrollbarPosition.aspx
Or, as a workaround for this one case only, modify your web.config and force forms to use cookies. But it won't solve all your problems.
Try install last windows updates. It helped us this the same situation.

Replicate steps in downloading file

I'm trying to automate the download of a file from a website. Normally to download the file, I login with a username and password. Navigate to a particular screen then click a button.
I've been trying to watch the sequence of POSTs using Chrome's developer mode, and then replicate all the steps using .Net WebClient class, but to no success. I've derived from the WebClient class and added cookie handling. Which seems to be working. I go to the login page and post using WebClient.UploadValues. About half the times it seems to work. The next step appears to make another POST action to a reporting URL. Once again I use WebClient.UploadValues, but the response from the server is a page showing an internal error.
I have a couple of questions.
1) Are there better tools than hand coding C# code to replicate a bunch of web browser interactions? I really only care about being able to download the file at a particular time each day onto a Windows box.
2) The WebClient does not seem to be the best class to use for this. Perhaps it's a bit to simplistic. I tried using HttpWebRequest, but it has no facilities for encoding POST requests. Any other recommendations?
3) Although Chrome's developer plugin appears to show all interaction, I find it a bit cumbersome to use. I'd be interested in seeing all of the raw communication (unencrypted though, the site is only accesses via https), so I can see if I'm really replicating all of the steps.
I can even post the exact code I'm using. The site I'm pulling data from, specifically is the Standard and Poors website. They have the ability to create custom reports for downloading historical data which I need for reporting, not republishing.
Using IE to download the file would be a much easier, as compared to writing C# / Perl / Java code to replicate http requests.
Reason is, even a slight change in JavaScript code can break the flow.
With IE, you can automate it using COM. Following VBA example opens IS and performs a google search:
Sub Search_Google()
Dim IE As Object
Set IE = CreateObject("InternetExplorer.Application")
IE.Navigate "http://www.google.com" 'load web page google.com
While IE.Busy
DoEvents 'wait until IE is done loading page.
Wend
IE.Document.all("q").Value = "what you want to put in text box"
ie.Document.all("btnG").Click
'clicks the button named "btng" which is google's "google search" button
While ie.Busy
DoEvents 'wait until IE is done loading page.
Wend
End Sub
3) Although Chrome's developer plugin appears to show all interaction, I find it a bit cumbersome to use. I'd be interested in seeing all of the raw communication (unencrypted though, the site is only accesses via https), so I can see if I'm really replicating all of the steps.
For this you can use Fiddler to view all the interaction going on and the RAW data going back and forth. To make it work with HTTPS you will need to install the Certificates to enable decryption of trafffic.

Categories

Resources