Securing returning querystrings from third-party website - c#

So I am implementing a payment system with 2co on my website. I am using their 'Header Redirect' which returns the customer to an ASPX page on my website with a bunch of querystrings after a successful payment.
Anyway, I was wondering. What is the proper way to secure this? What If a customer typed the parameters him/herself such as `Payment.aspx?params-here' and added credits as they wish to their account.
How can I make sure that this is 100% authentic?
Suggestions? Thanks!

A solution approach to parameter tampering is usually mapping the query parameters to something that cannot be easily manipulated, e.g. by using a one-way hash function to create a digest to send along with the original parameter and limiting the duration during which a particular mapping/digest is valid. If the digest matches the query parameter you know the request has not been tampered with.
E.g. your URL
Payment.aspx?Amount=100
could become
Payment.aspx?Amount=100&Digest=53e5e07397f7f01c2b276af813901c2
Here's an old but still relevant, detailed article on the topic: Passing Tamper-Proof QueryString Parameters
In ASP.NET you can use Page.EnableEventValidation which uses a hidden textbox as part of a form to validate that a request was issued from the form:
When the EnableEventValidation property is set to true, ASP.NET
validates that a control event originated from the user interface that
was rendered by that control. A control registers its events during
rendering and then validates the events during postback or callback
handling. For example, if a list control includes options numbered 1,
2, or 3 when the page is rendered, and if a postback request is
received specifying option number 4, ASP.NET raises an exception. All
event-driven controls in ASP.NET use this feature by default.

Related

Is Sanitizing user input necessary when Request Validation is already on guard

Request Validation is a powerful mechanism to prevent injecting malicious code via a request to server. This is done on server-side so regardless of the fact that whether any client-side validation has done or not, one can be sure if something unusual is coming then an exception will be thrown automatically.
My question:
While we have "Request Validation" in hand, does it still necessary to sanitize requests?
I'm using Asp.net MVC 5.0
PS:
I'm solely talking in the context of web (not DB or something else) and its potential vulnerabilities (such as XSS).
Yes! There is plenty of perfectly valid input in ASP.NET's eyes that could cause issues in your application if not dealt with correctly.
For example, if somebody passed some data in a request and you weren't correctly parameterizing queries in your data layer then this input:
x'; DROP TABLE users; --
Could result in this query:
SELECT FieldList
FROM Users
WHERE Email = 'x'; DROP TABLE Users; --
Oh noes! You've lost your Users table!
You should always treat user-input as hostile irrespective of request validation. This demonstrates some scenarios whereby request validation wouldn't save your skin.
HTML encoding when you render user-provided input is important... Never render any untrusted input using #Html.Raw and be careful that your HtmlHelpers correctly encode anything coming from a user.
Defence in depth is important. Think of request validation as just one piece of that process.
Here's a link about Xss on MSDN and Request Validation
https://msdn.microsoft.com/en-us/library/vstudio/hh882339%28v=vs.100%29.aspx?f=255&MSPPError=-2147217396

Prevent from multiple form submitions without javascript

I've got an Asp.net MVC action that creates user account(after input validation). And a View containing registration form that invokes this action. While action validates input, user is left with webbrowser waiting for server response and click submit button a few more times. This creates several accounts. Is there a way to prvent user from form resubmition without javascript. I cannot use javascript in this project it is intended for non javascript browsers. Or can you suggest(server) other solution?
EDIT:
This form request use POST method
JavaScript is not allowed because this Web Application is aimed for special web browsers for people with disabilities that do not support javascript
You have to handle the situation on the server-side then, there's no way around that.
There are 3 options that come to my mind atm:
create a cookie and for each submit check if it exists
similar, but using a session object
before creating a new account, always check if the user exists in the database. THIS should be a no-brainer anyway!
You can add a unique hidden token as part of the form. This token can also be saved as part of the session on the server.
When the user posts the form on the first action, the token is validated and a flag set to indicate the request is being processed. The action processed and results presented. If, while awaiting results, the user attempts to repost the request, the token validation fails as the request is still being processed.
On a side node, the main reason people continuously click is that there is no feed back on whether the request was received by the server or not. To this affect, it might be better to redirect the user to an interim page that shows the request is being processed. Which in conjunction with the above can be used to show the request progress and redirect to the appropriate page when completed.
Of-course, you should also consider making the process a bit lighter. So, that the system can respond quickly to input rather than making the user wait.
Is it a requirement to use MVC? I think you can accomplish something similar using WebForms. When the user submit the request, in the code behind you can disabled the submit button like this:
btnSubmit.Enabled = false;
But if MVC is a must be, #walther answer would be correct

How should I handle users browsing to pages in my site meant only for AJAX? Should I ever use GET?

Similar questions have been asked about the nature of when to use POST and when to use GET in an AJAX request
Here:
What are the advantages of using a GET request over a POST request?
and here: GET vs. POST ajax requests: When and how to use either?
However, I want to make it clear that that is not exactly what I am asking. I get idempotence, sensitive data, the ability for browsers to be able to try again in the event of an error, and the ability for the browser to be able to cache query string data.
My real scenario is such that I want to prevent my users from being able to simply enter in the URL to my "Compute.cshtml" file (i.e. the file on the server that my jQuery $.ajax function posts to).
I am in a WebMatrix C#.net web-pages environment and I have tried to precede the file name with an underscore (_), but apparently an AJAX request falls under the same criteria that this underscore was designed to prevent the display of and it, of course, breaks the request.
So if I use POST I can simply use this logic:
if (!IsPost) //if this is not a post...
{
Response.Redirect("~/") //...redirect back to home page.
}
If I use GET, I suppose I can send additional data like a string containing the value "AccessGranted" and check it on the other side to see if it equals this value and redirect if not, but this could be easily duplicated through typing in the address bar (not that the data is sensitive on the other side, but...).
Anyway, I suppose I am asking if it is okay to always use POST to handle this logic or what the appropriate way to handle my situation is in regards to using GET or POST with AJAX in a WebMatrix C#.net web-pages environment.
My advice is, don't try to stop them. It's harmless.
You won't have direct links to it, so it won't really come up. (You might want your robots.txt to exclude the whole /api directory, for Google's sake).
It is data they have access to anyway (otherwise you need server-side trimming), so you can't be exposing anything dangerous or sensitive.
The advantages in using GETs for GET-like requests are many, as you linked to (caching, semantics, etc)
So what's the harm in having that url be accessible via direct browser entry? They can POST directly too, if they're crafty enough, using Fiddler "compose" for example. And having the GETs be accessible via url is useful for debugging.
EDIT: See sites like http://www.robotstxt.org/orig.html for lots of details, but a robots.txt that excluded search engines from your web services directory called /api would look like this:
User-agent: *
Disallow: /api/
Similar to IsPost, you can use IsAjax to determine whether the request was initiated by the XmlHttpRequest object in most browsers.
if(!IsAjax){
Response.Redirect("~/WhatDoYouThinkYoureDoing.cshtml");
}
It checks the request to see if it has an X-Requested-With header with the value of XmlHttpRequest, or if there is an item in the Request object with the key X-Requested-With that has a value of XmlHttpRequest.
One way to detect a direct AJAX call is to check for the presence of the http_referer header. Directly typed URLs won't generate a referrer, but you still won't be able to differentiate the call from a simple anchor link.
(Just keep in mind that some browsers don't generate the header for XHR requests.)

How to better store the filters from the client side in the Session

Nearly every page of our application has several filters on it. My current goal is to implement a mechanism to store the filters and preselect them when a user re-opens a page, so at least during one session user don't have select them over and over again when he's opening a page, or moving from page to page.
The application is written with ASP.NEt MVC and we use a lot of javascript to handle filtering. At the moment a lot of filtering is done only on the client side(for example, the complete data for the grid is retrieved and all further filtering is made only on the client).
I was thinking of these steps:
Base class for the controllers: Method1 takes data send by the method from the common.js and saves it in the Session.
common JS: to common.js add a method, which accepts a selection made by a user, and together with the name of the control and name of the page sends it to the server Method1 in order to store new selection in the Session object.
Base class for the controllers: Method2 accepts name of the controller, name of the page and retrieves Session object.
JS of individual pages: in the onload event specifying all existing filters and getting data from the Method2.
However, I'm not sure that this solution is universal and optimal.
And I don't want to reinvent the wheel. Is there any already existing solutions or patterns for this task? Or any ideas how this can be done better?
One of the way that comes to my mind is using of Cookies rather than the session as it just the section and you can read the cookies from the JavaScript itself. it will save the server resource as you will not save anything in the Session. If your selection criteria is not very sensitive , there should not be any security issue

using Post method for only some parameters in Asp.net

I have a form in asp.net containing more than 20 textboxes. I have to use POST method to send only some of the parameters to payment gateway. Is there any way so that only required parameters from the form can be posted to payment gateway? Any help will be appreciated.
You could either make an explicit post with the values you want e.g. using jQuery (see jQuery.post()) or copy the values to another form that contains only the values of interest and submit that one.
Have you tried just posting ALL your web form elements to the payment gateway? Chances are their page will just use the parameters they are expecting to receive and just ignore the others (that is the default behavior in web pages, interpret what you can and just ignore the rest).
If so, you would just need to make sure the client name of your form elements matches the name of the parameters the payment gateway is expecting to receive.
Since it's a webforms application, you may want to look at How to: Post ASP.NET Web Pages to a Different Page on the MSDN.
input-element without a name-attribute wont be submitted, so just remove the name-attribute on those fields and your fine.

Categories

Resources