I have a .net c# web application that allows users to purchase products.
My site has a payment page, with input fields etc.
I have had some attacks recently via bots automating the submit of payments just to validate credit card authorization.
So I need to change my page so that bots cant do this. So I am looking at advice as to do this? I have started by changing the field names so that they are different each time to page loads, via a hash. Any other tips?
CAPTCHA will help to stop this. It will require the user to complete a validation check before the page will go through. Here is a sample implementation of how CAPTCHA can be implemented in ASP.Net
A "captcha" is the standard way of preventing bot submitted forms. Recaptcha is free, works well and is actually helping to scan books through its use.
I tried the following approach once and it gave me good results.
In short the idea is to create an invisible field, name it so that the robot could easily understand it and, on the server side, check the value of this field. If it is populated, than it was definitely a robot and you can safely ignore this request.
For example:
Rename your FirstName field to, say, EmanTsrif.
Add another field <input id="FirstName" class="trap_for_robots">.
Define css class trap_for_robots: {display: none} (preferable in a standalone .css file - don't use style="display:none"!).
In your codebehind check if (FirstName.Text != "") { //do nothing, log something }.
I can understand that you are trying to find a solution which does not involve human interaction in order to keep the user experience as good as possible.
Since the evil-doers are using your site to check credit card validity you are probably dealing with a more targeted misuse of your resources as opposed to common blocking scenarios for automated processes, like comment spam bots and alike. Depending on how valuable "your service" is to the people who are exploiting your website, locking them out without requiring human interaction might only work until they figured out what you changed.
Alternating field names aren't going to stop them from populating the fields by order of appearance on your site, for example.
Solutions like having javascript populate a hidden form field are only good as long the bot does not speak javascript.
I would suggest to use all the techniques found when searching for captcha alternatives and use the methods in random combinations for each request - then hope that another site is less secured so they start using a different site to get what they need.
If all doesn't help you can still use a solution that involves human interaction.
Ajax control kit has a "NoBot" control you can use.
http://www.asp.net/ajax/ajaxcontroltoolkit/samples/NoBot/NoBot.aspx
Have look on this wikipedia article , read the "Technical measures to stop bots" section, it gives a number of measures to stop bots.
Related
I am building a comments section for a blog and I would really like some help\guidance with it, specifically around the Recaptcha. I have questions around
Adding a variable amount of captchas to one page
validating that the user has completed the captcha before posting to the server
A blog post can have X number of comments and for each comment left and I want to add a reply form for each. I've tried to mock this up in a .net fiddle (https://dotnetfiddle.net/3GruG1) to demo it. In my application, the form is located in a partial, and when the foreach (blog comments) are processed it loads in a form at the bottom. The fiddle is purely for demo purposes and not how its implemented.
To try and limit spam submissions I would like to add a Recaptcha to each form. I've read that you can have multiple v2 Recaptcha per single view but can anyone suggest the best way to achieve this with a variable amount of forms per page? I am generating a GUID in the partial so maybe I can lean on that?
I assume this is achievable in js but my clientside is weak and before going down that route I wanted to get people's opinions on what is the best approach?
I would also like to validate that the user has interacted with the captcha - I played around with disabling the button until the box was checked but unsure if this is good for UX?
Any thoughts, please?
I want to create an application that basically search for something with some filters from various websites (I don't require to login to those third party websites so the data available is open to public) and show it on my application. I have a few questions:
1. Is It Legal ?
2. Is this web scraping or Meta Search Engine ?
3. Can I get more information (any web links/articles) to know more
about it ? How to achieve it technically ? One way I know that we can use the XPath technique to scrape but I am wondering if there are more ways.
I am NOT asking for the entire code. Just how to start / Any guidance?
Thank You in Advance !
Firstly you need to understand how search engines work!
-Our so called search engines like google have special programs designed to mine out information from the web they are called "Spiders" what a spider does is basically scroll over all web pages within the search query and find matching information however that's a really complex thing to work on! it takes really good code and algorithm expertise to develop a spider for yourself. However if you can master that you'll be earning a smooth sum of money, but it's really rare unless you're a blatant genius!
I want to get some information of a web page (Where I have customers, current balance, etc) in my C# application. I thought in use POST - GET methods, but I don't know how to use them. The first problem, I have to login in the page:
Login page
I guess I have to get boxes id, and button id, then, complete them in the C# app. Next I want to get table contents. The table is like this: Table
How I can get customer id, customer name, and customer balance? And if the customer balance is updated, check it on the app? I think with this is to be able to do the rest on my own.
This could be very difficult depending on exactly how the site is architected. Does the site have an API you might be able to use instead of trying to go through the UI?
If you have no choice, I would consider using something like PhantomJS which is basically a scriptable headless browser. To make it a little easier to use with C#, you might also consider Selenium WebDriver and their PhantomJSDriver.
Selenium is used for automated testing of websites, but you can just as easily use it to automate repetitive UI interaction in websites. The biggest issue you generally face is keeping your scripting from being so brittle that anytime a change is made to the site, it breaks your automation. You might want to read up on the Page Object Pattern as it can help you minimize redundancy and keep your code maintainable.
Also, it was recently announced that Chrome v59 will be able to be ran headless, so that might ultimately be an alternative to PhantomJS.
The solution is for a project in which changing all instances of Session[string] is not an option. My thoughts have been implementing the SessionStateStoreProviderBase. I understand that creating a class Session and having properties like Session.UserName would be a good idea.
Edit: The goal here is to turn off Sessions per user request, not application wide, without changing code in each aspx page.
First you need a way to tell a bot from a human apart.
When you're through, consider what do you want to achieve.
If you wish to disable Session to bots, then be sure it won't break you site. If a search engine bot gets a crashed page, it will index and rank it as such.
Set up your robots.txt file to direct (most) bots to a page of your choice, where you have control over session and other information. If you want free access to all pages, you have to put in code to distinguish bots by http header information - that's a research project in itself.
I have a simple form that dumps selected answers to XML file. How to prevent anonymous user from submitting this form many times?
I am not looking for totally bulletproof solution, and I have a limitation that I cannot use the database, and therefore no SqlMembershipProvider.
Will some cookie checking work? How to do this right?
Thank you in advance.
Update: To be more precise, I do not only mean some accidental submitting of the form but to prevent user that visited the site a week ago from submitting this form again.
In short, you can't totally reliably. If you're not bothered about something bullet proof as you say you could either
a) Persist a cookie to the client machine and check for it next time someone posts. Obviously the user can delete cookies so not brilliant.
b) You could store the IP address of where it was submitted from. Problem with this is that you'll prevent multiple users behind the same IP submitting i.e. proxy, and the same user could post from different locations.
Neither is particularly good and if it's possible I'd recommend asking them to input an email address at the start, store the post as non-confirmed, email a confirmation link out and only make the post official if they click on it. Again, not bullet proof as it doesn't stop people posting with multiple email accounts but it's a little better than the options above.
HTH
Use the session variables to keep track of what your user is doing.