I'm getting "banned" message from a forum site cause of my windows application which tries to connect over webbrowser control by this code:
webbrowser1.Navigate("http://www.xyz.com");
I can connect without any "banned" message with my normal browsers both IE8.0 and FireFox 3.6. I cannot find any differences between my application and normal browser. It seems "banned" message is not related my IP address, cookies or header info (User-Agent, HTTP-Accept)
Please help: How can this forum site realize my request coming from normal browser or application?
Note: Sorry for my English. Thank you for your understanding.
At its most basic you're only sending the following information:
IP Address
Headers
GET data
There must be a difference in one of the above for the site to be able to differentiate between the browser control and your actual browser - There's simply no other difference unless multiple requests are involved.
I thing they banned the IP is your system IP not your connection IP who can be change but system IP can not.
some application set their agent like Firefox's user agent then server can't find that the request not come from browser they make them because they need that server can't find somebody going of their site by code not just using browser.
the answer i put nothing match to your question.
Fake identity means i tell about the request make by code inside a application not by browser user use.
the user-agent change make them as normal request that they can't find the request they have come by code not by browser.
Related
I'm not a network expert, but for one of my projects, I need to ensure that the website I'm sending the request to is alive. Some websites do not respond to ping; basically, their configuration prevents response to ping requests.
I was trying to Arping instead of pinging websites, but Arping only works on the local network and will not go beyond the network segment (this).
I can download the whole or part of the webpage and confirm if the content is the same as the previous state, but I rather have one more level of confirmation before downloading Html.
Is there any other method that enables the app to get a response back from non-pingable websites outside the network?
Based on common practices you may use ping, telnet and tracert as a client to the requested server (at this point the website or the service you want to connect) and make sure the 3 command are enable to your side. You may also try to access it to your browser.
If its API you may also try to use POSTMAN and call the service.
Goodluck and happy coding :)
I'm not sure if this is a bug or not.
If it's a bug I already used github:
https://github.com/microsoftgraph/microsoft-graph-docs/issues/3106
if not I need help.
I'm using microsoft-graph for my game in Unity3d to store the save file on one drive user account.
Where I can use localhost everything works, but on android device I have to use 127.0.0.1 as localhost.
For dropbox and google I was able to do it... but microsoft-graph does not work with http on ip localhost like 127.0.0.1 it only works on http for localhost and it require https for 127.0.0.1... Of course HttpListener in c# for loopback only works for http (and since this is a game I need the user to use it on android without strange configurations).
I think this is a bug because boot dropbox and google give you the ability to use http for 127.0.0.1... but if it's not a bug how would you solve this problem? It is even possible to use https in a loopback with HttpListener or something like that without any strange configurations?
You should know that if I manually change the redirect url to http after the login I'm able to make it work.
The only problem is that I don't want to set up a server to redirect a user multiple times.. I want my game to look for the redirect url response in the localhost (that on android should be 127.0.0.1).
I need to know if it's possible to use https with HttpListener (or something like that) for 127.0.0.1 or you know how to workaround this problem without any webapplication.
var httpListener = new HttpListener();
httpListener.Prefixes.Add("http://127.0.0.1:" + anyfreeportonyourpc + "/");
httpListener.Start();
I am not sure if it is a bug or a feature you talking about but I have a few ideas you can try.
Probably the easiest way to workaround that is to use http://readme.localtest.me/ - This way you can use http://localtest.me:80 which is a public DNS record pointing back to 127.0.0.1 - Its nice for testing purposes but in this case quite a clever work around.
But as you mentioned it needs to be encrypted so based on the idea above you could to do this on one of your own domains. Get a certificate for it like free Lets Encrypt and just have that point to 127.0.0.1 and use that on your devices.
So say you have a domain called a.pl go and create a subdomain local.a.pl and set the A record on that sub domain to 127.0.0.1 (This would usually be a server with the web application but in this case we want to use it for a work around) Use lets encrypt to put a certificate on there so you can use HTTPS or maybe self signed will work too, I don't know.
Then in your code you do this
httpListener.Prefixes.Add("https://local.a.pl:" + anyfreeportonyourpc + "/");
Yes, you will initially need internet connection for your app to go and find the DNS record, and cache it on the mobile device. Setting the TTL to maximum will help to keep it cached for when there is no internet. So once its resolved you can use that domain for every single connection to the local device you need. It just requires that 2 seconds of internet to cache the DNS entry.
Another way to add a hosts lookup somehow... Not sure how but a simple local A record to myapp.local that point back to 127 in the hosts file. This would only be private level lookup for your workaround purposes but adding that record may not be straight forward on mobiles due to excessive abuse in the past.
It also just sounds like the graph server binding is only set to a hostname of localhost ... check to see if you can change that or relax it to an IP address. That would solve the root cause of your problem. I know these things have stupid bindings sometimes.
it's a bug. There is just no way to solve this problem if microsoft do not update their code.
EDIT. Finally a simple solution:
- just make a simple blog with wordpress
- activate https
- publish your blog in a free site like altervista
- install Insert Headers and Footers
- use this simple code
<script>
if(window.location.href.startsWith("https://yourblogaddress.altervista.org/blog/yourpostpath/")){
var stringPartUrl = window.location.href.substring(numberofcharinyoururl);
window.location.replace("http://localhostor127.0.0.1:yourport/" + stringPartUrl);
}
</script>
after that you just need to set https://yourblogaddress.altervista.org/blog/yourpostpath/ in the new azure portal app and change the OpenURL and redirect_uri in your app (but not the Prefixes.Add url that should be your 127 o localhost url).
Everytime someone log in should oauth to his/her microsoft account that should redirect to your https blog address that should redirect to localhost or 127 and get the code variable to your app that should finally be able to receive the auth code to upload/download files.
The main problem is that your blog should be online and you can only support one port (since you need to redirect one time). The best thing is that you don't need to change the implementation on desktop (since they allow again localhost with regular http in the new portal) and that you don't need a paid domain to do anything.
I have a project and it contains 2 pages: test1.aspx and test2.aspx. Now from test1.aspx I want to manually request test2.aspx and get the HTML out of it. I could do this using HttpClient or HttpWebRequest. Problem is I have a firewall and I suspect it won't work. Is there any other way to download the content from the webpage without actually using HttpWebRequest
Thanks in advance.
I don't really like what you are trying to do ;) Anyway, since your page don't seems to be a static page (.aspx) you must do a request to your webserver, whatever the method you use (HttpClient or HttpWebRequest).
Usually, a request done on the same machine does not passes through the network. If the DNS alias point to the machine IP address a loopback occurs.
In this case:
if your firewall is somewhere on your network, you don't care about
it, the request will not leave your host
if you speak about a firewall software, on your machine, it may block
the request. You may have to authorize such requests or force the DNS locally in your host file to specify 127.0.0.1 (which is a true localhost) and may work with
most firewall software
if you are on a Windows Server and your site require authentication, you may have to deal with Loopback
Check (or here)
NB: Loopbacks are usually considered as security breach and not recommended.
You should think about another solution like Ajax Web Services, Web or User controls (as already said) etc...
I am working in a multiple server environment and so have created a Management Program to start, stop and open pages on my Tomcat servers.
I want some way to determine from C# whether the server is up at any particular point. I have tried connecting to ports but haven't had any luck. Does anyone know how to do this? Poll a port on an IP address to determine if Tomcat has been bound to it?
What you can do is create a windows service or forms app which uses httpRequests to request a specific page on your tomcat server. This page can for example contain the text "server online"
In the httpResponse class it's possible to read the contents of the returned html code by the server.
If this html contains an error message, your server is probably down or misconfigured,
if it contains the right text, your server is up and running.
You can also try to create a program to check the windows service status for the tomcat service.
Note this will only tell you the service is running, not that it actually works the way it is supposed to.
You have to use JMX in connection with a web service maybe.
I'm using the Html Agility Pack and I keep getting this error. "The remote server returned an error: (500) Internal Server Error." on certain pages.
Now I'm not sure what this is, as I can use Firefox to get to these pages without any problems.
I have a feeling the website itself is blocking and not sending a response. Is there a way I can make my HTML agility pack call more like a call that is being called from FireFox?
I've already set a timer in there so it only sends to the website every 20 seconds.
Is there any other method I can use?
Set a User-Agent similar to a regular browser. A User agent is a http header being passed by the http client(browser) to identify itself to the server.
There are a lot of ways servers can detect scraping and its really just an arms race between the scraper and the scrapee(?), depending on how bad one or the other wants to access/protect data. Some of the things to help you go undetected are:
Make sure all http headers sent over are the same as a normal browser, especially the user agent and the url referrer.
Download all images and css scripts like a normal browser would, in the order a browser would.
Make sure any cookies that are set are sent over with each subsequent request
Make sure requests are throttled according to the sites robots.txt
Make sure you aren't following any no-follow links because the server could be setting up a honeypot where they stop serving your ip requests
Get a bunch of proxy servers to vary your ip address
Make sure the site hasn't started sending you captcha's because they think you are a robot.
Again, the list could go on depending on how sophisticated the server setup is.