I would like to know why my HTTP Request returned an 500 Server Internal Error on the Response.
I use this C# Code
HttpWebRequest req = (HttpWebRequest)WebRequest.Create("http://svabyss.66ghz.com/getmsg.php");
req.ContentLength = ("receiver=" + b.ToString() + "&PHPSESSID=" + _SESSIONID).Length;
req.ContentType = "application/x-www-form-urlencoded";
req.Method = "POST";
req.KeepAlive = true;
req.CookieContainer = new CookieContainer();
req.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2.13) Gecko/20101203 Firefox/3.6.13";
req.Headers[HttpRequestHeader.Pragma] = "cache";
req.UseDefaultCredentials = true;
req.Credentials = CredentialCache.DefaultNetworkCredentials;
req.PreAuthenticate = true;
req.Proxy = new WebProxy("http://svabyss.66ghz.com:80", true);
StreamWriter writer = new StreamWriter(req.GetRequestStream());
writer.Write("receiver=" + b.ToString() + "&PHPSESSID=" + _SESSIONID);
writer.Flush();
writer.Dispose();
try
{
HttpWebResponse response = (HttpWebResponse)req.GetResponse();
StreamReader reader = new StreamReader(response.GetResponseStream());
OperationCompleted(new SVWorkerEventArgs("GET|" + b.ToString(), reader.ReadToEnd()));
response.Close();
reader.Dispose();
}
catch (WebException ex)
{
StreamReader str = new StreamReader(ex.Response.GetResponseStream());
string err = str.ReadToEnd();
}
I dont know wich header i missed. ANy help would be great !
thanks..
PROBLEM FIXED
I received the reply via error message returned by the server. The error message is actually the result i want.
Now, i dont care what the server replying to me. Wether its 404 or 505, the error message returned is what i want..
Thanks guys :)
The 500 error code is returned because the server (not your client) experienced an internal error. To find out what the error is, you would need access to the server's logs, or need to ask someone who has access.
It is entirely possible that the error is caused by something in your code, but without knowing what the error is on the other end, there's no way for you to be sure. It could just as easily be something in their code, or a connection to their databse, etc.
I suggest you make the request with a browser while running Fiddler:
http://www.fiddler2.com/fiddler2/
Look at the header used there.
Related
So I am currently trying to log into my account on a website using WebRequest.
I have been reading about it to the point where I feel like I wanted to use an example to learn by trial and error.
This is the example I am using
Login to website, via C#
So when I try to execute my code it returns an unhandled exception and its this one
System.Net.WebException: 'The remote server returned an error: (404)
Not Found.'
I tried stepping through the code and I THINK it might be that it's trying to POST somewhere where it can't.
I wanted to fix this before moving onto getting a confirmation that it successfully logged in.
I changed the username and password to dummy text for the sake of this question.
What did I do wrong here and whats the most logical way of fixing this issue?
Thanks in advance.
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;
string formUrl = "https://secure.runescape.com/m=weblogin/login.ws"; // NOTE: This is the URL the form POSTs to, not the URL of the form (you can find this in the "action" attribute of the HTML's form tag
string formParams = string.Format("login-username={0}&login-password={1}", "myUsername", "password");
string cookieHeader;
WebRequest req = WebRequest.Create(formUrl);
req.ContentType = "application/x-www-form-urlencoded";
req.Method = "POST";
byte[] bytes = Encoding.ASCII.GetBytes(formParams);
req.ContentLength = bytes.Length;
using (Stream os = req.GetRequestStream())
{
os.Write(bytes, 0, bytes.Length);
}
WebResponse resp = req.GetResponse();
cookieHeader = resp.Headers["Set-cookie"];
When you scrape a website, you have to make sure you mimic everything that happens. That includes any client-side state (Cookies) that is sent earlier before a form is POST-ed. As most sites don't like to be scraped or steered by bots they are often rather picky about what is the payload. Same is true for the site you're trying to control.
Three important things you have missed:
You didn't start with an initial GET so you have the required cookies in a CookieContainer.
on the post you missed an header (Referrer) and three hidden fields in the form.
The form fields are named username and password (as can be seen in the name attribute of the input tags). You have used the id's.
Fixing those omissions will result in the following code:
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;
string useragent = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/59.0.3071.115 Safari/537.36";
// capture cookies, this is important!
var cookies = new CookieContainer();
// do a GET first, so you have the initial cookies neeeded
string loginUrl = "https://secure.runescape.com/m=weblogin/loginform.ws?mod=www&ssl=0&dest=community";
// HttpWebRequest
var reqLogin = (HttpWebRequest) WebRequest.Create(loginUrl);
// minimal needed settings
reqLogin.UserAgent = useragent;
reqLogin.CookieContainer = cookies;
reqLogin.Method = "GET";
var loginResp = reqLogin.GetResponse();
//loginResp.Dump(); // LinqPad testing
string formUrl = "https://secure.runescape.com/m=weblogin/login.ws"; // NOTE: This is the URL the form POSTs to, not the URL of the form (you can find this in the "action" attribute of the HTML's form tag
// in ther html the form has 3 more hidden fields, those are needed as well
string formParams = string.Format("username={0}&password={1}&mod=www&ssl=0&dest=community", "myUsername", "password");
string cookieHeader;
// notice the cast to HttpWebRequest
var req = (HttpWebRequest) WebRequest.Create(formUrl);
// put the earlier cookies back on the request
req.CookieContainer = cookies;
// the Referrer is mandatory, without it a timeout is raised
req.Headers["Referrer"] = "https://secure.runescape.com/m=weblogin/loginform.ws?mod=www&ssl=0&dest=community";
req.UserAgent = useragent;
req.ContentType = "application/x-www-form-urlencoded";
req.Method = "POST";
byte[] bytes = Encoding.ASCII.GetBytes(formParams);
req.ContentLength = bytes.Length;
using (Stream os = req.GetRequestStream())
{
os.Write(bytes, 0, bytes.Length);
}
WebResponse resp = req.GetResponse();
cookieHeader = resp.Headers["Set-cookie"];
This returns for me success. It is up to you parse the resulting HTML to plan your next steps.
after hours of trying I decided to ask you for help.
I am trying to send a post request to a website (https://www.instagram.com/)
More exactly here: "https://instagram.com/web/likes/xxxxxxxxxxxxxxx/like/"
The problem doesn't always appear, for example I can succesfully send the request 4-5 times, and then I get this error 3-4 times, then I can send it again etc.
That's why I can't understand what's causing the problem.
My code:
byte[] bytes = ASCIIEncoding.UTF8.GetBytes("");
HttpWebRequest postReq = (HttpWebRequest)WebRequest.Create("https://instagram.com/web/likes/" + ID + "/like/");
WebHeaderCollection postHeaders = postReq.Headers;
postReq.Method = "POST";
postReq.Host = "instagram.com";
postReq.UserAgent = "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:38.0) Gecko/20100101 Firefox/38.0";
postReq.Accept = "*/*";
postHeaders.Add("Accept-Language", "it-IT,it;q=0.8,en-US;q=0.5,en;q=0.3");
postHeaders.Add("Accept-Encoding", "gzip, deflate");
postHeaders.Add("X-Instagram-AJAX", "1");
postHeaders.Add("X-CSRFToken", CSRF);
postHeaders.Add("X-Requested-With", "XMLHttpRequest");
postReq.Referer = "https://instagram.com/";
postReq.CookieContainer = cookies;
postReq.KeepAlive = true;
postHeaders.Add("Pragma", "no-cache");
postHeaders.Add("Cache-Control", "no-cache");
postReq.ContentLength = bytes.Length;
Stream postStream = postReq.GetRequestStream();
postStream.Write(bytes, 0, bytes.Length);
postStream.Close();
HttpWebResponse postResponse;
postResponse = (HttpWebResponse)postReq.GetResponse();
StreamReader reader = new StreamReader(postResponse.GetResponseStream());
ID is a 26 characters number, always different for each photo.
I got the headers using Live HTTP Headers, and they are the same like my request.
This is the full error description:
"An unhandled exception of type 'System.Net.WebException' occurred in System.dll
Additional information: The remote server returned an error: (403) Forbidden."
Any ideas? Thanks in advance for any help :).
The following function is wrote in C#, and it's used for logging in to a website (using POST Method and setting up Cookies).
The problem is that if my first login with bad username or password, I cannot log in again until i run the program again. The function is executed once, and if the login information is wrong, it ends after few minutes with this error:
Stream newStream = getRequest.GetRequestStream(); // open connection
WebException was unhandled by user code:
timeout expired
I would like to ask for a little help to find out what is wrong. In my opinion, the mistake could be in the use of CookieCollection - I would like to delete all existing cookies in case of unsuccessful login, but I cannot figure it out. I'm using this solution:
private bool Login(string name, string password)
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create("http://.../login-page/");
request.CookieContainer = new CookieContainer();
request.CookieContainer.Add(cookies);
//Get the response from the server and save the cookies from the first request..
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
cookies = response.Cookies;
string sourceCode;
string getUrl = "http://.../login/";
string postData = String.Format("username={0}&password={1}", name, password);
HttpWebRequest getRequest = (HttpWebRequest)WebRequest.Create(getUrl);
getRequest.CookieContainer = new CookieContainer();
getRequest.CookieContainer.Add(cookies); //recover cookies First request
getRequest.Method = WebRequestMethods.Http.Post;
getRequest.UserAgent = "Mozilla/5.0 (Windows NT 6.1) AppleWebKit/535.2 (KHTML, like Gecko) Chrome/15.0.874.121 Safari/535.2";
getRequest.AllowWriteStreamBuffering = true;
getRequest.ProtocolVersion = HttpVersion.Version11;
getRequest.AllowAutoRedirect = true;
getRequest.ContentType = "application/x-www-form-urlencoded";
byte[] byteArray = Encoding.ASCII.GetBytes(postData);
getRequest.ContentLength = byteArray.Length;
Stream newStream = getRequest.GetRequestStream(); // open connection
newStream.Write(byteArray, 0, byteArray.Length); // Send the data.
newStream.Close();
HttpWebResponse getResponse = (HttpWebResponse)getRequest.GetResponse();
cookies = getResponse.Cookies;
using (StreamReader sr = new StreamReader(getResponse.GetResponseStream()))
{
sourceCode = sr.ReadToEnd();
}
if (sourceCode.Contains("<div id='login'>Přihlášení se zdařilo</div>"))
{
return true;
}
return false;
}
Code from: https://stackoverflow.com/a/8542205/2715725
I would really appreciate any kind of help. I'm not that into C# and I have problems to put this kind of code together. I have been trying to solve this for days, but even Google haven't helped me, I was looking for the solutin everywhere Thank you!
It looks like the fault most likely lies in the external site to which you are posting the username and password. Are you sure it displays that exact message both when you authenticate correctly the first time, and when you manage to authenticate after a failed logon? Try both actions directly on that site to find out.
PROBLEM SOLVED
I added this into my function:
request.KeepAlive = false;
response.Close();
getRequest.KeepAlive = false;
getResponse.Close();
Add this to your function:
request.KeepAlive = false;
response.Close();
getRequest.KeepAlive = false;
getResponse.Close();
I try to make small application for myself and I found this application
How to upload video on Dailymotion with c# ?? Is somebody has a complete code?
When I tried every thing but publishing is not working. I used fiddler but I cant find the error.
Here is the code
var request = WebRequest.Create("https://api.dailymotion.com/me/videos?url=" + Uri.EscapeUriString(uploadResponse.url));
request.Method = "POST";
request.ContentType = "application/x-www-form-urlencoded";
request.Headers.Add("Authorization", "OAuth " + accessToken);
var requestBytes = Encoding.UTF8.GetBytes("title=test 123&channel=Funny&tags=Humor&description=Testing testing&published=true");
var requestBytes = Encoding.UTF8.GetBytes(requestString);
var requestStream = request.GetRequestStream();
requestStream.Write(requestBytes, 0, requestBytes.Length);
var response = request.GetResponse();
var responseStream = response.GetResponseStream();
string responseString;
using (var reader = new StreamReader(responseStream))
{
responseString = reader.ReadToEnd();
}
When it reaches request.GetResponse() it gives the error. So what is the problem here..?
I believe you need to get rid of the "me" in the url as you're using OAuth instead of basic authentication, like this:
"https://api.dailymotion.com/videos?url="
Instead of:
"https://api.dailymotion.com/me/videos?url="
At least in a quick scan that looks like it's it, I wrote an auto-publisher for a client a year ago and it didn't use the me in the url. My credentials are invalid now, so can't test it unfortunately. It seems to be a bug in the answer you linked.
If you can read other languages, I found it helpful just going through their SDKs and converting the code:
http://www.dailymotion.com/doc/api/sdk-php.html
https://github.com/dailymotion/dailymotion-sdk-php/blob/master/Dailymotion.php
I have url like:
http://www.matweb.com/search/DataSheet.aspx?MatGUID=849e2916ab1541be9ff6a17b78f95c82
I want to download source code from that page using this code:
private static string urlTemplate = #"http://www.matweb.com/search/DataSheet.aspx?MatGUID=";
static string GetSource(string guid)
{
try
{
Uri url = new Uri(urlTemplate + guid);
HttpWebRequest webRequest = (HttpWebRequest)WebRequest.Create(url);
webRequest.Method = "GET";
HttpWebResponse webResponse = (HttpWebResponse)webRequest.GetResponse();
Stream responseStream = webResponse.GetResponseStream();
StreamReader responseStreamReader = new StreamReader(responseStream);
String result = responseStreamReader.ReadToEnd();
return result;
}
catch (Exception ex)
{
return null;
}
}
When I do so I get:
You do not seem to have cookies enabled. MatWeb Requires cookies to be enabled.
Ok, that I understand, so I added lines:
CookieContainer cc = new CookieContainer();
webRequest.CookieContainer = cc;
I got:
Your IP Address has been restricted due to excessive use. The problem may be compounded when an IP address may be shared by many people in a company or through an internet service provider. We apologize for any inconvenience.
I can understand this but I'm not getting this message when I try to visit this page using web browser. What can I do to get the source code? Some cookies or http headers?
It probably doesn't like your UserAgent. Try this:
webRequest.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2.13) Gecko/20101203 Firefox/3.6.13 (.NET CLR 3.5.30729)"; //maybe substitute your own in here
It looks like you're doing something that the company doesn't like, if you got an "excessive use" response.
You are downloading pages too fast.
When you use a browser you might get up to one page per second. Using a application you can get several pages per second and that's probably what their web server is detecting. Hence the excessive usage.