Based on the this sourcecode I'm not able to retrieve the data from the API into XDocument.
I retrieve the error message
{"The remote server returned an error: (400) Bad Request."}
Question:
I don't know what to do?
XDocument xml = XDocument.Parse(new
WebClient().DownloadString("http://api.arbetsformedlingen.se/af/v0/platsannonser/matchning?lanid=1&kommunid=180&yrkesid=2419&1&antalrader=10000"));
You need to send HTTP headers:
using (WebClient client = new WebClient())
{
client.Headers.Add("Accept-Language", " en-US");
client.Headers.Add("Accept", "application/xml");
client.Headers.Add("User-Agent", "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Trident/5.0)");
XDocument xml = XDocument.Parse(client.DownloadString("http://api.arbetsformedlingen.se/af/v0/platsannonser/matchning?lanid=1&kommunid=180&yrkesid=2419&1&antalrader=10000"));
}
Related
in my C# application below method it is hitting service and returning data. At the return statement there was a fortify scan issue raised under the category named Server side request forgery. There is no parameter adding to URL in the method. uri as is used to fetch data. I am completely new.
using (var wc = new WebClient())
{
wc.Encoding = System.Text.Encoding.ASCII;
wc.Headers["User-Agent"] = "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; .NET CLR 2.0.50727; .NET4.0C; .NET4.0E)";
return JsonObject.GetDynamicJsonObject(wc.DownloadString(uri.ToString())); //here issue raised
}
I currently have this code that is supposed to grab the HTML source of the website. Specifically, I am telling it to read the source of 4chan. It WILL get the source code for a board, such as /pol/ or /news/, but it will NOT get the source code for specific threads. It throws the error: [System.Net.WebException: 'The remote server returned an error: (403) Forbidden.']
Here is the code I am working with.
public string GetSource(string url)
{
WebClient client = new WebClient();
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12; //tried with & without this
client.Headers.Add("user-agent", "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.1; WOW64; Trident/6.0;)");
try
{
return client.DownloadString(url);
}
catch
{
Error(2); //error code 2
}
return "";
}
It will download the source of "https://boards.4chan.org/pol" for example.
It will not download the source of "https://boards.4chan.org/pol/thread/#"
I am completely lost as to how to proceed. I have a "user-agent" tag, and it works sometimes, so I don't know what the problem is. Any help would be appreciated. Thanks.
All I'm trying to do is create a program that gets a web response from Nike's upcoming shoe's page, however I keep running into an error saying this is forbidden. No other threads on this topic have been of use to me, is there anything I can do for this or am I just screwed? This is the code:
WebRequest request = WebRequest.Create("https://www.nike.com/launch/?s=upcoming");
WebResponse response = request.GetResponse();
and this is the error:
System.Net.WebException: 'The remote server returned an error: (403) Forbidden.'
Seems like a header issue, try this:
WebClient client = new WebClient();
client.Headers.Add("user-agent", "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1)");
client.Headers.Add("Content-Type", "application / zip, application / octet - stream");
client.Headers.Add("Referer", "http://whatevs");
client.Headers.Add("Accept", "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8");
String someStuff = client.DownloadString("https://www.hassanhabib.com");
Console.WriteLine(someStuff);
Console.Read();
Removed the Accept-Encoding line, should be fine now.
Code which i tried :
string contents = string.Empty;
using (var wc = new System.Net.WebClient())
{
contents = wc.DownloadString("http://www.bizjournals.com/albany/blog/health-care/2015/10/what-this-local-bank-did-to-control-health-care.html");
}
but its throwing error
The remote server returned an error: (416) Requested Range Not
Satisfiable
It appears that some webservers may return a 416 if your client does not send a User-Agent header. Try adding the header like this:
wc.Headers.Add("User-Agent", "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; .NET CLR 1.0.3705");
This page uses frames, but your browser doesn't support them.
This error occurs when I am trying to get the information from our smsgatwaye site.
The code is as follows:
WebClient client = new WebClient();
client.Headers.Add("user-agent", "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; .NET CLR1.0.3705;)");
string baseurl = "http://smsoutbox.in/?user=test&password=test#123";
Stream data = client.OpenRead(baseurl);
StreamReader reader = new StreamReader(data);
string s = reader.ReadToEnd();
data.Close();
reader.Close();
I am requesting to http://smsoutbox.in page which ask for username & password if it is valid than it shows my gateway balance on same page in frame.
But when I get the response, I found this error:
This page uses frames, but your browser doesn't support. (Line) instead of balance in response stream.
How can solve this?
View the source of the page yourself, and look at the frames being used. Open each one separately to determine which URL you need to retreive.
The problem might be that WebClient already submits a UserAgent and by adding another "user-agent"-header you're not replacing the original header.
Use this modified WebClient that internally uses HttpWebRequest's UserAgent property:
http://codehelp.smartdev.eu/2009/05/08/improve-webclient-by-adding-useragent-and-cookies-to-your-requests/
Alternatively it should work to correctly modify the UserAgent like this:
client.Headers["user-agent"] = "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; .NET CLR1.0.3705;)");