I am building a C# client for a CGI service (not sure what its called exactly).
It accepts a bunch of XML and spits out a response. I have tested it straight in Firefox and it works (see below).
Now I am not sure how to do this in C# code though? Does anyone have a helpful snippet of code, I can't imagine it would be that difficult?
http://www.travelcommunications.co.uk/cgi-bin/d3web_gzip.ssh?%3CTCOML%20version=%22NEWFORMAT%22%3E%3CTransferOnly%3E%3CAvailability%3E%3CRequest%3E%3CAgentCode%3ETEST%3C/AgentCode%3E%3CAgentType%3ETA%3C/AgentType%3E%3CDeparturePointCode%3EALC%3C/DeparturePointCode%3E%3CDeparturePointType%3EAIRPORT%3C/DeparturePointType%3E%3CArrivalPointCode%3EBEN%3C/ArrivalPointCode%3E%3CArrivalPointType%3ERESORT%3C/ArrivalPointType%3E%3CSectorType%3ERETURN%3C/SectorType%3E%3CArrDate%3E10.10.10%3C/ArrDate%3E%3CArrTime%3E10:00%3C/ArrTime%3E%3CRetDate%3E17.10.10%3C/RetDate%3E%3CRetTime%3E10:00%3C/RetTime%3E%3CBrochure%3E001%3C/Brochure%3E%3CAdults%3E2%3C/Adults%3E%3CChildren%3E0%3C/Children%3E%3CInfants%3E0%3C/Infants%3E%3CCurrencyCode%3EUKL%3C/CurrencyCode%3E%3C/Request%3E%3C/Availability%3E%3C/TransferOnly%3E%3C/TCOML%3E
You're looking for the WebClient class.
For example: (2nd EDIT: With GZIP; this code is tested and actually works)
string response;
using (var client = new WebClient()) {
byte[] bytes = client.DownloadData(url);
using(var reader = new StreamReader(new GZipStream(new MemoryStream(bytes), CompressionMode.Decompress)))
response = reader.ReadToEnd();
}
However, if the URL returns raw XML, you can also load the XML directly from the URL, like this:
var doc = XDocument.Load(url);
Related
If anyone load this url https://de.visiblealpha.com/links/80488d55-ae41-4def-9452-bae3ac2e2b06 into browser then a excel file start download. so when i invoke the same url by HttpWebRequest then excel file does not start download. this code example i tried.
string address = "https://de.visiblealpha.com/links/80488d55-ae41-4def-9452-bae3ac2e2b06";
using (WebClient client = new WebClient())
{
client.DownloadString(address);
}
again i tried this one too.
string url = "https://de.visiblealpha.com/links/80488d55-ae41-4def-9452-bae3ac2e2b06";
WebRequest request = HttpWebRequest.Create(url);
WebResponse response = request.GetResponse();
StreamReader reader = new StreamReader(response.GetResponseStream());
string responseText = reader.ReadToEnd();
but failed to reach my goal. code successfully executed but no excel file start downloading which i am trying to achieve.
when i tried to load this url https://de.visiblealpha.com/links/80488d55-ae41-4def-9452-bae3ac2e2b06 into webbrowser control then also saw same problem no excel file start download. here is code which i tried.
webBrowser1.Navigate("https://de.visiblealpha.com/links/80488d55-ae41-4def-9452-bae3ac2e2b06");
webBrowser1.ScriptErrorsSuppressed = true;
i just do not understand why excel file is not getting download when invoke or execute the same very url.
so please some one tell me what i need to do as a result the moment i will execute the url excel file will start downloading in client pc.
please share some working code example.
DownloadString returns the contents into a variable aka in memory. A file will not get saved on the system. If that is what you intended, there's a small change you need to make in your code:
string address = "https://de.visiblealpha.com/links/80488d55-ae41-4def-9452-bae3ac2e2b06";
using (WebClient client = new WebClient())
{
string contents = client.DownloadString(address);
}
The variable "contents" will contain html of the URL in your question. If you want it as a file, then I you need to use DownloadFile method instead. The spreadsheet itself is a different URL.
There's an example at the end of this documentation.
I have a web application written in ASP.NET. All is working okay, except that I would like to compress the data being returned. The data is basically a List of custom models. Currently I do something like:
string json_string = new JavaScriptSerializer().Serialize(my_models);
using (var output = new MemoryStream())
{
using (var compressor = new Ionic.Zlib.GZipStream(output, Ionic.Zlib.CompressionMode.Compress, Ionic.Zlib.CompressionLevel.BestCompression))
{
compressor.Write(Encoding.ASCII.GetBytes(json_string), 0, json_string.Length);
}
}
I then proceed with the following:
HttpResponseMessage json = Request.CreateResponse(HttpStatusCode.OK, json_string);
json.Content.Headers.Add("content-encoding", "gzip");
This causes an application error. In Chrome (through the console), I see the following message:
ERR_CONTENT_DECODING_FAILED
Where am I going wrong here? Thanks!
So in your code you wrote the compressed content to your MemoryStream but your json_string is still your original json_string which then you added that original string as response but marked it as compressed gzip format.
So in the end Chrome tries to decode a pure string JSON content with gzip and it failed to do so
You would want to write whats inside output into json_string object before sending it off
Or you can config IIS to always compress everything to save yourself some trouble.
Dumb question, but anyway... I am wondering how can one load xml file from url (in Universal Project).
It was quite easy with WPF:
XmlDocument xml = new XmlDocument();
xml.Load(url);
but this isn't working for me here and I really can not find a way around, it's annoying.
Thanks in advance!
You can use HttpClient to do the request. (It's the Microsoft.Net.Http Nuget package.) Once you have a stream from it, there's an overload on XmlDocument.Load which accepts a stream. If you need it parsed as an object, skip XmlDocument.Load and use XmlSerializer instead.
using (HttpClientHandler hHandler = new HttpClientHandler())
{
HttpResponseMessage response = await hClient.GetAsync(URL);
System.IO.Stream oStrm = await response.Content.ReadAsStreamAsync();
XmlSerializer oSer = new XmlSerializer(typeof(T));
return (T)oSer.Deserialize(oStrm);
}
I have a javascript file in a remote server, and when I use httpwebrequest it returns some weird characters.
Thr url is http://goo.gl/0Ug5QI
Is this kind of compressed contents?
static string GetScriptSource(string _url)
{
string _retValue = string.Empty;
HttpWebRequest hwr = (HttpWebRequest)WebRequest.Create(_url);
hwr.Method = "GET";
HttpWebResponse res = (HttpWebResponse)hwr.GetResponse();
StreamReader sr = new StreamReader(res.GetResponseStream());
return sr.ReadToEnd();
}
My code to read that script file's content is very simple.
Looking at the js source that you linked to, it could be that is has been gzipped. Try saving the source as a file and use 7zip or something to see if you can unzip it. There is a GZip library in C# so if it has been gzipped then you should be able to unzip it easily enough.
Although it's a Korean web site so maybe the encoding is not correct.
Either way it's not a problem with the code that you posted.
I am trying to pass through some XML from an external website.
What is the best way of doing this, through c# webpage or asp.MVC?
I tend to use something like this for working with external XML documents / RSS feeds etc:
string sURL = ".....";
// Create a request for the URL.
WebRequest oRequest = WebRequest.Create(sUrl);
// Get the response.
WebResponse oResponse = oRequest.GetResponse();
// Get the stream containing content returned by the server.
Stream oDataStream = oResponse.GetResponseStream();
// Open the stream using a StreamReader for easy access.
StreamReader oReader = new StreamReader(oDataStream, System.Text.Encoding.Default);
// Read the content.
string sXML = oReader.ReadToEnd();
// Convert string to XML
XDocument oFeed = XDocument.Parse(sXML);
Either should be fine. MVC is probably easiest (in terms of getting a raw response), but you could do the same in regular ASP.NET just by using a handler (possibly .ashx), or just by clearing the response.