Forgive me if this is a stupid question. I am not very experienced with Web programming.
I am implementing the payment component of my .net mvc application. The component interacts with an external payment service. The payment service accepts http post request in the following form
http://somepaymentservice.com/pay.do?MerchantID=xxx&Price=xxx&otherparameters
I know this is dead easy to do by adding a form in View. However, I do not want my views to deal with third party parameters. I would like my view to submit information to my controller, then controller generates the required url and then send out the request. Following is the pseudo code.
[AcceptVerbs(HttpVerbs.Post)]
public ActionResult PayForOrder(OrderForm order)
{
var url = _paymentService.GetUrlFromOrder(order);
SendPostRequest(url);
return View("FinishedPayment");
}
Is it possible to do so? Does c# have built-in library to generate http request?
Thanks in advance.
You'll want to use the HttpWebRequest class. Be sure to set the Method property to post - here's an example.
There certainly is a built in library to generate http requests. Below are two helpful functions that I quickly converted from VB.NET to C#. The first method performs a post the second performs a get. I hope you find them useful.
You'll want to make sure to import the System.Net namespace.
public static HttpWebResponse SendPostRequest(string data, string url)
{
//Data parameter Example
//string data = "name=" + value
HttpWebRequest httpRequest = HttpWebRequest.Create(url);
httpRequest.Method = "POST";
httpRequest.ContentType = "application/x-www-form-urlencoded";
httpRequest.ContentLength = data.Length;
var streamWriter = new StreamWriter(httpRequest.GetRequestStream());
streamWriter.Write(data);
streamWriter.Close();
return httpRequest.GetResponse();
}
public static HttpWebResponse SendGetRequest(string url)
{
HttpWebRequest httpRequest = HttpWebRequest.Create(url);
httpRequest.Method = "GET";
return httpRequest.GetResponse();
}
It realy makes a difference if ASP.NET makes a request or the the client makes a request.
If the documentation of the the provider says that you should use a form with the given action that has to be submited by the client browser then this might be necessary.
In lots of cases the user (the client) posts some values to the provider, enters some data at the providers site and then gets redirected to your site again. You can not do this applicationflow on the serverside.
Related
This question already has answers here:
Send HTTP POST request in .NET
(16 answers)
Closed 5 years ago.
I have a Node server hosted with Azure, where I can send a POST request to the API for it to perform some function. The API itself works, I have tested it with Post Man.
A call to the API would look something like this..
http://website.com/api/Foo?name=bar&second=example
This doesn't necessarily need to return anything, as the call is silent and does something in the background. (note: perhaps it must return something and this is a hole in my understanding of the concept?)
Using C#, how can I make a web request to this URL?
I am already constructing the URL based on parameters passed to my method (so name and type as above could be whatever was passed to the method)
It's the POSTing to this URL that I cannot get working correctly.
This is the code I have tried..
void MakeCall(string name, string second)
{
string url = "http://website.com/api/Foo?name="+name+"&second="+second;
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
request.Method = "POST";
request.ContentType = "application/json";
request.ContentLength = url.Length;
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
}
You have to create a request stream and write to it, this link here has several ways to do this either with HttpWebRequest, HttpClient or using 3rd party libraries:
Posting data using C#
I have an external URL, like http://a.com/?id=5 (not in my project)
and I want my website to show this URL's contents,
ex.
My website(http://MyWebsite.com/?id=123) shows 3rd party's url (http://a.com/?id=5) contents
but I don't want the client side to get a real URL(http://a.com/?id=5), I'll check the AUTH first and then shows the page.
I assume that you do not have control over the server of "http://a.com/?id=5". I think there's no way to completely hide the external link to users. They can always look at the HTML source code and http requests & trace back the original location.
One possible solution to partially hide that external site is using curl equivalent of MVC, on your controller: after auth-ed, you request the website from "http://a.com/?id=5" and then return that to your user:
ASP.NET MVC - Using cURL or similar to perform requests in application:
I assume the request to "http://a.com/?id=5" is in GET method:
public string GetResponseText(string userAgent) {
string url = "http://a.com/?id=5";
string responseText = String.Empty;
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
request.Method = "GET";
request.UserAgent = userAgent;
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
using (StreamReader sr = new StreamReader(response.GetResponseStream())) {
responseText = sr.ReadToEnd();
}
return responseText;
}
then, you just need to call this in your controller. Pass the same userAgent from client so that they can view the website exactly like they open it with their web browsers:
return GetResponseText( request.UserAgent);
//request is the request passed to the controller for http://MyWebsite.com/?id=123
PS: I may not using the correct MVC API, but the idea is there. Just need to look up MVC document on HttpWebRequest to make it work correctly.
We have a service provider that allows us to connect to his payment page for payments, however the code he uses is php but we would like to do it in asp.net.
Problem is I don't really understand what the method should be, POST or GET, basically we need to redirect to the client with underlying parameters(not query strings) and then our current page that calls the request must be redirected to the client page with the parameters as well.
I do get the response witch is basically markup, but that's not what I want, I want it to redirect to the payment page, can someone please tell me what I do wrong.Thanks
Here is my code I use for the POST Method:
string query = string.Format("description={0}&amount={1}&merchantIdent={2}&email={3}&transaction={4}&merchantKey={5}",
description.ToString(), amount.ToString(), merchantIdent.ToString(), email.ToString(), id.ToString(), merchantKey.ToString());
// Create the request back
string url = "https://www.webcash.co.za/pay";
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(url);
req.Method = "POST";
req.AllowAutoRedirect = true;
req.ContentType = "application/x-www-form-urlencoded";
req.ContentLength = query.Length;
req.AllowAutoRedirect = true;
StreamWriter stOut = new StreamWriter(req.GetRequestStream(),System.Text.Encoding.ASCII);
stOut.Write(query);
stOut.Close();
// Do the request
StreamReader stIn = new StreamReader(req.GetResponse().GetResponseStream());
string response = stIn.ReadToEnd();
stIn.Close();
Not sure I totally understand your question, but as your title goes, here is the difference between POST and GET:
The GET method passes variables through the url. This can be practical or impractical (for instance if you plan to pass sensitive material to another page)
The POST method does not pass variables through the url, it passes the variables behind the scenes.
You'll need to decide which better fits your situation.
Normally GETs are idempotent (meaning they don't change data). Use a GET if you want to be able to issue a request and not change anything. Use a POST if you're performing some sort of update/processing/etc.
Is there any chance to retrieve DOM results when I click older posts from the site:
http://www.facebook.com/FamilyGuy
using C# or Java? I heard that it is possible to execute a script with onclick and get results. How I can execute this script:
onclick="(JSCC.get('j4eb9ad57ab8a19f468880561') && JSCC.get('j4eb9ad57ab8a19f468880561').getHandler())(); return false;"
I think older posts link sends an Ajax request and appends the response to the page. (I'm not sure. You should check the page source).
You can emulate this behavior in C#, Java, and JavaScript (you already have the code for javascript).
Edit:
It seems that Facebook uses some sort of internal APIs (JSCC) to load the content and it's undocumented.
I don't know about Facebook Developers' APIs (you may want to check that first) but if you want to emulate exactly what happens in your browser then you can use TamperData to intercept GET requests when you click on more posts link and find the request URL and it's parameters.
After you get this information you have to Login to your account in your application and get the authentication cookie.
C# sample code as you requested:
private CookieContainer GetCookieContainer(string loginURL, string userName, string password)
{
var webRequest = WebRequest.Create(loginURL) as HttpWebRequest;
var responseReader = new StreamReader(webRequest.GetResponse().GetResponseStream());
string responseData = responseReader.ReadToEnd();
responseReader.Close();
// Now you may need to extract some values from the login form and build the POST data with your username and password.
// I don't know what exactly you need to POST but again a TamperData observation will help you to find out.
string postData =String.Format("UserName={0}&Password={1}", userName, password); // I emphasize that this is just an example.
// cookie container
var cookies = new CookieContainer();
// post the login form
webRequest = WebRequest.Create(loginURL) as HttpWebRequest;
webRequest.Method = "POST";
webRequest.ContentType = "application/x-www-form-urlencoded";
webRequest.CookieContainer = cookies;
// write the form values into the request message
var requestWriter = new StreamWriter(webRequest.GetRequestStream());
requestWriter.Write(postData);
requestWriter.Close();
webRequest.GetResponse().Close();
return cookies;
}
Then you can perform GET requests with the cookie you have, on the URL you've got from analyzing that JSCC.get().getHandler() requests using TamperData, and eventually you'll get what you want as a response stream:
var webRequest = WebRequest.Create(url) as HttpWebRequest;
webRequest.CookieContainer = GetCookieContainer(url, userName, password);
var responseStream = webRequest.GetResponse().GetResponseStream();
You can also use Selenium for browser automation. It also has C# and Java APIs (I have no experience using Selenium).
Facebook loads it's content dynamically with AJAX. You can use a tool like Firebug to examine what kind of request is made, and then replicate it.
Or you can use a browser render engine like webkit to process the JavaScript for you and expose the resulting HTML:
http://webscraping.com/blog/Scraping-JavaScript-webpages-with-webkit/
I'm trying to login to a website using C# and the WebRequest class. This is the code I wrote up last night to send POST data to a web page:
public string login(string URL, string postData)
{
Stream webpageStream;
WebResponse webpageResponse;
StreamReader webpageReader;
byte[] byteArray = Encoding.UTF8.GetBytes(postData);
_webRequest = WebRequest.Create(URL);
_webRequest.Method = "POST";
_webRequest.ContentType = "application/x-www-form-urlencoded";
_webRequest.ContentLength = byteArray.Length;
webpageStream = _webRequest.GetRequestStream();
webpageStream.Write(byteArray, 0, byteArray.Length);
webpageResponse = _webRequest.GetResponse();
webpageStream = webpageResponse.GetResponseStream();
webpageReader = new StreamReader(webpageStream);
string responseFromServer = webpageReader.ReadToEnd();
webpageReader.Close();
webpageStream.Close();
webpageResponse.Close();
return responseFromServer;
}
and it works fine, but I have no idea how I can modify it to send POST data to a login script and then save a cookie(?) and log in.
I have looked at my network transfers using Firebug on the websites login page and it is sending POST data to a URL that looks like this:
accountName=myemail%40gmail.com&password=mypassword&persistLogin=on&app=com-sc2
As far as I'm aware, to be able to use my account with this website in my C# app I need to save the cookie that the web server sends, and then use it on every request? Is this right? Or can I get away with no cookie at all?
Any help is greatly apprecated, thanks! :)
The login process depends on the concrete web site. If it uses cookies, you need to use them.
I recommend to use Firefox with some http-headers watching plugin to look inside headers how they are sent to your particular web site, and then implement it the same way in C#. I answered very similar question the day before yesterday, including example with cookies. Look here.
I've found more luck using the HtmlElement class to manipulate around websites.
Here is cross post to an example of how logging in through code would work (provided you're using a WebBrowser Control)