How to retrieve values after # in a URL using c# - c#

I am trying to implement ajax back/forward button support and therefore writing variables after a # in my url. I would also like the user to be able to copy the url and then link back to it. Does anyone know how can I parse the url and grab my "querystrings" even though they are behind a #?

The value after the hash is not transmitted to the server. There's another SO question about that somewhere, but I'm having trouble finding it. Likewise it's taken me a while to find a decent reference to cite, but this Wikipedia article has some confirmation:
The fragment identifier functions
differently than the rest of the URI:
namely, its processing is exclusively
client-side with no participation from
the server. When an agent (such as a
Web browser) requests a resource from
a Web server, the agent sends the URI
to the server, but does not send the
fragment.
I assume you want to respond to it on the server side rather than the browser side? (Given that you're asking about doing it in C#...)

http://msdn.microsoft.com/en-us/library/system.uri.fragment.aspx

Related

log in to account serverside instead client side

what i'm trying to do is load a webpage from serverside ,for example www.facebook.com then insert username and password programmatically and log in.using desktop application i know it's possible .i know how to do that in c#but in desktop/client side.but what i looking is do that in server side.
for example
i send a request with username and password to a site[my site].let's say www.fbloger.com. then server logging to Facebook using that details .so server can send me important details.my final requirement is to get an alert when a specific friend is online.so i don't need to always logged and check is she online.i can log in to fb as soon as server give me a alert.i don't know is it really possible
It sounds like you are trying to write some kind of server-side web crawler/spider. If this is the case, all you need to do is examine the network requests being performed in a browser then emulate these in C#.
In c# if you send the request with HttpClient, exactly as your browser does, you can then capture the returned web page and scrape the content with something like the HTML Agility Pack which allows you to query the HTML like an XML document for extracting the values you require. See http://html-agility-pack.net (get it via NuGet).
Since you know how to do that in C# simply use C# for server side code.
ASP.Net allows to use C# for code behind and you can copy (or better reuse) desktop code that signs in to a web site.
If your desktop code used WebBrowser control - you'll need to rewrite crawling code with something like HttpClient and avoid pages that execution of JavaScript to render/log in.

How do I send data to a website without writing anything in the URL?

I'm trying to make a login script in PHP, however a user will signup/login via a C# program.
For obvious reasons I don't want to use GET, but I don't want to make forms and use POST either.
How should I (securely) send data over?
This is only a proof of concept so there isn't any SSL on my website/etc, so there's no extra loops to jump through.
Just so nobody's confused:
Web-based login will be made in PHP, and data will be sent to the website via a program made in C#
You have to use GET or POST to submit your form data back to the PHP page. There is no way around that.
Once your PHP code has it on the backend you can do whatever you want to get it to the C# program.
Using a POST would eliminate items being added to the query string like GET does and of course doing it over HTTPS would make it more secure.
The only way to be secure is to use https:. You can look at client-side encryption, but that involves private/public keys - https will be easier.
Client side is HTML/JCSS/Javascript. If you want a user to enter ID annd password then you will have forms, and GET or POST is what you will use. Even if you wrap it up in AJAX, it will still be the same.
Go and think this through, and come back with a proper question. Telepathy isn't sufficiently reliable for your needs. Yet.
I'm not quite sure I understand correctly but what I understood was you want to send the login via a C# program. In that case check this question: HTTP request with post.
If however you want to send the data via the webpage, without having a form. You'd need to make the POST request via javascript.
Any of both methods require POST requests, even though you don't use forms.

ASP.NET manipulating and fire action on html

I am working on a project in which one functionality is that this page obtains data from other page (not web service) and then displays it on a grid and use hightcharts for charting.
The problem is that this data I want to read is in anotherpage.
I know that I can read html from other pages... but to get this information on the page, I need to fill 2 input text for a filter and press a submit button.. then it displays a table and this is the table where I need to extract the information.
Is there a way to do this automatically on c#?
There are plenty of ways to do this; the most common revolve around AJAX. You can initiate a callback from the client via Javascript to a method on the server, which can update controls in an UpdatePanel, for example.
You can also make client side calls to server side Page Methods. Effectively, this is a static method on your webform that you can call from the client via javascript/jquery and AJAX.
EDIT.
It turns out that you want to scrape another site. The easiest way for you to do this is have a server side page method on your website that does this - it requests the page from the client site, extracts the info you want, and then returns that to your client. Your client can of course call this as a page method.
See https://web.archive.org/web/20210513000146/http://www.4guysfromrolla.com/webtech/070601-1.shtml for a tutorial, and I do suggest using the HTML Agility Pack as that article mentions.
Further EDIT
You want to further manipulate the page on the remote site; if you can't or don't want to speak to the developers of that site to work out a way of doing it programmatically, then you'll have to cheat. Get Firebug and Tamper Data. Use Firebug and Tamper Data to see how clicking the button on the remote site makes a request and posts it to the server - you want to emulate doing the same. If you know what data is being posted then you can, from your server, make exactly the same post.
You often have this kind of problem when trying to scrape AJAX websites.

Need Help in building a "robot" that extracts data from HTTP request

I am building a web site in ASP.net and C# that one of its components involves log-in to a website that the user has an account (for example cellular phone company) on behalf of the user, take information from this site and store it in our database.
I think this action called "scraping".
Are there any products that already does so that I can use to integrate with my software ?
I don't need a software that does it, I need some sort of SDK that I can integrate with my C# code.
Thanks,
Koby
Use the HtmlAgilityPack to parse the HTML that you get from a web request once you've logged in.
See here for logging in: Login to website, via C#
I haven't found any product, that would do it right so far.
One way to handle this is to
- do requests by your self
- use http://htmlagilitypack.codeplex.com/ to extract important information from downloaded html
- save extracted information by your self
Thing is, that depending on context, there are so many things to tune/configure, that you need very large product and still it won't reach custom solution performance/accuracy:
a) multithreading control
b) extraction rules
c) persistance control
d) web spidering (or how next link to parse is chosen)
Check the Web Scraping Wikipedia Entry.
However I would say since what we need to acquire via web-scraping is application specific, most of the time, it may be more efficient to scrape whatever you need from a web response stream.

Comparison between POST and GET (httpwebrequest in C#) - in terms of "visibility"

I would like to implement an online activation feature (something like activating using a cd key), and I would like to do so via http.
I would like to send the key together with an internal password to the server each time an activation request is sent.
The password's use is that, since the http service is exposed publicly, I would like it to be used only by my application, not any unknown third party (like brute-force trying different keys).
As I know that in a web browser, end users can see the GET form values in the address bar, but for POST method it won't have this issue.
What I want to ask is:
Since, obviously, I don't want this
internal password to be known to
others, when submitting the cd key and
the password via the C# class
httpwebrequest, is there any
difference between using the GET and
POST method, in terms of "visibility"?
It makes sense that I should use POST in the rationale that the form values are visible in the web browser address when using GET, but since now I'm using httpwebrequest, there is no web browser "address bar" to be seen, so is it actually no difference at all?
Or, is it that there is when there is like a hacker that intercepts the web request of my application?
Thanks a lot for your help!
No. Both GET and POST are HTTP verbs. Anyone can observe your app using a tool like Fiddler (both GETs and POSTs).
You could try using HTTPS, but there exist "interceptors" for this as well (using function call hooking, not a man-in-the-middle attack).
You may want to consider using a challenge/response kind of communication, where the server sends a challenge to your app and your app has to respond appropriately. Usually asymmetric encryption is used as part of that handshake.
There isn't a perfect solution; any method of copy protection can be broken with a moderate investment of time and resources. You just have to decide where to draw the line.
From HTTP perspective both get and post requests can be intercepted and modified. Use fiddler to check what I mean. Fiddler can also be used to create a get request and submit a form to you site.
There are number of possible ways to deal with this
1) use SSL
2) on initial request, your web service can return a token, which must be used while sending the data. And you can verify and validate the token on next request, which deals with actual activation.
However, without SSL, it is still possible for the user of the application to trace the internal password being set over HTTP. What you may do is - use the token sent in 1st request to encrypt the password. So, every time the password will be different depending on your validation token.
Hope it makes sense.
Both GET and POST data will be in plain text unless you use HTTPS. It's trivial for anyone on the same subnet (or on an intervening network) to snoop on the data.
In short, don't send sensitive data via HTTP.

Categories

Resources