I have been looking around for hours trying to find a clear simple solution to my question and have yet to find a good answer. I am trying to do a URL request in flash to my NOPCommerce site. I want to pass a GET value to the my .cs file which i'll then use that value to grab specific information and return it back to flash. How would I set up the C# or asp.net side of things? If anyone could give me an example of what I am looking for I would greatly appreciate it.
I don't know if I am supposed to use a .aspx, .cs or .ascx file.
Thanks,
Brennan
I found it to be extremely simple with web services in as3. Here is a link to see what I mean
As3 Web Services
Use the HttpWebRequest class to GET the variables, do the magic and return a result by invoking the HttpWebRequest again.
Examples and usage here:
http://www.csharp-station.com/HowTo/HttpWebFetch.aspx
You have a few options for server-side communication with flash.
Flash remoting. This is the most popular because it's the most performant, but not the easiest to understand at first glance. It transfers data in a binary format. Available libraries are Weborb and Fluorine.
Web Services as mentioned in a previous post.
Ajax/JSON. I think with Flash Player 11.3, JSON decoding is native in the player now.
Straight up http request.
Sockets (not recommended for beginners)
To answer your question as you asked it, though, for all but #4, you'd be using a CS file to retrieve your data. For #4, you'd most likely be using an .aspx page, but it could be a combination of .aspx and .ascx files.
My recommendation is that you do some research on each of these methods to decide what would work best with your development environment, required level of security, and project. Then, ask specific questions about each method as necessary.
Good Luck!
Related
I'd like to preface this by saying that I'm feeling a bit overwhelmed at a summer internship, it's a lot of new technologies and concepts I've never used before and finding good resources to explain these things and not just regurgitate situational tutorials are very scarce. Particularly I'd like to talk about ASP.NET.
What I currently have is a simple webpage made using ASP.NET in visual studio. My general task is to access a database, get the info from the database, turn the data into JSON, and send that data to a python script which will then parse the JSON and do stuff with it, etc.
I am able to get as far as making the JSON object. I can get the user to download the json object as a file, but what I really want to do is just pass it through the network by accessing my web site from the python script using urllib2. This is where I have become completely lost. There are so many terms I've never heard of before, things like services, web APIs, controllers, routing and all these things I've spent hours digging around in and following basic tutorials but still cannot find a firm grasp on the concepts let alone how to accomplish it in a practical manner.
To be completely clear here are my goals:
Send 5 parameters usingurllib2 in python to my asp.net site
use these parameters to query the database and get a json object (COMPLETE)
return the json to the python script
I have no idea how to set up a "service" or how to even go about doing so. I know that I have to attach it to my website somehow but I'm not sure. Any suggestions or good resources would be much appreciated. I'm just looking for some direction and advice on how to go about accomplishing #1 and #3 on my list.
Thank you for taking the time to read through my post!
For part one you could do this:
import urllib2
response = urllib2.urlopen('http://mysite.io?paramOne=ValueOne/')
Now the response object will have your JSON so you can do this:
json = response.read()
urllib2 has a nice way of preparing URL parameters which you might want to look into.
Okay guys, here's the deal. I'm a total C# beginner but I've been advised to learn it. I'm desperate to get a working Bitcoin value grabber going - so that I can record the values to a text file. Another thing is that the value must be from MtGox, either their API or their homepage.
I've spent a while dealing with HTTP requests and JSON decoding (grrr...) but I don't see the point in me spending so much time on my learning code when I'm sure there is someone else out there who can just help me to write it.
Does anyone think they might be able to help with this? Just a couple of lines to pull the last Bitcoin value from MtGox.
Any contributions are much appreciated.
Will.
EDIT:
var json = WebClient.DownloadString("http://data.mtgox.com/api/2/BTCUSD/money/ticker_fast");
string valueOriginal = Convert.ToString(json);
That is all I needed to write. Wow. Thanks for the help though, Oliver.
This is a Q&A site, so don't expect other's to write your code for you.
You can access their API using any language that supports HTTP calls.
Here's a great document to get you started:
MtGOX api docs
This is a judgement call, but it sounds like you're new to programming in general. Don't commit yourself to a language. Always use the best tool for the job. I'd suggest using something a bit more "web friendly" for this sort of work. NodeJS would make this much easier for you as it understands the JSON returned from their API nativly, without using all sorts of wrappers and instances of the WebClient class or HttpResponseMessages as you do in .NET. Also consider the purpose of writing the data to a text file. What is going to consume that data? It's possible that you can skip the file and just interact with the down-level consumer directally.
What format needs to be in the data file? If you can save it in JSON, you could do this in C#
using System.Net;
//...
WebClient Client = new WebClient ();
Client.DownloadFile("https://mtgox.com/api/1/BTCUSD/ticker", #"C:\folder\results.json");
its a possible duplicate of this and i have seen this Is there an existing Google+ API?
now my question is i have seen the google+ history API and also google+ Api..
i have never worked with API's before and don't know where to start all i know is i have to implement a code to post on google+ page for my brand..I understand i have to have the access token to do this.. but can someone tell me where i should start and what i need to know before understanding this and implementing it..
i know there are a number of websites which allow us to post on a google plus page for example hootsuite.. which allows us to post to all the different networking sites at once and it does the same to goole+ as well.. so i am assuming there is definitely a work around to do this..can someone help me ..from where to start??
Thanks !
PS: let me know if i am not clear or my question is too vague!
There is currently no publicly documented API that lets you automatically post to your Google+ page.
There are some tools (such as HootSuite) that do allow this, however, and since you have never used an API, this may be a good path for you to investigate.
The API that HootSuite is using is slowly opening up to other vendors. See https://plus.google.com/u/0/104946722942277428266/posts/LUi2ZNyRHag for more information about what is coming and how you can sign up to request access to this.
Glancing at the API really quickly, I'm not seeing what call you would make to create a post, I'm only seeing a read API so far...
But if you can find where to send the data, you need to look into how to HTTP POST data to that url.
Your main options built into C# are WebClient.UploadString() or a HttpWebRequest with the Method property set to POST
Then you upload the JSON object specified to create the post. There are probably frameworks to help with AJAX or JSON that would make it easier, but for just making a single type of post doing it manually wouldn't be too hard.
I've been entrusted with an idiotic and retarded task by my boss.
The task is: given a web application that returns a table with pagination, do a software that "reads and parses it" since there is nothing like a webservice that provides the raw data. It's like a "spider" or a "crawler" application to steal data that is not meant to be accessed programmatically.
Now the thing: the application is made with standart aspx webform engine, so nothing like standard URLs or posts, but the dreadful postback engine crowded with javascript and non accessible html. The pagination links call the infamous javascript:__doPostBack(param, param) so I think it wouldn't even work if I try even to simulate clicks on those links.
There are also inputs to filter the results and they are also part of the postback mechanism, so I can't simulate a regular post to get the results.
I was forced to do something like this in the past, but it was on a standard-like website with parameters in the querystring like pagesize and pagenumber so I was able to sort it out.
Anyone has a vague idea if this is doable, or if I should tell to my boss to quit asking me to do this retarded stuff?
EDIT: maybe I was a bit unclear about what I have to achieve. I have to parse, extract and convert that data in another format - let's say excel - and not just read it. And this stuff must be automated without user input. I don't think Selenium would cut it.
EDIT: I just blogged about this situation. If anyone is interested can check my post at http://matteomosca.com/archive/2010/09/14/unethical-programming.aspx and comment about that.
Stop disregarding the tools suggested.
No, the parser you can write isn't WatiN or Selenium, both of those Will work in that scenario.
ps. had you mentioned anything on needing to extract the data from flash/flex/silverlight/similar this would be a different answer.
btw, reason to proceed or not is Definitely not technical, but ethical and maybe even lawful. See my comment on the question for my opinion on this.
WatiN will help you navigate the site from the perspective of the UI and grab the HTML for you, and you can find information on .NET DOM parsers here.
Already commented but think thus is actually an answer.
You need a tool which can click client side links and wait while page reloads.
Tool s like selenium can do that.
Also (from comments) WatiN WatiR
#Insane, the CDC's website has this exact problem, and the data is public (and we taxpayers have paid for it), I'm trying to get the survey and question data from http://wwwn.cdc.gov/qbank/Survey.aspx and it's absurdly difficult. Not illegal or unethical, just a terrible implementation that appears to be intentionally making it difficult to get the data (also inaccessible to search engines).
I think Selenium is going to work for us, thanks for the suggestion.
I'm looking at converting a web site from classic ASP to ASP.NET. I'm thinking of doing an agile style approach and providing deliverables as quickly as possible and so am thinking of doing a line by line conversion and creating "bad" ASP.NET and have it all in the ASPX file for phase 1 and get that working. That, I figure, will be the fastest and safest (i.e. preserving identical functionality). The next phase would be to split the code out into codebehind and multi-tiers.
I plan on replacing the VBScript in the ASP files with C# in the ASPX files.
So apart from general comments about what I'm planning on doing (which I welcome) the specific question that I have is: Are there any helper functions out there that wrap the VBScript functions from ASP to a C# equivalent that someone's already done?
So I'd be looking for a C# file (library) that has wrappers like:
string Mid(string txt,int start,int length)
{
return txt.SubString(start, length); // or is it start - 1?
}
double Abs(double num)
{
return Math.Abs(num);
}
Look in the Microsoft.VisualBasic namespace to get access to the old VBScript/VB6 functions. You can use them directly from C#.
Additionally, you're in for a shock. ASP.Net uses a different compiler model, and so some of the things you did in Classic ASP aren't allowed at all in ASP.Net. Include files is the big one that comes to mind-- all of your supporting code (code outside of the *.asp file for the page itself) must be re-thought to support the new model.
I don't think a line-by-line approach is a good idea - you'll just end up with a bunch of bad code that you'll still have to refactor. That doesn't sound very Agile to me either.
I'd try breaking the site up logically into modules, and then try rewriting specific pages or modules as ASPX. For example, if the site has an admin section, maybe you could just rewrite the admin portion. Then repeat for the next section. This could be done iteratively for each section until you're done.
I agree with the comment by Jason, and would like to also point out a higher level issue that you'd have to address, if you haven't thought about it already.
Does your classic ASP site make use of session variables to manage state between pages? If so, have you thought about how you're going to share session state between ASP and ASP.NET? Classic ASP and ASP.NET implement session state in different ways, so you will need methods to transfer between the two.
Smaller applications may be easier to "convert", but if you've got a large app that makes heavy use of session variables, you may want to think about other options.
If using ASP.NET MVC is an option for you, I'd take a look at that first. I would think it would be a much easier translation than trying to go from a scripting language to WebForms.
I don't think you gain anything by redoing the site as-is in .Net. Implementing it properly will be so radically different than what you'll start with in .Net that it just seems like wasted effort. Furthermore, it will be very difficult to write unit tests if you implement everything in .aspx pages without even code-behind.
I would aim for an MVC conversion. If you do port it over as-is, at least investigate master pages, that should save you some headaches.