Recently I have been thinking combining C# applications with a website, so that we can run multiple threads in C# and complete more tasks in the time it would take PHP would to do 1.
Most of us know that PHP Runs on a single thread and therefor we cant do multiple actions at the same time
But it would be nice if we can do something like:
$SharpEntity = new CSharpExecute();
$SharpEntity->add("downloader.class http://server.com/file.ext");
$SharpEntity->add("downloader.class http://server.com/file2.ext");
$SharpEntity->add("downloader.class http://server.com/file3.ext");
$SharpEntity->initialize();
while($SharpEntity->completed === false)
{
$SharpEntity->Update();
}
echo 'Files Grabbed';
This is just a basic example but would it actually be possible to do such a thing and if so, how can you do this?
I know that Facebook and other large systems do something like this but with C++, what you guys think?
A strategy similar to JSON (or json itself)
break the objects down to strings and map it to a C# object.
If you go with JSON it self you can get away with using some of the C# helpers already provided for translating Json.
Hope that helps.
Related
I'd like to preface this by saying that I'm feeling a bit overwhelmed at a summer internship, it's a lot of new technologies and concepts I've never used before and finding good resources to explain these things and not just regurgitate situational tutorials are very scarce. Particularly I'd like to talk about ASP.NET.
What I currently have is a simple webpage made using ASP.NET in visual studio. My general task is to access a database, get the info from the database, turn the data into JSON, and send that data to a python script which will then parse the JSON and do stuff with it, etc.
I am able to get as far as making the JSON object. I can get the user to download the json object as a file, but what I really want to do is just pass it through the network by accessing my web site from the python script using urllib2. This is where I have become completely lost. There are so many terms I've never heard of before, things like services, web APIs, controllers, routing and all these things I've spent hours digging around in and following basic tutorials but still cannot find a firm grasp on the concepts let alone how to accomplish it in a practical manner.
To be completely clear here are my goals:
Send 5 parameters usingurllib2 in python to my asp.net site
use these parameters to query the database and get a json object (COMPLETE)
return the json to the python script
I have no idea how to set up a "service" or how to even go about doing so. I know that I have to attach it to my website somehow but I'm not sure. Any suggestions or good resources would be much appreciated. I'm just looking for some direction and advice on how to go about accomplishing #1 and #3 on my list.
Thank you for taking the time to read through my post!
For part one you could do this:
import urllib2
response = urllib2.urlopen('http://mysite.io?paramOne=ValueOne/')
Now the response object will have your JSON so you can do this:
json = response.read()
urllib2 has a nice way of preparing URL parameters which you might want to look into.
Okay guys, here's the deal. I'm a total C# beginner but I've been advised to learn it. I'm desperate to get a working Bitcoin value grabber going - so that I can record the values to a text file. Another thing is that the value must be from MtGox, either their API or their homepage.
I've spent a while dealing with HTTP requests and JSON decoding (grrr...) but I don't see the point in me spending so much time on my learning code when I'm sure there is someone else out there who can just help me to write it.
Does anyone think they might be able to help with this? Just a couple of lines to pull the last Bitcoin value from MtGox.
Any contributions are much appreciated.
Will.
EDIT:
var json = WebClient.DownloadString("http://data.mtgox.com/api/2/BTCUSD/money/ticker_fast");
string valueOriginal = Convert.ToString(json);
That is all I needed to write. Wow. Thanks for the help though, Oliver.
This is a Q&A site, so don't expect other's to write your code for you.
You can access their API using any language that supports HTTP calls.
Here's a great document to get you started:
MtGOX api docs
This is a judgement call, but it sounds like you're new to programming in general. Don't commit yourself to a language. Always use the best tool for the job. I'd suggest using something a bit more "web friendly" for this sort of work. NodeJS would make this much easier for you as it understands the JSON returned from their API nativly, without using all sorts of wrappers and instances of the WebClient class or HttpResponseMessages as you do in .NET. Also consider the purpose of writing the data to a text file. What is going to consume that data? It's possible that you can skip the file and just interact with the down-level consumer directally.
What format needs to be in the data file? If you can save it in JSON, you could do this in C#
using System.Net;
//...
WebClient Client = new WebClient ();
Client.DownloadFile("https://mtgox.com/api/1/BTCUSD/ticker", #"C:\folder\results.json");
We are developing a product that will be deployed for a number of clients. In an area of this product, we need to execute some formulas. The problem is that this formulas can be different for each client. So we need to have a good architecture that can 'execute code dynamically'
The application is a web application hosted on sharepoint 2010 (thereby using .net 3.5)
A Simplified example:
I have class MyClassA with two numeric properties PropA and PropB
For one client the formula is PropA + PropB. For the other it is PropA-PropB.
This is just a simplified example as the formula is more complex than this.
I need to have a way that client A we can set PropA+PropB perhaps in an XMl file or database.
I would then load the code dynamically?
Does this make sense? Has anyone implement similar scenario in a production environment please?
I have found that the following article describes a similar situation but I do not know whether it is 100% reliable for a production environment:
http://west-wind.com/presentations/dynamicCode/DynamicCode.htm
I have also found that IronPython can also solve a similar problem but I cannot understand how I would use my ClassA with IronPython.
Any assistance would be greately appreciated.
Update ...
Thanks everyone for the detailed feedback. It was a very constructive exercise. I have reviewed the different approaches and it seems very likely that we will go ahead with the nCalc approach. nCalc seems to be a wonderful tool and I am already loving the technology :)
Thank you all!!
Look into nCalc.
You could store your calculations in a database field then simply execute them on demand. Pretty simple and powerful. We are using this in a multi-tenant environment to help with similar types of customization.
I'm just proposing this because I don't know the problem very well but the idea could be a Dll for each formula (so you can handle the code as you wish, with normal C# functionalities instead of an uncomfortable xml file).
With MEF you can inject dll into your code (you just have to upload those when you develop a new one, no need to recompile the exe file) and have a different formula for each client.
This is my idea because it looks like a perfect example for Strategy pattern
Do you have a fixed set of formulas, or does the client have the capacity to dynamically type those in, i.e. as for a calculator?
In the first case, I'd recommend the following: set of C# delegates, which get called/call each other in a particular order, and (a) Dictionary(ies) of closures which fit the delegates. The closures would then be assigned to the delegates based on your predefined conditions.
In the alternative case, I wouldn't compile .NET code based on what the client types in, since that (unless preempted) represents a server-side security risk. I would implement/adapt a simple parser for expressions that you're expecting.
P.S. nCalc sugguested by Chris Lively is definitely a viable option for this kind of task, and is better than directly using delegates if you have tons and tons of formulas that you don't want to keep in memory.
ClassA with Ironpython?
Keep it simple
Run through the classA instance for each member (maybe a custom attribute to mark up the ones you want to use in the calc) and end up with name=value pair which by some unfortunate coincidence looks like an assignment
e.g
PropA = 100
PropB = 200
Prepend that to your python script
PropAPropB = PropA + PropB
Execute the script which is the assignments and the calculation
Then it's basically
ClassB.PropAPropB = ipCalc.Eval("PropAPropB");
You can start getting real clever with it, but a methods to get the inputs from an instance and one the evaluatios the result of the calc and sets teh properties.
Bob's your mother's sister's brother...
What is the easiest way to programmatically extract structured data from a bunch of web pages?
I am currently using an Adobe AIR program I have written to follow the links on one page and grab a section of data off of the subsequent pages. This actually works fine, and for programmers I think this(or other languages) provides a reasonable approach, to be written on a case by case basis. Maybe there is a specific language or library that allows a programmer to do this very quickly, and if so I would be interested in knowing what they are.
Also do any tools exist which would allow a non-programmer, like a customer support rep or someone in charge of data acquisition, to extract structured data from web pages without the need to do a bunch of copy and paste?
If you do a search on Stackoverflow for WWW::Mechanize & pQuery you will see many examples using these Perl CPAN modules.
However because you have mentioned "non-programmer" then perhaps Web::Scraper CPAN module maybe more appropriate? Its more DSL like and so perhaps easier for "non-programmer" to pick up.
Here is an example from the documentation for retrieving tweets from Twitter:
use URI;
use Web::Scraper;
my $tweets = scraper {
process "li.status", "tweets[]" => scraper {
process ".entry-content", body => 'TEXT';
process ".entry-date", when => 'TEXT';
process 'a[rel="bookmark"]', link => '#href';
};
};
my $res = $tweets->scrape( URI->new("http://twitter.com/miyagawa") );
for my $tweet (#{$res->{tweets}}) {
print "$tweet->{body} $tweet->{when} (link: $tweet->{link})\n";
}
I found YQL to be very powerful and useful for this sort of thing. You can select any web page from the internet and it will make it valid and then allow you to use XPATH to query sections of it. You can output it as XML or JSON ready for loading into another script/ application.
I wrote up my first experiment with it here:
http://www.kelvinluck.com/2009/02/data-scraping-with-yql-and-jquery/
Since then YQL has become more powerful with the addition of the EXECUTE keyword which allows you to write your own logic in javascript and run this on Yahoo!s servers before returning the data to you.
A more detailed writeup of YQL is here.
You could create a datatable for YQL to get at the basics of the information you are trying to grab and then the person in charge of data acquisition could write very simple queries (in a DSL which is prettymuch english) against that table. It would be easier for them than "proper programming" at least...
There is Sprog, which lets you graphically build processes out of parts (Get URL -> Process HTML Table -> Write File), and you can put Perl code in any stage of the process, or write your own parts for non-programmer use. It looks a bit abandoned, but still works well.
I use a combination of Ruby with hpricot and watir gets the job done very efficiently
If you don't mind it taking over your computer, and you happen to need javasript support, WatiN is a pretty damn good browsing tool. Written in C#, it has been very reliable for me in the past, providing a nice browser-independent wrapper for running through and getting text from pages.
Are commercial tools viable answers? If so check out http://screen-scraper.com/ it is super easy to setup and use to scrape websites. They have free version which is actually fairly complete. And no, I am not affiliated with the company :)
So here's the concept. I'm trying to write a C# backend for CouchDB (keep reading, this has little or nothing to do with CouchDB).
Here's the basic idea:
1) The DB Driver uploads an arbitrary string to the DB Server.
2) The DB Driver tells the DB server to run the code and produce a view
3) The DB Server loads my C# backend, hands it the input string
4) The DB Server hands the function to the C# backend
5) The DB Server hands the C# backend data to execute the function over
6) The C# Backend executes the code and result the data to the DB Server
7) The DB Server stores, or otherwise sorts and sends the results back to the DB Driver
So the problem I'm having is with the remote code execution. All my models are C# objects. So it'd be really nice to be able to write the code on the front end, and have the code transfered to the back end for execution. Without having to copy DLLs etc. Now currently in CouchDB this query language is JavaScript so it's super simple to just copy the code into a string.
Any options like this in C#? Can I get a string version of a function? Should I create a external server that hosts these DLL assemblies and serves them up to the CouchDB backends? Should I compile them on the front end, and embed the .cs files as a resource? I'm just brainstorming at this point. Any thoughts/ideas/questions are welcome.
Would you be able to achieve this using Expression trees (System.Linq.Expressions)?
Perhaps some means of passing a serialised expression tree to the back end, compiling this into a lambda/Func<> oin the server side and then executing it?
Have not played with them except for some curiosity research when Linq was unveiled, but it may help.
It sounds like what you're looking for is a Javascript style eval method for C#. Unfortunately that does not exist today and is not trivial to implement. The two main approaches for dynamically executing C# code is
Building lightweight methods via Reflection.Emit
Compiling assemblies on the fly (via CodeDom or straight calls to csc).
Another option to consider here is to take advantage of a dynamic language. Starting with C# 4.0, it's very easy to interop with dynamic languages. Instead of the backend being passed C# code, why not pass Python or Ruby and take advantage of the DLR?
Define "function." Is this arbitrary code (which must be dynamically compiled and executed), or a list of methods that could be attached to a class?
If the latter (which is what you should be doing - allowing arbitrary dynamic code execution not only violates the .NET development model (in my opinion - no flames please), it's a huge security risk), you can use a web service.
Web services and WCF services are an excellent way to distribute processing between machines.
There's a couple of projects I've used in the past that might help you out:
Vici Parser
Vici Parser is a .NET library for
late-bound expression parsing and
template rendering. It also includes
an easy to use tokenizer that can be
used for parsing all kinds of file
formats (a full-featured JSON parser
is included).
This Codeproject article
This example shows an "Eval" function
for C# and implements different usages
of this evaluate function. The library
is written in C# and can be tested
with the MainForm in the solution.
Other alternatives include using XSLT scripting (something Umbraco CMS makes use of) and perhaps something like LUA.NET which would enable you to sandbox what the code can do.