A .Net math expression COMPILER for Windows phone? - c#

I'm making a function graphing app for windows phone where user would input a function for the app to draw. I need a fast (here I mean the fastest possible) expression evaluator. I've seen a lot of math parsers out there, but it seems none of them allow for compiling and evaluating separately. I need this because I need to calculate a lot of data points (1000+) at 30 or even better 60fps. All of those I found take a string and parse it + evaluate at the same time. As I'm making this for windows phone, I cannot compile c# code directly because of restrictions.
It should be able to do something like: 2^2*sin(x/20)+abs(x)/log(x, 2)
SOLVED:
I'm really angry with myself because I couldn't google this out and finaly when I ask a question here, I find the answer myself.
This did the trick:
http://nicoschertler.wordpress.com/2011/09/22/math-parser-using-lambda-expressions/
its so good that 1.5ghz dual core phone can run it at 1/4 pixel precision at 60fps!!

Strange because there are quite some parser generators, which in return give you parser ready to use. It is up to you if you define evaluation while parsing, or just build a math expression for later evaluation.
One disadvantage of general parser generator is, well, the fact it is general, so it does more than you ask. On the other hand, it is easy to add something when the need strikes (I wouldn't dare touch anything in the code you linked).
My own framework (i.e. parser generator) contains very simple calculator (so it does math expression parsing) in 2 forms -- written as C# code (slow) and generated to action table (fast -- it could be faster of course).
http://sourceforge.net/projects/naivelangtools/

Related

User defined formulas, evaluation, and speed

Lets say we have an array of a million elements, and we want to accept user input on what kind of math to apply to each of those elements.
Would the program need to evaluate the user formula (string) a million times, for each of the elements. Or can the formula itself somehow be saved and interpreted only once? And then be applied in a loop to the million elements?
I'm just trying to get a general idea on how this works. Because right now it looks like excel type programs are always interpreted which is way too slow for my data research. So basically what I'm doing is re-compiling my user defined math each and every time I change it. And there seems to be no way of compiling a program, and recompiling it while it itself is running. So that it could accept user input, compile it, and run it, without quitting. Maybe with DLLs and seperate app domains, but then all data has to be marshalled.
I just don't get how one gets top speed for data research, when everything is going against you. Including the stupid operating system that can't allow the DLL to be unloaded, loaded, because of security concerns. Or something. However maybe I'm just spurting nonsense, since I just began programming a few months ago.
In case you are interested in, or prefer to implement the solution yourself, you could parse the formula into a (binary) tree, where each node represents an operator (+, -, *,..) or a function (cos(), max(),..) and each leaf represents either a constant or one of the formula's variables.
This can be done in a pretty efficient way and then applied to the entire data set by recursively calculating the value of the root while filling in the correct variables at the leaves.
To figure out how best to do this, this question might provide some pointers (though I am fairly certain the answer provided there is only one of several ways):
Create Binary Tree from Mathematical Expression
Hope this helps or at least is of some interest to other readers!
If you want user input against a data set and expressions compiled on the fly, then check out a suitable C# REPL.
You should be able to sumply type in your transformations etc., and have the REPL compile and apply them. It's a useful tool/technique for trying out solutions prior to coding them up properly in classes/components.

What Python features will excite the interest of a C# developer? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 2 years ago.
Improve this question
For someone who’s been happily programming in C# for quite some time now and planning to learn a new language I find the Python community more closely knit than many others.
Personally dynamic typing puts me off, but I am fascinated by the way the Python community rallies around it. There are a lot of other things I expect I would miss in Python (LINQ, expression trees, etc.)
What are the good things about Python that developers love? Stuff that’ll excite me more than C#.
For me its the flexibility and elegance, but there are a handful of things I wish could be pulled in from other languages though (better threading, more robust expressions).
In typical I can write a little bit of code in python and do a lot more than the same amount of lines in many other languages. Also, in python code form is of utmost importance and the syntax lends its self to highly readable, clean looking code. That of course helps out with maintenance.
I love having a command line interpreter that I can quickly prototype an algorithm in rather than having to start up a new project, code, compile, test, repeat. Not to mention the fact I can use it to help me automate my server maintenance as well (I double as a SA for my company).
The last thing that comes to mind immediately is the vast amounts of libraries. There are a lot of things already solved out there, the built-in library has a lot to offer, and the third party ones are many times very good (not always though).
Being able to type in some code and get the result back immediately.
(Disclaimer: I use both C# and Python regularly, and I think both have their good and bad points.)
I'm primarily .NET developer and using Python for me personal projects.
What are the good things about python that developers love?
I can say for myself - Python is like a breath of fresh air.
1) It's simple to learn, took about a week for me in the evenings. I'm saying about Python + Django. Python syntax is quite simple.
2) It's simple to use. No troubles installing Python + Django on Windows at all.
3) It can be run on Windows and UNIX.
4) I need it for web, so I get cheaper hosting than ASP.NET.
5) All the advantages of Python language over C#. Like tuples - so useful!
The only thing I don't like is that my favorite IDE Visual Studio doesn't support it (I know about IronPython, don't you worry).
I'm a very heavy user of both C# and Python; I've built very complicated applications in both languages, and I've also embedded Python scripting in my major C# application. I'm not using either to do much in the way of web work right now, but other than that I feel like I'm pretty qualified to answer the question.
The things about Python that excite me, in particular:
The deep integration of generators into the language. This was the first thing that made me realize that I needed to take a long, serious look at Python. My appreciation for this has deepened considerably since I've become conversant with the itertools module, which looks like a nifty set of tools but is in fact a new way of life.
The coupling of dynamic typing and the fact that everything's an object makes pretty sophisticated techniques extremely simple to implement. It's so easy to replace logic with tables in Python (e.g. o = class_map[k]() instead of if k='foo': o = Foo()) that it becomes a basic technique. It's so normal in Python to write methods that take methods as parameters that you don't raise an eyebrow when you see d = defaultdict(list).
zip, and the methods that are designed with it in mind. It takes a while before you can intuitively grasp what dict(zip(k, v)) and d.update(zip(k, v)) are doing, but it's a paradigm-shifting moment when you get there. An entire universe of uninteresting and potentially error-laden code eliminated, just by using one function. Then you start designing functions and classes with the expectation that they'll be used in conjunction with zip, and suddenly your code gets simpler and easier. (Protip: Or itertools.izip. Or itertools.izip_longest.)
Speaking of dictionaries, the way that they're deeply integrated into the language. Understanding what a line of code like self.__dict__.update(**kwargs) does is another one of those paradigm-shifting moments.
List comprehensions and generator expressions, of course.
Inexpensive exceptions.
An interactive intepreter.
Function decorators.
IronPython, which is so much simpler to use than we have any right to expect.
And that's without even getting into the remarkable array of functionality in the standard modules, or the ridiculous bounty of third-party tools like BeautifulSoup or SQL Alchemy or Pylons.
One of the most direct benefits that I've gotten from getting deeply into Python is that it has greatly improved my C# code. I could generally understand code that had a variable of type Dictionary<string, Action<Foo>> in it, but it didn't seem natural to write it. (I use static dictionaries to replace hard-coded logic far more frequently today than I did a year ago.) I have no difficulty understanding what LINQ is doing now, or how IEnumerable<T> and return yield work.
So what don't I like about Python?
Dynamic typing really limits what you can do with static code analysis. Not only isn't there a tool like Resharper for Python, in a language where it's possible to write getattr(x, y)() there really can't be.
It has a bunch of inelegant conventions. How I would love to be able to go back in time and try to talk GVR out of the idea that lambda expressions should be introduced with the word lambda - it's pretty damning that something as fundamental as lambda expressions should be more concise in C# than they are in Python. The leading and trailing double-underscore convention is horrible, and the fact that people mutely acquiesce to it is testimony to Dostoevsky's observation that man is the animal who can get used to anything. And don't get me started on the fact that a module with the name of StringIO was allowed to get out the door.
Some of the features that make Python work on multiple platforms also make it kind of baffling. It's easy to use import, but it's really not easy to understand what the hell it's actually doing. (Where is it looking? What does __init__.py do? Etc.)
The amazingly rich library of standard modules is so amazingly rich that it's hard to know what's in it. It's often easier to write a function than it is to find out whether or not there's something in the standard library that does the same thing - I'm looking at you, itertools.chain.
Your question is kind of like a plumber asking why carpenters are always going on and on about hammers. After all the plumber doesn't have a hammer and has never missed it. Python (even IronPython) and C# target different types of developers and different types of programs. I am very comfortable in Python and enjoy the freedom to focus on the business rules without being distracted by the syntax requirements of the language. On the other hand I have written some fairly substantial code in C# and would be very concerned about the lack of type safety had I taken on the same task in Python. This is not to say that Python is a "toy" language. You can (and people have) write a complete medium or large application in Python. You have the freedom of dynamic typing, but you also have the responsibility to keep it all straight (frameworks help here). Similarly you can write a small application in C#, but you will bring along some overhead you do not likely need.
So if the problem is a nail use a hammer, if the problem is a screw use a screw driver. In other words spend some time to learn Python, get to know it's streangths (text processing, quick coding cycles, simple clean code, etc) and then when you are looking at tackling a new problem ask whether you would be better off in Python or C#. One thing is certain. So long as C# is the only programming language you know, it is the only one you will ever use.
Pat O
My language of choice is C#, and I didn't quite see the point for me to learn Python so far. This talk from PDC09 really piqued my interest: the guy demonstrates how you can use IronPython (or IronRuby) to make a C# app scriptable (in his demo, drop a Python script in a text box, and it works with/extends your C# code). I found this really fascinating: I don't even know where I would start to do something similar in C#, and this made me at least appreciate that it brings something different to the table, which could really enrich what I can develop!
I'm an asymmetrical user of both languages, in a sense that I use C# mostly professionally and Python for all my "fun" projects (not that work is never fun, but... you know...)
This difference of context may skew my perspective, including my opinion that they are two distinct types (pun intended) of languages for, generally, distinct purposes.
This said, it may not be a coincidence that Python is, at this point in time, [one of?] the languages of choice for all kinds of cutting edge, somewhat scholarly, technology/science oriented projects. (And BTW, this "scholarly" keyword here doesNOT imply, that Python is a university toy, plenty of "serious" applications in plenty of domains/industry are proof to the contrary). This may be due to several factors:
(I don't develop most points, readily well expressed in other responses)
the openness and quasi universal availability of Python (unlike C# !)
the lightweight / ease of use / low learning curve
the extensive, high quality, "standard" library and the extensiver (and occasionally bum quality, but on the whole available, open-sourced, etc.) additional library.
the wide array of open source projects in Python language
the relative ease to bind with C/C++ for reusing legacy code, but also for placing performance-critical portions of a project
the generally higher level of abstraction of may constructs of the language
the multi-paradigms (imperative, object oriented and functional)
the availability of practitioners in so many domain of science and technology
and, yes, the
"herd mentality effect" mentioned in a remark, possibly in a [self?] deriding way. The fact that a language attracts a broad, "closely knit" community, makes it attractive too, beyond the superficial ("look cool" and such) traits of herd mentality. Put in broader context, sometimes the best technology/language to use is not measured on the its intrinsic merits but on the overall "picture", including the user community.
I like all stuff with [] and {}. Selectors like this [-1:1]. Possibility to write less code, but more something meaningfull, that gives to write Models and other declarative things very DRY.
Like any programming language, it is just a tool in the box or a brush by which you may paint your creation. Any creative endeavour requires that the artist loves the tools he uses; otherwise, the outcome suffers. Some people like Python for the same reason others love Perl. Incidentally, I have found that most Python lovers loathe Perl's flexible and expressive syntax. As a Perl lover, I don't hate Python, but consider it to be overly structured and restrictive.
If you ask me, all of these throngs of people who seem to love Python were silently suffering under the tool choices before Python came into being. Some suffered under Perl, others under something else. In other words, I believe that when Python came along, it found a large group of silent sufferers longing for a tool like Python.
I can't program in Python because I can't "think" in Python. I can "think" in Perl, therefore, it is the tool I prefer. The silently suffering mass of, now, Python users seem to have found some long lost salvation. Now if they could only keep their evangelism to themselves :).
If you are familiar with the .NET CLR and prefer a statically-typed language, but you like Python's lightweight syntax, then perhaps Boo is the language for you.
Don't get me wrong, I am and will always be a devoted fan of C#.
But sometimes there are things I can't do in C#. lthough C# keeps reducing those gaps, Python is still the language I go to to fill them.
It's dynamic, flexible, powerful, and clean. Lovely language. Whenever I need to script or build dynamic or functional (as in functional programming) software, I go Python.
For me Python is the most elegant language I've used. The syntax is minimalist (significantly less punctuation than most) and intentionally modeled after the psuedo-code conventions which are ubiquitously used by programmer to outline their intentions.
Python's if __name__ == '__main__': suite encourages re-use and test driven development.
For example, the night before last I hacked together to run thousands of ssh jobs (with about 100 concurrently) and gather up all the results (output, error messages, exit values) ... and record the time take on each. It also handles timeouts (An ssh command can stall indefinitely on connection to a thrashing system --- it's connection timeouts and retry options don't apply after the socket connection is made, not matter if the authentication stalls). This only takes a few dozen lines of Python and it's really is easiest to create it as a class (defined above the __main__ suite) and do my command line parsing in a simple wrapper down inside __main__. That's sufficient to do the job at hand (I ran the script on 25,000 hosts the next day, in about two hours). It I can now use this code in other scripts as easily as:
from sshwrap import SSHJobMan
cmd = '/etc/init.d/foo restart'
targets = queryDB(some_criteria)
job = SSHJobMan(cmd, targets)
job.start()
while not job.done():
completed = job.poll()
# ...
# Deal with incremental disposition of of completed jobs
for each in sorted(job.results):
# ...
# Summarize results
... and so on.
So my script can be used for simple jobs ... and it can be imported as a module for more specialized work that couldn't be described on my wrapper's command line. (For example I could start up "consumer" subprocesses for handling other work on each host where the job was successful while spitting out service tickets or automated reboot requests for all hosts reporting timeouts or failures, etc).
For modules which have no standalone usage I can use the __main__ suite to contain unit-tests. Thus every module can contain its own tests ... which, in fact, can be integrated into the "doc strings" using the doctest module from the standard libraries. (Which, incidentally, means that properly formatted examples in the documentary comments can be kept in sync with the implementation ... since they are parts of the unit-test suite).
The main thing I like about Python is its very concise, readable syntax. Though using indentation as a block delimiter can seem strange at first, once you begin to code a lot in the language I find it begins to make sense. Though the core language is quite simple, its more advanced features, e.g. list comprehension, decorators and generators, are rather useful too.
In addition, the Python standard library is just fantastic; its documentation is very well written, and it contains a lot of very useful packages. I also find that there are plenty of good bindings for C libraries, such as PyGTK, Webkit and Qt, to name but a few.
One caveat is that Python, like most dynamic languages, is quite slow in comparison with compiled, statically-typed languages. However, you can easily extend it with C, allowing you to write code requiring better performance in C and the rest in Python.
It's a great language overall, and (for me at least) makes coding more productive and enjoyable.

Is there a tool to convert ANSI C code to C#?

I need to import some ANSI C code into a project I'm working on. For reasons I prefer not to go into, I want to refactor the code to C# rather than trying to wrap the original code. It will take me perhaps a couple of days to do the work by hand, but before I start, is there an automated tool that can get me most of the way there? I'm a cheapskate and the work I'm doing is pro bono, so free tools only please.
If manual refactoring is "only" going to take a few days, that would get my vote. Depending on what the C code is doing and how it is written (pointers, custom libraries, etc.) an automated converter may just make a mess. And untangling that mess could be a larger task than just converting by hand.
Setting up, cleaning up, refactoring, and just plain making converted code (if there is even a converter available) would probably be something in the magnitude of weeks or months, so you'll be better served just to go ahead with the manual rewrite.
There is an experimental CLI Back-End and Front-End for GCC. It is already capable of compiling a subset of C programs into CIL, the byte-code that the CLR runs.
(The webpage makes it seem like the code was only developed over a few months and then ignored since then, but it's out of date; ST Microelectronics is continuing maintenance and development.)
You don't specify why you want a C to C# translator, but if you just want to get C and C# to play together without P/Invoke or COM, it might be good enough.
It may make sense to start by getting the existing code to compile as managed C++ aka C++/CLI. Assuming that went smoothly enough, then you have a working, testable foundation on which to build. Move key features to their own classes, and as needed, rewrite as C# along the way.

Algorithm visualization for C#

Is there any software that visualizes algorithms from code? As a flow chart of something similar. Not dependencies, inheritance and that kind of thing, but the code inside a function, or a series of functions.
I don't know about inside a function, but VS2010 has sequence diagram generation from code - see here or here
I think you may be looking for Code Rocket.
It provides flowchart and pseudocode visualizations of code methods and algorithms, embedded directly inside Visual Studio and Eclipse - and there is a separate Designer application for working outwith IDEs too.
AiVosto has a set of tools to visualize source code from many languages: Visustin. It is on the market since a long time. I tried it a very long time ago, was not 100% convinved. Maybe you want to give it a try and evaluate if it's worth the money for you.
For me it was so that in order to understand some complex algorithm, I have anyway to experiment with it, having a graph helps a bit but as a coder you probably can visualize loops and decision trees well without software. I don't want to discourage you from using that, just try it out before investing money, having graphs is nice but shall also be useful.

What's the "Hello World!" of genetic algorithms good for?

I found this very cool C++ sample , literally the "Hello World!" of genetic algorithms.
I so decided to re-code the whole thing in C# and this is the result.
Now I am asking myself: is there any practical application along the lines of generating a target string starting from a population of random strings?
EDIT: my buddy on twitter just tweeted that "is useful for transcription type things such as translation. Does not have to be Monkey's". I wish I had a clue.
Is there any practical application along the lines of generating a target string starting from a population of random strings?
Sure. Imagine any scenario in which you know how to evaluate the fitness of a particular string, and in which the choices are discrete and constrained in some way:
Picking pronounceable names ("Xhjkxc" has low fitness; "Artekzo" has high fitness)
Trying out a series of chess moves
Guessing the combination to a safe, assuming you can tell how close you are to unlocking each tumbler
Picking phone numbers that evaluate to words (e.g. "843-2378" has high fitness because it spells "THE-BEST")
No. Each time you run the GA, you are giving it the eventual answer. This is great for showing how a GA works and to show how powerful it can be, but it does not have any purpose beyond that.
You could write an EA that writes code in a dynamic language like IronPython with the goal of creating code that a) executes without crashing and b) analyzes the stock market and intelligently buys and sells stock.
That's a very simplistic take on what would be necessary, but it's possible. You would need a host that provides a lot of methods for the IronPython code (technical indicators, etc) and a database of ticks.
It would also be smart to not just generate any old random code, lest you format your own hard-drive. You need a sandbox, and you need to limit the namespaces that are accessable, and you would need to provide a time limit to avoid infinite loops. You could also provide symantic guidelines that allow it to choose appropriate approved keywords instead of just stringing random letters together -- this would greatly speed up evolution.
So, I was involved with a project that did everything but the EA. We had a satellite dish that got real-time stock ticks from the NASDAQ, a service for trading that had an API, and a primitive decision making "brain" that made decisions as the ticks came in.
Sadly, one of the partners flipped out, quit his job, forked the project (got his own dish, etc), and started trading with logic that wasn't ready. He lost a bunch of money. It turns out that for some people this type of project is only a step away from common gambling. But anyway, the project kind of fizzled out after that. Evolving the logic part is the missing link though. And I know there are people out there doing this type of thing.
I have used GA in 2 real life research problems.
One was a power optimization problem (maximize number of appliances turned on, meeting the available power constraint and service guarantee for each appliance)
Another was for radio network optimization, maximizing the coverage area given a fixed equipment budget
GA has one main disadvantage, it usually works with genetic speed so using it in some serious time-dependant projects is quite risky.

Categories

Resources