As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
I'm looking for .net distributed map/reduce framework. I intend to use this for a real time data querying and to parallel process the query on multiple nodes. I'm currently using WCF for the communication between the web tier and app tier.
For example, If I have 5 nodes with in memory data. If I pass a filter to the 5 nodes. The filter is executed on a chunk of the data the node has and the results are reduced back to final answer.
Just wondering If there is already a framework which can map the jobs and reduce the results back. I was looking more like a Nimbus of the storm (Twitter real time map/reduce). Can't use nimbus because of many complications. And Zookeeper has too much overhead.
I'm trying to achieve the following using the framework
1) Map the job (mostly a request sent to all the available nodes) to the available nodes and reduce the results.
2) On a fail over map the job to a new node.
3) Manage the cluster. (If a node is down remove it from the list of available servers)
The data will be in memory so I don't need a Distributed file system. A .NET with WCF as communication underneath would be ideal but if there are other frameworks (any language) Please let me know.
Any help (Framework, code project, research papers, actual code :) ) would be appreciated.
Not sure what your comments on nimbus/zookeeper mean as these are management components.
Storm does sound suitable for your use case, but so does others like Hazelcast. I'd need more info on your needs to see which of the solutions might be suitable.
I guess one of the important questions would be what you mean with real time. If you just need short response time and need to work with lots of data, Hazelcast maybe better. If you have unstructured data coming in that you have to parse/process and make available to the user quickly, then Storm might be a better fit.
Related
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
Can anyone give any examples of a single tier architecture that is web-based? I understand that single tier means that all layers are run in the same machine... Would a soap service that returns a number from a database be an example or is that two tiered?
Would a soap service that returns a number from a database be an example or is that two tiered?
Using a database back-end is a two-tier architecture. Another example is the old-school ASP-style of development where the .asp file directly accesses the database.
Can anyone give any examples of a single tier architecture that is web-based?
A single-tier might be a webpage that directly opens a csv file and reads from it. Another example is a web-service that does not require data at all, like a time service.
Actually there is no Single tier web application.
Yes I insist :)
Because the Web Browser and the client machine is actually a tier.
But to make things easy the community assumes to drop this tire due to it's out of developer hands.
Anyway if you consider that any web page which is not dealing with database OR in better way like csharptest.net said :
A single-tier might be a webpage that directly opens a csv file and reads from it. Another example is a web-service that does not require data at all, like a time service
You may consider that as Single tier
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I need to develop an evaluation tool using C# that will run on the system for hours and then will show the overal performance of the system.
The system is supposed to run a service and we want to evaluate how this service is affecting the performance of the system. Will be great if I could use the performance counters that are available in "Windows Performance Monitor"... I'm not sure if there is any API available for developers to use them.
I was just looking for suggestions...
Thanks
If it were me, I'd use perfmon. The advantages are:
Well known data archiving model that offers multiple formats.
Existing tooling to slice and dice the data, including visualization.
Integrates with other systems if the client cares (ie lets them suck the data in to other performance tooling).
Someone else's code. :)
You can wrap perfmon and invoke it programatically if you want. Worst case just invoke it via the command line and start/stop collection that way.
Of course you can also expose your own performance counters for app specific stuff too. There are loads of APIs for this for just about every programming environment I can think of on Windows, including of course C#.
I would strongly suggest you use an existing option like automating the collection of WPM statistics.
otherwise C# may not be the best choice since hardware is almost completely abstracted away from the code by the runtime. additionally the application may require sufficient resources and time to contaminate your results. usually the performace cost between C++ and C# is neglible, but in this case could be a problem.
Good luck.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I'm working on a system (.NET website) that uses class libraries to manipulate data located on a MS SQL server. Lately, I've been wondering about ditching those classes and doing the data manipulation using WCF. That way I could consume the services in Android, Java, etc...
Well, some classes return a large amount of data, say a 125x10000 datatable... and I'm worried that WCF will not be able to handle that... or maybe that the system's performance will suffer too much.
What do you guys think?
Is WCF ok to to use for retrieving/updating large amounts of data on a multi-user system?
If not, what are some other options?
Maybe I'm misunderstanding what you're trying to achieve, but it sounds like what you're suggesting is moving the database load to the application side - pulling entire data tables to the application in order to run your would-be DB operation.That's kind of what the SQL server is designed for.
By doing so, you're putting all of the load onto your application server and underutilizing the DB area of concern. Write some code to handle locking and then let the database do it's job. You'll end up writing more procs, but that's normal, and better in the long run as you can edit them on the back end without having to recompile app code. IMO do all of your database operations in your stored procedure and return the smaller result sets through WCF.
I mean... are you really planning on piping that much info over a WCF service onto cellphone?!
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
I mainly develop Java EE webapps so I don't have any experiences with desktop application at all. Now a friend of mine needed a little tool for daily business which I've build with Seam and a MySQL db in the background. In case of my experience this was done really fast.
Now I want to go further and produce a real small desktop app for him. I've looked at various options and developing a gtk# application with Mono seems my way to go for this little project. The application should be small and fast so I was thinking if a whole MySQL server is needed for my solution here.
What options I could evaluate instead of a database server which has to run as a service on the workingmachine? Storing data as XML?
To clarify the application has now 6 entites (Products, ProductTypes, Colors, Sizes, Orders, Production). On daily basis orders and production are added to a ProductType, very simple stuff.
XML would work for small sets up data, but if you are going to have larger sets I would recommend something like sqlite.
http://www.sqlite.org/
I have looked at various options and I tend to like SQLite
for client applications on .NET. It is a file based solution that does not require a database server to be installed on the machine, much like using an Access database but better.
Try SQLite
Can be other DB will interest, for example, Db4o or SQL CE 4.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I'm currently in the planning phase of a project and I came across an interesting question (for me) - how the f* do DataSets work anyway?
I mean, since now I always used simple SQL-Statements written hard in the code, but that doesn't seem so sophisticated to me. So basically I want to start using DataSets but there are no tutorials out there concerning that. I mean, sure I can design them in VS, but how can I access them in my code? I searched a whole day long but couldn't find a single useful tutorial to that...
Is it because DataSets are no good or just because nobody uses them?
I would be very thankful for any information concerning that...
I think you'd be interested in the first few tutorials at Microsoft's ASP.NET Data Access Tutorial page. The tutorials there guide you through creating a simple ASP.NET site that features accessing data in the Northwind database (SQL Server 2005).
It does a decent job creating, configuring and using Typed DataSets.
Datasets are important part of ADO.NET, and works like in-memory objects and widely used. You can store more than 1 table in DataSet, and apply some relationship among them if needed. Also it works both in reading and writing the data while DataReader is only for reading teh data.
You can go for this link for better understanding
http://msdn.microsoft.com/en-us/library/system.data.dataset(VS.71).aspx