Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I am supposed to develop an enterprise-class ASP.NET Web Application that supports multiple companies.
What are the pros and cons to create separate databases for each company? Won’t it very resource consuming for a server to support connections to multiple databases or it’s better to have only one database? Let’s say I have 2000 companies and each of them has 100 employees.
What is the best approach to design the system in this case?
You might be interested in the below links:
Multi-Tenant Data Architecture
Stack Exchange
SO
Pros: Much faster querying; better overall performance, security and scale-ability; easier fine tuning of each individual database, easier deployment and troubleshooting.
Cons: maintaining 2000 backup jobs
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
I have a web page where I need to get fields from many tables in SQL that can all join. Is it better in terms of performance to do a few queries to the database or one big SQL statement. I'm using MVC with Web API calls.
Thanks.
In the past I've created DB Views to define the data I'm looking for. Depending on the data access framework you are using this can also be helpful in returning the data and translating it to objects.
As far as performance goes, I'm most familiar with SQL Server, and in the Management Studio there is an option to "Include Actual Execution Plan". This will show you a full breakdown of your JOIN statements, what indexes are being used if any, and will suggest indexes to speed performance. I recommend this tool to all developers on my teams when they are stuck with a slow performing page.
One other thing to note, the database configuration also makes a difference. If you are running a local database you will have fewer concerns than if you were running a cloud based database (Azure SQL, etc) as those have management overhead beyond your control when it comes to availability and physical location of your instance at any given time.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I am considering NPoco for a large business application. I am a bit concerned that is it worth to develop a repository pattern?
I will need to handle complex SQL statements and extensive database operations on several entities in a same use case.
NPoco provides a variety of db operations functions, which I think will not be (directly) exposed to the consumer layer(s) by my repository layer.
Edit-1
Which approach is better to get most out of NPoco?
If you want to invest time in unit tests, then the Repository Pattern is worth it. However if most of your application logic is in stored procs and functions, then you will be limited with unit tests and just spend your time in integration testing.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I want to test my asp.net web api REST service that hosted in a HP-580dl and I want to measure the performance and response time when 10,000 simultaneous requests hit the service.
is there any way to do that in C# ?
Siege is a good tool for measuring load under concurrent requests: http://www.joedog.org/siege-home/
It's not written in C#, but there's no reason why it should be.
The Visual Studio load testing tools provide this functionality, and can control multiple agents in cases where you want a distributed profile and/or a greater concurrency level than a single client machine can support.
Create and run a load test
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
As Big Data term used widely to manage huge data, I want to give a try to build a small application with Big Data to understand structure and how I can start with ASP.NET technology?
Is it possible?
"Big data" is a marketing term for "highly scalable large load computing". So can you use ASP.NET for highly scalable large load computing...
Yes, and here is how (Scaling Strategies for ASP.NET Applications).
Adding to Scott's answer, apart from ASP.NET being capable of scaling to high loads with effective strategies, .NET ecosystem also provides HDInsight in Azure, which implements MapReduce programming model to query over large clusters of Data.
Azure HDInsight could closely be related to the marketing buzzword of 'Hadoop','Big Data' etc.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I build a framework that is for high traffic site, the site only for other application site log records.
I used redis or file in the middle layer to cache log
In a certain period of time, the program will take cache into the database.
Because there is a large amount of data, I need a ORM what is lightweight and agile .
I use Mysql5.x, because the SQL server of the relatively high price.
I am more familiar with ibatis (for Java),but ibatis.net was not update for two years, so, ibatis.net whether can meet the requirements?
if ibatis can do it,I can reduce a lot of learning time.
Or are you any better suggestions?
c#'s new technology, I am not very familiar with, please pointing
For anything high performance, I use Dapper. It's a micro-ORM with very high performance developed and used by the very website your are seeing! (stackoverflow.com).
It has a NuGet package too, which you can install by Install-Package Dapper command in package manager console.