Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I'm developing my financial app for simple operations. But I need to be able to make note of any changes made to my records. I use C# ASP.NET for my application code and MS SQL 2014 for my DB.
My question is: which is better ?
Application Layer: write a method in code that creates a record in a history table in DB?
Database Layer: write trigger/procedure that creates a record in a history table in DB ?
My main concern is performance - the app is mounted on a IIS server in my home so i need to fine tune this so it will be lag-less for my users and doesn't put a big workload on my server.
This has a lot to do with preferences and maintainability.
For the sake of maintainability, I always go with the Layered approach, myself.
Here's an example why.
Let's say you have your SQL Inserts and updates scattered throughout the code. And then, one day, your database changes in such a way, that you would have to look up each and every one of those inserts and updates, and change all of them one by one.
For the sake of argument, you could also create one function that does all of these inserts and updates, and call that specific function every time from your app. That would work, but could become a clutter as you'll eventually end up with a lot of those functions for different tables, views, etc, ...
Now let's say you have used the layered approach. You would then simply find the class that does all updates and inserts to one specific table and simply do your changes there. The rest of the application perhaps doesn't even need to be aware of this change.
Performance is (in my opinion) not really a factor. Making an object isn't expensive at all, but is hardly measurably as it is. So I'd say go with the layered approach.
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
I have a web page where I need to get fields from many tables in SQL that can all join. Is it better in terms of performance to do a few queries to the database or one big SQL statement. I'm using MVC with Web API calls.
Thanks.
In the past I've created DB Views to define the data I'm looking for. Depending on the data access framework you are using this can also be helpful in returning the data and translating it to objects.
As far as performance goes, I'm most familiar with SQL Server, and in the Management Studio there is an option to "Include Actual Execution Plan". This will show you a full breakdown of your JOIN statements, what indexes are being used if any, and will suggest indexes to speed performance. I recommend this tool to all developers on my teams when they are stuck with a slow performing page.
One other thing to note, the database configuration also makes a difference. If you are running a local database you will have fewer concerns than if you were running a cloud based database (Azure SQL, etc) as those have management overhead beyond your control when it comes to availability and physical location of your instance at any given time.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
Problem:
I have an instance of an application on one computer (C# VS 2015), it is connected to a database (SQL).
I have other instances of the same application on other computers.
All computers are connected, and all instances are working with the same database.
When the software writes into the database, the UI will change accordingly.
Question:
How to I refresh the UI of EVERY instance when any machine changes the database?
Example:
Clicking buttonA:
1. Create record in database.
2. Change buttonA background color to green.
3. Tell other computers to refresh their instances to show their buttonA as green too.
Hope this wasn't a stupid question, I appreciate all help!
This is not a stupid question, but rather a very common requirement. The best solution (in my opinion) would be to create a WCF service all clients talk to.
Only the service performs operations on the database and is designed as a two-way service. That way, once a client is connected, the service can instruct it to refresh itself if another client changed something. No client would change the database itself.
If that isn't possible, I'm afraid polling the database for changes is the only common option that works for any RDBMS. There may be other options for the RDBMS you're using.
How fast do you need to refresh? If you have some seconds, then I would suggest to use a polling approach: just query the database every x seconds, for all records updated after the previous poll.
The returned records are those to be refreshed in the UI. Indexing that datetime field would help to minimize the impact of frequent/multiple queries.
If you have a web based UI then something like SignalR would do this for you. I've used this before and it works great.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
We have a .NET application using ADO.NET and Entity Framework and a lot of legacy stored procedures. Occasionally an operation will bring the database server's CPU to 100% for seconds or even minutes. During this time no other operations can be executed against the database. There is some culprit code which is far too complex and business critical to be feasible refactoring in the short term, but this can also happen from newer code depending on the situation.
I would like to prevent any one SQL operation from taking 100% of the CPU, etc. Is there any way to configure MS SQL to provide no more than, say 20% of the CPU, to any one query?
I know that ideally we would rewrite the code to not be as intensive, but that is not feasible in the short term, so I'm looking for a general setting which ensure this can never happen.
Take a look at the Resource Governor (assuming you're using SQL 2008 or up). A good simple overview on usage is here. Though it won't work necessarily on a specific query, using a reasonable classifier function will/should allow you to narrow it down pretty closely if you like. I don't have 10 rep yet so I can only post 2 links, but if you google "Sql Server classifier function" you'll get some decent guidance.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I'm working on a project in which I have a PostgreSQL database containing various tables and what not. I have a c# application that needs to be able to enter new data/change existing data/ or retrieve data from the database.
My question is, is there some tool/framework that I should be using to make this interaction easier. Currently, we have written a bunch of database helper functions to make various common operations simple in C#, we've written classes that define what a Column and Table are. We have defined data model classes for all the tables in the database as well as defined table classes corresponding to each data model class.
Basically, we written a lot of code just to be able to interface with the database and I feel like there has to be a simpler way. But maybe not! That's why I'm asking.
I saw some people using the Entity Framework but I'm not familiar with it and don't know if it does what I'm looking for. Also, it seemed like it was geared towards MySql and the free PostgreSql tool for it were a bit hacky.
The short answer is: you are probably already doing it the best way with current technologies.
I have tried several ORM solutions and all of them failed to live up to expectations on one level or another. Entity Framework is starting to get mature, but as you already found, the PostgreSQL adapters are essentially hacks since EF is baked for Microsoft SQL at a low level. One of the better ones I have used is Telerik's ORM, but it costs $$$$ and is far from perfect (http://www.telerik.com/data-access).
One to look at is this: https://www.nuget.org/packages/Shaolinq.Postgres.DotConnect/ Note though that is is an early version, I have not used it myself and generally use at your own risk:)
There are some code first and Node.js specific tools out there that are starting to get interesting, but in the end, having some decent reusable data access library and object classes is generally still the best solution for general use.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I am writing a small c# database application that will store sports statistics. There will be about 7 tables in the local sdf file that will store the data. One table for storing the player details, another table to store the game info and another table that will store the the actual stats from the game using the playerid and gameid as foreign keys. I personally see this as a small database as it will only grow by about 30 entries per week. The end result of this is to be about to pull reports out of the collected stats.
I am a little confused as to which way to access the data in the database. Datasets look ok, but when i want queries to access multiple tables or use the WHERE function in the query, things get a little troublesome. I was thinking of just directly accessing the database with out the need for datasets.
Opinions on the best options are appreciated.
Thanks
Datasets area relatively old technology which is steadily being replaced by Entity Framework. For any new development looking for a standard Data Access technology, Entity Framework should be your primary solution. The Model-Based option feels a lot like DataSets in the designer (you can design your model by dragging tables and relationships to the surface), but Entity Framework can also work directly against your code (EF-CodeFirst) which many people find better, since you have total control over what your code will look like (plus it won't get overwritten each time you save the datamodel).
Unless you are open to 3rd party libraries, in which case there are a couple of great open source alternatives that include NHibernate and a few others.