If EDM for 3 Tables is as shown in figer given below.
then,
is it good approach to use such foreign keys in EDM and in DB ?
can such DB design slower down execution ?
which architecture will be preferable with use of foreign key or
without use of foreign key ?
if any one want data from table-3 from id of TAble-1
then it can be done in 2 ways.
I.
db.TAble_1.FirstOrDefault(m => m.TAble_1ID == 2).Table_2.FirstOrDefault().TAble_3;
II.
By Join in LINQ
So which one is faster ?
if any one want data from table-1 from id of TAble-3
then it can be done in 2 ways.
I.
db.TAble_1.FirstOrDefault(m => m.TAble_3ID == 2).Table_2.TAble_3;
II.
By Join in LINQ
So which one is faster ?
Thank You all ,in advance.
Nope, Basically what is EDM or Lets take an example of Linq to Sql, It is just a converter of your Object Queries to SQL, and then it fires the same as the Inline query , using a SqlConnection object.
This Question is two - fold :
Is the Database Design Proper? - Coz this is actually going to Make a Difference in the Performance. Design makes the Core of the Queries, its refining can take care of most of your Performance issues.
EDMX , Linq To Sql, Entity Framework or some other ORM (Object Relational Mappers) : Normally why would one use a ORM? Since they have small queries and they are not very proficient into SQL Server, or they want the Learning curve for the Developers to be blunt comparatively, and need to right the simple small queries at their ease and pretty Rapidly (Having a Stored procedure for a Simple Select doesnt make sense.) So it solves your problems by Converting your Object Queries (LINQ) to Sql statements.
Answer for your Case:
It depends on for which queries you use the same, The queries which you have stated it wont make difference if it has joins, (Considering your Design is proper.) EDM , or anything stated above is fine for any of such queries,
But yes dont use it for "TOO BIG QUERIES" or dont Replace Stored Procedures with this since what happens is it converts your Object queries to SQL statements, sends it to the DB Server , Sql Server Engine parses the Queries, and then Executes; So in case the Queries are too big then in that case the Network Latency, Compilations of the Queries, (& since the Queries are just being Compiled it doesnt compile the Execution plan.) whereas in case of Stored Procedures it is already there and the Queries run pretty faster
Useful links:
Execution Plan
Related
When I am developing an ASP.NET website I do really like to use Entity Framework with both database-first or code-first models (+ asp.net mvc controllers scaffolding).
For an application requiring to access an existing database, I naturally thought to create a database model and to use asp.net mvc scaffolding to get all the basic CRUD operations done in a few minutes with nearly no development costs.
But I discussed with a friend who told me that accessing data stored in the database only through stored procedures is the best approach to take.
My question is thus, what do you think of this sentence? Is it better to create stored procedures for any required operations on a table in the database (e.g. create and read on this table, update and delete only on another one, ...)? And what are the advantages/disadvantages of doing so instead of using a database-first model created from the tables in the database?
What I thought at first is that it double costs of development to do everything through stored procedures as you have to write these stored procedures where Entity Framework could have provided DbContext in a few clicks, allowing me to use LINQ over Entities, ... But then I've read a few stuff about Ownership Chains that might improve security by setting only permissions to execute stored procedures and no permissions for any operations (select, insert, update, delete) on the tables.
Thank you for your answers.
Its a cost benefit analysis. Being a DB focused guy, I would agree with that statement. It is best. It also makes you code easier to read (no crazy sql statements uglifying it). Increased performance with cached execution plans. Ease of modifying the querying without recompiling the code, eetc.
Many of the ppl I work with are not all that familiar with writing SPROCs so it tends to be a constant fight with them use them. Personally I dont see any reason to ever bury SQLStatments in your code. They tend to shy away from them b/c it is more work for them up front.
Yes, it's a good approach.
Whether it's the best approach or not, that depends on a lot of factors, some of them which you don't even know yet.
One important factor is how much furter development there will be, and how much maintainence. If the initial development is a big part of the total job, then you should rather use a method that gets you there as fast and easy as possible.
If you will be working with and maintaining the system for a long time, you should focus less on the initial development time, and more on how easy it is to make changes to the system once it's up and running. Using stored procedures is one way to make the code less depending on the exact data layout, and allows you to make changes without a lot of down time.
Note that it's not neccesarily a choise between stored procedures and Entity Framework. You can also use stored procedures with Entity Framework.
This is primarily an opinion based question and the answer may depend on the situation. Using stored procedure is definetely one of the best ways to query the database but since the emergence of Entity Framework it is widely used. The advantage of Entity Framework is that it provides a higher level of abstraction.
Entity Framework applications provide the following benefits:
Applications can work in terms of a more application-centric conceptual model, including types with inheritance, complex members,
and relationships.
Applications are freed from hard-coded dependencies on a particular data engine or storage schema.
Mappings between the conceptual model and the storage-specific schema can change without changing the application code.
Developers can work with a consistent application object model that can be mapped to various storage schemas, possibly implemented in
different database management systems.
Multiple conceptual models can be mapped to a single storage schema.
Language-integrated query (LINQ) support provides compile-time syntax validation for queries against a conceptual model.
You may also check this related question Best practice to query data from MS SQL Server in C Sharp?
following are some Stored Procedure advantages
Encapsulate multiple statements as single transactions using stored procedured
Implement business logic using temp tables
Better error handling by having tables for capturing/logging errors
Parameter validations / domain validations can be done at database level
Control query plan by forcing to choose index
Use sp_getapplock to enforce single execution of procedure at any time
in addition entity framework will adds an overhead for each request you make, as entity framework will use reflection for each query. So, by implementing stored procedure you will gain in time as it's compiled and not interpreted each time like a normal entity framework query.
The link bellow give some reasons why you should use entity framework
http://kamelbrahim.blogspot.com/2013/10/why-you-should-use-entity-framework.html
Hope this can enlighten you a bit
So I'm gonna give you a suggestion, and it will be something I've done, but not many would say "I do that".
So, yes, I used stored procedures when using ADO.NET.
I also (at times) use ORM's, like NHibernate and EntityFramework.
When I use ADO.NET, I use stored procedures.
When you get data from the database, you have to turn it into something on the DotNet side.
The quickest thing is to put data into a DataTable or DataSet.
I no longer favor this method. While it may make for RAPID development ("just stuff the data into a datatable")......it does not work well for MAINTENANCE, even if that maintenance is only 2-3 months down the road.
So what do I put the data into?
I create DTO/POCO objects and hydrate the data from the database into these objects.
For example.
The NorthWind database has
Customer(s)
Order(s)
and OrderDetail(s)
So I create a csharp class called Order.cs, Order.cs and OrderDetail.cs.
These ONLY contain properties of the entity. Most of the time, the properties simple reflect the columns in the database for that entity. (Order.cs has properties, that simulate a Select * from dbo.Order where OrderID = 123 for example).
Then I create a child-collection object
public class OrderCollection : List<Order>{}
and then the parent object gets a property.
public class Customer ()
{
/* a bunch of scalar properties */
public OrderCollection Orders {get;set;}
}
So now you have a stored procedure. And it gets data.
When that data comes back, one way to get it is with an IDataReader. (.ExecuteReader).
When this IDataReader comes back, I loop over it, and populate the Customer(.cs), the Orders, and the OrderDetails.
This is basic, poor man's ORM (object relation mapping).
Back to how I code my stored procedures, I would write a procedure that returns 3 resultsets, (one db hit) and return the info about the Customer, the Order(s) (if any) and the OrderDetails(s) (if any exist).
Note that I do NOT do alot of JOINING.
When you do a "Select * from dbo.Customer c join dbo.Orders o on c.CustomerID = o.CustomerId, you'll note you get redundant data in the first columns. This is what I do not like.
I prefer multiple resultsets OVER joining and bringing back a single resultset with redundant data.
Now for the little special trick.
Whenever I select from a table, I always select all columns on that table.
So whenever I write a stored procedure that needs customer data, I do a
Select A,B,C,D,E,F,G from dbo.Customer where (......)
Now, alot of people will argue that. "Why do you bring back more info than you need?"
Well, real ORM's do this anyway. So I am poor-man reflecting this.
And, my code for taking the resultset(s) from the stored procedure to turn that into instances of objects........stays consistent.
Because if you write 3 stored procedures, and each one selects data from Customer table, BUT you select different columns and/or in a different order, youre "object mapper" code needs to have a method for each stored procedure.
This method of ADO.NET has served me well.
And, once my team swapped out ADO.NET for a real ORM, and that transition was very pain free because of the way we did the ADO.NET from the get go.
Quick rules of thumb:
1. If using ADO.NET, use stored procedures.
2. Get multiple result-sets, instead of redundant data via joins.
3. Make your columns consistent from any table you select from.
4. Take the results of your stored procedure call, and write a "hydrater" to take that info and put into your domain-model as soon as you can. (the .cs classes)
That has served me well for many years.
Good luck.
In my opinion :
Stored Procedures are written in big iron database "languages" like PL/SQL or T-SQL
Stored Procedures typically cannot be debugged in the same IDE your write your UI.
Stored Procedures don't provide much feedback when things go wrong.
Stored Procedures can't pass objects.
Stored Procedures hide business logic.
Source :
http://www.codinghorror.com/blog/2004/10/who-needs-stored-procedures-anyways.html
I normally use Entity Framework for my SQL queries which is great as it allows me to dynamically construct queries in a strongly-typed and maintainable fashion.
However, I'm on a project at the moment which utilises spatial queries. Most of my queries will output a resultset order by time or distance from a given co-ordinate. However, I have found that ordering by STDistance slows the query down 10 fold. (actually it slows down if I join on another table in addition to the "order by")
I have managed to optimize the query myself and got the performance back to where it should be, however, this query cannot be produced by Entity Framework.
So I could end up having a set of "order by time" queries generated by EF and then another set of "order by distance" queries as stored procedures in SQL Server. The trouble is that, basically, this latter set of queries would have to be created via string concatenation (either in an SP or C#).
I have been trying to get away from sql string concatenation for years now and these ORM frameworks are great for 99% of queries, however, I always find myself having to go back to string concanenation so I get the most optimal queries sent to the server. This is a maintenance nightmare though.
String concatenation was solved in ASP.NET with templating engines, which essentially allow you to build html strings. Does anyone know such a solution for SQL strings? Although it's a bit messy in some respects it would allow for the most optimal queries. In my view this would be better than
String concat in a stored proc
String concat in C#
Masses of duplicate code in stored procs covering all possible input parameters
LINQ queries which can create sub-optimal SQL
I'd love to know your thoughts about this general problem and what you think of my proposed solution.
thanks
Kris
Have you checked out the latest Entity Framework Beta? It is supposed to have support for spatial data types.
Also, if you want to build SQL queries dynamically, check out PetaPoco's SQL Builder. Some examples from the site:
Example 1:
var id=123;
var a=db.Query<article>(PetaPoco.Sql.Builder
.Append("SELECT * FROM articles")
.Append("WHERE article_id=#0", id)
.Append("AND date_created<#0", DateTime.UtcNow)
)
Example 2:
var id=123;
var sql=PetaPoco.Sql.Builder
.Append("SELECT * FROM articles")
.Append("WHERE article_id=#0", id);
if (start_date.HasValue)
sql.Append("AND date_created>=#0", start_date.Value);
if (end_date.HasValue)
sql.Append("AND date_created<=#0", end_date.Value);
var a=db.Query<article>(sql)
Example 3:
var sql=PetaPoco.Sql.Builder()
.Select("*")
.From("articles")
.Where("date_created < #0", DateTime.UtcNow)
.OrderBy("date_created DESC");
Maybe it would be an option to use a T4 template to generate the views (and corresponding queries). You could modify either the template or override the generated output selectively (although this may possibly also require you to modify the template).
Microsoft provides a T4 template for this purpose (assuming you're not using code-first, as I'm not sure the equivalent exists for that scenario).
Incidentally, pre-compiling the views also provides for faster startup as the views don't have to be generated at run-time.
Hard to understand the question.
Any kind of EF orm will always be slower than handcrafted sql. So we're left with manually creating and managing those sql queries. This can be done
manually write the procs
write "smart procs" with sql string concatenation in them
use templating engine for generating those procs at compile time
Write your own linq to sql provider for runtime generation of queries
all those have upsides and downsides, but if you have good unit test coverage it should protect you from obvious errors where someone has renamed the field in the database.
I am working on a piece of functionality where a user may select multiple parameters with multiple values in each parameter. I am trying to figure out a way to design this functionality in my application using C#, entity framework with entities being mapped to stored procedure. Due to security reasons, my application has to access the database via a surrogate database which has only stored procedures. Therefore my entities are mapped to stored procedures for insert, update, and select.
Ultimately, I need to pass the filters chosen by the user to the stored procedure for querying the database. One of the solutions I thought of is to retrieve all the data to my business layer and use linq to filter out further. But this is not ideal due to the amount of data being filtered in the memory rater than in the database which is more better suited to do this kind of complex query.
I have seen posts for constructing dynamic queries with linq, but in these kind of posts, the entities are mapped to the tables which makes it easier.
Any help here will be greatly appreciated.
Thank You,
sirkal
EF (and LINQ for that matter) use deferred execution. You can fairly easily create a dynamic query using IQueryable (Google search time?) and creating the filter criteria in an object you build (you can do it without the object, but think reusable).
As for the SQL sproc, you can also solve it there by passing in all of the items that can possibly change the filter and having SQL dynamically work on the data to produce a result set.
Which to choose? It really depends on where the core competency is in your group. I prefer C# code mostly due to familiarity (spent years doing sprocs, but dynamic sprocs can be a royal pain).
Now, one thing you do want to be wary of is ending up with dynamic queries that cannot easily be tuned by the server (like statistics in SQL Server, although other RDBMSes use similar concepts). One issue I have seen with LINQ to SQL, for example, is dynamic queries that cause SQL to perform less than optimally, requiring a lot of handholding from the DBA.
I ultimately have two areas with a few questions each about Entity Framework, but let me give a little background so you know what context I am asking for this information in.
At my place of work my team is planning a complete re-write of our application structure so we can adhere to more modern standards. This re-write includes a completely new data layer project. In this project most of the team wants to use Entity Framework. I too would like to use it because I am very familiar with it from my using it in personal projects. However, one team member is opposed to this vehemently, stating that Entity Framework uses reflection and kills performance. His other argument is that EF uses generated SQL that is far less efficient than stored procedures. I'm not so familiar with the inner-workings of EF and my searches haven't turned up anything extremely useful.
Here are my questions. I've tried to make them as specific as possible. If you need some clarification please ask.
Issue 1 Questions - Reflection
Is this true about EF using reflection and hurting performance?
Where does EF use reflection if it does?
Are there any resources that compare performance? Something that I could use to objectively compare technologies for data access in .NET and then present it to my team?
Issue 2 Questions - SQL
What are the implications of this?
Is it possible to use stored procedures to populate EF entities?
Again are there some resources that compare the generated queries with stored procedures, and what the implications of using stored procedures to populate entities (if you can) would be?
I did some searching on my own but didn't come up with too much about EF under the hood.
Yes, it does like many other ORMs (NHibernate) and useful frameworks (DI tools). For example WPF cannot work without Reflection.
While the performance implications of using Reflection has not changed much over the course of the last 10 years since .NET 1.0 (although there has been improvements), with the faster hardware and general trend towards readability, it is becoming less of a concern now.
Remember that main performance hit is at the time of reflecting aka binding which is reading the type metadata into xxxInfo (such as MethodInfo) and this happens at the application startup.
Calling reflected method is definitely slower but is not considered much of an issue.
UPDATE
I have used Reflector to look at the source code of EF and I can confirm it heavily uses Reflection.
Answer for Issue 1:
You can take a look at exactly what is output by EF by examining the Foo.Designer.cs file that is generated. You will see that the resulting container does not use reflection, but does make heavy use of generics.
Here are the places that Entity Framework certainly does use reflection:
The Expression<T> interface is used in creating the SQL statements. The extension methods in System.Linq are based around the idea of Expression Trees which use the types in System.Reflection to represent function calls and types, etc.
When you use a stored procedure like this: db.ExecuteStoreQuery<TEntity>("GetWorkOrderList #p0, #p1", ...), Entity Framework has to populate the entity, and at very least has to check that the TEntity type provided is tracked.
Answer for Issue 2:
It is true that the queries are often strange-looking but that does not indicate that it is less efficient. You would be hard pressed to come up with a query whose acutal query plan is worse.
On top of that, you certainly can use Stored Procedures, or even Inline SQL with entity framework, for querying and for Creating, Updating and Deleting.
Aside:
Even if it used reflection everywhere, and didn't let you use stored procedures, why would that be a reason not to use it? I think that you need to have your coworker prove it.
I can comment on Issue 2 about Generated EF Queries are less efficient than Stored Procedures.
Basically yes sometimes the generated queries are a mess and need some tuning. There are many tools to help you correct this, SQL Profiler, LinqPad, and others. But in the end the Generated Queries may look like crap but they do typically run quickly.
Yes you can map EF entities to Procedures. This is possible and will allow you to control some of the nasty generated EF queries. In turn you could also map views to your entities allowing you to control how the views select the data.
I cannot think of resources but I must say this. The comparison to using EF vs SQL stored procedures is apples to oranges. EF provides a robust way of mapping your Database to your code directly. This combined with LINQ to Entity queries will allow your developers to quickly produce code. EF is an ORM where as SQL store procedures is not.
The entity framework likely uses reflection, but I would not expect this to hurt performance. High-end librairies that are based on reflection typically use light-weight code generation to mitigate the cost. They inspect each type just once to generate the code and then use the generated code from that point on. You pay a little when your application starts up, but the cost is negligible from there on out.
As for stored procedures, they are faster than plain old queries, but this benefit is often oversold. The primary advantage is that the database will precompile and store the execution plan for each stored procedure. But the database will also cache the execution plans it creates for plain old SQL queries. So this benefit varies a great deal depending on the number and type of queries your application executes. And yes, you can use stored procedures with the entity framework if you need to.
I don't know if EF uses reflection (I don't believe it does... I can't think of what information would have to be determined at run-time); but even if it did, so what?
Reflection is used all over the place in .NET (e.g. serializers), some of which are called all of the time.
Reflection really isn't all that slow; especially in the context of this discussion. I imagine the overhead of making a database call, running the query, returning it and hydrating the objects dwarf the performance overhead of reflection.
EDIT
Article by Rick Strahl on reflection performance: .Net Reflection and Performance(old, but still relevant).
Entity Framework generated sql queries are fine, even if they are not exactly as your DBA would write by hand.
However it generates complete rubbish when it comes to queries over base types. If you are planning on using a Table-per-Type inheritance scheme, and you anticipate queries on the base types, then I would recommend proceeding with caution.
For a more detailed explanation of this bizarre shortcoming see my question here. take special note of the accepted answer to that question --- this is arguably a bug.
As for the issue of reflection -- I think your co-worker is grasping at straws. It's highly unlikely that reflection will be the root cause of any performance problems in your app.
EF uses reflection. I didn't check it by I think it is the main point of the mapping when materializing instance of entity from database record. You will say what is a column's name and what is the name of a property and when a data reader is executed you must somehow fill the property which you only know by its name.
I believe that all these "performance like" issues are solved correctly by caching needed information for mapping. Performance impact caused by reflection will probably be nothing comparing to network communication with the server, executing the complex query, etc.
If we talk about EF performance check these two documents: Part 1 and Part2. It describes some reason why people sometimes think the EF is very slow.
Are stored procedures faster then EF queries? My answer: I don't think so. The main advantage of the linq is that you can build your query in the application. This is extreamly handy when you need anything like list filtering, paging and sorting + changing number of displayed columns, loading realted entities etc. If you want to implement this in a stored procedure you will either have tens of procedures for diffent query configurations or you will use a dynamic sql. Dynamic sql is exactly what uses EF. But in case of EF that query has compile time validation which is not the case of plain SQL. The only difference in performance is the amount of transfered data when you send whole query to the server and when you send only exec procecdeure and parameters.
It is true that queries are sometimes very strange. Especially inheritance produces sometimes bad queries. But this is already solved. You can always use custom stored procedure or SQL query to return entities, complex types or custom types. EF will materialize results for you so you don't need to bother with data readers etc.
If you want to be objective when presenting Entity framework you should also mention its cons. Entity framework doesn't support command batching. If you need to update or insert multiple entities each sql command have its own roundrip to database. Because of that you should not use EF for data migrations etc. Another problem with EF is almost no hooks for extensibility.
I want to convert all of my db stored procedures to linq to sql expressions, is there any limitation for this work? you must notice that there is some complicated queries in my db.
Several features of SQL Server are not supported by Linq to SQL:
Batch updates (unless you use non-standard extensions);
Table-Valued Parameters;
CLR types, including spatial types and hierarchyid;
DML statements (I'm thinking specifically of table variables and temporary tables);
The OUTPUT INTO clause;
The MERGE statement;
Recursive Common Table Expressions, i.e. hierarchical queries on a nested set;
Optimized paging queries using SET ROWCOUNT (ROW_NUMBER is not the most efficient);
Certain windowing functions like DENSE_RANK and NTILE;
Cursors - although these should obviously be avoided, sometimes you really do need them;
Analytical queries using ROLLUP, CUBE, COMPUTE, etc.
Statistical aggregates such as STDEV, VAR, etc.
PIVOT and UNPIVOT queries;
XML columns and integrated XPath;
...and so on...
With some of these things you could technically write your own extension methods, parse the expression trees and actually generate the correct SQL, but that won't work for all of the above, and even when it is a viable option, it will often simply be easier to write the SQL and invoke the command or stored procedure. There's a reason that the DataContext gives you the ExecuteCommand, ExecuteQuery and ExecuteMethodCall methods.
As I've stated in the past, ORMs such as Linq to SQL are great tools, but they are not silver bullets. I've found that for larger, database-heavy projects, L2S can typically handle about 95% of the tasks, but for that other 5% you need to write UDFs or Stored Procedures, and sometimes even bypass the DataContext altogether (object tracking does not play nice with server triggers).
For smaller/simpler projects it is highly probable that you could do everything in Linq to SQL. Whether or not you should is a different question entirely, and one that I'm not going to try to answer here.
I've found that in almost all cases where I've done a new project with L2S, I've completely removed the need for stored procedures. In fact, many of the cases where I would have been forced to use a stored proc, multivariable filters for instance, I've found that by building the query dynamically in LINQ, I've actually gotten better queries in the vast majority of cases since I don't need to include those parts of the query that get translated to "don't care" in the stored proc. So, from my perspective, yes -- you should be able to translate your stored procs to LINQ.
A better question, thought, might be should you translate your stored procs to LINQ? The answer to that, I think, depends on the state of the project, your relative expertise with C#/VB and LINQ vs SQL, the size of the conversion, etc. On an existing project I'd only make the effort if it improves the maintainability or extensibility of the code base, or if I was making significant changes and the new code would benefit. In the latter case you may choose to incrementally move your code to pure LINQ as you touch it to make changes. You can use stored procs with LINQ so you may not need to change it to make use of LINQ.
I'm not a fan of this approach. This is a major architectural change, because you are now removing a major interface layer you previously put in place to gain a decoupling advantage.
With stored procedures, you have already chosen the interface your database exposes. You will now need to grant users SELECT privileges on all the underlying tables/views instead of EXECUTE on just the application stored procedures and potentially you will need to restrict column read rights at the column level in the tables/views. Now you will need to re-implement at a lower level every explicit underlying table/view/column rights which your stored procedure was previously implementing with a single implicit EXECUTE right.
Whereas before the services expected from the database could be enumerated by an appropriate inventory of stored procedures, now the potential database operations are limited to the exposed tables/views/columns, vastly increasing the coupling and potential for difficulty in estimating scope changes for database refactorings and feature implementations.
Unless there are specific cases where the stored procedure interface is difficult to create/maintain, I see little benefit of changing a working SP-based architecture en masse. In cases where LINQ generates a better implementation because of application-level data coupling (for instance joining native collections to database), it can be appropriate. Even then, you might want to LINQ to the stored procedure on the database side.
If you chose LINQ from the start, you would obviously have done a certain amount of work up front in determining column/view/table permissions and limiting the scope of application code affecting database implementation details.
What does this mean? Does this mean you want to use L2S to call your stored procedures, or do you want to convert all the T-SQL statements in your stored procs to L2S? If it's the later, you should not have too many problems doing this. Most T-SQL statements can be represented in Linq without problem.
I might suggest you investigate a tool like Linqer to help you with your T-SQL conversion. It will convert most any T-SQL statement into Linq. It has saved my quite a bit of time in converting some of my queries.
There are many constructs in T-SQL which have no parallel in LINQ to SQL. Starting with flow control, ability to return multiple row sets, recursive queries.
You will need to approach this on a case by case basis. Remembering any times the SP does significant filtering work on the database much of that filtering may end up on the client, so needing to move far more data from server to client.
If you already have tested and working stored procedures, why convert them at all? That's just making work for no reason.
If you were starting a new product from scratch and were wondering whether to use stored procedures or not, that would be an entirely different question.