My current project is an E-Commerce API. The database I am reading from is from Microsoft Dynamics GP. The database is currently hosted on everything from SQL Server 2012 to SQL Server 2019. I cannot modify the schema. We have many different projects, all reading and writing to this same database.
I have done everything I can find to increase my API's speed. I have even cheated a few times and created a few indexes or two. With the last combined patch (CU13) for SQL Server 2019, I ran SQL Server profiler and started a script that benchmarks all the heavy queries. I then ran the Tuning Wizard against the trace log. It recommended half a page of new indexes which I implemented.
Now, since I am using Entity Framework Core and I initially migrated / imported the tables I needed, do I need to re-import my tables to take advantage of the recently created indexes? Or will they automatically be used by the SQL Server regardless if EF knows about them or not?
They will automatically be used by the SQL Server regardless if EF knows about them or not.
Related
I'm developing an application, in part to learn new technologies - in this instance Entity Framework.
I'm using .NET 4.5, Visual Studio 2013. I need some way to store data that doesn't involve running an SQL server on my tired old laptop (not even the lightweight one) and works with Entity Framework. Ideally I would like something that just stores data in a file. I have a few basic requirements:
A simple relational database. Max 10 tables, this is a prototype, so
the tables won't get very big
Ability to design the database in some
sort of GUI tool
Ability to generate Entity Framework database model
from the DB. I don't want to type those in manually.
No proprietary software other than what I'm already using - i.e. I don't own a MS Access license
Something that ACTUALLY
WORKS
I've spent the better part of today trying to get VS2013 to connect me to an SQLite database that I've created. I never got anywhere. SQLite is simply not listed among the available types of data sources. I've read StackOverflow posts about this issue, I'm done trying to make it work. (this is where the frustration you can sense in this post stems from - this is SO not the point of the exercise, yet I've wasted so much time with it)
Is there any other file based database technology I could use?
Your needs are not compatible, the best choice is SQLite but you can not generate model from database and must use EF as CodeFirst .
and your problem with SQLite can easily fixed by using correct provider and configuration : http://www.bricelam.net/2012/10/entity-framework-on-sqlite.html
I suggest you use SQL Server Compact in combination with my free SQL Server Compact Toolbox VS extension:
A simple relational database. Max 10 tables, this is a prototype, so the tables won't get very big
SQL CE supports all standard relational paradigms, and has a max database size of 4 GB
Ability to design the database in some sort of GUI tool
The extension allows you to do this in Visual Studio
Ability to generate Entity Framework database model from the DB. I don't want to type those in manually.
Yes, fully supported
No proprietary software other than what I'm already using - i.e. I don't own a MS Access license
SQL Compact is free and freely distributable
I am trying to use Entity Framework 6.1.1 in Visual Studio 2013 with C#. I am using SQL Server 2012 in the back end. I have tested a regular SQL Server database which works fine. I used the Database First design pathway. I have a database that is actually a Linked Server in Microsoft SQL. The back end of the linked server is SQLite. The linked server is interacted with via several views in a regular Microsoft SQL database. I do not have to write or modify data in the linked server in any way.
I cannot get Database First to build a model of this database with the views, regardless of what I try. I have tried to use Code First to manually write an interface with the database, without any success.
Is it possible to get Entity Framework to talk to this database? How can I do so?
Okay, if anyone later finds this and has the same problem, I resolved this by using Code First and manually creating Entities with names to match the SQL tables and variables with names to match the fields. This worked fine and I could even use navigation properties and all worked great.
I'm re-writing an old VB6 application which used SQL Server 2005 and Access MDB in VS 2013 C# with EF 6.0 to SQL Server 2008 R2 and SQL Server CE 4.0. The app has to function offline in the event of network errors, of which there are many. The offline version needs read-only access. Re-use of code was easy in VB6 I just changed the connections.
I have completed 98% of the code using a Database First model EF 6.0 (just upgraded). Now I need to add support for offline. I thought I could swap out the connections but have read-up that I can not. I'm open to changing the model used but would prefer to keep Database or Model First approach.
I have an external app that creates & copies all the data necessary from SQL Server to SQL Server CE and I have no need for SQL Server technology in the offline. I don't want to re-write all my code just to support offline connections.
I'm willing to support multiple EDMX models, one for SQL Server and one for SQL Server CE but before I do I wanted to get some opinions & advice.
In theory, the application will test access to the server at start up and connect to the offline if the server is not available (assuming the offline SQL Server CE is already generated and ready).
I am assuming SSCE means SQL Server CE.
EF does support multiple back-ends for the same model. In this scenario you would have one CSDL and two SSDL and two MSL files. CSDL represents your model while SSDL represents your database (and is database specific - e.g. uses database specific database types). MSL is just telling EF how CSDL is mapped to SSDL. However the designer does not support this scenario since the designer work just with EDMX files and not separate CSDL/MSL/SSDL artifacts. You potentially could have two EDMX files but then you would have to make sure that CSDL is in both cases the same and you would have to disable generating code from one of the EDMX files - otherwise you would end up having two sets of entities which would create the conflict. Even though this could work there is a bigger problem - Visual Studio 2013 does not support Sql Server CE anymore. It means that there is no DDEX provider for Sql Server CE for VS2013 which makes it impossible to create/update EF models that use Sql Server CE since the EF designer relies on DDEX provider (this is what happens if you try to use Sql Server CE with EF Designer on VS2013).
Because of the above you actually might be better of trying CodeFirst approach. You would not need to worry about CSDL/MSL/SSDL since they should be built from your code (if you need to tweak the model in the way it touches store (e.g. you want to map to a column of specific store type) you would still have to have an if and tweak the model depending on the back end you are using but it should be much more manageable than separate SSDLs). CodeFirst does not depend on DDEX so again you could use it in VS2013 just by adding NuGet packages.
I am looking for a solution in which several applications on the same machine access one and the same database. Generally the operations are just reads thus I am not interested in having to provide concurrent write access as well.
I checked into SQL Server Express, SQL Server LocalDb, SQL CE, SQLite, MySQL and am not convinced which one is the best solution. I read that SQL CE allows concurrent read access but SQL Server LocalDb does not, which I find very odd given LocalDb is hyped by MS as a version that is very similar in functionality to the SQL Server family and which is supposed to make it easy to later on scale out.
I like to manage 5-10 tables each of which holds less than 5000 rows, so really lightweight content.
I am looking for a solution that meets the following requirements:
Concurrent read access by several applications on the same machine
Should be somewhat lightweight. I intend to move all applications within a solution to a different machine later and do not want to have to install a 200mb full blown SQL Server Ex#ress version if possible.
Should play well with VS2012 express (sqlite and mysql are highly unsupported in that regards, either not supporting EF5 or they do not show up in the server explorer.
Should be an SQL solution in order to manually update database tables within a management console such as Workbench or Management Studio or other third party app.
Should work somewhat with EF or other ORM solution. I want to be able to create an entity class and create a database from that or update tables using class objects. Also I want to populate class object collections from table rows without having to go through SQL code.
I target C# in .Net 4.5 and I guess it boils down to the question whether SQL CE is up to the task to allow concurrent reads and how I can load CE data tables and edit and visualize the content in some sort of management console. Also does SQL CE play well with EF5? Any better suggestions?
Since you're asking for an opinion, SQLite is my answer.
We are aware of no other embedded SQL database engine that supports as
much concurrency as SQLite. SQLite allows multiple processes to have
the database file open at once, and for multiple processes to read the
database at once. When any process wants to write, it must lock the
entire database file for the duration of its update. But that normally
only takes a few milliseconds. Other processes just wait on the writer
to finish then continue about their business. Other embedded SQL
database engines typically only allow a single process to connect to
the database at once.
Entity Framework on SQLite
System.Data.SQLite
Setups for 32-bit Windows (.NET Framework 4.5)
This setup package is capable of installing the design-time components for Visual Studio 2012.
SQL CE Works with EF5 and VS 2012 Express, is very lightweight, supports multiple readers on the same machine, and can be managed in VS Pro+ combined with the SQL Server Compact Toolbox add-in (or standalone) (I am the author)
I have a simple app written using SQL Server, Entity Framework, C# and WCF. When I wanted to share this app with my friends, I realised they didn't use SQL Server on their machine. I could go for SQL Server Express edition, as the usage of my app is personal and non-commercial.
I found MySQL as a popular alternative to SQL Server.
1) Would I be required to update my entities when moving to MySQL?
2) Should I anticipate code changes in my BL layer due to change in entities layer? (I am wondering whether entities was built for SQL Server)
Are there any databases similar to MS Access that is lightweight compared to MySQL?
Are there any databases that need not be installed but can be copied around like MS Access?
Appreciate your response!
Sounds like you want SQLite.
SQLite is a software library that
implements a self-contained,
serverless, zero-configuration,
transactional SQL database engine.
Very easy to deploy. Also, check out System.Data.SQLite.
According to the System.Data.SQLite page ...
Supports nearly all the entity
framework functionality that Sql
Server supports, and passes 99% of the
tests in MS's EFQuerySamples demo
application.
You should be good. :)
Im not sure how your BLL looks like and i have no experience with entity framework, but ive experienced multiple times that linq-to-sql works much better with sql-server as with any other database.
So unless you have a good reason not to use sql express, i'd advice to stick to sql express.
After all, you should always install something when deploying (unless you use xml as storage, which is quite well possible with linq-to-xml).
VistaDB Express Edition is also free for non-commercial usage and integrates good into .NET and VS. afaik it also works on a single local data file thus requires no specific installation on your friends' computers.
Otherwise I recommend using PostgreSQL over MySql since it is more standards compliant and has a nicer license.
I think what you're after is just a change in providers. What you need to use MySQL is the .Net Connector which supports most everything simple. It's not very mature yet so something very complex you may have issues on, but it should do most of what you want through Entity Framework.
With Entity Framework yes you can do updates, it's LINQ-to-SQL that doesn't update against any other databases (unless you use a third party provider like DotConnect)
SQLite is one alternative, but since multiple threads against it can cause major issues with it's operation, so if you need a major data store I'd go SQLExpress or MySQL.
Yes, you could use MySql with EF but I don't know if it would require changes.... I wouldn't be surprised if it does though. At the very least your physical DB would have to be ported / converted to MySql and that will take time.
I would assume that if you need to install a DB on your friends Pc's why not stick with SQL Express since you already developed in SQL Server on your box. Should be less issues with this than migrating to MySql.
I'd also vote for VistaDB 3 as it's so easy to deploy.