In a .NET Core project I have the following model:
public class WrapperContext : DbContext
{
public DbSet<WrappedFunction> Functions { get; set; }
protected override void OnConfiguring(DbContextOptionsBuilder builder)
{
builder.UseSqlite("Data Source=functions.db");
}
}
public class WrappedFunction
{
[Key]
public int FunctionId { get; set; }
public String FunctionName { get; set; }
public String AssemblyPath { get; set; }
public String FactoryName { get; set; }
}
As you can see it is an Sqlite database.
I have added the initial migration and updated the database. Before running my test application I have populated the Functions table with a single record. I can open the DB with SqliteManager to see the record is there.
I then try to access the DB with the following function:
static MUinfo getInfoFor(String command)
{
UFInfo rv = null;
using (var ctx = new WrapperContext()) // <---- record disappears here
{
foreach (var fn in ctx.Functions)
{
if (fn.FunctionName.Equals(command))
{
rv = new UFInfo()
{
AssemblyPath = fn.AssemblyPath,
FactoryName = fn.FactoryName,
CommandName = command
};
}
}
}
if (rv != null) return rv;
return "No Info for " + command;
}
However, when stepping through the debugger I see that the foreach loop never executes as there are no records in the DB. When I stop before executing the foreach I can verify (using SqliteManager) that my single record has been deleted from the database.
So, it would appear that the DbContext object is somehow deleting the data from the database. Why would that be and what have I done wrong here? I am fairly new to EntityFramework so I may have done something obvious, but it isn't obvious to me.
One additional bit of info here. The project that does the db access is a library project. I have to copy the DB over to the build directory of the application before populating it and running the application. I don't know if that makes a difference or not.
Edit 6/19/17 (18:51):
OK. A bit more information (thanks to #StevePy). Nunit3 seems to conflict with EFCore (though I don't get any errors on restore) so I couldn't create a unit test for this. So I inserted a using (var ctx...) above the one in the listing above and inserted a couple of dummy records. Then I exited that using block and when I entered the one to traverse the records they were there.
However, I then commented out the dummy insertions and reran my test. And, once again, both records disappeared. So there is something very weird going on here with EFCore.
Edit 6/20/17 (15:30):
Well, I'm still not sure what is going on but there is an odd interaction between EFCore and Microsoft.Data.Sqlite.
I decided to remove all references to EFCore and simply use SQL queries on the DB. I did this without recreating the DB (so I was using the one already created by EFCore). I noticed the exact same behavior when I created a new SqliteConnection without using EFCore.
In desperation I deleted the DB and recreated it from scratch using Microsoft.Data.Sqlite. This DB didn't have any of the EFCore information in it, obviously. I then populated the DB with some test records and went to use it. No more disappearing records.
Apparently there was something very strange in the way EFCore set up the database in the first place that caused the issue. But I don't have any idea what it was.
What is likely happening is that you are using CodeFirst /w EF, but taking a DB first approach when it comes to testing your new code. EF tracks schema changes within the database and my guess is that by you creating the table ahead of time it does not know that the schema is vetted, so it drops and recreates it on context start-up.
The fix here should be to create a "stub" test that populates a test record one time using your EF model. From there you should have a table that EF recognizes against that context and accepts. From there you can create test records in whatever editor for testing purposes.
SQLite has several limitations when it comes to schema migration that you probably will want to consider as well. (see: https://learn.microsoft.com/en-us/ef/core/providers/sqlite/limitations)
You might be better off setting up DB first, telling EF how to map to an existing SQLite schema rather than trying to use Code-First as migration is pretty limited for that DB engine.
Related
So, I have a Xamarin app with a local DB, and I'm using the command this.Database.Migrate() to apply any pending migration, it works fine at first, but the problem is, when I uninstall the app and install again, the app try to execute the same pending migration, and I got the error "Table 'name' already exists". Is there a way to ignore tables that already exists 'cause I don't want to delete the users local data every time they uninstall the app.
I'm using the command dotnet ef migrations add initial to create migrations.
To everyone facing this problem, I solved creating a method using SQLiteConnection. You need to add using SQLite; :
public static bool TableExists(string tableName)
{
var dbPath = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.Personal), databaseName);
using (var db = new SQLiteConnection(dbPath))
{
var info = db.GetTableInfo(tableName);
if (info.Any())
{
return true;
}
}
return false;
}
I'm working on a small project in WPF where you can insert, edit and search Recipes. I'm using Entity for this.
Recipe has a property, list<Ingredient>. When I insert an ingredient, it's OK, but when I insert a recipe with a list<Ingredient>, everything is ok except that in table recipe, there is no column ingredient at all. But in debug I clearly can see that in my method where I insert a recipe , the object recipe has ingredients count... I have spend couple hours trying different things out but with no luck.
This is how I insert (same as ingredient , where everything works)
public static void InsertRecipe(Recipe recipe)
{
RecipeDbContext ctx = new RecipeDbContext();
ctx.Recipes.Add(recipe);
ctx.SaveChanges();
}
Screen-shots from debug:
Recipe Table
Recipe object properties
Just for testing purposes , the ingredients should come from a listbox later on.
When I press ctrl property menu is fading away (reason why screen-shot is the way it is)
The whole User Instance and AttachDbFileName= approach is flawed - at best! When running your app in Visual Studio, it will be copying around the .mdf file (from your App_Data directory to the output directory - typically .\bin\debug - where you app runs) and most likely, your INSERT works just fine - but you're just looking at the wrong .mdf file in the end!
If you want to stick with this approach, then try putting a breakpoint on the myConnection.Close() call - and then inspect the .mdf file with SQL Server Mgmt Studio Express - I'm almost certain your data is there.
The real solution in my opinion would be to
install SQL Server Express (and you've already done that anyway)
install SQL Server Management Studio Express
create your database in SSMS Express, give it a logical name (e.g. RecipeDataBase)
connect to it using its logical database name (given when you create it on the server) - and don't mess around with physical database files and user instances. In that case, your connection string would be something like:
Data Source=.\\SQLEXPRESS;Database=RecipeDataBase;Integrated Security=True
and everything else is exactly the same as before...
Also see Aaron Bertrand's excellent blog post Bad habits to kick: using AttachDbFileName for more background info.
First, you must insert Recipe item and get the ID :
public static int InsertRecipe(Recipe recipe)
{
RecipeDbContext ctx = new RecipeDbContext();
ctx.Recipes.Add(recipe);
ctx.SaveChanges();
return recipe.recipeID;
}
And after that you can insert all Ingrédients, Something like that
public static void InsertIngredient(List<Ingredient> Ingredients, int recipeID)
{
IngredientDbContext ctx = new IngredientDbContext();
foreach (var ingredient in Ingredients)
{
Ingredient newItem = new Ingredient { IngredientID = ingredient.Ingredient,..., recipeID = recipeID};
ctx.Ingredient.Add(newItem);
}
ctx.SaveChanges();
}
I'm trying to remove a database form my application using entity framework.
The code I use is the following:
using (var dbContext = container.Resolve<ApplicationDbContext>())
{
dbContext.Database.Delete();
}
According to msdn this should work but nothing happens.
the dbContext is registered using ContainerControlledLifetimeManager and should be the same instance used to create the DB.
Adding, updating and deleting instances of entity types needs dbContext.SaveChanges() to reflect changes.
However dbContext.Database.Delete() does not need dbContext.SaveChanges().
If you open connection for example from Sql Management Studio to your database and try to dbContext.Database.Delete() then you receive Cannot drop database "dbContext" because it is currently in use. If you restart you sql server, you drop those connections and then retry dbContext.Database.Delete() you successfully drop database.
Last thing is refresh database list in Sql Management Studio in order to see that database is not there any more.
Testing with this code snippet:
using (var dbContext = new dbContext())
{
dbContext.Database.Delete();
}
After #moguzalp and this (MSDN Database.Delete Function), I came with a solution for my case:
using System.Data.Entity;
Database.Delete("connectionStringOrName");
In my case I was trying to Recreate a mssqllocalDb database for test purposes. But whenever I used the same DbContext (or an immediately new one, disposing the first and opening another), it looked like the database was still up when I tried to create it.
Bad Example: (x)
public static void RecreateDatabase(string schemaScript)
{
using (DbContext context = new DbContext("connectionStringName"))
{
context.Database.Delete(); // Delete returns true, but...
Database database = context.Database;
database.Create(); // Database already exists!
database.ExecuteSqlCommand(schemaScript);
}
}
Working example: (/)
public static void RecreateDatabase(string schemaScript)
{
Database.Delete(); // Opens and disposes its own connection
using (DbContext context = new DbContext("connectionStringName")) // New connection
{
Database database = context.Database;
database.Create(); // Works!
database.ExecuteSqlCommand(schemaScript);
}
}
Context: I'm using this on an [AssemblyInitialize] function to test my ModelContexts
All of your changes occurred on local variable dbcontext.
That is final kick > add dbContext.SaveChanges(); at the end of your using block like so:
using (var dbContext = container.Resolve<ApplicationDbContext>())
{
dbContext.Database.Delete();
dbContext.SaveChanges();
}
protected override void Seed(Fitlife.Domain.Concrete.EFDBContext context)
{
List<List<string>> foodweights = GetLines(basePath + "FoodWeights.txt");
int counter = 0;
foodweights.ForEach(line =>
{
FoodWeights newVal = new FoodWeights()
{
FoodCode = int.Parse(line[0]),
PortionCode = int.Parse(line[1]),
PortionWeight = decimal.Parse(line[2])
};
context.FoodWeights.Add(newVal);
if (++counter == 1000)
{
counter = 0;
context.SaveChanges();
}
});
}
Above method is used to populate my database. But it takes 50 seconds for 1000 entries i have a file with 470k entries, how can i improve performance i am using entity framework and this method is called when i do
PM> update-database
with Package manager. i need similar functionality, i am very new to asp.net and entity framework any guidance will be appreciated thanks.
PS: Is it ok to take 50 seconds for 1000 entries or am i doing something wrong.
The Seed method runs every time the application starts, so the way you have coded it will attempt to add the FoodWeights over and over again. EF have provided the AddOrUpdate as a convenient method to prevent that but it is really not appropriate for bulk inserts.
You could use sql directly on the database - and if you are using sql server that sql could be 'BULK INSERT'.
I would put the sql in an Up migration because you probably only want to run the insert once from a known state, and it avoids having to worry about the efficiency of the context and tracking changes etc.
There is example code and more information here: how to seed data using sql files
Okay, this is really weird. I made a simple database with a single table, Customer, which has a single column, Name. From the database I auto-generated an ADO.NET Entity Data Model, and I'm trying to add a new Customer to it like so:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
namespace Test
{
class Program
{
static void Main()
{
Database1Entities db = new Database1Entities();
Customer c = new Customer();
c.Name = "Harry";
db.AddToCustomer(c);
db.SaveChanges();
}
}
}
But it doesn't persist Customer "Harry" to the database! I've been scratching my head for a while now wondering why such a simple operation doesn't work. What on earth could be the problem!?
EF requires that you have a unique index for many operations.
Try adding an identity field (primary key) to your table. Delete and recreate your model and try again.
Edit:
It looks like you are running this from a console app.
Do you have a connection string in the app.config file?
Do you have more than one project in your solution?
Are you getting any exceptions?
Edit2:
Next things to try:
Use SQL Server profiler to see what is being sent to the database
Open the EF model in an editor to see the xml, check if there are any errors
Place db in a using statement to ensure the connection/transaction is closed cleanly before process exit.
OK, here's a longshot. Are you sure your connection string is pointing to the right place? And the connection is functioning?
You can verify it by using SQL Server Management Studio to add some records to your test database, and then do something like
foreach (Customer c in db.Customers)
{
Console.WriteLine(c.Name);
}
Make sure that the "Copy to Output Directory" property of the database file is not set to "Copy always." If it is, every time you rebuild the application, the database may be clobbered with a pristine copy.
See: Entity Framework not flushing data to the database