.Net Core - Change database migrations depending on environment - c#

So, I'm trying to figure out what the golden path is for running an application against a PostgreSQL database in development and a SQL Server database in production. The difficult part is that the migrations will be different. Currently, my approach is like this:
public void ConfigureServices(IServiceCollection services)
{
services.AddDbContext<ApplicationDbContext>(SetDbContextOptionsForEnvironment, ServiceLifetime.Transient);
}
private void SetDbContextOptionsForEnvironment(DbContextOptionsBuilder options)
{
if(_environmentName == "production") {
options.UseSqlServer(Configuration["Data:DefaultConnection:ConnectionString"]);
}
else {
options.UseNpgsql(Configuration["Data:DefaultConnection:ConnectionString"]);
}
}
Is the preferred way to keep the migrations in a separate assembly and specify that assembly in the options? So then I need to have multiple definitions of the same DbContext in those assemblies as well?

After thinking on this for quite a while, I believe that this is an anti-pattern. Regardless of my particular use case (postgres in dev, SQL Server in prod), I think it is frowned upon to use different database systems for different environments as there may be unexpected issues after deployment. Best to stick with the same one for development and production.

Related

Custom scaffolding logic for database first Entity Framework Core

I'm new to Entity Framework Core as I'm currently migrating over from EF6. We use a database first model.
I've created a .Net Standard class library where I've imported EF Core and have used the scaffolding command in VS to import one of my tables as a test - which worked.
We have a couple of different replica databases that we use for geo-redundancy and reporting purposed, but all have the same model.
I've set up the different connection options, somewhat like point 2 in the answer in this post Using Entity Framework Core migrations for class library project so we can support connections to the different databases.
Now I want to go about adding the rest of the tables in. I have seen mention of Migrations, but that seems to be for code first models.
I then tried to use the "-Force" command on the scaffolding, which did import an additional table, but I lost my multi-database support.
In EF6 I had this logic in the Context.tt file, so when I retrieved updates from the database it would retain the custom connection options I had.
Is there a way to replicate this in EF Core or something I am missing?
Also, for something as simple as a new column on a table, should I still run the same command?
Thanks in advance,
David
* UPDATED *
I ended up using EF Core Power Tools which is a great package and allows complete control over the model, much like the context.tt file used to.
For anyone looking in future, I reverse engineered and used the Handlebars templates in EF Core Power Tools. I pass the 3 connection strings I use in the startup of the application and can then set optional booleans to indicate which connection to use - an enum may be more elegant for anyone starting again but this makes migration easier for us.
In the DbConstructor.hbs, I updated it to:
{{spaces 8}}public {{class}}(bool ReadOnlyDatabase = false, bool ReportsDatabase = false) : base()
{{spaces 8}}{
if (ReadOnlyDatabase)
_connectionString = ReadOnlyContext;
else if (ReportsDatabase)
_connectionString = ReportsContext;
else
_connectionString = ReadWriteContext;
{{spaces 7}} }
The DbContext.hbs file is:
{{> dbimports}}
namespace {{namespace}}
{
public partial class {{class}} : DbContext
{
public static string ReadWriteContext = "";
public static string ReadOnlyContext = "";
public static string ReportsContext = "";
private readonly string _connectionString;
{{{> dbsets}}}
{{#if entity-type-errors}}
{{#each entity-type-errors}}
{{spaces 8}}{{{entity-type-error}}}
{{/each}}
{{/if}}
{{{> dbconstructor}}}
protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
{
optionsBuilder.UseSqlServer(_connectionString);
}
{{{on-model-creating}}}
}
}
Supply a DbContextOptions to the constructor:
public MetadataContext(DbContextOptions<MetadataContext> options)
: base(options)
{
}
Maybe EF Core Power Tools can help you? (I am the author)
You need to post more of your code so we can see what’s going on. At a guess it sounds like you are customizing the generated context class. It will be overwritten with each scaffold operation and as you note the migration support is there to drive db Schema updates from model changes. I.e. it’s for code first.
The db first scaffold approach seems that to be written with the expectation that the scaffold will be done once and thereafter models and contexts will be manually updated.
We have a workflow that involves re scaffolding the DbContext in response to db schema updates and consequently do three things:
scaffold a localhost or SSPI db so the embedded connection string contains no passwords of value.
tolerate the annoying warning in CI/CD about the connection string the scaffolder generates each time that cant be disabled.
we use extension methods to add functionality to the DbContext when necessary so we never need to alter the generated code.
Now I would not suggest extension methods to initialize the different connection strings.
Those I am sure are already coming from config but if not, and you are making a console rather than asp.net core app which makes configuration driven creation of DbContexts really easy I would really suggest learning how to use at least Microsoft.Extensions.Configuration to configure your DbContext instances from an appSettings.
I can’t share a code sample of initializing a DBContext with an DbOptionsBuilder directly because my guilty secret is I always get the DI in Asp.net Core to do it for me.

Parallel integration testing with SQL database

Wanted Result: Run integration tests in parallel with entity framework configured to use some kind of database that enforces constraints.
Current Situation: I have a project with some integration tests, and by that I mean test set up with WebApplicationFactory<Startup>. Currently I use it with EF Core setup to use In Memory Database
protected override void ConfigureWebHost(IWebHostBuilder builder)
{
ConfigureWebHostBuilder(builder);
builder.ConfigureTestServices(services =>
{
var serviceProvider = new ServiceCollection()
.AddEntityFrameworkInMemoryDatabase()
.BuildServiceProvider();
services.AddDbContext<AppDbContext>(options =>
{
options.UseInMemoryDatabase("testDb");
options.UseInternalServiceProvider(serviceProvider);
options.EnableDetailedErrors();
options.EnableSensitiveDataLogging();
});
});
}
While this works, the problem is that it doesn't enforce SQL queries and I already ran in multiple cases where integration tests show the feature works, but when trying to reproduce it with actual project running and having EF connected to SQL Server database, the feature fails because of some failed database constraint.
Ideas:
At first I thought of using SQLite in memory database. Didn't work, I think it's some concurrency issue, because at the start of every test case, I wipe the database with .EnsureDeleted() .EnsureCreated() .Migrate()
Then SQLite with files with random names. Didn't work, because I couldn't manage to wipe out internal DbContext Cache. I was sure that file was recreated and clean, but DbContext simply had cached entities.
Finally after some migrations I released that I wouldn't probably be able to use SQLite at all, because it doesn't support a lot of migration types (Removing FK, altering a column, etc.)/

Running Unit Test on ASP.Net MVC5 with DbContext

I currently using Unit Testing on my projects but it seems I have trouble on getting the data on my site DBContext, I assume that I need to run the web project then run the test but I dont think Visual Studio permits that.
So I tried opening 2 instances of Visual Studio but it still did not work.
This is my code:
private ApplicationDbContext db = new ApplicationDbContext();
[TestMethod]
public void IsDbNull()
{
var dblist = db.StudentInformation.ToList();
Assert.IsNotNull(dblist);
}
A simple assert to check if the project can read my database, I tried debuging and dblist doesn't have any items on it.
I saw some codes that uses fake database by putting it as CSV but as much I dont want to go that method since I want to test the real data itself.
I also saw "Run Tests" on GitHub but it only supports VS 2012
Putting aside discussions about what constitutes a unit test, as has been suggested by #jdphenix in the comments, your problem is likely to be a configuration issue. The main evidence for this comes from your statement:
I tried debuging and dblist doesn't have any items on it.
If your test code had failed to connect to a database, you would get an exception when attempting to read the contents from it rather than an empty list, which is what you are reporting.
In response to your comment:
If I make the configuration in the test project then I will need to replicate the database which is pointless because its a Unit Test. Its purpose is just to test the main project not to have its own configurations
When using entity framework, if you don't supply it with configuration information, it will derive a connection string based on the namespace and class name of the context you are using to connect with the database (I believe it will also assume you want to use SQLEXPRESS as the provider, although that may be version dependant). If you don't provide configuration information in your unit-test project, this is what the entity framework will do. If you don't have any configuration in your application then the two will match, and everything will point at the same database.
In most real world applications however, your application will have some kind of connection information in it. This will either be because you want a sensible database name, or you need to be able to control the machine that the database is on, or you need to use a different provider. If this is the case then you need to replicate the connection string into your test project. Note, duplicating the connection string does not "replicate the database", it simply creates another pointer to the database.
It's also worth noting that if there is a mismatch in the test project, EF will try to create a new database with it's defaults if one doesn't already exist.
A simple way to check if it is a configuration issue is to debug your test and real code and compare the values of:
(((System.Data.Entity.DbContext)(db)).Database.Connection).ConnectionString
Where db is the instantiated name of your database context (note you'll have to execute a line like db.StudentInformation.ToList() before it is populated.
As you appear to be using the MS test framework, any configuration you need to be available for testing should be placed in an App.config file in the project (which may already have been added if you used nuget to reference the entity framework) to make it available (note, this varies with different testing frameworks and may also be different if you're using a non-standard test runner).

Connection string for MySQL missing in other projects

I am currently developing a solution with both an ASP.NET project, and a WPF application. I have a common project too, with an ADO.NET entity model, where entities are generated from a database.
If i try to call anything from my database (MySQL) from my WPF, ASP or a Test project i get a InvalidOperationException where it says that no connection string named "DataModel" could be found in application config file.
I then have to add entity framework connectionstrings and other stuff to each project, in order to be able to fetch data from my common project or database. It also means if i want to change the db connection i have to do it in every single project. Isn't there a smarter way to do this?
Thanks..
Isn't there a smarter way to do this You're doing what most people do, at least for small and medium environments, by putting the connection string in each project.
Most projects need different connection strings for different environments (DEV, QA, PRODUCTION). I use and highly recommend the free add-in Slow Cheetah. That tool allows you to define XSLT transforms to modify XML files in your project. In this case, I use it to drop in the correct connection string depending on the build settings.
When you are ready to create a build for the PRODUCTION environment, you just change the Visual Studio solution configuration to Release, and the generated web.config/app.config contains the PRODUCTION connection string.
You can pass the connectionstring that has to be used to the constructor of your DbContext. You have 3 options to pass a connectionstring:
Pass nothing, EF will look for defaultconnectionstring in configfile, if none is found it will throw an error.
Pass the name of the connectionstring you want to use, if it's not found in the config file EF will use the default connectionstring.
public partial class MyDb : DbContext
{
public MyDb(string connectionStringName) : base(connectionStringName)
}
Pas a connectionstring to the constructor. EF won't look in any config files and just use that one, this is probably what you're looking for:
public partial class MyDb : DbContext
{
public MyDb(string connectionString) : base(connectionString)
//Or hardcode it in:
public MyDb() : base("datasource=...")
}
Edit: It is indeed not good practice to do the above, I'm just saying it's possible.

Integration testing multiple Entity framework dbcontexts that share a database

In my application I have multiple small entity framework dbcontexts which share the same database, for example:
public class Context1 : DbContext {
public Context1()
: base("DemoDb") {
}
}
public class Context2 : DbContext {
public Context2()
: base("DemoDb") {
}
}
All database updates are done via scripts and do not rely on migrations (nor will they going forward). The question is - how would you do integration testing against these contexts?
I believe there are three options here (there may be more I just don't know them)
Option 1 - Super context - a context which contains all models and configurations required for setting up the database:
public class SuperContext : DbContext
{
public SuperContext()
: base("DemoDb") {
}
}
In this option the test database would be setup against the super context and all subsequent testing would be done through the smaller contexts.
The reason I am not keen on this option is that I will be duplicating all the configurations and entity models that i have already built.
Option 2 - create a custom initialiser for integration tests that will run all the appropriate db initialisation scripts:
public class IntegrationTestInitializer : IDatabaseInitializer<DbContext> {
public void InitializeDatabase(DbContext context) {
/* run scripts to set up database here */
}
}
This option allows for testing against the true database structure but will also require updating everytime new db scripts are added
Option 3 - just test the individual contexts:
In this option one would just let EF create the test database based upon the context and all tests would operate within there own "sandbox".
The reason that I don't like this is that it doesn't feel like you would be testing against a true representation of the database.
I'm currently swaying towards options 2. What do you all think? Is there a better method out there?
I'm using integration testing a lot, because I still think it's the most reliable way of testing when data-dependent processes are involved. I also have a couple of different contexts, and DDL scripts for database upgrades, so our situations are very similar.
What I ended up with was Option 4: maintaining unit test database content through the regular user interface. Of course most integration tests temporarily modify the database content, as part of the "act" phase of the test (more on this "temporary" later), but the content is not set up when the test session starts.
Here's why.
At some stage we also generated database content at the start of the test session, either by code or by deserializing XML files. (We didn't have EF yet, but otherwise we would probably have had some Seed method in a database initializer). Gradually I started to feel misgivings with this approach. It was a hell of a job to maintain the code/XML when the data model or the business logic changed, esp. when new use cases had to be devised. Sometimes I allowed myself a minor corruption of these test data, knowing that it would not affect the tests.
Also, the data had to make sense, as in they had to be as valid and coherent as data from the real application. One way to ensure that is to generate the data by the application itself, or else inevitably you will somehow duplicate business logic in the seed method. Mocking real-world data is actually very hard. That's the most important thing I found out. Testing data constellations that don't represent real use cases isn't only a wast of time, it's false security.
So I found myself creating the test data through the application's front end and then painstakingly serializing this content into XML or writing code that would generate exactly the same. Until one day it occurred to me that I had the data readily available in this database, so why not use it directly?
Now maybe you ask How to make tests independent?
Integration tests, just as unit tests, should be executable in isolation. They should not depend on other tests, nor should they be affected by them. I assume that the background of your question is that you create and seed a database for each integration test. This is one way to achieve independent tests.
But what if there is only one database, and no seed scripts? You could restore a backup for each test. We chose a different approach. Each integration test runs within a TransactionScope that's never committed. It is very easy to achieve this. Each test fixture inherits from a base class that has these methods (NUnit):
[SetUp]
public void InitTestEnvironment()
{
SetupTeardown.PerTestSetup();
}
[TearDown]
public void CleanTestEnvironment()
{
SetupTeardown.PerTestTearDown();
}
and in SetupTeardown:
public static void PerTestSetup()
{
_tranactionScope = new TransactionScope();
}
public static void PerTestTearDown()
{
if (_tranactionScope != null)
{
_tranactionScope.Dispose(); // Rollback any changes made in a test.
_tranactionScope = null;
}
}
where _tranactionScope is a static member variable.
Option 2, or any variation thereof that runs the actual DB update scripts would be the best. Otherwise than this you are not necessarily integration testing against the same database you have in production (with respect to the schema, at least).
In order to address your concern about requiring updating every time new DB scripts are added, if you were to keep all the scripts in a single folder, perhaps within the project with a build action of "copy if newer", you could programmatically read each file and execute the script therein. As long as the place you're reading the files from is your canonical repository for the update scripts, you will never need to go in and make any further changes.

Categories

Resources