Bowing to my Visual Studios request, I started my latest project using Entity Framework Core (1.0.1)
So writing my database models as I always have using the 'virtual' specifier to enable lazy loading for a List. Though when loading the parent table it appears that the child list never loads.
Parent Model
public class Events
{
[Key]
public int EventID { get; set; }
public string EventName { get; set; }
public virtual List<EventInclusions> EventInclusions { get; set; }
}
Child Model
public class EventInclusions
{
[Key]
public int EventIncSubID { get; set; }
public string InclusionName { get; set; }
public string InclusionDesc { get; set; }
public Boolean InclusionActive { get; set; }
}
Adding new records to these tables seems to work as I am used to where I can nest the EventInclusions records as a List inside the Events record.
Though when I query this table
_context.Events.Where(e => e.EventName == "Test")
The Issue
EventInclusions will return a null value regardless of the data behind the scenes.
After reading a bit I am getting the feeling this is a change between EF6 which I normally use and EF Core
I could use some help in either making a blanket Lazy Loading on statement or figuring out the new format for specifying Lazy Loading.
Caz
Lazy loading is now available on EF Core 2.1 and here is link to the relevant docs:
https://learn.microsoft.com/en-us/ef/core/querying/related-data#lazy-loading
So it appears that EF Core does not currently support lazy loading. Its coming but may be a while off.
For now if anyone else comes across this problem and is struggling. Below is a demo of using Eager loading which is what for now you have to use.
Say before you had a person object and that object contained a List of Hats in another table.
Rather than writing
var person = _context.Person.Where(p=> p.id == id).ToList();
person.Hats.Where(h=> h.id == hat).ToList();
You need to write
var person = _context.Person.Include(p=> p.Hats).Where(p=> p.id == id).ToList();
And then person.Hats.Where(h=> h.id == hat).ToList(); will work
If you have multiple Lists - Chain the Includes
var person = _context.Person.Include(p=> p.Hats).Include(p=> p.Tickets)
.Include(p=> p.Smiles).Where(p=> p.id == id).ToList();
I kinda get why this method is safer, that your not loading huge data sets that could slow things down. But I hope they get Lazy loading back soon!!!
Caz
you can instaling this package for enable lazy loading in EF Core 2.1.
Microsoft.EntityFrameworkCore.Proxies
and then set this config in your ef dbContext
protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
=> optionsBuilder
.UseLazyLoadingProxies()
.UseSqlServer("myConnectionString");
"Notice" this package works only on EF Core 2.1 and above.
For EF Core 2.1 and above,
Install:
dotnet add package Microsoft.EntityFrameworkCore.Proxies --version 2.2.4
Then Update your Startup.cs file as indicated below.
using Microsoft.EntityFrameworkCore.Proxies;
services.AddEntityFrameworkProxies();
services.AddDbContext<BlogDbContext>(options =>
{
options.UseSqlite(Configuration.GetSection("ConnectionStrings")["DefaultConnection"]);
options.UseLazyLoadingProxies(true);
});
There's a pre-release version that just came out, regardless it's supposed to be available in full release soon.
A couple of caveats:
All your data properties that are more than simple types (ie: any other classes/tables) need to be public virtuals (default scaffolding they're not).
This line gets tucked into OnConfiguring on your data context:
optionsBuilder.UseLazyLoadingProxies();
It's (currently) pre-release so may the force be with you.
LazyLoading is not yet supported by EF Core, but there is a non-official library that enables LazyLoading: https://github.com/darxis/EntityFramework.LazyLoading. You can use it until it is officially supported.
It supports EF Core v1.1.1. It is available as a nuget package: https://www.nuget.org/packages/Microsoft.EntityFrameworkCore.LazyLoading/
Disclaimer: I am the owner of this repo and invite you to try it out, report issues and/or contribute.
Lazy load is planned to be in EF core 2.1 - you can read more on why it is a must-have feature - here.
Related
I have an api that uses code first entity framework, and uses AutoMapper to map the entities to models before surfacing the data to the consumer. The mapping profiles exist in the same file as the respective model classes.
I'm trying to restructure the projects a bit and pull the models out into a separate project while keeping the mapping profiles where they are in the existing project. And updating the reference to point to the new project after the models are removed locally, of course. Upon doing so, AutoMapper stops working.
Before the restructure
The structure of the solution before the restructure is as follows:
Api project (Houses the controllers and api endpoints)
Startup.cs
public void ConfigureServices(IServiceCollection services) {
...
services.AddAutoMapper(typeof(LocaleModel).GetTypeInfo().Assembly)
...
}
Application project (Houses the MediatR handlers, models, mapping profiles, etc.)
Models/LocaleModel.cs
public class LocaleModel {
public long Id { get; set; }
public string Code { get; set; }
public string Name { get; set; }
}
public class LocaleModelMapping() : Profile {
public LocaleModelMapping() {
CreateMap<Locale, LocaleModel>();
CreateMap<LocaleModel, Locale>();
}
}
Queries/Locales/Get/GetLocalesRequestHandler.cs
public async Task<IEnumerable<LocaleModel>> Handle(GetLocalesRequest request, CancellationToken cancellationToken) {
var locales = await DbContext.Locales
.AsNoTracking()
.ToListAsync(cancellationToken);
return Mapper.Map<List<LocaleModel>>(locales);
}
After the restructure
The structure of the solution after the restructure:
Api project (Houses the controllers and api endpoints)
Startup.cs
public void ConfigureServices(IServiceCollection services) {
...
services.AddAutoMapper(typeof(LocaleModelMapping).GetTypeInfo().Assembly)
...
}
Application project (Houses the MediatR handlers, models, mapping profiles, etc.)
Mappings/LocaleModelMapping.cs
public class LocaleModelMapping() : Profile {
public LocaleModelMapping() {
CreateMap<Locale, LocaleModel>();
CreateMap<LocaleModel, Locale>();
}
}
Models project (Houses the models only)
LocaleModel.cs
public class LocaleModel {
public long Id { get; set; }
public string Code { get; set; }
public string Name { get; set; }
}
The references were updated as necessary so that the Application project is aware of the Models project.
The moment I remove the models from the Application project and refer to the Models project, AutoMapper stops working, even though I updated Startup.cs to use the assembly of a mapping profile from the Application project.
The error that is generated is as follows (full namespace censored):
Mapping types:
List`1 -> List`1
List[...Domain.Locale, ...Domain, Version=2.0.0.0]
-> List[...Application.Models.LocaleModel, ...Application.Models, Version=1.0.0.0]
---> AutoMapper.AutoMapperMappingException: Missing type map configuration or unsupported mapping.
Things I've tried:
Putting all the mapping profiles in a single class.
Several different ways of passing in the assembly, types, etc. into services.AddAutoMapper().
Registering the mapping profiles manually in Startup.cs.
Mirrored another project with the same intended structure where it works just fine without issue.
Any help is appreciated!
Moving the Model and mapping should not have required any change to the service.AddAutomapper call beyond where the project would resolve the Assembly reference. You should have been able to leave the Startup.cs as services.AddAutoMapper(typeof(LocaleModel).GetTypeInfo().Assembly); The difference would have been the namespace where "LocaleModel" would have resolved from.
The first thing I would check is if you accidentally selected an Intellisense option to Create a new LocaleModelMapping class when you updated the Startup.cs so instead of it appending a using block to the new namespace in your Application project, it created a dummy LocaleModelMapping class somewhere in your API project leaving Automapper trying to resolve mappings from the API project still instead of the Application project. You can verify this by right clicking on the LocaleModelMapping inside (typeof(LocaleModelMapping).... and select "Go to Definition". Does this navigate you into your Application project, to an empty class in your API project, or somewhere else? (I.e. disassembly of a stale assembly reference)
Found the solution after a bit more searching.
In our dependency injection Autofac module that is called from ConfigureContainer(...) within Startup.cs, there was some code that was registering the AutoMapper profiles using one of the models. Which used to be coupled together with the mapping prior to me pulling them out into a separate project, but is no longer.
// AutoMapper Profiles
var profiles = typeof(LocaleModelMapping).Assembly.GetTypes()
.Where(t => typeof(Profile).IsAssignableFrom(t))
.Select(t => (Profile)Activator.CreateInstance(t));
builder.Register(ctx => new MapperConfiguration(cfg =>
{
foreach (var profile in profiles) cfg.AddProfile(profile);
}));
builder.Register(ctx => ctx.Resolve<MapperConfiguration>().CreateMapper())
.As<IMapper>()
.InstancePerLifetimeScope();
Simply updating the reference to use a mapping class instead of a model class fixed the issue.
Alternatively, removing the code altogether also worked. The Startup.cs code is already registering the AutoMapper profiles via services.AddAutoMapper(...), so this code in the Autofac module is not necessary.
For Blazor WebAssembly I came up with the idea of using SQLite. This question mentions it is not possible. Is it possible to use SQLite in Blazor WebAssembly and if so, how?
As of .NET 6, you can use include native dependencies in Blazor WebAssembly, one example in fact being SQLite. See example code here: https://github.com/SteveSandersonMS/BlazeOrbital/blob/6b5f7892afbdc96871c974eb2d30454df4febb2c/BlazeOrbital/ManufacturingHub/Properties/NativeMethods.cs#L6
Starting .Net 6, it is now possible to use SQLite with Blazor Web Assembly .
Here are the steps,
Add reference to following Nuget packages.
a. Microsoft.EntityFrameworkCore.Sqlite
b. SQLitePCLRaw.bundle_e_sqlite3 - Currently in preview as of posting this answer. This package is to avoid NativeFileReference of e_sqlite3.o.
Add the following in .csproj to avoid unwanted warning from popping out.
<AllowUnsafeBlocks>true</AllowUnsafeBlocks>
<EmccExtraLDFlags>-s WARN_ON_UNDEFINED_SYMBOLS=0</EmccExtraLDFlags>
Add the following code in Program.cs. This is required to avoid runtime exception - Could not find method 'AddYears' on type 'System.DateOnly'
public partial class Program
{
/// <summary>
/// FIXME: This is required for EF Core 6.0 as it is not compatible with trimming.
/// </summary>
[DynamicallyAccessedMembers(DynamicallyAccessedMemberTypes.All)]
private static Type _keepDateOnly = typeof(DateOnly);
}
I'm using sqlite inmemory store. Here is the Code Example.
Database Model:
public class Name
{
public int Id { get; set; }
public string FullName { get; set; }
}
Database Context:
public class TestDbCOntext : DbContext
{
public DbSet<Name> Names { get; set; } = default!;
public TestDbCOntext(DbContextOptions<TestDbCOntext> options) : base(options)
{
}
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
base.OnModelCreating(modelBuilder);
modelBuilder.Entity<Name>().ToTable("Names");
modelBuilder.Entity<Name>().HasIndex(x => x.FullName);
modelBuilder.Entity<Name>().Property(x => x.FullName).UseCollation("nocase");
}
protected override void OnConfiguring(DbContextOptionsBuilder options)
{
options.LogTo(Console.WriteLine, LogLevel.Warning)
.EnableDetailedErrors()
.EnableSensitiveDataLogging(true);
}
}
Page/Component:
<button #onclick="RunEfCore">Run Ef Core</button>
#code {
private async Task RunEfCore()
{
var connectionStringBuilder = new SqliteConnectionStringBuilder { DataSource = ":memory:" };
var connection = new SqliteConnection(connectionStringBuilder.ToString());
var options = new DbContextOptionsBuilder<TestDbCOntext>()
.UseSqlite(connection)
.Options;
using var db = new TestDbCOntext(options);
db.Database.OpenConnection();
await db.Database.EnsureCreatedAsync();
var nextId = db.Names!.Count() + 1;
db.Names.Add(new Name { Id = nextId, FullName = "Abdul Rahman" });
await db.SaveChangesAsync();
Console.WriteLine();
await foreach (var name in db.Names.AsAsyncEnumerable())
{
Console.WriteLine(name.FullName);
}
db.Database.CloseConnection();
}
}
For persistance, you can make use of IndexedDB from browser and sync to save in your server.
Sample Working Demo can found in my Github Repo - BlazorWasmEfCore
Live Demo
Refer the github issue for complete history.
Steve Sanderson Video Reference
Screenshot:
Points to consider:
The below details are taken from following stackoverflow answer.
A good rule of programming is KISS - Keep it Simple. So if the requirement of your app is satisfied by Linq to Objects, then complicating it with SQLite would seem to be the wrong thing to do.
However, in the video Steve S. does come up with requirement parameters that lend themselves to using SQLite - the application needs to process:
lots of data
in the client
offline
quickly
The use of SQLite here is not for persisting beyond client app memory.
So the answer to your question on the advantage of a Blazor app using SQLite is simply:
If an app needs the functionality of an in-memory relational database but is trying to avoid using one, it will either be reinventing the wheel in its implementation, or missing necessary functionality.
If an app doesn't need the functionality of an in-memory database but is using one, it will be introducing unnecessary complexity (cost).
Your Blazor WebAssembly C# code still runs in the sandbox of the browser, that means it is not allowed to open files on the local drive.
Blazor WebAssembly has the same access to the machine as any regular website.
Even if someone was to port SQLite to WebAssembly you would not be able to open a database file.
For storage on the client computer you are limited to Local Storage, it is limited to 5 MB (might be different per browser brand) and can only contain strings. But it is not a reliable option as the data will be removed when the users clears the cache, browser history etc..
The only option you have is storing data on the server.
I'm currently working on an ASP .NET 5.0 application - I migrate the app from .NET3.1 to .NET 5.0. I use Entity Framework Core with Devart.Data.Oracle.EFCore to connect to an Oracle Database.
I try to save a simple record with a string value, when I call the Context.SaveChangesAsync() method, I get this error: Sequence contains no elements, oddly the source for the exception is System.Linq => System.Linq.ThrowHelper.ThrowNoElementsException(). There is a StackTrace but no detailed inner exception message unfortunately.
My save method in the repository looks like this:
public async Task<T> Add(T entity)
{
// ...
await Context.AddAsync<T>(entity);
await Context.SaveChangesAsync();
return entity;
}
The Tree entity class looks like this:
public class Tree
{
public int Id { get; set; }
public string Name { get; set; }
}
The Tree Entity Configuration looks like this (Scaffolded with EF Core):
public void Configure(EntityTypeBuilder<Tree> entity)
{
entity.ToTable("TREES");
entity.HasIndex(e => e.Id, "SYS_C001234")
.IsUnique();
entity.Property(e => e.Id).HasColumnName("ID");
entity.Property(e => e.Name).HasColumnName("NAME");
}
On my AppContext, the Trees are registered:
public virtual DbSet<Tree> Trees { get; set; }
In the DB, the Tree table looks like this:
Column_Name | Data_Type
------------------------
ID | NUMBER(10,0)
NAME | NCLOB
It is odd, because it worked before I ported my app to .NET 5.0 and updated all NuGet packages...
I tracked down the error to the following NuGet packages updates:
I cannot update Devart.Data.Oracle.EFCore alone, because it is dependent on the Microsoft NuGet packages listed below. I must update all 5 NuGet packages at once.
After this NuGet package update, I cannot save the entity anymore.
Maybe I'm missing something in my configuration?
Do you know how to solve this issue?
The bug with throwing "Sequence contains no elements" on insert or update in EF Core 5 is fixed. We are going to release the new public build of dotConnect for Oracle this week.
I am building a simple Asp.Net Core app in linux(pop os). I am using VueJs + Aps.Net Core 3.1.101
I am trying to do a POST call to my app and my model is like below:
public class AddConfigurationContextValueApiRequest
{
public int ContextId { get; set; }
[Required(ErrorMessage = "Value is required to continue")]
[StringLength(500, ErrorMessage = "Value can not be longer than 500 characters")]
public string Value { get; set; }
[StringLength(500, ErrorMessage = "Display name can not be longer than 500 characters")]
public string DisplayName { get; set; }
}
As you can see there is not Required attribute for the DisplayName field, but whenever I pass a null value from VueJS app for this field I get The DisplayName field is required..
I am trying to figure out why would AspNet Core complain for this, since there is no Required attribute for such field!
Does anybody know if this intentional ? I tried to remove the StringLength attribute and still it triggers required attribute.
My action is fairly simple:
[HttpPost(UrlPath + "addConfigurationContextValue")]
public async Task AddConfigurationContextValue([FromBody]AddConfigurationContextValueApiRequest request)
{
using var unitOfWork = _unitOfWorkProvider.GetOrCreate();
if (!ModelState.IsValid)
{
//Here it throws because ModelState is invalid
throw new BadRequestException(ModelState.GetErrors());
}
//do stuff
await unitOfWork.CommitAndCheckAsync();
}
I have seen the same issue where the .csproj Nullable setting caused a property that was not marked as [Required] to act as though it were. I took a different approach than changing the Nullable settings in the .csproj file.
In my case it came down to a property that is required by the database; but the model allows null during POST as this particular property is a secret from the user. So I had avoided changing string to string? initially.
Once again, the Fluent API has provided an alternative solution.
Original Property
[JsonIgnore]
[StringLength(15)]
public string MyProperty { get; set; }
Updated Property
[JsonIgnore]
public string? MyProperty { get; set; }
Fluent API Directives (in your DbContext file)
protected override void OnModelCreating(ModelBuilder builder) {
builder.Entity<MyClass>(c => {
c.Property(p => p.MyProperty)
.IsRequired()
.HasMaxLength(15)
.IsFixedLength();
});
}
Apparently .NET 6 Web APIs have the "Nullable" property added by default. I simply had to remove it.
.csproj file:
Edit: As Luke pointed out (without elaborating any further), the above behavior is working as intended and actually makes sense. If your JSON has a null value, your code might crash, if not handled. Throwing a compile time error is impossible, since the JSON is unknown. Forcing yourself to using nullable reference types makes the code more resilient.
I haven't checked yet, how swagger behaves, the default config surely doesn't care about a .NET 6 setting and will no longer flag mandatory fields, if you leave out the [Required] attribute and use only nullable reference types to point out what is mandatory and what isn't.
After #devNull's suggestion I found out that somehow while I was playing around with Rider IDE it seems it switched that feature on!
There is an option in rider that allows to change that configuration on project level:
If somebody has the same problem: right click on the project level, go to properties, Application and there you can see this configuration.
Thank you #devNull for the help :)
I am working on a site where we'd like schema to be the differentiator between projects and in doing so, securing things up between projects.
I have had some luck with this using the HasDefaultSchema method in OnModelCreating in my data context, but also having to make my DbContext implement IDbModelCacheKeyProvider and implementing the CacheKey property. It has worked before. Unfortunately, this solution seems to be inconsistent and I am currently having a problem where I am trying to update my model but getting the following error when running Update-Database:
The specified schema name "dbo" either does not exist or you do not have permission to use it.
The connection string has a user that only has access to my xma schema, so this error makes sense if I had changed the schema, but you can see in the following code, I haven't:
public class DataContext : DbContext, DbModelCacheKeyProvider
{
public DataContext()
//: base("name=DataContext")
: base("DataContext")
{
}
protected override void OnModelCreating(DbModelBuilder modelBuilder)
{
...
modelBuilder.HasDefaultSchema("xma");
base.OnModelCreating(modelBuilder);
}
public virtual DbSet<Article> Articles { get; set; }
public virtual DbSet<Author> Authors { get; set; }
...
public string CacheKey
{
get { return Utility.SchemaPrefix ?? "xma"; }
}
}
This problem occurs with other team members too, so any help would be much appreciated.
Thanks
Edit: I forgot to mention, this appears to be a problem on a clean database.
I had removed the pending changes to the model, and was able to run update-database -script as advised by Steve, turns out that the initial migrations are based on the dbo schema so changed to a more privileged user and was able to continue with recreating the database. My mistake unfortunately.