I'm pretty new with ASP.NET MVC, but I know a lot more about PHP. So if I had 2 connected tables in a database, normally I'd connect them with an ID - secondary key.
Example:
Movies:
+ ID
+ Title
+ Description
+ Genre
-- Comments --
+ ID
+ MoviesID
+ Comment
Thus creating a one to many kind of relationship. But I saw that in ASP.NET MVC people would create models which would reference one another:
public class Movie {
// Annotations not included
public int ID {get;set;}
public string Title {get;set;}
public string Description {get;set;}
public string Genre {get;set;}
public List<Comment> Comment {get;set;}
}
public class Comment {
public int ID {get;set;}
public Movie Movie {get;set;}
public string Comment {get;set;}
}
So the parts where one model calls another, how does that looks like in the database, how do you fill those up in the database, how do you pass values of one comment/movie class to a database of the other class when you want to create a new row? (if that makes sense - example would be how to insert a movie object in the database for when you upload a new comment in the database.) Or if at least you could give me some source to read about it, because I found nothing.
Of course another question would be if this is smart to do, or should you just do it the "traditional" way, following the pattern I did at the top?
Hope it's understandable, thanks!
What you are referring to is an ORM what stands for Object-Relational Mapping.
Basically ORM helps you treat tables and relations between them as objects. This approach makes programming much more easier in terms of common language for code and database.
ORM tools are widely used also in PHP(search for Proper, Doctrine). For C# you can refer to NHibernate, Entity Framework or micro-ORMs like Dapper. The example you provided as a .NET approach is similar to your approach in PHP. The only difference is that you explicitely mark one property as foreign key. You could change your example to:
public class Movie {
//Annootations not included
public int ID {get;set;}
public string Title {get;set;}
public string Description {get;set;}
public string Genre {get;set;}
public List<int> CommentIds {get;set;}
}
public class Comment {
public int ID {get;set;}
public int MovieID {get;set;}
public string Comment {get;set;}
}
but this would only load identifiers for related records. When using ORM you can mark property(which in table is an identifier to the another table record) as strong-typed class, what enables you to load all its data from database.
To make long story short - when using ORM you can load whole Movie when fetching Comment from DB, not only its identifier.
Whole process and its configuration depends on ORM tool you are using, you can use mapping attributes(e.g. in Entity Framework), or fluent mappings(when using NHibernate with Fluent NHibernate). Those tools are quite complex(there are many issues to solve - eager/lazy loading, connection management, session management, LINQ to Entities and many many more, it is impossible to explain it on SO :))
Related
I am using code first approach for my .net core project. We are using multiple schemas in single database.
We are maintaining models in different class library projects like Inventory,Sales,Finance etc ..
Model mapping is like ( note : below they are different name spaces)
[Table(name: "Product", Schema = "Inventory")]
public class Product
{
public int ProductId {get;set;}
public string Name {get;set;}
}
[Table(name: "Order", Schema = "Sales")]
public class Order
{
public int OrderId{get;set;}
public int ProductId {get;set;}
public virtual Product Product {get;set;}
}
With in the schema i am able to add relation by using attribute. Now i want to add relation between these tables.
I have tried some tweaks but not working.
Any help is appreciated.
Update:
DBcontext's also different for each schema and they are placed in respective class library
You have conflicting approaches in your current architecture:
On one hand, you have gone for a micro-services approach where each service is dealing with its own bounded context (e.g. Inventory, Sales, etc.). This seems to be confirmed by your use of different database schemas, which can be viewed as logically different databases that happen to be deployed in a single physical database. This is fine, and allows for future scaling / segregation where you might move the inventory data into its own database, for example.
On the other hand, you are trying to treat the datastore as a monolithic artefact where you can build table relationships across the bounded context boundary established by your micro-service approach.
If you wish to maintain your micro-services approach, which is perfectly reasonable, then you have to accept that you cannot rely on database-enforced referential integrity for table relationships that span entities in different micro-services.
You'd need a layer above that can retrieve data from separate micro-services and put them together into entities (preferably DTO entities, not the EF Data Entities used for code-first) that the consumer is looking for.
This layer would first retrieve Orders from the 'Sales' service, and then enumerate your Orders and retrieve the relevant Products from the 'Inventory' service and then map those into DTO entities that include navigation properties between OrderDTO and ProductDTO.
Your Order data entity should not have a virtual navigation property to "Product" but instead just hold a unique id of the Product that the Order relates to (not enforced as a database relationship). Personally, I would go further and introduce a GUID Unique Identifier to the Product data class that can be used in the Order to uniquely identify the product. That way, if you ever do migrate your Inventory tables to a new database, you don't have to worry about managing the Database Identity Column during that migration as the reference 'outside' of the Inventory service to the Product table would be the GUID Unique Identifier.
You have a typo
public virtal Product Product {get;set;}
should be virtual
Also you need a navigation member in Product
[Table(name: "Product", Schema = "Inventory")]
public class Product
{
public int ProductId {get;set;}
public string Name {get;set;}
public virtual ICollection<Order> Orders { get; set; }
}
[Table(name: "Order", Schema = "Sales")]
public class Order
{
public int OrderId{get;set;}
public int ProductId {get;set;}
public virtual Product Product {get;set;}
}
EF Core has a "code first mentality" by default, i.e. it is supposed to be used in a code-first manner, and even though database-first approach is supported, it is described as nothing more than reverse-engineering the existing database and creating code-first representation of it. What I mean is, the model (POCO classes) created in code "by hand" (code-first), and generated from the database (by Scaffold-DbContext command), should be identical.
Surprisingly, official EF Core docs demonstrate significant differences. Here is an example of creating the model in code: https://ef.readthedocs.io/en/latest/platforms/aspnetcore/new-db.html And here is the example of reverse-engineering it from existing database: https://ef.readthedocs.io/en/latest/platforms/aspnetcore/existing-db.html
This is the entity class in first case:
public class Blog
{
public int BlogId { get; set; }
public string Url { get; set; }
public List<Post> Posts { get; set; }
}
public class Post
{
public int PostId { get; set; }
public string Title { get; set; }
public string Content { get; set; }
public int BlogId { get; set; }
public Blog Blog { get; set; }
}
and this is the entity class in second case:
public partial class Blog
{
public Blog()
{
Post = new HashSet<Post>();
}
public int BlogId { get; set; }
public string Url { get; set; }
public virtual ICollection<Post> Post { get; set; }
}
The first example is a very simple, quite obvious POCO class. It is shown everywhere in the documentation (except for the examples generated from database). The second example though, has some additions:
Class is declared partial (even though there's nowhere to be seen another partial definition of it).
Navigation property is of type ICollection< T >, instead of just List< T >.
Navigation property is initialized to new HashSet< T >() in the constructor. There is no such initialization in code-first example.
Navigation property is declared virtual.
DbSet members in a generated context class are also virtual.
I've tried scaffolding the model from database (latest tooling as of this writing) and it generates entities exactly as shown, so this is not an outdated documentation issue. So the official tooling generates different code, and the official documentation suggests writing different (trivial) code - without partial class, virtual members, construction initialization, etc.
My question is, trying to build the model in code, how should I write my code? I like using ICollection instead of List because it is more generic, but other than that, I'm not sure whether I need to follow docs, or MS tools? Do I need to declare them as virtual? Do I need to initialize them in a constructor? etc...
I know from the old EF times that virtual navigation properties allow lazy loading, but it is not even supported (yet) in EF Core, and I don't know of any other uses. Maybe it affects performance? Maybe tools try to generate future-proof code, so that when lazy-loading will be implemented, the POCO classes and context will be able to support it? If so, can I ditch them as I don't need lazy loading (all data querying is encapsulated in a repo)?
Shortly, please help me understand why is the difference, and which style should I use when building the model in code?
I try to give a short answer to each point you mentioned
partial classes are specially useful for tool-generated code. Suppose you want to implement a model-only derived property. For code first, you would just do it, wherever you want. For database first, the class file will be re-written if you update your model. So if you want to keep your extension code, you want to place it in a different file outside the managed model - this is where partial helps you to extend the class without tweaking the auto-generated code by hand.
ICollection is definitely a suitable choice, even for code first. Your database probably won't support a defined order anyway without a sorting statement.
Constructor initialization is a convenience at least... suppose you have either an empty collection database-wise or you didn't load the property at all. Without the constructor you have to handle null cases explicitely at arbitrary points in code. Whether you should go with List or HashSet is something I can't answer right now.
virtual enables proxy creation for the database entities, which can help with two things: Lazy Loading as you already mentioned and change tracking. A proxy object can track changes to virtual properties immediately with the setter, while normal objects in the context need to be inspected on SaveChanges. In some cases, this might be more efficient (not generally).
virtual IDbSet context entries allow easier design of testing-mockup contexts for unit tests. Other use cases might also exist.
I want to have a page where the user selects from a drop down list the category, then adds a small text about that category and uploads an image where the path of that image is saved in the database rather than the whole image. I have created a table "Categories" where the admin is authorized to fill it and the user only selects from the categories list.
Here is what I have done so far:
The create categories model:
using System;
using System.Collections.Generic;
using System.ComponentModel.DataAnnotations;
using System.Data.Entity;
using System.Linq;
using System.Web;
namespace DemoIdentity.Models
{
public class CategoriesAdmin
{
public int ID { get; set; }
[Required(AllowEmptyStrings = false)]
[Display(Name = "category name")]
public string categoryName { get; set; }
}
public class DefaultConnection:DbContext
{
public DbSet<CategoriesAdmin> categories { get; set; }
}
}
Now I want to have another table (Data) which includes (ID, Category (category name selected from table categories), News, Image_Path). This table is in the Default Connection database. The category name is the selected category name from a drop down list, and the image path is an upload image which saves the path rather than the whole image.
I am unsure of how to achieve this.
It appears that you are confusing components of ASP.NET MVC and Entity Framework.
As the Entity Framework site states:
Entity Framework (EF) is an object-relational mapper that enables .NET
developers to work with relational data using domain-specific objects.
It eliminates the need for most of the data-access code that
developers usually need to write.
And the MVC site states that:
The ASP.NET MVC is an open source web application framework that
implements the model–view–controller (MVC) pattern.
The two frameworks meet through your model classes. MVC uses the model class to define the data, logic and rules of the application. In Entity Framework, your model class is mapped to tables in your database where it handles the direct reads and writes for you.
By creating your CategoriesAdmin model class and exposing it as a property in your DbContext class as such:
public class DefaultConnection:DbContext
{
public DbSet<CategoriesAdmin> categories { get; set; }
}
Entity Framework will have mapped your model class to a database table called CategoriesAdmins. If this table does not yet exist in your database, it will automatically create it for you. This approach in Entity Framework is known as Code First to a new Database.
Now since you already have a table that stores the available categories (CategoriesAdmin), you will need to create a second model class (called Data for the sake your example) which contains properties for the other bits of information that you want to store.
public class Data
{
// gets or sets the ID of this Data record.
public int ID {get;set;}
public string ImagePath {get;set;}
// other properties
...
}
Now that you have two model classes, you need to create a relationship between the two. In a SQL database this is achieved by Foreign Keys. In Entity Framework, you can achieve the same by using Navigational Properties.
So we update the Data model class as such:
public class Data
{
// gets or sets the ID of this Data record.
public int ID {get;set;}
public string ImagePath {get;set;}
// gets or sets the ID of the related CategoriesAdmin record.
public int CategoriesAdminId {get;set;}
// gets or sets the related CategoriesAdmin record. Entity Framework will
// automatically populate this property with an object for the related
// CategoriesAdmin record.
[ForeignKey("CategoriesAdminId")]
public virtual CategoriesAdmin CategoriesAdmin {get;set;}
// other properties
...
}
The ForeignKeyAttribute on the CategoriesAdmin property is there to give Entity Framework a further hint of the foreign key column to load the navigational property from.
Finally to be able to use your new Data model class with Entity Framework, you need to add another property to your DbContext class so that you have a means of accessing your data:
public class DefaultConnection:DbContext
{
public DbSet<CategoriesAdmin> Categories { get; set; }
public DbSet<Data> Data { get; set; }
}
Now that you have created your model classes and wired them into Entity Framework, you will now be able to use them in MVC. If you load your Data model into your view (using DefaultConnection.Data), you will be able to access the related CategoriesAdmin record by accessing the CategoriesAdmin property on the Data object.
In short: two tables means you need two models. Both models can be loaded into the single view.
Footnote: Apologies if there are large gaps in my answer as there is a lot to explain that have already been explained in other places far better than what I can. The references I have linked should hopefully fill in the gaps.
Should you need more help, please see all of the tutorials on the ASP.NET MVC website on working with data. They're much better written than my concise attempt. I would recommend following them exactly and getting the examples to work before completing your own project so that you have a better understanding of how the two frameworks work and interact with each other.
I have a quick question about the sqlite-net library which can be found here : https://github.com/praeclarum/sqlite-net.
The thing is I have no idea how collections, and custom objects will be inserted into the database, and how do I convert them back when querying, if needed.
Take this model for example:
[PrimaryKey, AutoIncrement]
public int Id { get; set; }
private string _name; // The name of the subject. i.e "Physics"
private ObservableCollection<Lesson> _lessons;
Preface: I've not used sqlite-net; rather, I spent some time simply reviewing the source code on the github link posted in the question.
From the first page on the sqlite-net github site, there are two bullet points that should help in some high level understanding:
Very simple methods for executing CRUD operations and queries safely (using parameters) and for retrieving the results of those
query in a strongly typed fashion
In other words, sqlite-net will work well with non-complex models; will probably work best with flattened models.
Works with your data model without forcing you to change your classes. (Contains a small reflection-driven ORM layer.)
In other words, sqlite-net will transform/map the result set of the SQL query to your model; again, will probably work best with flattened models.
Looking at the primary source code of SQLite.cs, there is an InsertAll method and a few overloads that will insert a collection.
When querying for data, you should be able to use the Get<T> method and the Table<T> method and there is also an Query<T> method you could take a look at as well. Each should map the results to the type parameter.
Finally, take a look at the examples and tests for a more in-depth look at using the framework.
I've worked quite a bit with SQLite-net in the past few months (including this presentation yesterday)
how collections, and custom objects will be inserted into the database
I think the answer is they won't.
While it is a very capable database and ORM, SQLite-net is targeting lightweight mobile apps. Because of this lightweight focus, the classes used are generally very simple flattened objects like:
public class Course
{
public int CourseId { get; set; }
public string Name { get; set; }
}
public class Lesson
{
public int LessonId { get; set; }
public string Name { get; set; }
public int CourseId { get; set; }
}
If you then need to Join these back together and to handle insertion and deletion of related objects, then that's down to you - the app developer - to handle. There's no auto-tracking of related objects like there is in a larger, more complicated ORM stack.
In practice, I've not found this a problem. I find SQLite-net very useful in my mobile apps.
I'm really looking for advice here on best practices so I will explain the situation. We have a fairly large application built on top of POCO and EF 4 with a complicated database. While we have been happy with Entity Framework there are definite performance improvements to be made for example with the following scenario (quite simplified).
We have a table called News which has a collection of users that have added it to their favourites and a collection of ratings (1 - 5) by users for example:
public class News
{
public virtual int NewsId;
public virtual string Title;
.......etc....
public virtual ICollection<User> UserFavourites { get; set; }
public virtual ICollection<Rating> Ratings { get; set; }
}
We have written a stored procedure which returns news for a user and allows us to return whether it is a favourite and whether it has already been rated by the user we are requesting the data for and the current rating for News rather than use EF to build this data from the ICollections and we end up with an object like below.
public class NewsDataModel
{
public int NewsId;
public string Title;
.......etc....
public bool IsFavourite { get; set; }
public bool IsRated { get; set; }
public double Rating { get; set; }
}
The stored procedure is much faster and a single database hit rather than EF with Lazy Loading which could be multiple calls but the data returned by the sproc does not match the POCO class for news which is above.
We have been trying to workout the best way to move forward with this as we have a INewsRepository which can either return the entity framework related class or the custom DataModel class we are populating with a stored procedure and ADO.NET. This doesn't feel right and I would appreciate any advice or insight from others experience about the best way to handle these scenarios when you want a single object with data built from multiple tables which would be a lot faster with a sproc than an entity framework call with lazy loading enabled.
Many thanks for any help
There is nothing wrong with a new method on your repository returning instances of NewsDataModel - it is still in the scope of your INewsRepository because it is data class constructed from news information. Otherwise you will have repository for every data model you defined.