Save/Load Properties to file or database provider - c#

I need to save and load properties of a Class dynamicly,
but what is the best practis for this ?
I have for now, two classes that I need to save.
public abstract class BaseComponent {
protected int ComponentID { get; set; }
protected string ComponentName { get; set; }
protected Dictionary GetAllProperties{) { /* Reflection */ }
}
public class Article : BaseComponent {
protected string Title { get; set; }
protected string Content { get; set; }
}
Here I'm thinking to table:
Table: Component, ComponentID, Parrent -> and more
Table: ComponentProperties: ComponentID, Key, Value -> and more
I need to use as must of the dotNet framework, but still keep it simpel.
Im think og use a Provider, I need the function og make different data provider:thatcan save to xml file, sql database or oracle database, you name it.
Do I use a provider, or somethnig else ?

The answer for your persistence will depend very much on the scale of what you're writing. Storing XML in files or a database certainly works, as does a metadata driven generic schema (as you described), but making them work for large scale data can be problematic. Salesforce.com did a recent presentation on how they handle dynamic entities in a multi-tenant architecture (see here), but I doubt you need anything of that scale.
My recommendation would be to store as XML to start. The key issues to solve with the generic schema are joins between logical entities and the trouble of pulling together data that belongs to a single entity when it's split out into key/value pairs in a table.

.Net Framework 2.0 or later has a great class that helps you to implement this task, called SettingsBase. You can find it in System.Configuration assembly. You have to create a class for instance BaseComponentSettings and derive it from System.Configuration.SettingsBase. SettingsBase is provider model and you can support many data source as well.

Related

How to use documents linking with rethinkdb?

I'm using Rethink DB with C# via RthinkDB.Driver https://github.com/bchavez/RethinkDb.Driver.
I know that in relational databases there is a feature to create references in one document to another. In mongodb it is ObjectID, in LiteDB it is BsonRef attribute or Dbref function.
LiteDB example:
public class Order
{
public int OrderId { get; set; }
[BsonRef("customers")] // where "customers" are Customer collection name
public Customer Customer { get; set; }
}
The question is, how i can declare reference of object to another table?
I read this article https://rethinkdb.com/docs/data-modeling/#linking-documents-in-multiple-tables but there are no examples how to insert documents with references.
The c# driver don't expose annotations for your classes. RethinkDB is only storing json documents that can be queried with relations. Consider writting a little ORM, or comment your classes attributes.

SqliteNetExtensions is it possible to ignore a specified child when calling InsertOrReplaceWithChildren()?

I'm building a mobile app in xamarin that has a lot of relationships between classes. For simplicity consider it to be a "University" app that has the classes: ExamPapers, Students, ExamAnswers.
The ExamPaper class would look like this
public class ExamPapers {
[ManyToMany(typeof(Student_ExamPaper))]
public List<Student> Students { get; set; }
[OneToMany]
public List<ExamAnswers> Files { get; set; }
[OneToMany(CascadeOperations = CascadeOperation.All)]
public List<ExamSection> Sections { get; set; }
public string ExamName { get; set; }
[PrimaryKey]
public string Id { get; set; }
}
So at the moment any sqlitenetextension operation (the ones that end with "WithChildren") will interact with all the relationships when I often just want to interact with one at a time. Here are some example scenarios:
A student "Lana" is just granted access to an ExamPaper "Mathematics 101". Now to grant her access I need to load; all other students with access, all the files for the exam, all the sections. Otherwise these relationships get deleted when I call "InsertOrReplaceWithChildren".
I want to find out the first question in an ExamPaper. I call "GetWithChildren(examId)". I now have an object with a lot of information I don't want (e.g. 300 students and 300 ExamAnswers).
Sorry if I missed something relevant in the documentation, but I've read it through a couple times now. https://bitbucket.org/twincoders/sqlite-net-extensions
Thanks.
Answer on your question in title: No. You cannot.
SQLite-Net-Extensions does not provide such flexible API for manipulating related data.
But there is one helpful thing that can be used in specific cases:
You can work with junction tables as simple tables through SQLite-Net methods (methods without *WithChildren postfix) if junction tables has PrimaryKey (Id).
For example if you have any data which you want to get withoud additional (related) data, just simply call Table</*class*/>() method on specific table with Where(/*your condition*/) clause for getting only data that you really need. Then you can save modified data through Update method.
Unfortunately, this thing will not work with relations update (for example, if you want to move one ExamAnswer from ExamPaper to another ExamPaper object) because all SQLite-Net-Extensions attributes inherited from Ignore SQLite-Net attribute which is ignoring in all SQLite-Net operations.
But there is another one workaround (a little hacky), you can specify second class without any SQLite-Net-Extensions attributes and that's provide you to CRUD any field in specific table.

What is a proper way of writing entity POCO classes in Entity Framework Core?

EF Core has a "code first mentality" by default, i.e. it is supposed to be used in a code-first manner, and even though database-first approach is supported, it is described as nothing more than reverse-engineering the existing database and creating code-first representation of it. What I mean is, the model (POCO classes) created in code "by hand" (code-first), and generated from the database (by Scaffold-DbContext command), should be identical.
Surprisingly, official EF Core docs demonstrate significant differences. Here is an example of creating the model in code: https://ef.readthedocs.io/en/latest/platforms/aspnetcore/new-db.html And here is the example of reverse-engineering it from existing database: https://ef.readthedocs.io/en/latest/platforms/aspnetcore/existing-db.html
This is the entity class in first case:
public class Blog
{
public int BlogId { get; set; }
public string Url { get; set; }
public List<Post> Posts { get; set; }
}
public class Post
{
public int PostId { get; set; }
public string Title { get; set; }
public string Content { get; set; }
public int BlogId { get; set; }
public Blog Blog { get; set; }
}
and this is the entity class in second case:
public partial class Blog
{
public Blog()
{
Post = new HashSet<Post>();
}
public int BlogId { get; set; }
public string Url { get; set; }
public virtual ICollection<Post> Post { get; set; }
}
The first example is a very simple, quite obvious POCO class. It is shown everywhere in the documentation (except for the examples generated from database). The second example though, has some additions:
Class is declared partial (even though there's nowhere to be seen another partial definition of it).
Navigation property is of type ICollection< T >, instead of just List< T >.
Navigation property is initialized to new HashSet< T >() in the constructor. There is no such initialization in code-first example.
Navigation property is declared virtual.
DbSet members in a generated context class are also virtual.
I've tried scaffolding the model from database (latest tooling as of this writing) and it generates entities exactly as shown, so this is not an outdated documentation issue. So the official tooling generates different code, and the official documentation suggests writing different (trivial) code - without partial class, virtual members, construction initialization, etc.
My question is, trying to build the model in code, how should I write my code? I like using ICollection instead of List because it is more generic, but other than that, I'm not sure whether I need to follow docs, or MS tools? Do I need to declare them as virtual? Do I need to initialize them in a constructor? etc...
I know from the old EF times that virtual navigation properties allow lazy loading, but it is not even supported (yet) in EF Core, and I don't know of any other uses. Maybe it affects performance? Maybe tools try to generate future-proof code, so that when lazy-loading will be implemented, the POCO classes and context will be able to support it? If so, can I ditch them as I don't need lazy loading (all data querying is encapsulated in a repo)?
Shortly, please help me understand why is the difference, and which style should I use when building the model in code?
I try to give a short answer to each point you mentioned
partial classes are specially useful for tool-generated code. Suppose you want to implement a model-only derived property. For code first, you would just do it, wherever you want. For database first, the class file will be re-written if you update your model. So if you want to keep your extension code, you want to place it in a different file outside the managed model - this is where partial helps you to extend the class without tweaking the auto-generated code by hand.
ICollection is definitely a suitable choice, even for code first. Your database probably won't support a defined order anyway without a sorting statement.
Constructor initialization is a convenience at least... suppose you have either an empty collection database-wise or you didn't load the property at all. Without the constructor you have to handle null cases explicitely at arbitrary points in code. Whether you should go with List or HashSet is something I can't answer right now.
virtual enables proxy creation for the database entities, which can help with two things: Lazy Loading as you already mentioned and change tracking. A proxy object can track changes to virtual properties immediately with the setter, while normal objects in the context need to be inspected on SaveChanges. In some cases, this might be more efficient (not generally).
virtual IDbSet context entries allow easier design of testing-mockup contexts for unit tests. Other use cases might also exist.

How to insert an ObservableCollection property to a local sqlite-net db?

I have a quick question about the sqlite-net library which can be found here : https://github.com/praeclarum/sqlite-net.
The thing is I have no idea how collections, and custom objects will be inserted into the database, and how do I convert them back when querying, if needed.
Take this model for example:
[PrimaryKey, AutoIncrement]
public int Id { get; set; }
private string _name; // The name of the subject. i.e "Physics"
private ObservableCollection<Lesson> _lessons;
Preface: I've not used sqlite-net; rather, I spent some time simply reviewing the source code on the github link posted in the question.
From the first page on the sqlite-net github site, there are two bullet points that should help in some high level understanding:
Very simple methods for executing CRUD operations and queries safely (using parameters) and for retrieving the results of those
query in a strongly typed fashion
In other words, sqlite-net will work well with non-complex models; will probably work best with flattened models.
Works with your data model without forcing you to change your classes. (Contains a small reflection-driven ORM layer.)
In other words, sqlite-net will transform/map the result set of the SQL query to your model; again, will probably work best with flattened models.
Looking at the primary source code of SQLite.cs, there is an InsertAll method and a few overloads that will insert a collection.
When querying for data, you should be able to use the Get<T> method and the Table<T> method and there is also an Query<T> method you could take a look at as well. Each should map the results to the type parameter.
Finally, take a look at the examples and tests for a more in-depth look at using the framework.
I've worked quite a bit with SQLite-net in the past few months (including this presentation yesterday)
how collections, and custom objects will be inserted into the database
I think the answer is they won't.
While it is a very capable database and ORM, SQLite-net is targeting lightweight mobile apps. Because of this lightweight focus, the classes used are generally very simple flattened objects like:
public class Course
{
public int CourseId { get; set; }
public string Name { get; set; }
}
public class Lesson
{
public int LessonId { get; set; }
public string Name { get; set; }
public int CourseId { get; set; }
}
If you then need to Join these back together and to handle insertion and deletion of related objects, then that's down to you - the app developer - to handle. There's no auto-tracking of related objects like there is in a larger, more complicated ORM stack.
In practice, I've not found this a problem. I find SQLite-net very useful in my mobile apps.

NHibernate: Lazy Loading Properties

So, according to Ayende Lazy Loading Properties are already in the NHibernate trunk.
My Problem is: I can't use the trunk for I have FluentNHibernate and LinQ for NHibernate, so I depend on the version they are linked against (Versio 2.x). I can't and don't want to build all the assemblies myself against the newest version of NHibernate.
So, has someone got information about when NHibernate 3.0 will leave Beta-Stadium and the auxiliaries (Linq etc.) will be compiled against it?
I appreciate any estimate!
I need this feature so I can use it on Blob-Fields. I don't want to use workarounds to destroy my object model.
You can compile Fluent with the NH 3.0 binaries, and you don't need L2NH anymore; there's a new integrated provider.
Alternatively it isn't much of a model change. Make a new class, Blob, that has Id, Version and Bytes properties, make a new table to match. Add the new class as a protected property to each of your classes that currently has a blob. Use it like a backing store. Change your mapping to map the underlying property instead of the public one.
public class MyClass
{
public MyClass()
{
MyBlobProperty_Blob= new Blob();
}
public virtual byte[] MyBlobProperty
{
get { return MyBlobProperty_Blob.Bytes; }
}
protected virtual Blob MyBlobProperty_Blob { get; private set; }
}
It is a significant schema change however. This particular solution moves all your binary data into one table.

Categories

Resources