C# linq to sql - selecting tables dynamically - c#

I have the following scenario: there are a database that generates a new logTable every year. It started on 2001 and now has 11 tables. They all have the same structure, thus the same fields, indexes,pk's, etc.
I have some classes called managers that - as the name says - manages every operation on this DB. For each different table i have a manager, except for this logTable which i have only one manager.
I've read a lot and tried different things like using ITable to get tables dynamically or an interface that all my tables implements. Unfortunately, i lose strong-typed properties and with that i can't do any searches or updates or anything, since i can't use logTable.Where(q=> q.ID == paramId).
Considering that those tables have the same structure, a query that searches logs from 2010 can be the exact one that searches logs from 2011 and on.
I'm only asking this because i wouldn't like to rewrite the same code for each table, since they are equal on it's structure.
EDIT
I'm using Linq to SQL as my ORM. And these tables uses all DB operations, not just select.

Consider putting all your logs in one table and using partitioning to maintain performance. If that is not feasible you could create a view that unions all the log tables together and use that when selecting log data. That way when you added a new log table you just update the view to include the new table.
EDIT Further to the most recent comment:
Sounds like you need a new DBA if he won't let you create new SPs. Yes I think could define an ILogTable interface and then make your log table classes implement it, but that would not allow you do GetTable<ILogTable>(). You would have to have some kind of DAL class with a method that created a union query, e.g.
public IEnumerable<ILogTable> GetLogs()
{
var Log2010 = from log in DBContext.2010Logs
select (ILogTable)log;
var Log2011 = from log in DBContext.2011Logs
select (ILogTable)log;
return Log2010.Concat(Log2011);
}
Above code is completely untested and may fail horribly ;-)
Edited to keep #AS-CII happy ;-)

You might want to look into the Codeplex Fluent Linq to SQL project. I've never used it, but I'm familiar with the ideas from using similar mapping techniques in EF4. YOu could create a single object and map it dynamically to different tables using syntax such as:
public class LogMapping : Mapping<Log> {
public LogMapping(int year) {
Named("Logs" + year);
//Column mappings...
}
}

As long as each of your queries return the same shape, you can use ExecuteQuery<Log>("Select cols From LogTable" + instance). Just be aware that ExecuteQuery is one case where LINQ to SQL allows for SQL Injection. I discuss how to parameterize ExecuteQuery at http://www.thinqlinq.com/Post.aspx/Title/Does-LINQ-to-SQL-eliminate-the-possibility-of-SQL-Injection.

Related

Manipulating large quantities of data in ASP.NET MVC 5

I am currently working towards implementing a charting library with a database that contains a large amount of data. For the table I am using, the raw data is spread out across 148 columns of data, with over 1000 rows. As I have only created models for tables that contain a few columns, I am unsure how to go about implementing a model for this particular table. My usual method of creating a model and using the Entity Framework to connect it to a database doesn't seem practical, as implementing 148 properties for each column does not seem like an efficient method.
My questions are:
What would be a good method to implement this table into an MVC project so that there are read actions that allow one to pull the data from the table?
How would one structure a model so that one could read 148 columns of data from it without having to declare 148 properties?
Is the Entity Framework an efficient way of achieving this goal?
Entity Framework Database First sounds like the perfect solution for your problem.
Data first models mean how they sound; the data exists before the code does. Entity Framework will create the models as partial classes for you based on the table you direct it to.
Additionally, exceptions won't be thrown if the table changes (as long as nothing is accessing a field that doesn't exist), which can be extremely beneficial in a lot of cases. Migrations are not necessary. Instead, all you have to do is right click on the generated model and click "Update Model from Database" and it works like magic. The whole process can be significantly faster than Code First.
Here is another tutorial to help you.
yes with Database First you can create the entites so fast, also remember that is a good practice return onlye the fiedls that you really need, so, your entity has 148 columns, but your app needs only 10 fields, so convert the original entity to a model or viewmodel and use it!
One excelent tool that cal help you is AutoMapper
Regards,
Wow, that's a lot of columns!
Given your circumstances a few thoughts come to mind:
1: If your problem is the leg work of creating that many properties you could look at Entity Framework Power Tools. EF Tools is able to reverse engineer a database and create the necessary models/entity relation mappings for you, saving you a lot of the grunt work.
To save you pulling all of that data out in one go you can then use projections like so:
var result = DbContext.ChartingData.Select(x => new PartialDto {
Property1 = x.Column1,
Property50 = x.Column50,
Property109 = x.Column109
});
A tool like AutoMapper will allow you to do this with ease via simply configurable mapping profiles:
var result = DbContext.ChartingData.Project().To<PartialDto>().ToList();
2: If you have concerns with the performance of manipulating such large entities through Entity Framework then you could also look at using something like Dapper (which will happily work alongside Entity Framework).
This would save you the hassle of modelling the entities for the larger tables but allow you to easily query/update specific columns:
public class ModelledDataColumns
{
public string Property1 { get; set; }
public string Property50 { get; set; }
public string Property109 { get; set; }
}
const string sqlCommand = "SELECT Property1, Property50, Property109 FROM YourTable WHERE Id = #Id";
IEnumerable<ModelledDataColumns> collection = connection.Query<ModelledDataColumns>(sqlCommand", new { Id = 5 }).ToList();
Ultimately if you're keen to go the Entity Framework route then as far as I'm aware there's no way to pull that data from the database without having to create all of the properties one way or another.

Making generic getters and setters

I have a problem where I have to get the column names and their values from all the Tables in my schema and show that the result in a grid.
I have used the direct approach for this but I have to implement the SqlSiphon structure. For this I have to make getters and setters of each of the column of each Table in the schema which is impossible.
What should I use to get the Column names and their values dynamically from the table.
SELECT * FROM INFORMATION_SCHEMA.COLUMNS
WHERE
TABLE_NAME = '" + #Tablename1 + "' AND TABLE_SCHEMA='dbo'"
What will be the best dynamic solution?
And what will be Best to use List , Dictionay or something like 2d Array which will give the column names as well as column values?
A few suggestions:
I'm not completely clear on what you're trying to achieve, but consider using an ORM (Linq2SQL, WEF, NHibernate)
In .NET, a suitable type to represent a database table would be a DataTable
Edit: After a few more re-reads I think I understand what you're asking - you already have a database/schema and you want to automatically create the entity classes needed by SqlSiphon. This is called "database-first" (as opposed to model-first). However, from a brief scan of the SqlSiphon documentation it appears it does not support database-first. Is this why you are trying to put the columns into a grid - to make it easier to manually create the entity classes for SqlSiphon?
Edit2: Note that trying to use an ORM on top of a database whose schema is frequently modified will be problematic. My next guess is that you're trying to figure out how to create an entity class in SqlSiphon which you can use to retrieve database schema information like table columns? I'm still struggling to understand what you're actually asking here - perhaps you can update your question?
Edit3: I think the answer to your question is take a different approach to your design - ORM's like SqlSiphon are not intended to be used to retrieve and modify the database schema itself.
Might be worth taking a step back an comparing against how other people solve similar problems.
Typically, each table on a database represents an entity, and you also have a class per entity, and you may use an ORM system to avoid duplication of work. So, in a typical system, you have a table for customers, and a table for invoices, and a table for invoice lines, etc. and then a class that represents a customer, a class for an invoice, a class for an invoice line, etc. As you later add functionality (and possible columns/properties) you change the classes, rather than just seeing what columns are on the database - you can of course decorate these with XML documentation and get Intelisense goodness.
There are many ORM systems out there, and each have their strengths and weaknesses, but I personally like LINQ to SQL for adding onto an existing data model.

Selecting different tables in nHibernate

I need to execute a query that will return N tables. in my program, i have the following tables (some of them):
TABLES:
HM_RECEIVE;
HM_SEND;
SM_RECEIVE;
SM_SEND;
P_SLAB;
P_SLAB_PDO;
...
Entities:
HMreceive;
HMsend;
SMreceive;
SMsend;
PSlab;
PSlabPDO;
...
I have two questions, and this would be nicer if i could accomplish it using only one criteria:
How do I select all tables? In first moment I believe there's no need to match IDs, but if changes anything I really would like to know;
Is there a way to select only HM and SM tables, ignoring all others?
Thanks in advance.
Edit:
How I do something like: select * from HMReceive, SMReceive, HMSend, SMSend?
If the tables have similar layout, you could use inheritance.
Define the classes similar to:
public class HMReceive : BaseClass
public class HMSend : BaseClass
public class SMReceive : BaseClass
public class SMSend : BaseClass
and you can use HQL to
select * from BaseClass
or use criteria to query against BaseClass. The result will be an IList but each entity returned will consist of the actual type HMReceive, HMSend, SMReceive, or SMSend.
I'm not very sure about your requirement, but using NHibernate, there are several ways to fetch multiple tables in a single query:
Using eager fetching if your tables connect with each other. Following query will fetch child records along with their Parent:
session.QueryOver<Child>().Fetch(child => child.Parent).Eager.List();
If the tables are not connect with each other, and your database is Oracle, then you are out of luck. But with MS SQL Server, you can use Future() to make multiple queries to go to database at once:
// Future() returns a lazy enumerable, not actually queries the database.
var childs = session.QueryOver<Child>().Future();
// NHibernate will populate the lazy enumerable once it being enumerated,
// or when it has to hit the database anyway, like when a call to List() happen:
var parents =session.QueryOver<Parent>().List();
Hope this help.
It sounds like you are asking for a way to fetch pretty much all of the data in the database. You really should just fetch only the data that you need. That said, sometimes it is useful to be able to fetch data from tables without having to explicitly name the tables that you want to fetch from. For example, you may want to write a unit test that simply verifies that the NHibernate mappings correctly match your database schema. Ayende has a blog post illustrating how to write such a test. This is a slightly modified version of that test:
[Test]
public void SchemaShouldMatchMappings()
{
// `GetAllClassMetadata` returns a collection of all of the mapped entities.
foreach (var entry in _sessionFactory.GetAllClassMetadata())
{
// Build a query that fetches this entity...
_session.CreateCriteria(entry.Value.EntityName)
// ... but tell it to just check the schema and not actually bring any back.
.SetMaxResults(0)
// Execute the query.
.List();
}
}
I think you can use a loop similar to the above as a starting point for what you are trying to accomplish. I'm not going to actually post a code example that fetches all of the data in the database because I don't want to encourage people to do bad things - but this should be enough to get you started.

Creating data access to possibly changing schemas

The product I'm working on will need to support different database types. At first, it needs to support SQL Server and Oracle, but in the future it may need to support IBM DB2 and Postgre SQL.
And the product will need to be working for different customers who might have slightly different schemas. For example a column name on one client with SQL Server might be _ID and on another client with Oracle it could be I_ID.
The general schema will be the same except the column names. They all could potentially be mapped to the same object. But, there may be some extra columns that are specific to each customer. These do not need to be mapped to an object though. They can be retrieved in a Master-Detail scenario using a simpler way.
I wanted use an ORM as we will need to support different types of database providers. But as far as I can understand, ORMs are not good with creating a mapping on runtime.
To support these requests (summary):
Column names may be different for each customer, but they are pretty much the same columns except names.
Database provider may be different for each customer.
There may be extra columns for each customer.
Edit: Program should be able to support a new database by changing the configuration during runtime.
What is a good way to create a data access for such specifications? Is there a way to do it with ORMs? Or do I need to write code specific to each database to support this scenario? Do I have any other option that would make it easier than using ADO.NET directly?
Edit: I think I wrote my question a bit too broad, and didn't explain it clearly, sorry about that. The problem is I won't be creating the databases. They will be created already, and the program should be able to work with a new database by configuring the program during runtime. I have no control over the databases.
The other thing is, of course it is possible to do it by creating SQL statements in the program, but that is really cumbersome. All these providers have slightly different rules and different SQL implementations, so it is a lot of work. I was wondering if I could use something like an ORM to make it easier for me.
Edit 2: I am totally aware that this is a stupid way to do things, and it shows bad design decisions. But I have spent so many hours trying to convince my company to not do it this way. They don't want to change their way of thinking because an intern tells them so. So any help would be appreciated.
Column names may be different for each customer, but they pretty much the same columns except names.
Because of this requirement alone you're going to have to build your SQL statement dynamically, on your own, but it's really pretty straight forward. I would recommend building a table like this:
CREATE TABLE DataTable (
ID INT PRIMARY KEY NOT NULL,
Name SYSNAME NOT NULL
)
to store all of the tables in the database. Then build one like this:
CREATE TABLE DataTableField (
ID INT PRIMARY KEY NOT NULL,
DataTableID INT NOT NULL,
Name SYSNAME NOT NULL
)
to store the base names for the fields. You'll just have to pick a schema and call it the baseline. That's what goes in those two tables. Then you have a table like this:
CREATE TABLE Customer (
ID INT PRIMARY KEY NOT NULL,
Name VARCHAR(256) NOT NULL
)
to store all the unique customers you have using the product, and then finally a table like this:
CREATE TABLE CustomerDataTableField (
ID INT PRIMARY KEY NOT NULL,
CustomerID INT NOT NULL,
DataTableFieldID INT NOT NULL,
Name SYSNAME,
IsCustom BIT
)
to store the different field names for each customer. We'll discuss the IsCustom in a minute.
Now you can leverage these tables to build your SQL statements dynamically. In C#, you might cache all this data up front when the application first loads and then use those data structures to build the SQL statements. But get started on that and if you have specific questions about that then create a new question, add the code you already have, and let us know where you're having trouble.
Database provider may be different for each customer.
Here you're going to need to use something like Dapper because it works with POCO classes (like what you'll be building) and it also simply extends the IDbConnection interface so it doesn't matter what concrete class you use (e.g. SqlConnection or OracleConnection), it works the same.
There may be extra columns for each customer.
This is actually quite straight forward. Leverage the IsCustom field in the CustomerDataTableField table to add those fields to your dynamically built SQL statements. That solves the database side. Now, to solve the class side, I'm going to recommend you leverage partial classes. So consider a class like this:
public partial class MyTable
{
public int ID { get; set; }
public string Field1 { get; set; }
}
and that represents the baseline schema. Now, everything maps into those fields except those marked IsCustom, so we need to do something about those. Well, let's build an extension to this class:
public partial class MyTable
{
public string Field2 { get; set; }
}
and so now when you build a new MyTable() it will always have these additional fields. But, you don't want that for every customer do you? Well, that's why we use partial classes, you define these partial classes in external assemblies that only get installed for the right customer. Now you have a bunch of small, customer specific extensions to the system, and they are easily developed, installed, and maintained.

LinqToSQL - no supported translation to SQL

I have been puzzling over a problem this morning with LinqToSQL. I'll try and summarise with the abbreviated example below to explain my point.
I have DB two tables:
table Parent
{
ParentId
}
table Child
{
ChildId
ParentId [FK]
Name
Age
}
These have LinqToSQL equivalent classes in my project, however, I have written two custom model classes that I want my UI to use, instead of using the LinqToSQL classes.
My data access from the front end goes through a service class, which in turn calls a repository class, which queries the data via linq.
At the repository level I return an IQueryable by:
return from data in _data.Children
select new CustomModel.Child
{
ChildId = data.ChildId,
ParentId = date.ParentId
};
My service layer then adds an additional query restriction by parent before returning the list of children for that parent.
return _repository.GetAllChildren().Where(c => c.Parent.ParentId == parentId).ToList();
So at this point, I get the method has no supported translation to sql error when I run everything as the c.Parent property of my custom model cannot be converted. [The c.Parent property is an object reference to the linked parent model class.]
That all makes sense so my question is this:
Can you provide the querying process
with some rules that will convert a
predicate expression into the correct
piece of SQL to run at the database
and therefore not trigger an error?
I haven't done much work with linq up to now so forgive my lack of experience if I haven't explained this well enough.
Also, for those commenting on my choice of architecture, I have changed it to get around this problem and I am just playing around with ideas at this stage. I'd like to know if there is an answer for future reference.
Many thanks if anyone can help.
Firstly, it begs the question: why is the repository returning the UI types? If the repo returned the database types, this wouldn't be an issue. Consider refactoring so that the repo deals only with the data model, and the UI does the translation at the end (after any composition).
If you mean "and have it translate down to the database" - then basically, no. Composable queries can only use types defined in the LINQ-to-SQL model, and a handful of supported standard functions. Something similar came up recently on a related question, see here.
For some scenarios (unusual logic, but using the typed defined in the LINQ-to-SQL model), you can use UDFs at the database, and write the logic yourself (in TSQL) - but only with LINQ-to-SQL (not EF).
If the volume isn't high, you can use LINQ-to-Objects for the last bit. Just add an .AsEnumerable() before the affected Where - this will do this bit of logic back in managed .NET code (but the predicate won't be used in the database query):
return _repository.GetAllChildren().AsEnumerable()
.Where(c => c.Parent.ParentId == parentId).ToList();

Categories

Resources