I'm just getting my head round C#. I've been creating classes and objects so say i created a class called Member:
public class Member
{
public int MemberID;
public string FirstName;
public string LastName;
public string UserName;
}
and I create a new object of that class by doing this:
Member Billy = new Member();
Billy.UserName = "Jonesy";
Billy.FirstName = "Billy";
Billy.LastName = "Jones";
That's all fine but what if I've queried a database and gotten back 5 members, can I create objects on the fly? Or what is the best way to store these members in memory?
I've used VB.Net where I would just add them into a datatable. But I've never really done any object-oriented programming before and thought since I'm learning C#, now's the best time to learn OOP.
If you don't go with LINQ to SQL (or the Entity Framework) then using a regular ADO.NET DataReader you would loop through the results, instantiate a new object with the details, and add it to a list.
Roughly it would look like this:
List<Member> members = new List<Member>();
using (SqlConnection connection = new SqlConnection(connectionString))
{
connection.Open();
using (SqlCommand command = new SqlCommand(queryString, connection))
{
using (SqlDataReader reader = command.ExecuteReader())
{
while (reader.Read())
{
Member member = new Member();
member.UserName = reader.GetString(0);
member.FirstName = reader.GetString(1);
member.LastName = reader.GetString(2);
members.Add(member);
}
}
}
}
foreach(Member member in members)
{
// do something
}
This is a common problem. Fortunately there is a good answer to this: Linq To Sql! You can read about it here: http://weblogs.asp.net/scottgu/archive/2007/05/19/using-linq-to-sql-part-1.aspx
What it basically does is that it creates a class, one per table you choose in your database. This makes it very easy to get all your objects directly from the database into object oriented programming.
Saving is as easy as calling a function "SubmitChanges()". There are more providers for this but I think Linq will suit you well as a beginner as it abstracts a lot of the annoying parts.
I'd recommend that you look at LINQ to SQL. Then you can write code like this to query the database to get a specific user:
Member member = db.Members.Single(member => member.UserName == "Jonesy");
or to get users matching a criterion:
IQueryable<Member> members = db.Members
.Where(member => member.LastName == "Jones");
LINQ to SQL also takes care of writing the boilerplate code to declare the classes based on the database structure.
Linq2Sql suggested twice is not the only way, but having in mind case when all objects can be one to one mapped to tables in your database it works just fine. Personally I would go for EF instead however, because it allows you to have one more layer of abstraction.
Also I can suggest you to look at db4o and etc, there you just can save you poco and that's all.
Related
`Hi,
Can somebody please give me a pointer on this? I have 8 servers each with 8 databases which look identical exept server/database name. We are talking thousands of tables.
I create my data contexts with sqlmetal.exe
After creating my data contexts, I import them into the application and then I run comparison scripts over the databases to compare results.
My problem is dynamically switching between data contexts.
Datacontext.DAL.DUK1 duk1sdi = new Datacontext.DAL.DUK1(connectionString);
Datacontext.DAL.DUK3 duk3sdi = new Datacontext.DAL.DUK3(connectionString);
string fromOne = runQuery(duk1sdi);
string fromThree = runQuery(duk3sdi);
public static string runQuery(DataContext duk)
{
var query =
from result in duk.TableA
select result.Total;
string returnString = query;
return returnString;
}
I have no problem with the query running when the duk is predefined, however how do I define and pass the datacontext to the function?
The error I get is:
Error 1 'System.Data.Linq.DataContext' does not contain a definition
for 'TableA' and no extension method 'TableA' accepting a first
argument of type 'System.Data.Linq.DataContext' could be found (are
you missing a using directive or an assembly reference?)
You could use the GetTable<T> method, where T is the type of the table, e.g. TableA.
public static string runQuery(DataContext duk) {
var table = duk.GetTable<TableA>();
var query = from result in table select result.Total;
...
}
However, all types of TableA will need to be the same type, strictly (I'm pretty sure).
Otherwise you would need to literally branch the logic for the handling of each context. Since you can extend your DataContext instances (in general, maybe not in your specific case) then you could have them share an interface that exposes a collection property of TableA, but you would need a higher level context wrapper to pass around then - unless you pass around the collection by altering the method signature.
You can use interfaces. Check this answer, but be sure to script the interfaces using a .tt file with the amount of tables you have.
Edit:
If you have generated contexts which you want to use interchangeably in a reusable method, you have the problem that the generated TableA classes are not reusable, since they are different types (even though the names may match, but that doesn't make them equal). Therefore you need to abstract the actual types, and one way to do this, is to use interfaces. You build your reusable method around an interface which abstracts the specific context-type and table-type. The downside is that you have to implement the interfaces on the generated contexts and tabletypes. This though is something you can solve using a .tt script.
Pseudo code:
// Define interface for table
public interface ITableA {
// ... properties
}
// Define interface for context
public interface IMyContext {
IQueryable<ITableA> TableA { get; }
}
// Extend TableA from DUK1
public partial class TableA: ITableA {
}
// Extend DUK1
public partial class Datacontext.DAL.DUK1: IMyContext {
IQueryable<ITableA> IMyContext.TableA {
get { return TableA; }
}
}
// Same for DUK3 and TableA FROM DUK3
// Finally, your code
Datacontext.DAL.DUK1 duk1sdi = new Datacontext.DAL.DUK1(connectionString);
Datacontext.DAL.DUK3 duk3sdi = new Datacontext.DAL.DUK3(connectionString);
string fromOne = runQuery(duk1sdi);
string fromThree = runQuery(duk3sdi);
public static string runQuery(IMyContext duk) {
// Note: method accepts interface, not specific context type
var query = from result in duk.TableA
select result.Total;
string returnString = query;
return returnString;
}
If your schema is identical between databases, why script the dbml for all of them? Just create one context with it's associated classes and dynamically switch out the connection string when instantiating the context.
var duk1sdi = new Datacontext.DAL.DUK1(connectionString1);
var duk3sdi = new Datacontext.DAL.DUK1(connectionString2);
Thanks, guys, I think I found the simplist solution for me based a bit of both your answers and by RTFM (Programming Microsoft Linq in Microsoft .NET Framework 4 by Paulo Pialorsi and Marco Russo)
In this way I don't have to use the large DBML files. It is a shame because I'm going to have to create hundreds of tables in this way, but I can now switch between connection strings on the fly.
First I create the table structure. (outside the program code block)
[Table(Name = "TableA")]
public class TableA
{
[Column] public int result;
}
Then I define the table for use:
Table<TableA> TableA = dc.GetTable<TableA>();
And then I can query from it:
var query =
from result in TableA
select TableA.result;
TL;DR I'm using EntityFramework 5.0 with Oracle and need to query a table for two columns only using index with NVL of two columns.
Details after hours of attempts... I'll try to organize it as possible.
The desired SQL query should be:
SELECT t.Code, NVL(t.Local, t.Global) Description
FROM Shows t
Where t.Code = 123
So what is the problem? If I want to use Context.Shows.Parts.SqlQuery(query) I must return the whole row(*), but then I get Table Access Full, so I must return only the desired columns.
The next thing(Actually there were a lot of tries before the following...) that I've tried which gives a very close results was using the null-coalescing operator(??) :
Context.Shows.Where(x => x.Code == 123)
.Select(x => new { x.Code, Description = x.Local ?? x.Global);
But the SQL it's using is complicated using case & when and not using my Index on Code, Nvl(Local, Global) which is critical!
My next step was using Database.SqlQuery
context.Database.SqlQuery<Tuple<int, string>>("the Raw-SQLQuery above");
But I get an error that Tuple must not be abstract and must have default ctor(it doesn't).
Final step which I dislike is creating a class which has only those two properites(Code, Description), now... it works great, but I don't want to write a class for each query like that.
Ideas?
This is a no-solution answer.
I think whatever you try, you can't do that. Even if you define your own mutable generic Tuple, it will failed since the name of the property must match the name of the column:
SqlQuery(String, Object[]): Creates a raw SQL query that will
return elements of the given generic type. The type can be any type
that has properties that match the names of the columns returned from
the query, or can be a simple primitive type.
I think the best you can do is creating your own generic method for querying the database via classic Command and ExecuteReader pattern. Untested, but you get the idea:
public static IEnumerable<Tuple<T>> SqlQuery<T>(this DbContext context, string sql)
{
using(var connection = new SqlConnection(context.Database.Connection.ConnectionString))
using (var command = new SqlCommand(sql, connection))
{
var reader = command.ExecuteReader();
while (reader.NextResult())
{
yield return new Tuple<T>((T)reader[0]);
}
}
}
public static IEnumerable<Tuple<T1, T2>> SqlQuery<T1, T2>(this DbContext context, string sql)
{
using (var connection = new SqlConnection(context.Database.Connection.ConnectionString))
using (var command = new SqlCommand(sql, connection))
{
var reader = command.ExecuteReader();
while (reader.NextResult())
{
yield return new Tuple<T1, T2>((T1)reader[0], (T2)reader[1]);
}
}
}
I have a C# .NET 3.5 project using a MySQL database.
I have an object Task which I would like to be able to create by pulling it from a series of database tables.
public class Task
{
public Task()
{
Values = new List<string>();
OtherValues = new List<string>();
Requirement = string.Empty;
Minimum = 1;
Children = new List<Foo>();
}
public IList<string> Values { get; set; }
public IList<string> OtherValues { get; set; }
public string Requirement { get; set; }
public int Minimum { get; set; }
public int Maximum { get; set; }
public IList<Foo> Children { get; set; }
}
I'd like to be able to get the tasks from a TaskList which would lazily read elements of the task as they were accessed by an enumerator.
public class TaskList : IEnumerable<Task>
{
/* ... */
public IEnumerator<Task> GetEnumerator()
{
string query = #"SELECT my_task.*, `Order` FROM my_task ORDER BY `Order` DESC";
using (MySqlConnection connection = new MySqlConnection(connection_string_))
using (MySqlCommand command = connection.CreateCommand())
{
command.CommandText = query;
connection.Open();
using (MySqlDataReader reader = command.ExecuteReader())
{
yeild /* ??? */
}
}
}
}
How is this done?
You can serialize it to XML and store it as a string. Add the following function to Task:
public XElement Serialize()
{
return new XElement("Task",
new XElement("Values",from val in Values select new XElement("Item",val)),
new XElement("OtherValues",from val in OtherValues select new XElement("Item",val)),
new XElement("Requirement",Requirement),
new XElement("Minimum",Minimum),
new XElement("Maximum",Maximum)
);
}
You will need to put using System.Linq; and using System.Xml.Linq; in the top of the .cs file.
I didn't write the code to serialize Children because I don't know what the data type Foo looks like, but you should serialize it in a similar manner. After you've done that, you can easily write the XML to the database, and read it back(write a constructor that parses the Xml into a Task object)
EDIT(addition):
Here is an example to a constructors that receives XML(or parse a string as XML):
public Task(string xmlSourceAsString):
this(XElement.Parse(xmlSourceAsString))
{
}
public Task(XElement xmlSource)
{
Values=(from itm in xmlSource.Element("Values").Elements("Item") select itm.Value).ToList();
OtherValues=(from itm in xmlSource.Element("OtherValues").Elements("Item") select itm.Value).ToList();
Requirement=xmlSource.Element("Requirement").Value;
Minimum=int.Parse(xmlSource.Element("Minimum").Value);
Maximum=int.Parse(xmlSource.Element("Maximum").Value);
}
EDIT(explanation):
You can't store your object as is in the database "as is", because it refers to other objects. For example - the list Values doesn't sit in the same place in memory as the rest of the object, befause it's a ref type - it refers to another object that sits in a different place in the memory. In matter of fact, the only parts of your object that are stored in the same place as the main object are the Minimum and Maximum, which are ref types, so if you could somehow store the object as is(laziest solution possible, if it worked), you would get your Minimum and Maximum fields right, but all other fields will point to the memory addresses where those objects where placed when you stored the Task object, which are now most likely invalid pointers(and I say "most likely" because it is also possible(though rare) that they will point to legitimate objects, maybe event of the same type - but they still won't have your data.
If you want the object with all it's data stored in a database(or in a file. or passed to a proccess that runs on another computer via network) you have to serialize it. Performance-wise the best way is to serialize it to binary(C# have some tools for that, but it's still more complex than XML).
Xml also have the adventage of being easily readable from most modern programming languages and database engines. MySQL has some functions to read and write XML, so you can update the object in the database and access it's fields from MySQL queries.
Conclusion
You asked for a solution that is easy(lazy), efficient, and sql-compatible(access to the object's fields from MySQL queries). I say you can only have two of your three requirements, but you can choose which two:
If you want something easy and efficient, even at the price of loosing compatibility, serialize your objects to binary. True, it's not as easy as XML, but .NET has some tools to help you with that.
If you want something efficient and compatible, and willing to do some work for that, you can put your object in MySQL the way databases are meant to be used - use separate tables for the lists that refers to the objects via OIDs, etc. This will require some work, but after you add the tables and code the MySQL functions and the C# functions that handle everything, you should be able to store, retrieve, and access your objects with ease.
If you want something easy and compatible, and you can afford loosing some efficiency, use my solution and serialize your objects to XML. This is the laziest solution - unless someone knows a library that can automatically serialize any object, LINQ to XML is the easiest way to do it, and requires much less code than any other solution.
I have written a generic database helper method, which returns the records of a particular entity.
here is how I do it:
I have a class called Customer having 10 properties also having a property called TableName.
There is a Method which just take Type parameter, and return an array of passed type.
How the method work is, by using reflection it got a table name, and fire a select statement, and on the basis of DataReader it loops through each colum and Properties of passed Type.
So, the problem is suppose there are 1 million records and 10 properties. It loops for 10 (Properties) * (1,000,000 records) = 10,000,000 times
is there any optimized way to do this, something like using LINQ against a Datareader?
Here is a code
object[] LoadAll(Type type)
{
try
{
object obj = Activator.CreateInstance(type);
SqlConnection conn = new SqlConnection("connection string");
string tableName = type.GetField("TableName").GetValue(obj) as string;
SqlCommand cmd = conn.CreateCommand();
cmd.CommandText = string.Format("select * from {0}", tableName);
conn.Open();
List<object> list = new List<object>();
SqlDataReader reader = cmd.ExecuteReader();
while (reader.Read())
{
object obj1 = Activator.CreateInstance(type);
foreach (PropertyInfo propertyInfo in type.GetProperties())
{
obj.GetType().GetProperty(propertyInfo.Name).SetValue(obj1,reader[propertyInfo.Name],null);
}
list.Add(obj1);
}
}
Thanx
Try an object-relational mapper like NHibernate.
Sounds like you are calculating the size for every instance of an entity. You should have some meta-data controller which caches the size of an entity for each table name. Assuming I understand your problem right, for the same table name, the size will always be the same.
If I understad the problem correctly, it sounds like you could simply make the DB do the work for you. You say "and fire a select statement": Can you not fire a smarter select statement that does what you are explaining?
I don't fully understand what you are trying to do when you say that you are looping through each column. But look into the "Group by" and "aggrop" aka "Aggregate operators" and see if any of those can help you out.
For optimization point of view , there is no need to maintain a connection to db for 1 million records, mean u are interacting with database until your loop ends. :( .
for optimization , you cache the whole table record in some dataset and then iterate it. not take the connection live to database for long time . hope it will be ans of your question . :)
You could probably tighten up the loop a bit to reduce calls involving reflection. You don't need to create that initial obj either:
PropertyInfo[] properties = type.GetProperties();
while (reader.Read())
{
object obj = Activator.CreateInstance(type);
foreach (PropertyInfo propertyInfo in properties)
{
propertyInfo.SetValue(obj, reader[propertyInfo.Name], null);
}
list.Add(obj);
}
But to get it even faster you could give pass the LoadAll() function a way to map the row to a new object, something along the lines of:
IEnumerable<T> LoadAll<T>(Func<DataReader, T> map) {
var tablename = typeof(T).GetField("TableName)......
// other connection and query stuff
while (reader.Read()) {
yield return map(reader);
}
}
// use it like:
var kittens = LoadAll<Kitten>(reader => new Kitten {
Name = (string)reader["Name"],
Colour = (string)reader["Colour"]
});
This also gives you more control over the mapping from the data layer to your domain object, for example your method using reflection would take a lot of modification to handle an enum property, which would be straightforward to code in an explicit map function.
You could try installing the free trial of ReSharper, its inspection tools can suggest a variety of ways to optimize your code from the ground up.
I am new to C# and this may end up being a dumb question but i need to ask anyway.
Is there a mechanism with C# to deserialize a result from an executed SQL statement into a c# object?
I have a C# program that reads a table from an sql server storing the row in an object - i am assigning each column value to an object member manually so i was wondering if there is a way to serialize the row automagically into an object. Or even better, a whole table in a collection of objects of the same type.
My environment is C#, VS2010, .NET4, SQLServer2008.
The assumption is that i know the columns i need, it's not a select * query.
A link to a neat example will also be appreciated.
Thanks.
You could use an ORM to do this. ADO.NET Entity Framework (checkout the video tutorials) and NHibernate are popular choices.
If the columns named as per the table names, you can do this with LINQ-to-SQL without any mapping code - just using ExecuteQuery:
using(var dc = new DataContext(connectionString)) {
var objects = dc.ExecuteQuery<YourType>(sql); // now iterate object
}
Additionally, the sql can be automatically parameterized using string.Format rules:
class Person {
public int Id {get;set;}
public string Name {get;set;}
public string Address {get;set;}
}
using(var dc = new DataContext(connectionString)) {
List<Person> people = dc.ExecuteQuery(#"
SELECT Id, Name Address
FROM [People]
WHERE [Name] = {0}", name).ToList(); // some LINQ too
}
Of course, you can also use the VS tools to create a typed data-context, allowing things like:
using(var dc = new SpecificDataContext(connectionString)) {
List<Person> people =
(from person in dc.People
where person.Name == name
select person).ToList();
}