What alternatives are there to property.setvalue()? I've read that it is very slow. I'm using it to map IDataReader to POCO objects.
This is a truncated version of the code. Everything here is very new to me. I know there a lot of frameworks that accomplish this task. However, we can't use them.
public class DbAutoMapper<T>
{
public IEnumerable<T> MapToList(IDataReader reader)
{
var list = new List<T>();
while (reader.Read())
{
var obj = Activator.CreateInstance<T>();
foreach (PropertyInfo prop in obj.GetType().GetProperties())
{
foreach (var attribute in prop.GetCustomAttributes(true))
{
prop.SetValue(obj, value, null);
}
}
list.Add(obj);
}
return list;
}
}
Firstly: why are you repeating the reflection for every attribute, when you don't use the attribute?
Second: assuming you intended to map this by name, column-to-property (which isn't what the code currently does), consider a tool like dapper, which does all this for you, including cached high-performance reflection-emit. It'll also handle the command itself for you. For example:
string region = "North";
var customers = conn.Query<Customer>(
#"select * from Customers where Region = #region",
new { region } // full parameterization, the easy way
).ToList();
If you need more control, consider FastMember, which provides fast member-access (again, reflection-emit), but without being specific to data access:
var accessor = TypeAccessor.Create(typeof(T));
string propName = // something known only at runtime
while( /* some loop of data */ ) {
var obj = new T();
foreach(var col in cols) {
string propName = // ...
object cellValue = // ...
accessor[obj, propName] = cellValue;
}
yield return obj;
}
A few approaches come to mind...
Skip Reflection
public class DbAutoMapper<T> where T : IInitFromReader, new()
{
public IEnumerable<T> MapToList(IDataReader reader)
{
var list = new List<T>();
while (reader.Read())
{
IInitFromReader obj = new T;
obj.InitFromReader(reader);
list.Add(obj);
}
return list;
}
}
Then you'll have to implement the InitFromReader in each of your entitiy objects. Obviously, this skips the benefits of reflection (less code).
Code Generation
Maintaining this code for (InitFromReader) is painful, so you could opt to generate it. This in many ways gives you the best of both worlds:
You don't have to maintain (by hand) a lot of code
You don't take the performance hit of reflection.
Related
This question is more of a "is my understanding accurate", and if not, please help me get my head around it. I have this bit of code to explain my question:
class Example
{
public string MyString { get; set; }
}
var wtf = new[] { "string1", "string2"};
IEnumerable<Example> transformed = wtf.Select(s => new Example { MyString = s });
IEnumerable<Example> transformedList = wtf.Select(s => new Example { MyString = s }).ToList();
foreach (var i in transformed)
i.MyString = "somethingDifferent";
foreach (var i in transformedList)
i.MyString = "somethingDifferent";
foreach(var i in transformed)
Console.WriteLine(i.MyString);
foreach (var i in transformedList)
Console.WriteLine(i.MyString);
It outputs:
string1
string2
somethingDifferent
somethingDifferent
Both Select() methods at first glance return IEnumerable< Example>. However, underlying types are WhereSelectArrayIterator< string, Example> and List< Example >.
This is where my sanity started to come into question. From my understanding the difference in output above is because of the way both underlying types implement the GetEnumerator() method.
Using this handy website, I was able to (I think) track down the bit of code that was causing the difference.
class WhereSelectArrayIterator<TSource, TResult> : Iterator<TResult>
{ }
Looking at that on line 169 points me to Iterator< TResult>, since that's where it appears GetEnumerator() is called.
Starting on line 90 I see:
public IEnumerator<TSource> GetEnumerator() {
if (threadId == Thread.CurrentThread.ManagedThreadId && state == 0) {
state = 1;
return this;
}
Iterator<TSource> duplicate = Clone();
duplicate.state = 1;
return duplicate;
}
What I gather from that is when you enumerate over it, you're actually enumerating over a cloned source (as written in the WhereSelectArrayIterator class' Clone() method).
This will satisfy my need to understand for now, but as a bonus, if someone could help me figure out why this isn't returned the first time I enumerate over the data. From what I can tell, the state should = 0 the first pass. Unless, perhaps there is magic happening under the hood that is calling the same method from different threads.
Update
At this point I'm thinking my 'findings' were a bit misleading (damn Clone method taking me down the wrong rabbit hole) and it was indeed due to deferred execution. I mistakenly thought that even though I deferred execution, once it was enumerated the first time it would store those values in my variable. I should have known better; after all I was using the new keyword in the Select. That said, it still did open my eyes to the idea that a particular class' GetEnumerator() implementation could still return a clone which would present a very similar problem. It just so happened that my problem was different.
Update2
This is an example of what I thought my problem was. Thanks everyone for the information.
IEnumerable<Example> friendly = new FriendlyExamples();
IEnumerable<Example> notFriendly = new MeanExamples();
foreach (var example in friendly)
example.MyString = "somethingDifferent";
foreach (var example in notFriendly)
example.MyString = "somethingDifferent";
foreach (var example in friendly)
Console.WriteLine(example.MyString);
foreach (var example in notFriendly)
Console.WriteLine(example.MyString);
// somethingDifferent
// somethingDifferent
// string1
// string2
Supporting classes:
class Example
{
public string MyString { get; set; }
public Example(Example example)
{
MyString = example.MyString;
}
public Example(string s)
{
MyString = s;
}
}
class FriendlyExamples : IEnumerable<Example>
{
Example[] wtf = new[] { new Example("string1"), new Example("string2") };
public IEnumerator<Example> GetEnumerator()
{
return wtf.Cast<Example>().GetEnumerator();
}
IEnumerator IEnumerable.GetEnumerator()
{
return wtf.GetEnumerator();
}
}
class MeanExamples : IEnumerable<Example>
{
Example[] wtf = new[] { new Example("string1"), new Example("string2") };
public IEnumerator<Example> GetEnumerator()
{
return wtf.Select(e => new Example(e)).Cast<Example>().GetEnumerator();
}
IEnumerator IEnumerable.GetEnumerator()
{
return wtf.Select(e => new Example(e)).GetEnumerator();
}
}
Linq works by making each function return another IEnumerable that is typically a deferred processor. No actual execution occurs until an enumeration of the finally returned Ienumerable occurs. This allows for the create of efficient pipelines.
When you do
var transformed = wtf.Select(s => new Example { MyString = s });
The select code has not actually executed yet. Only when you finally enumerate transformed will the select be done. ie here
foreach (var i in transformed)
i.MyString = "somethingDifferent";
Note that if you do
foreach (var i in transformed)
i.MyString = "somethingDifferent";
the pipeline will be executed again. Here thats is not a big deal but it can be huge if IO is involved.
this line
var transformedList = wtf.Select(s => new Example { MyString = s }).ToList();
Is the same as
var transformedList = transformed.ToList();
The real eyeopener is to place debug statements or breakpoints inside a where or select to actually see the deferred pipeline execution
reading the implementation of linq is useful. here is select https://referencesource.microsoft.com/#System.Core/System/Linq/Enumerable.cs,5c652c53e80df013,references
I have a need to sort a collection of these based upon criteria determined at run-time.
I was using the code from this article to perform the sorting - originally my code used the dynamic class.
Then I hit issues with serialization over WCF so I switched to using a SerializableDynamicObject and now the sorting code breaks on the line:
PropertyInfo pi = type.GetProperty(prop);
with the error that SerializableDynamicObject does not have a property called "Name" - where "Name" was the value of prop.
I guess the simplest thing to do is to find an alternate way of serializing a dynamic type that the sorting algorithm works with. Any pointers in this direction would be appreciated!
I have looked at this example, but I get the error message:
The constructor with parameters (SerializationInfo, StreamingContext) is not found in ISerializable type
Here's some code using FastMember for this, which works for both reflection-based and dynamic-based objects (depending on what you pass to TypeAccessor.Create)
using System;
using System.Collections;
using System.Collections.Generic;
using System.Dynamic;
using FastMember;
namespace ConsoleApplication6
{
class Program
{
static void Main()
{
var list = new List<dynamic>();
dynamic obj = new ExpandoObject();
obj.Foo = 123;
obj.Bar = "xyz";
list.Add(obj);
obj = new ExpandoObject();
obj.Foo = 456;
obj.Bar = "def";
list.Add(obj);
obj = new ExpandoObject();
obj.Foo = 789;
obj.Bar = "abc";
list.Add(obj);
var accessor = TypeAccessor.Create(
typeof(IDynamicMetaObjectProvider));
string propName = "Bar";
list.Sort((x,y) => Comparer.Default.Compare(
accessor[x, propName], accessor[y,propName]));
foreach(var item in list) {
Console.WriteLine(item.Bar);
}
}
}
}
It may be worth mentioining that for reflection-based types, this does not use reflection on a per-item basis; all that is optimized away via meta-programming.
Marc Gravell's answer gave me the answer to complete this - I needed to implement a sorter that could handle multiple sort criteria, not known until runtime. I'm accepting Marc's answer, but posting this as someone may find it useful too.
There may be a more elegant way of achieving this, if so please let me know and I'll update the answer.
public class SerializableDynamicObjectComparer: IComparer
{
private readonly List<KeyValuePair<string, bool>> sortCriteria = new List<KeyValuePair<string, bool>>();
private readonly TypeAccessor accessor;
public SerializableDynamicObjectComparer(IEnumerable<string> criteria)
{
foreach (var criterium in criteria)
{
string[] sortCriterium = criterium.Split('.');
this.sortCriteria.Add(new KeyValuePair<string, bool>(sortCriterium[0],
sortCriterium.Length == 0
? sortCriterium[1].ToUpper() == "ASC"
: false));
}
this.accessor = TypeAccessor.Create(typeof (IDynamicMetaObjectProvider));
}
public int Compare(object x, object y)
{
for(int i=0; i< this.sortCriteria.Count; i++)
{
string fieldName = this.sortCriteria[i].Key;
bool isAscending = this.sortCriteria[i].Value;
int result = Comparer.Default.Compare(this.accessor[x, fieldName], this.accessor[y, fieldName]);
if(result != 0)
{
//If we are sorting DESC, then return the -ve of the default Compare result
return isAscending ? result : -result;
}
}
//if we get here, then objects are equal on all sort criteria.
return 0;
}
}
Usage:
var sorter = new SerializableDynamicObjectComparer(sortCriteria);
var sortableData = reportData.ToList();
sortableData.Sort(sorter.Compare);
where sortCriteria is an array of strings e.g.
new {"Name.DESC", "Age.ASC", "Count"}
I have the following method that takes an extremely long time to run and would love some help to make it run faster and or be more efficient.
The main responsibility of the method is to take a list of data points created from a CSV file, map the Name property of the file datapoints to the to the HistorianTagname property in a list of tagnames by the DataLoggerTagname property and create a resulting list from the mapping. If the mapping does not exist, the file datapoint is ignored.
I know it that was long-winded, but I hope it makes sense. It may be easier just to look at the method:
private IEnumerable<DataPoint> GetHistorianDatapoints(IEnumerable<DataPoint> fileDatapoints, IEnumerable<Tagname> historianTagnames)
{
/**
** REFACTOR THIS
**/
foreach (var fileDatapoint in fileDatapoints)
{
var historianTagname = historianTagnames.FirstOrDefault(x => x.DataLoggerTagname.Equals(fileDatapoint.Name, StringComparison.OrdinalIgnoreCase));
if (historianTagname != null)
{
var historianDatapoint = new DataPoint();
historianDatapoint.Name = historianTagname.HistorianTagname;
historianDatapoint.Date = fileDatapoint.Date;
historianDatapoint.Value = fileDatapoint.Value;
yield return historianDatapoint;
}
}
}
Notes:
I have complete control of classes and methods of mapping, so if I am doing something fundamentally wrong. I would love to know!
Thanks!
I would start by fixing up:
var historianTagname = historianTagnames.FirstOrDefault(x => x.DataLoggerTagname.Equals(fileDatapoint.Name, StringComparison.OrdinalIgnoreCase))
That's a pretty expensive operation to run every iteration through this loop.
Below is my proposition:
private IEnumerable<DataPoint> GetHistorianDatapoints(IEnumerable<DataPoint> fileDatapoints, IEnumerable<Tagname> historianTagnames)
{
var tagNameDictionary = historianTagnames.ToDictionary(t => t.DataLoggerTagname, StringComparer.OrdinalIgnoreCase);
foreach (var fileDatapoint in fileDatapoints)
{
if (tagNameDictionary.ContainsKey(fileDatapoint.Name))
{
var historianTagname = tagNameDictionary[fileDatapoint.Name];
var historianDatapoint = new DataPoint();
historianDatapoint.Name = historianTagname.HistorianTagname;
historianDatapoint.Date = fileDatapoint.Date;
historianDatapoint.Value = fileDatapoint.Value;
yield return historianDatapoint;
}
}
}
Like #Sheldon Warkentin said FirstOrDefault is probably bottle neck of your function, i s better to create historianTagnames a Dictionary where Name is key, then in your function you can get value by key.
Something like bellow:
// this is passed to method
IDictionary<string, Tagname> historianTagnames;
// .. method body
var historianTagname = historianTagnames[fileDatapoint.Name];
ofcourse you need to add proper if's.
As others have said, a Dictionary<string, Tagname> might perform better.
var historianDict = new Dictionary<string, Tagname>();
foreach (var tagName in historianTagnames) {
historianDict[tagName.DataLoggerTagname.ToLowerInvariant()] = tagName;
}
foreach (var fileDatapoint in fileDatapoints) {
if (historianDict.ContainsKey(fileDatapoint.Name.ToLowerInvariant()) {
// ...
}
}
Foo is a class with a lot of string fields. I want to create a method Wizardify that performs an operation on many of the fields of the object. I could do it like this:
Foo Wizardify(Foo input)
{
Foo result;
result.field1 = Bar(input.field1);
result.field2 = Bar(input.field2);
result.field3 = Bar(input.field3);
...
This is some easily generated code, but I prefer not to waste fifty lines on this. Is there a way to go over selected fields of an object? Note that there are four or five fields I want to work on in a different way and they should be excluded from the iteration.
try
foreach ( FieldInfo FI in input.GetType().GetFields () )
{
FI.GetValue (input)
FI.SetValue (input, someValue)
}
Though I would not recommend the reflection approach for known Types - it is slow and depending on your specific scenario could pose some permission issue at runtime...
This is what I have - it gives me a list (names) of all properties in my classes, that later I can work on with Reflection or "Expression trees":
private static string xPrev = "";
private static List<string> result;
private static List<string> GetContentPropertiesInternal(Type t)
{
System.Reflection.PropertyInfo[] pi = t.GetProperties();
foreach (System.Reflection.PropertyInfo p in pi)
{
string propertyName = string.Join(".", new string[] { xPrev, p.Name });
if (!propertyName.Contains("Parent"))
{
Type propertyType = p.PropertyType;
if (!propertyType.ToString().StartsWith("MyCms"))
{
result.Add(string.Join(".", new string[] { xPrev, p.Name }).TrimStart(new char[] { '.' }));
}
else
{
xPrev = string.Join(".", new string[] { xPrev, p.Name });
GetContentPropertiesInternal(propertyType);
}
}
}
xPrev = "";
return result;
}
public static List<string> GetContentProperties(object o)
{
result = new List<string>();
xPrev = "";
result = GetContentPropertiesInternal(o.GetType());
return result;
}
Usage: List<string> myProperties = GetContentProperties(myObject);
Loop through typeof(YourType).GetProperties() and call GetValue or SetValue.
Note that reflection is rather slow.
You could use the Dynamic Language Runtime to generate a lambda of the type Func. You'll just need to generate the lambda once (you can cache it away) and there'll be no reflection performance hit.
I'm migrating an older .net application to .net 4, this migration has to be done in several stages, thats why some of the methods might seem a bit unconventional. Anyway...
What I have is a Stored Procedure (Analysis_select) returning one row with several columns with the result. If i call it with
var result = dbContext.Analysis_select(user.UserId, Year, Week);
everything is fine, i can view the data in with the debugger or display it in a grid view or something like that, so the expression and Stored Procedure really works! But the result is not compatible with the rest of the code so...
If I try to cast it to DataSet it fails, Visual Studio actually sais this is ok but when rendering on a web page it crashes
var result = (DataSet)dbContext.Analysis_select(user.UserId, Year, Week);
The error is as follows
Unable to cast object of type 'SingleResult`1[Analysis_select]' to type 'System.Data.DataSet'.
I've read about some other conversions from linq to DataSet but most of the methods seems a bit excessive for this. The reason why I want to keep the DataSet is that there's tens of thousands of lines of code depending on such results. Sucks yes, but can you help me fix this?
Any help is highly appreciated, thanks!
I'm not suggesting this as a great solution or best practices; there is most definitely a different (and probably better) way.
For a case where you have IEnumerable and no other means to create a data table, reflection can step in.
You could use something like below...
public static class ExtensionMethods
{
public static DataTable ToDataTable<T>(this IEnumerable<T> items)
{
DataTable table = new DataTable();
var properties = typeof(T).GetProperties();
foreach (var propertyInfo in properties)
{
table.Columns.Add(propertyInfo.Name, typeof(object));
}
foreach (var item in items)
{
var row = properties.Select(p => NormalizeObject(p.GetValue(item, null))).ToArray();
table.Rows.Add(row);
}
return table;
}
private static object NormalizeObject(object value)
{
Binary bin = value as Binary;
if (bin != null)
{
return bin.ToArray();
}
XElement element = value as XElement;
if (element != null)
{
return element.ToString();
}
return value;
}
}
You will need to write an extension method to convert the IEnumerable into a DataSet. Here is an example of how to convert IEnumerable to a DataTable.
private DataTable ToDataTable<T>(List<T> items)
{
var table = new DataTable(typeof (T).Name);
PropertyInfo[] props = typeof (T).GetProperties(BindingFlags.Public | BindingFlags.Instance);
foreach (PropertyInfo prop in props)
{
Type t = GetCoreType(prop.PropertyType);
table.Columns.Add(prop.Name, t);
}
foreach (T item in items)
{
var values = new object[props.Length];
for (int i = 0; i < props.Length; i++)
{
values[i] = props[i].GetValue(item, null);
}
table.Rows.Add(values);
}
return table;
}
public static Type GetCoreType(Type t)
{
if (t != null && IsNullable(t))
{
if (!t.IsValueType)
{
return t;
}
else
{
return Nullable.GetUnderlyingType(t);
}
}
else
{
return t;
}
}
public static bool IsNullable(Type t)
{
return !t.IsValueType || (t.IsGenericType && t.GetGenericTypeDefinition() == typeof(Nullable<>));
}
Here's a link to the source of this solution: http://www.chinhdo.com/20090402/convert-list-to-datatable/
did you check this tutorial http://msdn.microsoft.com/en-us/library/bb386921.aspx from MS? Otherwise there is no direct conversion between LINQ result and Dataset.
With LINQ2SQL stored procedures you never get DataSets. What you get is Exactly that, a SingleResult. Is an IEnumerable.