Convert DataTable to LINQ Anonymous Type - c#

I want a function which takes in a datatable & returns a List (object is not DataRow)
Eg. :
I know I can do this (but this requires column names to be known) :
// Datatable dt = Filled from a Database query & has 3 columns Code,Description & ShortCode
List<object> rtn = new List<object>();
var x = from vals in dt.Select()
select new
{
Code = vals["Code"],
Description = vals["Description"],
ShortCode = vals["ShortCode"],
};
rtn.AddRange(x)
return rtn;
What i want is a generic version so that i can pass in any datatable & it will generate based on column names in the datatable.

Since the property names are not known at compile time and you want to use the data for JSON serialization, you can use the following to create a list of dictionary. If you use Newtonsoft JSON, then the serialization takes care of converting the key value pairs in a JSON object format.
IEnumerable<Dictionary<string,object>> result = dt.Select().Select(x => x.ItemArray.Select((a, i) => new { Name = dt.Columns[i].ColumnName, Value = a })
.ToDictionary(a => a.Name, a => a.Value));

In order to dynamically create properties so as to treat different dataTables with different set of Columns, we can use the System.Dynamic.ExpandoObject. It basically implements, IDictionary <string,object>. The format, which can easily be converted to JSON.
int colCount = dt.Columns.Count;
foreach (DataRow dr in dt.Rows)
{
dynamic objExpando = new System.Dynamic.ExpandoObject();
var obj = objExpando as IDictionary<string, object>;
for (int i = 0; i < colCount; i++)
{
string key = dr.Table.Columns[i].ColumnName.ToString();
string val = dr[key].ToString();
obj[key] = val;
}
rtn.Add(obj);
}
String json = new System.Web.Script.Serialization.JavaScriptSerializer().Serialize(rtn);

You can use the following generic function:-
private static List<T> ConvertDataTable<T>(DataTable dt)
{
List<T> data = newList<T>();
foreach (DataRowrow in dt.Rows)
{
Titem = GetItem<T>(row);
data.Add(item);
}
return data;
}
private static TGetItem<T>(DataRow dr)
{
Type temp = typeof(T);
T obj =Activator.CreateInstance<T>();
foreach (DataColumncolumn in dr.Table.Columns)
{
foreach (PropertyInfopro in temp.GetProperties())
{
if (pro.Name == column.ColumnName)
pro.SetValue(obj,dr[column.ColumnName], null);
else
continue;
}
}
return obj;
}
Please check my article, which has complete demonstration on how to use this generic method.

Here is the original question:
// Datatable dt = Filled from a Database query & has 3 columns Code,Description & ShortCode
List<object> rtn = new List<object>();
var x = from vals in dt.Select()
select new
{
Code = vals["Code"],
Description = vals["Description"],
ShortCode = vals["ShortCode"],
};
rtn.AddRange(x)
return rtn;
Just replace with
List<object> rtn = JsonConvert.DeserializeObject<List<object>>(JsonConvert.SerializeObject(dt));

You will have the provide the anonymous object as a parameter and use json/xml serialization:
protected static List<T> ToAnonymousCollection<T>(DataTable dt, T anonymousObject)
{
List<DataColumn> dataColumns = dt.Columns.OfType<DataColumn>().ToList();
return dt.Rows.OfType<DataRow>().Select(dr =>
{
Dictionary<string, object> dict = new Dictionary<string, object>();
dataColumns.Each(dc => dict.Add(dc.ColumnName, dr[dc]));
return JsonConvert.DeserializeAnonymousType(JsonConvert.SerializeObject(dict), anonymousObject);
}).ToList();
}
Usage:
var anonymousCollection = ToAnonymousCollection(dt, new { Code = [ColumnTypeValue, eg. 0], Description = [ColumnTypeValue, eg. string.Empty], ShortCode = Code=[ColumnTypeValue, eg. 0] })

Related

jObject.Parse and JsonConvert.DeserializeObject data into a DataTable resulting in Parameter Count Mismatch exception

I have trouble with the following (test) code. This gives me a "Parameter Count Mismatch" error at the line
dataTable.Merge(CreateDataTableFromObject(info.GetValue(inputObject)));
The entire code looks like this:
public object SerializeThis(DataTable dataTable1, DataTable dataTable2)
{
string jsonString = #"{'EquipmentNumber':'CP5301078','Data_General_Exp': {'Authgrp':'CP01','Objecttype':'9A1B'}}";
var jConvertObejct = (JsonConvertObject)JsonConvert.DeserializeObject(jsonString, typeof(JsonConvertObject));
var jObject = JObject.Parse(jsonString);
dataTable1 = CreateDataTableFromObject(jConvertObejct);
dataTable2 = CreateDataTableFromObject(jObject);
return jConvertObejct;
}
public DataTable CreateDataTableFromObject(object inputObject)
{
DataTable dataTable = new DataTable();
Type type = inputObject.GetType();
var properties = type.GetProperties();
PropertyInfo info;
for (int i = 0; i < properties.Length; i++)
{
info = properties[i];
if (info.GetValue(inputObject).GetType().GetProperties().Count() > 2)
dataTable.Merge(CreateDataTableFromObject(info.GetValue(inputObject)));
else
if (!dataTable.Columns.Contains(info.Name))
dataTable.Columns.Add(new DataColumn(info.Name, Nullable.GetUnderlyingType(info.PropertyType) ?? info.PropertyType));
}
return dataTable;
}
Note that I am trying to do the same thing with both the JsonConvert object and the JObject - the error is emerging when executing the
CreateDataTableFromObject(object inputObject)
on the JObject object and not on the JsonConvert object.
I need a solution for the JObject as I have to handle some unknown json strings, which I need to put in to a DataTable (column names being the property names and row values being the values of the json objects). I have omitted the usings.
I don't see that this is answered by any of the other stackoverflow articles.
OK - I found that I had tangled things a bit up. And came to this solution:
public static DataTable DeSerializeThis(string jsString)
{
const string json1 = #"{""EquipmentNumber"":""CP1"",""Authgrp"":""CP01"",""Objecttype"":""9A1A""}";
const string json2 = #"{""EquipmentNumber"":""CP2"",""Authgrp"":""CP02"",""Objecttype"":""9B1B""}";
List<JObject> list = new List<JObject>();
list.Add(JObject.Parse(json1));
list.Add(JObject.Parse(json2));
DataTable table = ToDataTable(list);
return table;
}
static public DataTable ToDataTable(List<JObject> list)
{
DataTable dataTable = new DataTable();
int i = 0;
foreach (JToken content in list.ToList<JToken>())
{
dataTable.Rows.Add();
foreach (JProperty prop in content)
{
if (i == 0)
{
dataTable.Columns.Add(prop.Name);
}
dataTable.Rows[i][prop.Name] = prop.Value;
}
i++;
}
return dataTable;
}
Only now is the question if this could be re-written so that the
ToDataTable(List<JObject> list)
Could be of a
List<T>
instead - I haven't found the answer for that...

Alternate of Class Object while creating formatted List

I have a Datatable which is being added to List with specific format. Now as per my requirement I do not want to create generic list from a Class while preserving the format of my data but not able to do it. Below is my code
List<Data> datalist = new List<Data>();
for (int i = 0; i < dt.Rows.Count; i++)
{
Data dd1 = new Data();
dd1.ID = Convert.ToString(dt.Rows[i]["ID"]);
dd1.STATUS = Convert.ToString(dt.Rows[i]["Name"]);
dd1.TYPE = Convert.ToString(dt.Rows[i]["TYPE"]);
datalist.Add(dd1);
}
Is there a way to remove dependency of class Data from the above code keeping the format same?
You can use linq query on your dt like below and project your result into anonymous type like.
var datalist = (from r in dt.AsEnumerable()
select new
{
ID = r.Field<string>("ID"),
Name = r.Field<string>("Name"),
TYPE = r.Field<string>("TYPE"),
}).ToList();
If you want to get name for some Id from datalist then.
string name = datalist.Where(x => x.ID == "123").FirstOrDefault()?.Name;
You could use a dictionary in its place. No more Data class dependency, but the same basic data and format!
var datalist = new List<IDictionary<string, string>>();
for (var i = 0; i < dt.Rows.Count; ++i)
{
var data = new Dictionary<string, string>()
{
{ "ID", Convert.ToString(dt.Rows[i]["ID"]) },
{ "STATUS", Convert.ToString(dt.Rows[i]["Name"]) },
{ "TYPE", Convert.ToString(dt.Rows[i]["TYPE"]) }
};
datalist.Add(data);
}
Then you'd just access the values datalist[i]["ID"] instead of datalist[i].ID.

How do I convert a BigQuery row to JSON using the C# API?

I am pulling some data from a BigQuery table using the code below in C#
BigQueryClient client = BigQueryClient.Create("<Project Name>");
BigQueryTable table = client.GetTable("<Database>", "Students");
string sql = $"select * FROM {table} where Marks='50'";
BigQueryResults results = client.ExecuteQuery(sql);
foreach (BigQueryRow row in results.GetRows())
{
}
I want to be able to either read the entire results variable into JSON or be able to get the JSON out of each row.
Of course, I could create a class that models the table. And inside the foreach loop, I could just read each row into the class object. The class object I can try to serialize into JSON using a third party like "newton soft".
Something like :
class Student{
int id; // assume these are columns in the db
string name;
}
My foreach would now look like:
foreach (BigQueryRow row in results.GetRows())
{
Student s=new Student();
s.id = Convert.ToString(row["id"]);
s.name= Convert.ToString(row["name"]);
// something like string x=x+ s.toJSON(); //using newton soft
}
This way string x will have the JSON generated and appended for each row.
Or is there a way I can just add each student to a collection or List and then get the JSON from the whole list?
This whole reading row by row and field by field seems tedious to me and there must be a simpler way I feel. Did not see any support from Google BigQuery for C# to directly convert to JSON. They did have something in Python.
If not then the list to JSON would be better but I am not sure if it supported.
Update :
https://github.com/GoogleCloudPlatform/google-cloud-dotnet/blob/master/apis/Google.Cloud.BigQuery.V2/Google.Cloud.BigQuery.V2/BigQueryRow.cs
Looks like the Big Query Row class has a RawRow field which is of Type TableRow. And the class uses JSON references so , I am sure they have the data of the row in JSON format . How can I expose it to me ?
This might be a little late but you can use:
var latestResult = _bigQueryClient.ExecuteQuery($"SELECT TO_JSON_STRING(t) FROM `{ProjectId}.{DatasetId}.{TableName}` as t", null
All columns will be serialized as json and placed in the first column on each row. You can then use something like Newtonsoft to parse each row easily.
I ran into the same issue.
I am posting this solution which is not optimized for performance but very simple for multiple data types.
This allows you to deserialize anything (almost)
public class BQ
{
private string projectId = "YOUR_PROJECT_ID";
public BQ()
{
}
public List<T> Execute<T>(string sql)
{
var client = BigQueryClient.Create(projectId);
List<T> result = new List<T>();
try
{
string query = sql;
BigQueryResults results = client.ExecuteQuery(query, parameters: null);
List<string> fields = new List<string>();
foreach (var col in results.Schema.Fields)
{
fields.Add(col.Name);
}
Dictionary<string, object> rowoDict;
foreach (var row in results)
{
rowoDict = new Dictionary<string, object>();
foreach (var col in fields)
{
rowoDict.Add(col, row[col]);
}
string json = Newtonsoft.Json.JsonConvert.SerializeObject(rowoDict);
T o = Newtonsoft.Json.JsonConvert.DeserializeObject<T>(json);
result.Add(o);
}
}
catch (Exception ex)
{
Console.WriteLine(ex.ToString());
}
finally
{
client.Dispose();
Console.WriteLine("Done.");
}
return result;
}
}
You can use Newtonsoft.Json. First download by PackageManager Console the Nuget Package, here you can get the command to do that.
After download you can use it as the following code:
List<Student> list = new List<Student>();
foreach (BigQueryRow row in results.GetRows())
{
Student s=new Student();
s.id = Convert.ToString(row["id"]);
s.name= Convert.ToString(row["name"]);
list.Add(s);
}
var jsonResult = Newtonsoft.Json.JsonConvert.SerializeObject(list);
I hope this can help you.
Here is the complete solution for casting BigQueryResults or GetQueryResultsResponse or QueryResponse data to Model/JSON format using C# reflection:
public List<T> GetBQAsModel<T>(string query) where T : class, new()
{
var bqClient = GetBigqueryClient();
var res = bqClient.ExecuteQuery(query, parameters: null);
return GetModels<T>(res);
}
private List<T> GetModels<T>(BigQueryResults tableRows) where T : class, new()
{
var lst = new List<T>();
foreach (var item in tableRows)
{
var lstColumns = new T().GetType().GetProperties(BindingFlags.DeclaredOnly | BindingFlags.Instance | BindingFlags.Public | BindingFlags.NonPublic).ToList();
var newObject = new T();
for (var i = 0; i < item.RawRow.F.Count; i++)
{
var name = item.Schema.Fields[i].Name;
PropertyInfo prop = lstColumns.FirstOrDefault(a => a.Name.ToLower().Equals(name.ToLower()));
if (prop == null)
{
continue;
}
var val = item.RawRow.F[i].V;
prop.SetValue(newObject, Convert.ChangeType(val, prop.PropertyType), null);
}
lst.Add(newObject);
}
return lst;
}
private List<T> GetModels<T>(GetQueryResultsResponse getQueryResultsResponse) where T : class, new()
{
var lst = new List<T>();
foreach (var item in getQueryResultsResponse.Rows)
{
var lstColumns = new T().GetType().GetProperties(BindingFlags.DeclaredOnly | BindingFlags.Instance | BindingFlags.Public | BindingFlags.NonPublic).ToList();
var newObject = new T();
for (var i = 0; i < item.F.Count; i++)
{
var name = getQueryResultsResponse.Schema.Fields[i].Name;
PropertyInfo prop = lstColumns.FirstOrDefault(a => a.Name.ToLower().Equals(name.ToLower()));
if (prop == null)
{
continue;
}
var val = item.F[i].V;
prop.SetValue(newObject, Convert.ChangeType(val, prop.PropertyType), null);
}
lst.Add(newObject);
}
return lst;
}
private List<T> GetModels<T>(QueryResponse queryResponse) where T : class, new()
{
var lst = new List<T>();
foreach (var item in queryResponse.Rows)
{
var lstColumns = new T().GetType().GetProperties(BindingFlags.DeclaredOnly | BindingFlags.Instance | BindingFlags.Public | BindingFlags.NonPublic).ToList();
var newObject = new T();
for (var i = 0; i < item.F.Count; i++)
{
var name = queryResponse.Schema.Fields[i].Name;
PropertyInfo prop = lstColumns.FirstOrDefault(a => a.Name.ToLower().Equals(name.ToLower()));
if (prop == null)
{
continue;
}
var val = item.F[i].V;
prop.SetValue(newObject, Convert.ChangeType(val, prop.PropertyType), null);
}
lst.Add(newObject);
}
return lst;
}
I would do something like this:
var res = Result. Getrows. Select(x=> new student(){id=x[`ID']}).
And then:
var js = json. Conver(res);
This way is much faster and clearer.

Convert datatable to list of generic type

How to convert datatable into list of generic type. Below is the scenario.
I have datatable with name table1 and contains columns col1,col2. how could we convert this table into a list of type name table1bj(which can be different per the datatable name) with properties col1 and col2 with compatible datatype as of datatable column data types.
There are many post on SO but these are with the converting datatable into predefined object list. Here in my case I have to generate object and list dynamically from the datatable. Thanks.
Assuming that you've already created the class table1bj (consider to make it uppercase due to .NET naming conventions) with two properties col1,col2 (the same). You just have to use Enumerable.Select to create instances of this class and ToList to create a generic List<table1bj>:
List<table1bj> result = table1.AsEnumerable()
.Select(row => new table1bj
{
col1 = row.Field<string>("col1"),
col1 = row.Field<string>("col1")
}
).ToList();
I have also presumed that these properties are strings, otherwise use the correct type with the Field extension method. If you don't know the type you should stay with your DataTable since it's already an in-memory collection with dynamic types.
You can do like this...
Create Class with properties :
public class table1bj
{
public string col1{ get; set; }
public string col2{ get; set; }
}
Convert DataTable to Generic Type :
List<table1bj> Objtable1bj = table1.ToCollection<table1bj>();
I know this question asked many times ago, but also I need a solutions for convert data table to dynamic or generic types in one method and I can't find answer for this, so post my answer.
You can use a extension method to convert data table to any type like below:
public static class Extension
{
public static IList<T> ToList<T>(this DataTable dt, bool isFirstRowColumnsHeader = false) where T : new()
{
var results = new List<T>();
if (dt != null && dt.Rows.Count > 0)
{
var columns = dt.Columns.Cast<DataColumn>().ToList();
var rows = dt.Rows.Cast<DataRow>().ToList();
var headerNames = columns.Select(col => col.ColumnName).ToList();
//
// Find properties name or columns name
if (isFirstRowColumnsHeader)
{
for (var i = 0; i < headerNames.Count; i++)
{
if (rows[0][i] != DBNull.Value && !string.IsNullOrEmpty(rows[0][i].ToString()))
headerNames[i] = rows[0][i].ToString();
}
//
// remove first row because that is header
rows.RemoveAt(0);
}
// Create dynamic or anonymous object for `T type
if (typeof(T) == typeof(System.Dynamic.ExpandoObject) ||
typeof(T) == typeof(System.Dynamic.DynamicObject) ||
typeof(T) == typeof(System.Object))
{
var dynamicDt = new List<dynamic>();
foreach (var row in rows)
{
dynamic dyn = new ExpandoObject();
dynamicDt.Add(dyn);
for (var i = 0; i < columns.Count; i++)
{
var dic = (IDictionary<string, object>)dyn;
dic[headerNames[i]] = row[columns[i]];
}
}
return (dynamic)dynamicDt;
}
else // other types of `T
{
var properties = typeof(T).GetProperties();
if (columns.Any() && properties.Any())
{
foreach (var row in rows)
{
var entity = new T();
for (var i = 0; i < columns.Count; i++)
{
if (!row.IsNull(columns[i]))
{
typeof(T).GetProperty(headerNames[i])? // ? -> maybe the property by name `headerNames[i]` is not exist in entity then get null!
.SetValue(entity, row[columns[i]] == DBNull.Value ? null : row[columns[i]]);
}
}
results.Add(entity);
}
}
}
}
return results;
}
}
We can do it by Reflection also, this method helps to set ClassObject properties by DataTable:
using System.Reflection;
public void SetObjectProperties(object objClass, DataTable dataTable)
{
DataRow _dataRow = dataTable.Rows[0];
Type objType = objClass.GetType();
List<PropertyInfo> propertyList = new List<PropertyInfo>(objType.GetProperties());
foreach (DataColumn dc in _dataRow.Table.Columns)
{
var _prop = propertyList.Where(a => a.Name == dc.ColumnName).Select(a => a).FirstOrDefault();
if (_prop == null) continue;
_prop.SetValue(objClass, Convert.ChangeType(_dataRow[dc], Nullable.GetUnderlyingType(_prop.PropertyType) ?? _prop.PropertyType), null);
}
}

How to convert DataTable to List<object>?

I have a strongly typed class PersonExport. I initially get data into a DataTable and call the following method on the DataTable to convert it to List<PersonExport>:
public static List<T> ConvertToList<T>(DataTable dt, out string message)
{
message = string.Empty;
var list = new List<T>();
try
{
var columnNames = dt.Columns.Cast<DataColumn>()
.Select(c => c.ColumnName)
.ToList();
var properties = typeof(T).GetProperties();
list = dt.AsEnumerable().Select(row =>
{
var objT = Activator.CreateInstance<T>();
foreach (var pro in properties)
{
if (columnNames.Contains(pro.Name))
{
var value = row[pro.Name];
var typeName = value.GetType().FullName;
if (typeName == "MySql.Data.Types.MySqlDateTime")
{
var mySqlDateTime = (MySqlDateTime) value;
if (mySqlDateTime.IsValidDateTime)
{
value = Convert.ToDateTime(mySqlDateTime.ToString());
pro.SetValue(objT, value, null);
}
}
else
{
pro.SetValue(objT, row.IsNull(pro.Name) ? null : value, null);
}
}
}
return objT;
}).ToList();
}
catch (Exception ex)
{
message = (ex.InnerException != null) ? ex.InnerException.Message : ex.Message;
}
return list;
}
However, once I start removing columns from the DataTable returned, it no longer works because the columns in the DataTable don't match up with the properties in the PersonExport list.
I am eventually using the exported list here to export to excel, but it is not working since I have modified by DataTable and it can't Deserialize into a List<PersonExport>:
//Trying to get data into List<object>
List<object> persons = GetPersonExport(query, out message);
var exportData = new Dictionary<string, List<object>> { { "xldata", persons} };
//Deserialize to List<object> to export
var persons = JsonConvert.DeserializeObject<List<object>>(args["xldata"]);
The above line just returns a List of empty objects.
A few things got me thinking, but I am wondering what might be the best approach. I am using the EPPLUS library to export data to excel and it has the option of hiding columns, so would it be better to just export the whole object and hide columns you don't want, this way you avoid the anonymous type or what I can do is still get the whole object, but then convert it to a DataTable and then remove the columns? Thoughts?
All that you want is:
public IEnumerable<object> GetListOfObject()
{
foreach (var prod in TenMostExpensiveProducts().Tables[0].AsEnumerable())
{
yield return prod;
}
}
Or:
TenMostExpensiveProducts().Tables[0].AsEnumerable().Select (x => x).ToList<object>()
But you can get it work more elegant via linq like this:
from prod in TenMostExpensiveProducts().Tables[0].AsEnumerable()
where prod.Field<decimal>("UnitPrice") > 62.500M
select prod
Or like this (AsDynamic is called directly on DataSet):
TenMostExpensiveProducts().AsDynamic().Where (x => x.UnitPrice > 62.500M)
I prefer the last approach while is is the most flexible.
P.S.: Don't forget to connect System.Data.DataSetExtensions.dll reference
List<Dictionary<string, object>> rows = new List<Dictionary<string, object>>();
Dictionary<string, object> row;
foreach (DataRow dr in dt.Rows)
{
row = new Dictionary<string, object>();
foreach (DataColumn col in dt.Columns)
{
row.Add(col.ColumnName, dr[col]);
}
rows.Add(row);
}
StringBuilder sbRes = new StringBuilder();
jSon.Serialize(rows, sbRes);
ret = sbRes.ToString();

Categories

Resources