LoadFromCollection horizontally - c#

Using EPPlus I want to load data horizontally.
var randomData = new[] { "Foo", "Bar", "Baz" }.ToList();
ws.Cells["B4"].LoadFromCollection(randomData);
Default behaviour is vertically, this code will result in:
This is what I need:
Downside of using EPPlus, their documentation is sketchy.

What if you did something like this:
var randomData = new[] { "Foo", "Bar", "Baz" }.ToList();
//ws.Cells["B4"].LoadFromCollection(randomData);
ws.Cells["B4"].LoadFromArrays(new List<string[]>(new[] { randomData.ToArray() }));
Which gives me this in the output:
Bear in mind that if you are concerned about performance, say with very large collections, you are better off writing your own code anyway as the LoadFrom* methods do add overhead to account for multiple scenarios.

If I am condemned to looping myself, I can write the code:
public byte[] TestExcellGeneration_HorizontalLoadFromCollection()
{
byte[] result = null;
using (ExcelPackage pck = new ExcelPackage())
{
var foo = pck.Workbook.Worksheets.Add("Foo");
var randomData = new[] { "Foo", "Bar", "Baz" }.ToList();
//foo.Cells["B4"].LoadFromCollection(randomData);
int startColumn = 2; // "B";
int startRow = 4;
for(int i = 0; i < randomData.Count; i++)
{
foo.Cells[startRow, startColumn + i].Value = randomData[i];
}
result = pck.GetAsByteArray();
}
return result;
}
And when you call this from a TestMethod:
[TestMethod]
public void TestExcellGeneration_HorizontalLoadFromCollection()
{
var excelFileBytes = (new MyExcelGenerator()).TestExcellGeneration_HorizontalLoadFromCollection();
OpenExcelFromTempFile(excelFileBytes);
}
private void OpenExcelFromTempFile(byte[] data)
{
string tempPath = System.IO.Path.GetTempFileName();
System.IO.File.WriteAllBytes(tempPath, data);
Application excelApplication = new Application();
_Workbook excelWorkbook;
excelWorkbook = excelApplication.Workbooks.Open(tempPath);
excelApplication.Visible = true;
}
It results in:

Here is an extension method:
public static void LoadFromCollectionHorizontally<T>(this ExcelWorksheet excelWorksheet, IEnumerable<T> objects, string cellAddress = "A1")
{
List<object[]> valuesHorizontally = new List<object[]>();
if (typeof(T).IsClass)
{
var properties = typeof(T)
.GetProperties(BindingFlags.Instance | BindingFlags.Public)
.Where(p => !Attribute.IsDefined(p, typeof(EpplusIgnore)));
foreach (var prop in properties)
{
var values = new List<object>();
foreach (T item in objects)
{
values.Add(prop.GetValue(item));
}
valuesHorizontally.Add(values.ToArray());
}
}
else
{
valuesHorizontally.Add(objects.Cast<ValueType>().ToArray());
}
var startingCellRange = excelWorksheet.Cells[cellAddress];
var filledUpCellRange = startingCellRange.LoadFromArrays(valuesHorizontally);
...
}

Related

How to read data from any file and insert to class object using c#

I am reading a data from CSV file and trying to add to object. but when i add new record by looping through records the data in object is getting overridden.
public static void Process()
{
var path= #"ABC.csv";
var header = false;
var lines = System.IO.File.ReadAllLines(filePath);
var element = new List<Content>();
var mapTo = new Content();
for (var i = 0; i < lines.Length; i++)
{
var t = lines[i];
if (!header)
{
header = true;
}
else
{
var data = t.Split(',');
mapTo.Key= data[0];
mapTo.Name= data[1];
mapTo.Number= data[2];
mapTo.Date= data[3];
element.Add(mapTo);
}
}
var result = element;
}
so here when i try to add mapTo to element all the content in element variable is getting overridden.Can anybody suggest what's wrong i am doing here.
Move var mapTo = new Content(); into the for loop. Otherwise you have the same instance of Content for each loop iteration which gets overriden by the assigenments in the for loop.
public static void Process()
{
var path= #"ABC.csv";
var header = false;
var lines = System.IO.File.ReadAllLines(filePath);
var element = new List<Content>();
for (var i = 0; i < lines.Length; i++)
{
var mapTo = new Content(); //here
var t = lines[i];
if (!header)
{
header = true;
}
else
{
var data = t.Split(',');
mapTo.Key= data[0];
mapTo.Name= data[1];
mapTo.Number= data[2];
mapTo.Date= data[3];
element.Add(mapTo);
}
}
var result = element;
}

Assigning values to array elements based on a look up table

HI i am writing a c# program where i need to populate a array based on a look up table and set of string arrays with metadata. My lookup table looks like this (Table with key: transmitter, value: Array of receiver)
{
LED1: ["px1","px2","px3"],
LED2: ["px4","px5","px6"]
}
and my meta arrays looks like this (It is dynamic. Just an example. This comes as a response from DB query.)
var transmitters = new string[] { "LED1", "LED2" };
var receivers = new string[] { "px1", "px2", "px3", "px4", "px5", "px6" };
My requirement is
If the transmitter LED1 or LED2 (or any other transmitter) is present in the lookup table, the value of the transmitter (ie ["px1","px2","px3"]) has to be compared with the receiver which are present in the lookup and led has to be marked yellow.
Orphan tranmitter or/ receiver has to be marked red.
Example
LookUp
{
LED1: ["px1", "px2", "px3"],
LED2: ["px5", "px8"]
}
Tranmitters and receivers
var transmitters = new string[] { "led1", "led2" };
var receivers = new string[] { "px1", "px2", "px3", "px4", "px5", "px6" };
the result should be a list as
led1-yellow
px1-yellow
px2-yellow
px3-yellow
led2-yellow
px5-yellow
px4-red
px6-red.
I have written code that works
public class Program
{
public static void Main()
{
var transmitters = new string[] { "led1", "led2" };
var receivers = new string[] { "px1", "px2", "px3", "px4", "px5", "px6" };
var lookup = new Dictionary<string, string[]>() { { "led1", new string[] { "px1", "px2", "px3" } }, { "led2", new string[] { "px5", "px8" } } };
var blocks = new List<Block>();
var blocksTracker = new List<string>();
foreach (var transmitter in transmitters)
{
if (lookup.ContainsKey(transmitter))
{
var receiverLookup = lookup[transmitter];
var intersection = receivers.Intersect(receiverLookup).ToArray();
if (intersection.Length > 0)
{
blocks.Add(new Block() { Id = transmitter, status = "yellow" });
blocksTracker.Add(transmitter);
foreach (var receiver in intersection)
{
blocks.Add(new Block() { Id = receiver, status = "yellow" });
blocksTracker.Add(receiver);
}
}
else
{
blocks.Add(new Block() { Id = transmitter, status = "red" });
blocksTracker.Add(transmitter);
}
}
}
}
}
I am new to c# and i wanted to know if there is a better way of doing this. Please help. You can see the working fiddle Here

foreach and index in .ToDictionary C#

I am web-scraping some data and trying to write the scraped data to a json file using C# newtonsoft.Json
I get stuck when writing a foreach in my .ToDictionary function as well as not being able to ++ an index into my .ToDictionary function.
My class:
public class JsonParametersData
{
public bool required { get; set; }
public bool list { get; set; }
public List<string> options { get; set; }
}
My arrays
var jsonData = new List<Dictionary<string, Dictionary<string, JsonParametersData>>>();
var moduleParameters = new List<string>();
var parameterOptionsArray = new List<List<string>>();
var parameterOptions = new List<string>();
var requiredArray = new List<bool>();
var listArray = new List<bool>();
string moduleName = item.Attributes["href"].Value.Replace("_module.html", "");
The code which is commented shows what I am trying to do.
int index = 0;
jsonData.Add(new Dictionary<string, Dictionary<string, JsonParametersData>>()
{
{
moduleName,
moduleParameters
.ToDictionary(n => n,
n => new JsonParametersData
{
required = requiredArray[index],
list = listArray[index],
options = new List<string>() { "option1", "option2" },
/*
foreach (var parameteroption in parameterOptionsArray[index])
{
options.Add(parameteroption);
}
index++;
*/
})
}
});
string json = JsonConvert.SerializeObject(jsonData.ToArray());
//write string to file
System.IO.File.WriteAllText(#"path", json);
Your parameterOptionsArray is not an Array, but a List of lists.
The thing is that parameterOptionsArray[index] is a List, not a string. So you should use AddRange() instead of Add().
parameterOptionsArray.Foreach(parameteroption => options.AddRange(parameteroption));
As I´ve written in the comments you can make only assignments in an object-initializer. Thus the following is allowed:
var a = new { MyMember = anInstance }
whilst this is not:
var a = new { MyMember = anInstance, anInstance.DoSomething() };
That´s one of those cases where you should not use Linq at all, as it leads to more confusion than it helps. Instead use a good old-styled loop:
int index = 0;
var innerDict = new Dictionary<string, JsonParametersData>();
foreach(var name in moduleParameters)
{
innerDict[name] = new JsonParametersData
{
required = requiredArray[index],
list = listArray[index],
options = new List<string>() { "option1", "option2" },
}
innerDict[name].Options.AddRange(parameterOptionsArray[index]);
index++;
}
var dict = new Dictionary<string, Dictionary<string, JsonParametersData>>();
dict[moduleName] = innerDict;
jsonData.Add(dict);
string json = JsonConvert.SerializeObject(jsonData.ToArray());
You appear to have a jagged array in parameterOptionsArray. You can make use of SelectMany here. Perhaps following sample can help:
string[][] parameterOptionsArray = new string[2][];
parameterOptionsArray[0] = new string[2];
parameterOptionsArray[0][0] = "1";
parameterOptionsArray[0][1] = "2";
parameterOptionsArray[1] = new string[2];
parameterOptionsArray[1][0] = "3";
parameterOptionsArray[1][1] = "4";
var testing = new {options = parameterOptionsArray.SelectMany(x => x).ToList()};
testing.options.ForEach(x => Console.WriteLine(x));

How do I convert a BigQuery row to JSON using the C# API?

I am pulling some data from a BigQuery table using the code below in C#
BigQueryClient client = BigQueryClient.Create("<Project Name>");
BigQueryTable table = client.GetTable("<Database>", "Students");
string sql = $"select * FROM {table} where Marks='50'";
BigQueryResults results = client.ExecuteQuery(sql);
foreach (BigQueryRow row in results.GetRows())
{
}
I want to be able to either read the entire results variable into JSON or be able to get the JSON out of each row.
Of course, I could create a class that models the table. And inside the foreach loop, I could just read each row into the class object. The class object I can try to serialize into JSON using a third party like "newton soft".
Something like :
class Student{
int id; // assume these are columns in the db
string name;
}
My foreach would now look like:
foreach (BigQueryRow row in results.GetRows())
{
Student s=new Student();
s.id = Convert.ToString(row["id"]);
s.name= Convert.ToString(row["name"]);
// something like string x=x+ s.toJSON(); //using newton soft
}
This way string x will have the JSON generated and appended for each row.
Or is there a way I can just add each student to a collection or List and then get the JSON from the whole list?
This whole reading row by row and field by field seems tedious to me and there must be a simpler way I feel. Did not see any support from Google BigQuery for C# to directly convert to JSON. They did have something in Python.
If not then the list to JSON would be better but I am not sure if it supported.
Update :
https://github.com/GoogleCloudPlatform/google-cloud-dotnet/blob/master/apis/Google.Cloud.BigQuery.V2/Google.Cloud.BigQuery.V2/BigQueryRow.cs
Looks like the Big Query Row class has a RawRow field which is of Type TableRow. And the class uses JSON references so , I am sure they have the data of the row in JSON format . How can I expose it to me ?
This might be a little late but you can use:
var latestResult = _bigQueryClient.ExecuteQuery($"SELECT TO_JSON_STRING(t) FROM `{ProjectId}.{DatasetId}.{TableName}` as t", null
All columns will be serialized as json and placed in the first column on each row. You can then use something like Newtonsoft to parse each row easily.
I ran into the same issue.
I am posting this solution which is not optimized for performance but very simple for multiple data types.
This allows you to deserialize anything (almost)
public class BQ
{
private string projectId = "YOUR_PROJECT_ID";
public BQ()
{
}
public List<T> Execute<T>(string sql)
{
var client = BigQueryClient.Create(projectId);
List<T> result = new List<T>();
try
{
string query = sql;
BigQueryResults results = client.ExecuteQuery(query, parameters: null);
List<string> fields = new List<string>();
foreach (var col in results.Schema.Fields)
{
fields.Add(col.Name);
}
Dictionary<string, object> rowoDict;
foreach (var row in results)
{
rowoDict = new Dictionary<string, object>();
foreach (var col in fields)
{
rowoDict.Add(col, row[col]);
}
string json = Newtonsoft.Json.JsonConvert.SerializeObject(rowoDict);
T o = Newtonsoft.Json.JsonConvert.DeserializeObject<T>(json);
result.Add(o);
}
}
catch (Exception ex)
{
Console.WriteLine(ex.ToString());
}
finally
{
client.Dispose();
Console.WriteLine("Done.");
}
return result;
}
}
You can use Newtonsoft.Json. First download by PackageManager Console the Nuget Package, here you can get the command to do that.
After download you can use it as the following code:
List<Student> list = new List<Student>();
foreach (BigQueryRow row in results.GetRows())
{
Student s=new Student();
s.id = Convert.ToString(row["id"]);
s.name= Convert.ToString(row["name"]);
list.Add(s);
}
var jsonResult = Newtonsoft.Json.JsonConvert.SerializeObject(list);
I hope this can help you.
Here is the complete solution for casting BigQueryResults or GetQueryResultsResponse or QueryResponse data to Model/JSON format using C# reflection:
public List<T> GetBQAsModel<T>(string query) where T : class, new()
{
var bqClient = GetBigqueryClient();
var res = bqClient.ExecuteQuery(query, parameters: null);
return GetModels<T>(res);
}
private List<T> GetModels<T>(BigQueryResults tableRows) where T : class, new()
{
var lst = new List<T>();
foreach (var item in tableRows)
{
var lstColumns = new T().GetType().GetProperties(BindingFlags.DeclaredOnly | BindingFlags.Instance | BindingFlags.Public | BindingFlags.NonPublic).ToList();
var newObject = new T();
for (var i = 0; i < item.RawRow.F.Count; i++)
{
var name = item.Schema.Fields[i].Name;
PropertyInfo prop = lstColumns.FirstOrDefault(a => a.Name.ToLower().Equals(name.ToLower()));
if (prop == null)
{
continue;
}
var val = item.RawRow.F[i].V;
prop.SetValue(newObject, Convert.ChangeType(val, prop.PropertyType), null);
}
lst.Add(newObject);
}
return lst;
}
private List<T> GetModels<T>(GetQueryResultsResponse getQueryResultsResponse) where T : class, new()
{
var lst = new List<T>();
foreach (var item in getQueryResultsResponse.Rows)
{
var lstColumns = new T().GetType().GetProperties(BindingFlags.DeclaredOnly | BindingFlags.Instance | BindingFlags.Public | BindingFlags.NonPublic).ToList();
var newObject = new T();
for (var i = 0; i < item.F.Count; i++)
{
var name = getQueryResultsResponse.Schema.Fields[i].Name;
PropertyInfo prop = lstColumns.FirstOrDefault(a => a.Name.ToLower().Equals(name.ToLower()));
if (prop == null)
{
continue;
}
var val = item.F[i].V;
prop.SetValue(newObject, Convert.ChangeType(val, prop.PropertyType), null);
}
lst.Add(newObject);
}
return lst;
}
private List<T> GetModels<T>(QueryResponse queryResponse) where T : class, new()
{
var lst = new List<T>();
foreach (var item in queryResponse.Rows)
{
var lstColumns = new T().GetType().GetProperties(BindingFlags.DeclaredOnly | BindingFlags.Instance | BindingFlags.Public | BindingFlags.NonPublic).ToList();
var newObject = new T();
for (var i = 0; i < item.F.Count; i++)
{
var name = queryResponse.Schema.Fields[i].Name;
PropertyInfo prop = lstColumns.FirstOrDefault(a => a.Name.ToLower().Equals(name.ToLower()));
if (prop == null)
{
continue;
}
var val = item.F[i].V;
prop.SetValue(newObject, Convert.ChangeType(val, prop.PropertyType), null);
}
lst.Add(newObject);
}
return lst;
}
I would do something like this:
var res = Result. Getrows. Select(x=> new student(){id=x[`ID']}).
And then:
var js = json. Conver(res);
This way is much faster and clearer.

Setting null in list cause null value in other list

I have two lists which i want to get the different items from them
SearchElement[] criteria = new SearchElement[] {
new SearchElement
{
Comparison = "=",
FieldName = "CableProperty.ProjectId",
FieldValue = int.Parse(comboBoxSource.SelectedValue.ToString()),
LogicalOperator = "" }
};
sourceCables = client.GetCables(criteria, null, "Cores,CableProperty,CableProperty.CableApplication").ToList();
criteria = new SearchElement[] {
new SearchElement
{
Comparison = "=",
FieldName = "CableProperty.ProjectId",
FieldValue = int.Parse(comboBoxDestination.SelectedValue.ToString()),
LogicalOperator = "" }
};
destinationCables = client.GetCables(criteria, null, "Cores,CableProperty,CableProperty.CableApplication").ToList();
diffCables = sourceCables.Except(destinationCables, new CableComparer())
.ToList();
Now I have the different items in diffcable. Sometimes i want to set
diffCable.CableProperty.CableApplication = null;
but when i do that, all the navigation Porperty(CableApplication) in sourcelist is also set to null.
this is the code
if (destinationCableApplications.Contains(diffCable.CableProperty.CableApplication, new CableApplicationComparer()))
{
criteria = new SearchElement[] { new SearchElement { Comparison = "=", FieldName = "ProjectId", FieldValue = int.Parse(comboBoxDestination.SelectedValue.ToString()), LogicalOperator = "" }};
cableApplication = client.GetCableApplications(criteria, null, "").SingleOrDefault();
diffCable.CableProperty.CableApplication = null;
}
excatly in after this line
diffCable.CableProperty.CableApplication = null;
all the
sourcecables[0].CableProperty.CableApplication
sourcecables[1].CableProperty.CableApplication
.....
sourcecables[100].CableProperty.CableApplication
are set to null
what should i do to not lose the navigation property in sourcelist when i set null to navigation property in diffcable ?
easiest way is using MemoryStream..
Here is a sample,
[Serializable]
public class temp
{
public int a;
}
class Program
{
public static T DeepClone<T>(T a)
{
using (MemoryStream stream = new MemoryStream())
{
BinaryFormatter formatter = new BinaryFormatter();
formatter.Serialize(stream, a);
stream.Position = 0;
return (T)formatter.Deserialize(stream);
}
}
static void Main(string[] args)
{
List<temp> list1 = new List<temp>();
list1.Add(new temp { a = 1 });
list1.Add(new temp { a = 2 });
list1.Add(new temp { a = 3 });
List<temp> list2 = DeepClone<List<temp>>(list1);
list1[1].a = 4;
Console.WriteLine(list2[1].a);
Console.ReadKey();
}
}
Note: class must be Serializable.
This will work for all value and reference types.
You are doing a copy by reference without realising it. Look into Cloning your objects or creating new lists.
List<int> newCopyList = new List<int>(originalList);

Categories

Resources