Issue while sorting Combo Box - c#

I'm working on a task in which I need to populate the comboBox through List and the List is populating from employee.dat file. I'm successfully populates the combo but while sorting it I'm facing an issue. The Values which I'm trying to sort is String Numbers. It shows the values from 0-9 but when it beyond from 9 the combo looks goofy.
Here is the screenshot
What I need to do is sort these values, Means 10 should be leading by 9.
Here is the code snippet which I have tried yet.
private void FormDelete_Load(object sender, EventArgs e)
{
//LOAD EVENT for FORM
//Clear all form controls
Reset();
//Fill the ID Combo box using data in List<T>
comboBoxID.Items.Clear();
foreach (MyClass.Employee obj in MyClass.listEmployees)
{
comboBoxID.Items.Add(obj.ID);
}
//Sort the combo box ibn ascending order
comboBoxID.Sorted = true;
}
Employee.cs
public class Employee
{
public int ID;
public string fName;
public string lName;
public string gender;
public int age;
public double hourWage;
}
public static List<Employee> listEmployees = new List<Employee>();
Empolyee.dat
1, Ann, Crowe, F, 34, 12.95
2, Bob, Costas Jr., M, 27, 8.75
3, Sue, Soppala, F, 22, 7.95
4, Bill, Barton, M, 45, 15.25
5, Jill, Jordan, F, 33, 14.75
6, Art, Ayers, M, 33, 14.75
7, Larry, Stooge, M, 55, 21.05
8, Art, Ayers, M, 33, 14.75
9, Larry, Stooge, M, 55, 21.05
10, Art, Ayers, M, 33, 14.75
11, Larry, Stooge, M, 55, 21.05
List Populating code
if (File.Exists("employee.data"))
{
try
{
streamEmployee = new StreamReader("employee.data");
string line; //to read a line from the text file
string[] arrFields = new string[5];
while (streamEmployee.Peek() > -1)
{
line = streamEmployee.ReadLine(); //read a line of records from file
arrFields = line.Split(','); //split the line at comma junction, and save split //fields in array
MyClass.Employee objEmp = new MyClass.Employee(); //create a "specific" employee //object instance
//Assign each field from line as generic object's properties to make it "specific
objEmp.ID = Convert.ToInt32(arrFields[0].Trim()); //ID is integer
objEmp.fName = arrFields[1].Trim();
objEmp.lName = arrFields[2].Trim();
objEmp.gender = arrFields[3].Trim();
objEmp.age = Convert.ToInt32(arrFields[4].Trim()); //age is integer
objEmp.hourWage = Convert.ToDouble(arrFields[5].Trim()); //hourly wage is double
//Add this specific employee object to the List
MyClass.listEmployees.Add(objEmp);
} //while
} //try
catch (IOException err)
{
MessageBox.Show(err.Message);
error = true;
} //catch
finally
{
if (streamEmployee != null) //it is indeed representing a file and may not be closed
streamEmployee.Close();
} //finally
} //if

Your file is already sorted, thus there is no need to call Sorted=true.
Anyway, if you want to be sure that, whatever order is present in the input file, your code adds the employees following the ID order, then you could change your loop to
foreach (MyClass.Employee obj in MyClass.listEmployees.OrderBy(x => x.ID))
{
comboBoxID.Items.Add(obj.ID);
}
Again no need to set the Sorted property to true....

Related

How to sort an array of strings of zeros and ones?

I am trying to sort an array of ints by it's binary form. for example if i have {1,2,3} i must convert them to binary then sort them based on how many 1 each has, if the number only has one 1 then it is number 1 and then comes the number that has two 1s in its binary form and so on.
i tried to use the following two methods but i couldn't do them due to errors and i don't understand. i searched all stackoverflow but can't figure it out, i am stuck for our here in this problem.
public static int[] xx = { 1, 2, 3, 4, 5, 6, 87, 8, 9, 7, 5, 24, 58, 63, 10, 11, 87, 20, 22, 90, 14, 17, 19, 21, 18, 24 };
public string[] ChangeToBinary(int[] data)
{
string[] binary = new string[data.Length];
var i = 0;
foreach (int d in data)
{
string newvalue = Convert.ToString(d, 2);
binary[i] = newvalue;
i++;
}
return binary;
}
public int[] Sort(string[] binaries)
{
Array.Sort(binaries, (x, y) => x.Count(z => z == '1') > y.Count(e => e == '1'));
string[] sorted = binaries.OrderBy(b => b.Count(z => z == '1') > b.Count(e => e == '1'));
}
the two lines inside the sort function i know that they are wrong in some point that i dont get can someone tell me how to do it ? i try to say to sort the elements of the array based on the least number of 1s in the element.
I won;t use both but i put them to show you what i wanted to use.
thanks in advance.
With a help of Convert and Linq you can easily count 1s:
int value = ...
ones = Convert
.ToString(value, 2) // Binary representation
.Count(c => c == '1'); // Count 1's
Having this done, let's sort the array:
int[] xx = ...
int[] sorted = xx
.OrderBy(value => Convert // criterium: see the code above
.ToString(value, 2)
.Count(c => c == '1'))
.ThenBy(value => value) // on tie e.g. 0b101 and 0b110 let's use value
.ToArray();
There are two different approaches to custom sorting:
You provide a comparison between any two elements. This is what Array.Sort takes. The comparison has to return an int with a value that's negative, zero or positive to indicate whether the first element is less than, equal to or greater than the second element. (Your lambda expression currently returns bool because you're using >.)
You provide a projection from one element, and the sort algorithm compares the results of that projection. That's what OrderBy does - but the lambda expression you're providing in the OrderBy compares the count of one element with the count from the same element.
I would suggest using the OrderBy approach, because that's much simpler. You're just trying to order by the count, so you just need to count the 1s in the element you're provided, and return that count:
string[] sorted = binaries.OrderBy(b => b.Count(z => z == '1')).ToArray();
Now your method is meant to return an int[] at the moment. Presumably it would convert each binary string back to an int. I'd suggest not doing that - instead change your Sort method to return the sorted string[]. You can then perform the conversion in another method, whether that's a separate method or the calling method.
That's the approach I'd take first, as the minimal change to your code - and as a way of understanding what's going on in the Sort and OrderBy methods. You can then potentially change to order the integers directly without ever directly creating an array of strings, as Dmitry suggests.
A solution for those of us who are just tip-toeing around Linq. More code but remember the linq is looping underneath.
public static int[] xx = { 1, 2, 3, 4, 5, 6, 87, 8, 9, 7, 5, 24, 58, 63, 10, 11, 87, 20, 22, 90, 14, 17, 19, 21, 18, 24 };
public class MyBinary
{
public string BinString { get; set; }
public int OneCount { get; set; }
public int OriginalInt { get; set; }
public MyBinary(string input, int count, int original)
{
OriginalInt = original;
BinString = input;
OneCount = count;
}
}
private List<MyBinary> SortByOnes(int[] input)
{
List<MyBinary> lst = new List<MyBinary>();
//The following will give a result First ordered by number of ones but also
// secondarily the original numbers
//just a bit of linq
int[] SortedArray = (from i in input orderby i select i).ToArray();
List <MyBinary> lstSorted = new List<MyBinary>();
int index = 0;
string[] Bin = new string[SortedArray.Count()];
foreach (int i in SortedArray)
{
Bin[index] = Convert.ToString(i, 2);
int oneCount = 0;
foreach (char c in Bin[index])
{
if (c == '1')
oneCount += 1;
}
//sends the binary string, the count of ones, and the original number
//to the constructor of MyBinary
lst.Add(new MyBinary(Bin[index], oneCount, i));
lstSorted = (from b in lst orderby b.OneCount descending select b).ToList();
index += 1;
}
return lstSorted;
}
//To use:
private void TestSort()
{
List<MyBinary> lst = SortByOnes(xx);
foreach (MyBinary b in lst)
{ Debug.Print($"{b.OneCount} - {b.BinString} - {b.OriginalInt}"); }
}

Time series LINQ query

I have a central repository for IoT device logs. So as the logs arrive they have a timestamp. The problem I want to solve is, over a given time span, the same device might send multiple logs regarding its interaction with a specific catalyst. I want to consider that set of logs as a single event and not 5 disparate logs. I want to count the number of interactions. and not the number of logs.
Data Set
public class Data
{
public Guid DeviceId {get; set;}
public DateTime StartTime { get; set; }
public DateTime EndDateTime { get; set; }
public int Id { get; set; }
public int Direction { get; set;}
}
Data d1 = new Data();// imagine it's populated
Data d2 = new Data();// imagine it's populated
I am looking for a LINQ query that would yield something along the lines of
If ((d1.DeviceId == d2.DeviceId ) && (d1.Id == d2.Id) && (d1.Direction == d2.Direction) && (d1.StartTime - d2.StartTime < 15 minutes ))
If i know that the same IoT device is interacting with the same Id (catalyst) and the Direction is the same, and all of those logs occur within a 15 minute time span, It can be presumed that they correspond to the same catalyst event.
I do not control the log creation so ... no i cannot update the data to include "something" that would indicate the relationship.
Data per request... nothing fancy. I am sure most people suspect that I have 30+ properties and I only provide the one impacted by the calculation, but this is a simple set of possibilities
class SampleData
{
public List<Data> GetSampleData()
{
Guid device1 = Guid.NewGuid();
List<Data> dataList = new List<Data>();
Data data1 = new Data();
data1.DeviceId = device1;
data1.Id = 555;
data1.Direction = 1;
data1.StartTime = new DateTime(2010, 8, 18, 16, 32, 0);
data1.EndDateTime = new DateTime(2010, 8, 18, 16, 32, 30);
dataList.Add(data1);
//so this data point should be excluded in the final result
Data data2 = new Data();
data1.DeviceId = device1;
data1.Id = 555;
data1.Direction = 1;
data1.StartTime = new DateTime(2010, 8, 18, 16, 32, 32);
data1.EndDateTime = new DateTime(2010, 8, 18, 16, 33, 30);
dataList.Add(data2);
//Should be included because ID is different
Data data3 = new Data();
data1.DeviceId = device1;
data1.Id = 600;
data1.Direction = 1;
data1.StartTime = new DateTime(2010, 8, 18, 16, 32, 2);
data1.EndDateTime = new DateTime(2010, 8, 18, 16, 32, 35);
dataList.Add(data3);
//exclude due to time
Data data4 = new Data();
data1.DeviceId = device1;
data1.Id = 600;
data1.Direction = 1;
data1.StartTime = new DateTime(2010, 8, 18, 16, 32, 37);
data1.EndDateTime = new DateTime(2010, 8, 18, 16, 33, 40);
dataList.Add(data4);
//include because time > 15 minutes
Data data5 = new Data();
data1.DeviceId = device1;
data1.Id = 600;
data1.Direction = 1;
data1.StartTime = new DateTime(2010, 8, 18, 16, 58, 42);
data1.EndDateTime = new DateTime(2010, 8, 18, 16, 58, 50);
dataList.Add(data5);
return dataList;
}
This turned out to be more complex than I hoped for.
I used a custom LINQ extension method I have called ScanPair which is a variation of my Scan method, which is an version of the APL scan operator (which is like Aggregate, but returns the intermediate results). ScanPair returns the intermediate results of the operation along with each original value. I think I need to think about how to make all of these more general purpose, as the pattern is used by a bunch of other extension methods I have for grouping by various conditions (e.g. sequential, runs, while test is true or false).
public static class IEnumerableExt {
public static IEnumerable<(TKey Key, T Value)> ScanPair<T, TKey>(this IEnumerable<T> src, Func<T, TKey> seedFn, Func<(TKey Key, T Value), T, TKey> combineFn) {
using (var srce = src.GetEnumerator()) {
if (srce.MoveNext()) {
var seed = (seedFn(srce.Current), srce.Current);
while (srce.MoveNext()) {
yield return seed;
seed = (combineFn(seed, srce.Current), srce.Current);
}
yield return seed;
}
}
}
}
Now, you can use a tuple as an intermediate result to track the initial timestamp and the group number, and increment to the next (timestamp, group number) when the interval goes over 15 minutes. If you first group by the interaction, and then count the less than 15-minute groups per interaction, you get the answer:
var ans = interactionLogs.GroupBy(il => new { il.DeviceId, il.Id, il.Direction })
.Select(ilg => new {
ilg.Key,
Count = ilg.OrderBy(il => il.Timestamp)
.ScanPair(il => (firstTimestamp: il.Timestamp, groupNum: 1), (kvp, cur) => (cur.Timestamp - kvp.Key.firstTimestamp).TotalMinutes <= 15 ? kvp.Key : (cur.Timestamp, kvp.Key.groupNum + 1))
.GroupBy(ilkvp => ilkvp.Key.groupNum, ilkvp => ilkvp.Value)
.Count()
});
Here is a portion of a sample of intermediate results from ScanPair - the actual result is a ValueTuple with two fields, where the Key is the intermediate result (which is the ValueTuple of firstTimestamp,groupNum) and Value is the corresponding source (log) item. Using the function seeded version puts the first source item into the seed function to begin the process.
Key_firstTimestamp Key_groupNum Timestamp
7:58 PM 1 7:58 PM
7:58 PM 1 8:08 PM
7:58 PM 1 8:12 PM
8:15 PM 2 8:15 PM
8:15 PM 2 8:20 PM

C# merge multiple lists based on timestamp

I have a data set that comes as a list of objects in C#, looking something below.
public class MotorDataModel
{
public DateTime timestamp { set; get; }
public decimal MotorSpeed { set; get; }
public decimal MotorTemp { set; get; }
public decimal MotorKw{ set; get; }
}
public class MotorModel
{
public string MotorName { set; get; }
public List<MotorDataModel> MotorData { set; get; }
}
When I do the query, I will have 1 or more MotorModel records coming back (say motor 1, 2, 3, ...), each with their own timestamps, and various data points at those time stamps.
I am then sending this data to a javascript charting library, which takes the data in as a data table (e.g. spreadsheet like format), such as:
TimeStamp | Motor1:kW | Motor1:Speed | Motor1:Temp | Motor2:kW |Motor2:Speed ...
with the data following in rows. The data will be grouped on the timestamp, which should be the within a couple minutes of each other, in a consistent increment (say 15 minutes).
The plan is to transform the data in C#, convert it to JSON, and and send it to the chart library (Google Chart).
I don't have to format this in C#, and could convert the Object list data in C# to JSON, and reformat it in javascript on the client, but it seems better to transform it at the server.
Either way, I am struggling on how to transform the data from a multiple list of objects to a "datatable" like view.
This answer via LINQ seems to be close, but I have multiple lists of equipment, not a defined number.
I have also looked at just looping through and building the data table (or array), but unsure of what structure makes the most sense.
So, if anyone has done something similar, or has any feedback, it would be much appreciated.
Suggested format for providing sample data
Below is some sample data provided by BlueMonkMN. Please update the question providing sample data representative of your actual question.
List<MotorModel> allData = new List<MotorModel>() {
new MotorModel() {MotorName="Motor1", MotorData = new List<MotorDataModel> {
new MotorDataModel(){timestamp=new DateTime(2016, 9, 18, 2, 56, 0), MotorSpeed=20.0M, MotorTemp=66.2M, MotorKw=5.5M},
new MotorDataModel(){timestamp=new DateTime(2016, 9, 18, 3, 10, 30), MotorSpeed=10.0M, MotorTemp=67.0M, MotorKw=5.5M},
new MotorDataModel(){timestamp=new DateTime(2016, 9, 18, 3, 25, 45), MotorSpeed=17.5M, MotorTemp=66.1M, MotorKw=5.8M},
new MotorDataModel(){timestamp=new DateTime(2016, 9, 18, 3, 40, 23), MotorSpeed=22.2M, MotorTemp=65.8M, MotorKw=5.4M}
}},
new MotorModel() {MotorName="Motor2", MotorData = new List<MotorDataModel> {
new MotorDataModel(){timestamp=new DateTime(2016, 9, 18, 2, 58, 0), MotorSpeed=21.0M, MotorTemp=67.2M, MotorKw=5.6M},
new MotorDataModel(){timestamp=new DateTime(2016, 9, 18, 3, 11, 30), MotorSpeed=11.0M, MotorTemp=68.0M, MotorKw=5.6M},
new MotorDataModel(){timestamp=new DateTime(2016, 9, 18, 3, 24, 45), MotorSpeed=18.5M, MotorTemp=67.1M, MotorKw=5.9M},
new MotorDataModel(){timestamp=new DateTime(2016, 9, 18, 3, 39, 23), MotorSpeed=23.2M, MotorTemp=66.8M, MotorKw=5.5M}
}}
};
One possibility is to iterate through all the data, and build the table, as you suggest. I would suggest using a Dictionary, with the timestamp as the key. For each timestamp there can be multiple MotorData's, so it could have a list, like this:
Dictionary<DateTime, List<MotorDataModel>>
A code snippet to build this table would look like this:
List<MotorModel> motorModels; // filled in previously
// build result structure in this dictionary:
Dictionary<DateTime, List<MotorDataModel>> table = new Dictionary<DateTime, List<MotorDataModel>>();
// iterate through all motors and their data, and fill in the table
foreach(MotorModel m in motorModels)
{
foreach(MotorDataModel md in m.MotorData)
{
DateTime ts = md.timestamp;
// if this is the first occurance of the timestamp, create new 'row'
if (!table.ContainsKey(ts)) table[ts] = new List<MotorDataModel>();
// add the data to the 'row' of this timestamp
table[ts].Add(md);
}
}
// output the table
foreach(DateTime ts in table.Keys)
{
...
foreach(MotorDataModel md in table[ts])
{
...
}
}
I'd use Json.NET from NewtonSoft.
JObject o = new JObject();
foreach (MotorModel mm in allData) {
foreach (MotorDataModel mdm : mm.MotorData()) {
string key = mdm.TimeStamp.ToString(); // Or do your own format
o[key][mm.MotorName + ":kW"] = mdm.MotorKw;
o[key][mm.MotorName + ":Speed"] = mdm.MotorSpeed;
o[key][mm.MotorName + ":TEmp"] = mdm.MotorTemp;
}
}
Could you try something like this to compute your data:
var motorData = allData.SelectMany(x => x.MotorData).ToArray();
var starting = motorData.Min(x => x.timestamp);
var ending = motorData.Max(x => x.timestamp);
var duration = ending.Subtract(starting);
var blocks = (int)Math.Ceiling(duration.TotalMinutes / 15.0);
var query =
from b in Enumerable.Range(0, blocks)
let s = starting.AddMinutes(b * 15.0)
let e = starting.AddMinutes((b + 1.0) * 15.0)
select new
{
Timestamp = s,
MotorSpeedAverage =
motorData
.Where(x => x.timestamp >= s && x.timestamp < e)
.Average(x => x.MotorSpeed),
};
I get this result:

Adding up columns in a CSV file to a distinct Key

I have a CSV file that has numerous rows and columns. All rows will be distinct except for the first column which will be an account name. There are for example ten different account names that have about 100 rows for each account name . So for each account name that appears I need to get a sum for all the proceeding columns in that row.
So a csv file like this:
Smith, 10, 5, 9
Smith, 9, 5, 6
Jones, 10, 5, 7
jones, 9, 6, 5
Needs to be written to another file like this:
Smith, 19, 19, 15
Jones, 19, 11, 12
Been trying all morning using either an array or a Dictionary to do this, but I can't seem to logically get this done.
static void Main(string[] args)
{
String path = #"C:\Users\jhochbau\documents\visual studio 2015\Projects\CsvReader\CsvReader\Position_2016_02_25.0415.csv";
string[] lines = File.ReadAllLines(path);
foreach(string line in lines)
{
string[] parsedLine = line.Split(',');
//for each iteration through the loop where string[0] is already existing
//I want to have sum = sum + string[1]
}
Console.Read();
}
I also tried using a dictionary for this but ended up only grabbing the row when a distinct name came up.
//To add account as key and each line as value
foreach(var s in data)
{
string[] temp = s.Split(',');
//Adds key of account into dictionary, full string line as value.
dictionary.Add(temp[0], temp);
foreach (KeyValuePair<string, string[]> accountKeyValuePair in dictionary)
{
Console.WriteLine("Account = {}", accountKeyValuePair.Key); //Additional information: Input string was not in a correct format.
}
}
Looking for either a link to a similar example, or maybe a gentle nudge in the right logical direction. I don't really want the answer coded for me.
Check this:
public static void Run()
{
var lines = new List<string>() {
"Smith, 10, 5, 9",
"Smith, 9, 5, 6",
"Jones, 10, 5, 7",
"Jones, 9, 6, 5"
};
var aList = from l in lines
select new { Name = l.Split(',')[0], Value1 = Convert.ToInt32(l.Split(',')[1]), Value2 = Convert.ToInt32(l.Split(',')[2]), Value3 = Convert.ToInt32(l.Split(',')[3]) };
var vList = from a in aList
group a by a.Name into g
select new { Name = g.Key, Value1Sum = g.Sum(a => a.Value1), Value2Sum = g.Sum(a => a.Value2), Value3Sum = g.Sum(a => a.Value3) };
foreach (var v in vList)
Console.WriteLine("{0} {1} {2} {3}", v.Name, v.Value1Sum, v.Value2Sum, v.Value3Sum);
}

Array value resets automatically

Ok this silly problem is connected to the post HERE. What I did is basically removed the return and was able to set the values of the xValues array depending on combobox selection index. as per this pic
But as I try to call another method to divide certain variable with xValues.Length it gives me 'System.DivideByZeroException error as the value for xValues and xValues.Length resets to zero. Here is the code snippet:
int[] xValues = { }; //declaring empty array
private void comboBox1_SelectedValueChanged(object sender, EventArgs e) //using selection
//to set the xValues
{
if (comboBox1.SelectedIndex == 0)
{
int[]xValues= { 1, 2, 3, 4, 5 };
}
else if (comboBox1.SelectedIndex == 1)
{
int[] xValues = { 6, 7, 8, 9, 10 };
}
else if (comboBox1.SelectedIndex == 2)
{
int[] xValues = { 11, 12, 13, 14, 15 };
}
}
And then lets say I'm calling a method doSomeThing()
public void doSomeThing()
{
int bSum = bValues.Sum(); //bValues comes from different input and in debugger
//it shows expected values.
int aSum = xValues.Sum(); //Here the debugger tells me aSum doesn't exists
int slope = bSum / xValues.Length; //Divided by zero exception error goes here.
}
Why and how the the values are resetting for xValues?
waka's answer is right about what's wrong - you're declaring new local variables which are entirely independent of the instance variable.
However, waka's fix isn't quite right - you can only initialize arrays in that particular way at the point of the declaration of the variable. To assign a new value to the variable, you need slightly different syntax:
xValues = new[] { 1, 2, 3, 4, 5 };
Or if you want to specify the element type explicitly:
xValues = new int[] { 1, 2, 3, 4, 5 };
You are not resetting anything. You declare a new int array (as in: a new object) every time the selected value changes! That's why the length of the array in the other method is always 0: Because it's still the global empty int array.
int[] xValues = { }; //declaring empty array
private void comboBox1_SelectedValueChanged(object sender, EventArgs e) //using selection
//to set the xValues
{
if (comboBox1.SelectedIndex == 0)
{
xValues= new int[] { 1, 2, 3, 4, 5 };
}
else if (comboBox1.SelectedIndex == 1)
{
xValues = new int[] { 6, 7, 8, 9, 10 };
}
else if (comboBox1.SelectedIndex == 2)
{
xValues = new int[] { 11, 12, 13, 14, 15 };
}
}
This should do the trick.
Set the length of the xValues array to five if it will always be five. I.e
int[] xValues =new int[5];
Then in your if statements just assign the xValues array the new values i.e.
xValues =new {1,2,3,4,5};
You never told the computer how big the array was. But by saying
int[]xValues xValues =new {1,2,3,4,5}
You are changing the scope for that variable; you are not actually referencing the original xValues array

Categories

Resources