How can I get the sum of values from the column "Quantity" that match a selected value in the column "Le Nom". For example :
Le Nom Quantity
HEA200 200
HEB300 150
IPE450 700
HEA200 300
Let's say that the first row is selected, I want to sum the first one (200) and the fourth one (300) since they share the same value (HEA200).
Here is my code to get the sum of all values, but I can't achieve my goal.
int sum = 0;
string NameProfil =
gridView2.GetRowCellValue(gridView2.FocusedRowHandle, "Le Nom").ToString();
for (int i = 0; i < gridView2.RowCount; i++)
{
sum += int.Parse(Convert.ToString(gridView2.GetRowCellValue(i, "Quantité")));
}
MessageBox.Show(sum.ToString() + " "+ NameProfil);
int sum = 0;
string NameProfil = gridView2.GetRowCellValue(gridView2.FocusedRowHandle, "Le Nom").ToString();
for (int i = 0; i < gridView2.RowCount; i++)
{
if(gridView2.GetRowCellValue(i, "Le Nom").ToString() == NameProfil)
{
sum += int.Parse(gridView2.GetRowCellValue(i, "Quantité").ToString());
}
}
MessageBox.Show(sum + " "+ NameProfil);
should do the trick.
Also, .ToString() will be implicitly called on sum when trying to concatenate it with strings, so - while not incorrect - it is not needed.
Related
The objective of this code is to add all the integers in a whole number into one value (e.g "2013" => 6),
In c# I have written the code so it outputs the number to its corresponding ASCII value one at a time, but I am at a loss at how to convert it back into its number value.
Note that I am new at C#
string Year;
int Total = 0;
int Adding = 0;
int Adding2 = 0;
Console.WriteLine("Write The year you want converted");
Year = Console.ReadLine();
for (int i = 0; i < Year.Length; i++)
{
Adding2 = Year[i];
Adding = Convert.ToInt32(Adding2);
Total = Adding + Total;
Console.WriteLine(Total);
}
You should sum values, not ascii codes:
...
for (int i = 0; i < Year.Length; i++)
{
Adding2 = Year[i];
Adding = Adding2 - '0';
Total = Adding + Total;
}
Console.WriteLine(Total);
In general case, you can use char.GetNumericValue():
// double: some characters have fractional values: '⅝'
double Total = 0.0;
foreach (char c in Year) {
double value = char.GetNumericValue(c);
// If character has value (e.g. 'A' doesn't have)
if (value != -1)
Total += value;
}
Console.WriteLine(Total);
I have a datagridview column with values formed by multiple digits.
I'm tryng to obtain the sum of all cell value from this column but first I need to sum the digits in the single cell.
column
Example: every cell has "201" value. I need to obtain "3" as result and then do the sum of all the rows values. So if I have 3 row with 201, I need to obtain 3 for each and then 9 as the total.
How can do this?
This is my code but it seems to give wrong results.
int sum4 = 0;
for (int p = 0; p < dataGridView1.Rows.Count; ++p)
{
sum4 += Convert.ToInt32(dataGridView1.Rows[p].Cells[4].Value);
}
int sum5 = 0;
while (sum4 != 0)
{
sum5 += sum4 % 10;
sum4 /= 10;
}
label22.Text = sum5.ToString();
Now with 3 rows it's giving me 9 as result, with 4 row give me 12 but when I have 5 row it's giving me 6!
EDIT
The solution provided works but I missed a step.
In my datagridview I need first to multiply the "Ripetizioni" value for the "Serie" value and then multiply this for the sum of seconds of the "Time under Tension" column. A the end the sum of all rows values.
So, for a single row I need the multiply of 11 x 3 (first wor example) then the sum of 201 with 3 as result and the multiply of 11x3= 33 x (sum of 201 result). At the end the sum of all rows
How can I achieve this?
Try this. In your solution you are changing value of sum4 every time. This is wrong.
int finalSum = 0;
for (int p = 0; p < dataGridView1.Rows.Count; ++p)
{
finalSum += getCellDigitSum(Convert.ToInt32(dataGridView1.Rows[p].Cells[4].Value));
}
label22.Text = finalSum.ToString();
int getCellDigitSum(int cellValue)
{
int l = cellValue;
int result = 0;
while (l > 0)
{
result += (l % 10);
l = l / 10;
}
return result;
}
Assignment for game design using the Microsoft C# tutorial as a basis
Display to console
the value held by number1 in a grid
with number2 amount of rows
and number3 amount of columns.
Something like "If num1 is 8 * row = 4 and column = 5"
I'm sorry I'm just beginning to learn this.
for (int row = number2; row < number1; row++)
{
for (int column = number3; column <= number1; column++)
{
Console.WriteLine($"{row}{column}");
}
}
Given your comment description
What I'm basically trying to do is to print a grid of numbers using
user input. Num1 is the number that'll be printed, num2 is how many
rows they'll be, and num3 is how many columns they'll be. I've tried
switching up the Var placements but it hasn't given me the result I
prefer.
Potentially, this might be what you are looking for
var charToPrint = 'x';
var rows = 5;
var columns = 4;
for (var i = 0; i < rows; i++)
{
for (var j = 0; j < columns; j++)
Console.Write(charToPrint); // writes character at a time for each column
Console.WriteLine(); // write a new line for end of row
}
Output
xxxx
xxxx
xxxx
xxxx
xxxx
Full Demo Here
Another way might be using Enumerable.Range, Select, string.Join and the string constructor
var result = Enumerable
.Range(1, rows)
.Select(x => new string(charToPrint,columns));
Console.WriteLine(string.Join(Environment.NewLine,result));
To do the input validation you could do something like this
var input = string.Empty;
var rows = 5;
var columns = 4;
Console.WriteLine("Enter character to print");
while ((input = Console.ReadLine()).Length != 1)
Console.WriteLine("you had one job! Enter character to print");
var charToPrint = input[0];
Console.WriteLine("Enter the number of rows");
while (!int.TryParse(Console.ReadLine(),out rows))
Console.WriteLine("you had one job! Enter the number of rows");
Console.WriteLine("Enter the number of columns");
while (!int.TryParse(Console.ReadLine(),out columns))
Console.WriteLine("you had one job! Enter the number of columns");
Example output
Enter character to print
23423
you had one job! Enter character to print
d
Enter the number of rows
d
you had one job! Enter the number of rows
5
Enter the number of columns
4
dddd
dddd
dddd
dddd
dddd
It was this..
for(int row = 0; row < number2; row++)
{
for(int column = 0; column < number3; column++)
{
Console.Write(number1);// once it prints number1
}
Console.WriteLine();// it goes to next line
}
I have a method to work out intervals from a file containing training data. It detects spikes and drops in power to figure out intervals and I'm developing a method to work out averages for each interval. Here is the method:
public void getIntervalData()
{
//Remove first drop anomaly
drops.Remove(0);
int intervalAltitude;
int intervalPower;
int intervalSpeed;
int altitudeAddUp = 0;
int powerAddUp = 0;
int speedAddUp = 0;
int counter = 0;
//Loop through to get all spikes and drops
for (int j = 0; j < spikes.Count(); j++)
{
int firstNumber = Convert.ToInt32(spikes[j]);
int secondNumber = Convert.ToInt32(drops[j]);
MessageBox.Show(firstNumber.ToString());
counter++;
//Difference to work out averages
int difference = secondNumber - firstNumber;
//Get seperate interval data (first number = spike, second number = drop)
for (int i = firstNumber; i < secondNumber; i++)
{
int altitudeNumber = altitudeList[i];
int powerNumber = powerList[i];
int speedNumber = Convert.ToInt32(speedList[i]);
//Add up data
altitudeAddUp += altitudeNumber;
powerAddUp += powerNumber;
speedAddUp += speedNumber;
}
MessageBox.Show("Alt add up:" + altitudeAddUp.ToString());
intervalAltitude = altitudeAddUp / difference;
intervalPower = powerAddUp / difference;
intervalSpeed = speedAddUp / difference;
intervalAverages.Add(new Tuple<int, int, int>(intervalAltitude, intervalPower, intervalSpeed));
MessageBox.Show("Interval: " + counter.ToString() + ": Avgs: " + intervalAverages[0]);
}
MessageBox.Show("Interval averages added. There were: " + counter + " intervals");
}
altitudeAddUp, powerAddUp and speedAddUp are always 0, but I can't figure out why it's not adding up. Probably a rookie error I just can't see it.
I've used a message box previously to test if altitudeNumber, powerNumber and speedNumber contain data and they do, but it won't add up.
I think the problem is that all your variables are integers. And integers don't have any decimal precision, that means that if an interval is 0.999, the actual integer value is 0 (0.999 doesn't exist as integer, to the value is truncated when you call ToInt32).
Use float, double or decimal, depending on the need of precision and range.
I am looking to accumulate the value of each cell and display this in a "total column" at the end of the grid in flex cell. Which is the best way of going about this?
I have following code so far which I don't think is correct!
int total = 0;
for (int B = 3; B < 27; B++)
{
total = total + int.Parse(this.grid2.Cell(0, B).ToString());
this.grid2.Cell(0, 27).Text = total.ToString();
}
Your code seems correct to me, but it can be improved:
int total = 0;
for (int B = 3; B < 27; B++)
{
total = total + Convert.ToInt32(this.grid2.Cell(0, B));
}
this.grid2.Cell(0, 27).Text = total.ToString();
Only put things that need to be repeated inside the for-loop. If it only needs to be done once, put it after (or before if possible) the loop.
Also, try to use more meaningfull names for your variables. I would change 'B' to either 'i' if you don't want to give it a long name, or to 'column', so you (and other developers, like us) know what it stands for.
BTW, the code calculates the sum for one row (the first one). If you want to do it for every row, then you will need a double for-loop:
for(int row = 0;row < numRows; row++){
int total = 0;
for (int column = 3; column < 27; column++)
{
total = total + Convert.ToInt32(this.grid2.Cell(row, column));
}
this.grid2.Cell(row, 27).Text = total.ToString();
}