I have the following problem. I create a chart with migradoc in c#.
Suppose I have the following points for my xAxis:
20.4, 20.6, 30.6, 100.4, 200.3
The problem is that it sets every xpoint in the series on an equal distance in the chart.
While what I need is a graph who sets the xpoints on a relative distance. For example, the distance between points 20.6 and 30.6 needs to be way smaller than the distance between 30.6 and 100.4. (The points always differ, as do the number of points)
One way to make the distance good is to add extra points between the existing points. For example the first step is 0.2 extra, the second step is 10.0 extra. So I want to add for example 50 extra points between this step, so that the distance is relative the same.
This is the only thing I can come up with, can somebody give me some advice how to accomplish this? (Or another possible solution?)
This method worked out for me. I first made the distances relative:
Int64[] relAfstand = new Int64[afstand.Count()];
for(int i = 0; i < afstand.Count(); i++){
double tussenRel = Convert.ToDouble(afstand[i]);
double eindRel = Convert.ToDouble(afstand[afstand.Count()-1]);
double beginRel = Convert.ToDouble(afstand[0]);
double Rel = (((eindRel - beginRel) - (eindRel - tussenRel)) / (eindRel - beginRel));
relAfstand[i] = Convert.ToInt64((Rel)*100);
}
Then I converted the data to scale with relative with the same factor as the distances:
List<double> ConvertedData = new List<double>();
int c = 0;
int c2 = 1;
double steps = 0;
bool calcSteps = false;
bool calcDistance = false;
for (int i = 0; i < 100; i++) {
if (calcDistance == false) {
distance.Add(i);
}
if (relAfstand[c] == i) {
ConvertedData.Add(data[c]);
calcSteps = false;
c2 = 1;
c++;
}else {
if (calcSteps == false) {
steps = ((data[c] - data[c-1])/(relAfstand[c] - relAfstand[c-1]));
calcSteps = true;
}
ConvertedData.Add(data[c-1] + (steps * c2));
c2++;
}
}
calcDistance = true;
Probably not the best workaround, but it works. Since the percentages can come close together I scale both now with around 200-300 instead of 100.
Related
In my C# program I have a dataset where each data point consists of:
a stimulus intensity (intensity) as x-coordinate
the percentage of correct response (percentageCorrect) to stimulus as y-coordinate
When the intensity is low percentageCorrect is low. When the intensity is high the percentageCorrect is high. The function graph is an S-shaped curve as the percentageCorrect reaches an asymptote at low and high ends.
I am trying to find the threshold intensity where percentageCorrect is half way between the asymtotes at either end (center of the S-shaped curve)
I understand this to be a function maximization problem that can be solved by the Nelder Meade Simplex algorithm.
I am trying to solve my problem using the Nelder Meade Simplex algorithm in mathdotnet and its IObjectiveFunction parameter.
However, I am having trouble understanding the API of the NedlerMeadeSimplex class FindMinimum method and the IObjectiveFunction EvaluateAt method.
I am new to numerical analysis that is pre-requisite for this question.
Specific questions are:
For the NedlerMeadeSimplex class FindMinimum method what are the initialGuess and initialPertubation parameters?
For the IObjectiveFunction EvaluateAt method, what is the point parameter? I vaguely understand that the point parameter is a datum in the dataset being minimized
How can I map my data set to this API and solve my problem?
Thanks for any guidance on this.
The initial guess is a guess at the model parameters.
I've always used the forms that don't require an entry of the initialPertubation parameter, so I can't help you there.
The objective function is what your are trying to minimize. For example, for a least squares fit, it would calculate the sum of squared areas at the point given in the argument. Something like this:
private double SumSqError(Vector<double> v)
{
double err = 0;
for (int i = 0; i < 100; i++)
{
double y_val = v[0] + v[1] * Math.Exp(v[2] * x[i]);
err += Math.Pow(y_val - y[i], 2);
}
return err;
}
You don't have to supply the point. The algorithm does that over and over while searching for the minimum. Note that the subroutine as access to the vector x.
Here is the code for a test program fitting a function to random data:
private void btnMinFit_Click(object sender, EventArgs e)
{
Random RanGen = new Random();
x = new double[100];
y = new double[100];
// fit exponential expression with three parameters
double a = 5.0;
double b = 0.5;
double c = 0.05;
// create data set
for (int i = 0; i < 100; i++) x[i] = 10 + Convert.ToDouble(i) * 90.0 / 99.0; // values span 10 to 100
for (int i = 0; i < 100; i++)
{
double y_val = a + b * Math.Exp(c * x[i]);
y[i] = y_val + 0.1 * RanGen.NextDouble() * y_val; // add error term scaled to y-value
}
// var fphv = new Func<double, double, double, double>((x, A, B) => A * x + B * x + A * B * x * x); extraneous test
var f1 = new Func<Vector<double>, double>(x => LogEval(x));
var obj = ObjectiveFunction.Value(f1);
var solver = new NelderMeadSimplex(1e-5, maximumIterations: 10000);
var initialGuess = new DenseVector(new[] { 3.0, 6.0, 0.6 });
var result = solver.FindMinimum(obj, initialGuess);
Console.WriteLine(result.MinimizingPoint.ToString());
}
I currently have a line graph in my C# program, and I have a min and max variable. If any the graph ever exceeds the max, or goes below the min, is there any built in way of displaying on the graph (such as a dot at the point) that the limit was passed, and display the x/y values for that point?
int max = 2000;
int min = 2000;
for (int i = 0; i < dgvLoadedValues.RowCount - 1; i++)
{
DateTime x = Convert.ToDateTime(dgvLoadedValues.Rows[i].Cells[0].Value.ToString());
try
{
float y = float.Parse(dgvLoadedValues.Rows[i].Cells[e.ColumnIndex].Value.ToString());
chart1.Series["Series1"].Points.AddXY(x, y);
}
catch
{
Console.WriteLine("Unable to plot point");
}
}
Code above simply shows values taken from a datagridview and displaying it into a line graph
Thank you
Unfortunately there seems to be no way to define such an automatic alert.
But as you know just when the DataPoints are added or bound you can set a Marker where necessary.
Here is a loop that does it after the fact in one go, but of course you can just as well set the markers as you add the points..:
foreach (DataPoint dp in chart1.Series[0].Points)
{
if (dp.YValues[0] < max && dp.YValues[0] > min ) continue;
dp.MarkerStyle = MarkerStyle.Circle;
dp.MarkerColor = Color.Red;
}
Or in your case:
try
{
float y = float.Parse(dgvLoadedValues.Rows[i].Cells[e.ColumnIndex].Value.ToString());
int i = chart1.Series["Series1"].Points.AddXY(x, y);
if (y < min || y > max)
{
chart1.Series["Series1"].Points[i].MarkerStyle = MarkerStyle.Circle;
chart1.Series["Series1"].Points[i].MarkerColor = Color.Red;
}
}
To clear a marker you can set its MarkerStyle = MarkerStyle.None.
Of course you could easily give the min and max points different colors..
Here is an example with the simple circle style, but there are others including images..:
To add the values in a label use a format like this:
dp.Label = "(#VALX{0.0} / #VAL{0.0})" ;
I have a set of bounding boxes(rectangular) in a 3D space. The bounds of each box are computed and stored in a dictionary named "RegionBounds". Also, a set of points are populated in a List named "PointsToCategorize" Given a point(x,y,z) coordinates from the List populated and a bounding box to be checked in, i can check if the point is inside the box or not. The problem is, this is a big dataset. The number of points to be checked are like 1000 and the no of bounding boxes are like 250-300. So, if i loop through each bounding box for each given point; the total time it takes is like 5-6 minutes. Is there any efficient method that would do the process quicker ? If possible, a small code to do so would be great
public struct iBounds {
public double x1, x2;
public double y1, y2;
public double z1, z2;
}
public struct iPoint {
public double x,y,z
}
Dictionary<String, iBounds> RegionBounds = new Dictionary<String, iBounds>();
List<iPoint> PointsToCategorize = new List<iPoint>();
int no_of_bounding_boxes = 300;
int no_of_points_to_categorize = 1000;
for (int i = 1; i <= no_of_bounding_boxes; i++)
{
String boundingBoxName = "bound_" + i;
iBounds boundingBox = new iBounds
{
x1 = Computed By Some Other method and Formulas,
x2 = Computed By Some Other method and Formulas,
y1 = Computed By Some Other method and Formulas,
y2 = Computed By Some Other method and Formulas,
z1 = Computed By Some Other method and Formulas,
z2 = Computed By Some Other method and Formulas
};
RegionBounds.Add(boundingBoxName, boundingBox);
}
////////////Start of Output section /////////////////////////
for(int i= 1; i < = PointsToCategorize.Count; i++){
foreach(var pair in RegionBounds)
{
String myboxNmame = pair.Key;
iBounds myboxBounds = pair.Value;
Console.WriteLine(PointInside(PointsToCategorize[i],myboxBounds).ToString());
}
}
////////////// End of Output section //////////////////
private bool PointInside(iPoint mypoint, iBounds boxToBeCheckedIn)
{
if (mypoint.x > boxToBeCheckedIn.x1) && (mypoint.x < boxToBeCheckedIn.x2){
if (mypoint.y > boxToBeCheckedIn.y1) && (mypoint.y < boxToBeCheckedIn.y2){
if (mypoint.z > boxToBeCheckedIn.z1) && (mypoint.z < boxToBeCheckedIn.z2){
return true;
}
}
}else{
return false;
}
}
You may want to use a OcTree or a kD-tree data structure, which is way more efficient than iterating through all the boxes.
See also this article at the section 2-D orthogonal range searching, it has a very good resume of available techniques and algorithms, which are easily extendable to 3D
I'm writing some sort of Geometry Wars inspired game except with added 2d rigid body physics Ai pathfinding some waypoint analysis line of sight checks load balancing etc. It seems that even though with around 80-100 enemies on screen it can work reasonably fast with all that stuff enabled the performance completely breaks down once you get to a total of 250 (150 enemies) objects or so. I've searched for any O(n^2) parts in the code but there don't seem to be any left. I'm also using spatial grids.
Even if I disable pretty much everything from the supposedly expensive Ai related processing it doesn't seem to matter, it like still breaks down at 150 enemies.
Now I implemened all the code from scratch, currently even the matrix multiplication code, and I'm almost completely relying on the GC as well as using C# closures for some things, so I expect this to be seriously far from being optimized, but still it doesn't make sense to me that with like 1/15 of the processing work but double the objects the game suddenly starts to slow down to crawl? Is this normal, how is the XNA platform normally supposed to scale as far as the amount of objects being processed is concerned?
I remember Some slerp spinning cube thing I did at first could handle more than 1000 at once so I think I'm doing something wrong?
edit:
Here's the grid structure's class
public abstract class GridBase{
public const int WORLDHEIGHT = (int)AIGridInfo.height;
public const int WORLDWIDTH = (int)AIGridInfo.width;
protected float cellwidth;
protected float cellheight;
int no_of_col_types;
// a dictionary of lists that gets cleared every frame
// 3 (=no_of_col_types) groups of objects (enemy side, players side, neutral)
// 4000 initial Dictionary hash positions for each group
// I have also tried using an array of lists of 100*100 cells
//with pretty much identical results
protected Dictionary<CoordsInt, List<Collidable>>[] grid;
public GridBase(float cellwidth, float cellheight, int no_of_col_types)
{
this.no_of_col_types = no_of_col_types;
this.cellheight=cellheight;
this.cellwidth=cellwidth;
grid = new Dictionary<CoordsInt, List<Collidable>>[no_of_col_types];
for (int u = 0; u < no_of_col_types; u++)
grid[u] = new Dictionary<CoordsInt, List<Collidable>>(4000);
}
public abstract void InsertCollidable(Collidable c);
public abstract void InsertCollidable(Grid_AI_Placeable aic);
//gets called in the update loop
public void Clear()
{
for (int u = 0; u < no_of_col_types; u++)
grid[u].Clear();
}
//gets the grid cell of the left down corner
protected void BaseCell(Vector3 v, out int gx, out int gy)
{
gx = (int)((v.X + (WORLDWIDTH / 2)) / cellwidth);
gy = (int)((v.Y + (WORLDHEIGHT / 2)) / cellheight);
}
//gets all cells covered by the AABB
protected void Extent(Vector3 pos, float aabb_width, float aabb_height, out int totalx, out int totaly)
{
var xpos = pos.X + (WORLDWIDTH / 2);
var ypos = pos.Y + (WORLDHEIGHT / 2);
totalx = -(int)((xpos / cellwidth)) + (int)((xpos + aabb_width) / cellwidth) + 1;
totaly = -(int)((ypos / cellheight)) + (int)((ypos + aabb_height) / cellheight) + 1;
}
}
public class GridBaseImpl1 : GridBase{
public GridBaseImpl1(float widthx, float widthy)
: base(widthx, widthy, 3)
{
}
//adds a collidable to the grid /
//caches for intersection test
//checks if it should be tested to prevent penetration /
//tests penetration
//updates close, intersecting, touching lists
//Collidable is an interface for all objects that can be tested geometrically
//the dictionary is indexed by some simple struct that wraps the row and column number in the grid
public override void InsertCollidable(Collidable c)
{
//some tag so that objects don't get checked more than once
Grid_Query_Counter.current++;
//the AABB is allocated in the heap
var aabb = c.CollisionAABB;
if (aabb == null) return;
int gx, gy, totalxcells, totalycells;
BaseCell(aabb.Position, out gx, out gy);
Extent(aabb.Position, aabb.widthx, aabb.widthy, out totalxcells, out totalycells);
//gets which groups to test this object with in an IEnumerable (from a statically created array)
var groupstestedagainst = CollidableCalls.GetListPrevent(c.CollisionType).Select(u => CollidableCalls.group[u]);
var groups_tested_against = groupstestedagainst.Distinct();
var own_group = CollidableCalls.group[c.CollisionType];
foreach (var list in groups_tested_against)
for (int i = -1; i < totalxcells + 1; i++)
for (int j = -1; j < totalycells + 1; j++)
{
var index = new CoordsInt((short)(gx + i), (short)(gy + j));
if (grid[list].ContainsKey(index))
foreach (var other in grid[list][index])
{
if (Grid_Query_Counter.Check(other.Tag))
{
//marks the pair as close, I've tried only keeping the 20 closest but it's still slow
other.Close.Add(c);
c.Close.Add(other);
//caches the pair it so that checking if the pair intersects doesn't go through the grid //structure loop again
c.CachedIntersections.Add(other);
var collision_function_table_id = c.CollisionType * CollidableCalls.size + other.CollisionType;
//gets the function to use on the pair for testing penetration
//the function is in a delegate array statically created to simulate multiple dispatch
//the function decides what coarse test to use until descending to some complete //geometric query
var prevent_delegate = CollidableCalls.preventfunctions[collision_function_table_id];
if (prevent_delegate == null) { Grid_Query_Counter.Put(other.Tag); continue; }
var a = CollidableCalls.preventfunctions[collision_function_table_id](c, other);
//if the query returns true mark as touching
if (a) { c.Contacted.Add(other); other.Contacted.Add(c); }
//marks it as tested in this query
Grid_Query_Counter.Put(other.Tag);
}
}
}
//adds it to the grid if the key doesn't exist it creates the list first
for (int i = -1; i < totalxcells + 1; i++)
for (int j = -1; j < totalycells + 1; j++)
{
var index = new CoordsInt((short)(gx + i), (short)(gy + j));
if (!grid[own_group].ContainsKey(index)) grid[own_group][index] = new List<Collidable>();
grid[own_group][index].Add(c);
}
}
[...]
}
First. Profile your code. Even if you just use manually inserted time stamps to surround blocks you're interested in. I prefer to use the profiler that comes built into Visual Studio Pro.
However, based in your description, I would assume your problems are due to too many draw calls. Once you exceed 200-400 draw calls per frame your performance can drop dramatically. Try batching your rendering and see if this improves performance.
You can use a profiler such as ANTS Profiler to see what may be the problem.
Without any code theres not much I can do.
I am required to create a program which reads in data from a .cvs file, and use these (x, y and z) values for a series of calculations.
I read in the file as a string, and then split this into 3 smaller strings for x, y and z.
The x, y and z coordinates represents the x and y coordinates of the contours of a lake, and the depth (z).
One of the calculations which I have to do, is to calculate the surface area of the lake, using the formula (x[i]*y[i+1])-(x[i+1]*y[i]), where z(depth) = 0.
I can get my code to run perfectly, up until the x[i+1] and y[i+1], where it keeps giving me a value of 0.
Can someone please tell me how to fix this?
Here is my code;
{
string[] ss = File.ReadAllLines(#"C:File.csv");
for (int i = 1; i < ss.Length; i++)
{
string[] valuesAsString = ss[i].Split(new char[] { ' ', ',' }, StringSplitOptions.RemoveEmptyEntries);
double[] X = new double[valuesAsString.Length];
double[] Y = new double[valuesAsString.Length];
double[] Z = new double[valuesAsString.Length];
for (int n = 0; n < 1; n++)
{
X[n] = double.Parse(valuesAsString[0]);
Y[n] = double.Parse(valuesAsString[1]);
}
do
{
double SurfaceArea = (X[n] * Y[n + 1]) - (X[n + 1] * Y[n]);
Console.WriteLine(SurfaceArea);
}
while (Z[n] == 0);
}
}
Ok, im not sure if i got it right, so you if you would take a look to what i did and tell me if its of any help.
After reviewng it a little i came up with the following:
A class for the values
public class ValueXyz
{
public double X { get; set; }
public double Y { get; set; }
public int Z { get; set; }
}
A class to manange the calculation:
public class SurfaceCalculator
{
private ValueXyz[] _valuesXyz;
private double _surface;
private readonly string _textWithValues;
public SurfaceCalculator(string textWithValues)
{
_textWithValues = textWithValues;
SetValuesToCalculate();
}
public double Surface
{
get { return _surface; }
}
public void CalculateSurface()
{
for (var i = 0; i < _valuesXyz.Length; i++)
{
if (_valuesXyz[i].Z == 0)
_surface = (_valuesXyz[i].X*_valuesXyz[i + 1].Y) - (_valuesXyz[i + 1].X*_valuesXyz[i].Y);
}
}
private void SetValuesToCalculate()
{
var valuesXyz = _textWithValues.Split(' ');
_valuesXyz = valuesXyz.Select(item => new ValueXyz
{
X = Convert.ToDouble(item.Split(',')[0]),
Y = Convert.ToDouble(item.Split(',')[1]),
Z = Convert.ToInt32(item.Split(',')[2])
}).ToArray();
}
}
So now your client code could do somethin like:
[TestMethod]
public void TestSurfaceCalculatorGetsAValue()
{
//var textWithValues = File.ReadAllText(#"C:File.csv");
var textWithValues = "424.26,424.26,0 589.43,231.46,0 720.81,14.22,1";
var calculator = new SurfaceCalculator(textWithValues);
calculator.CalculateSurface();
Assert.IsNotNull(calculator.Surface);
}
I'm not very sure i got the idea correct of how to implement the formula, but i just wanted to expose an alternative you can use, you can never have to many ways of doing one thing :).
Cheers.
By the way part of the intent i had, was not tying up your funcionality to the csv in case your source for the text in the future would change.
Step through your code in the debugger. Pay special attention to tbe behavior of the line
for (int n = 0; n < 1; n++)
This loop will execute how many times? What will the value of n be during each iteration through the loop?
Well, one thing i noticed is when you're setting your X, Y, Z vars, you're setting it to the Length of the array object instead of it's value - is that intentional?
Put a debug break on the line with:
double SurfaceArea = (X[n] * Y[n + 1]) - (X[n + 1] * Y[n]);
and check the datatype of "X", "Y" and "Z"
I've had problems in the past where it tries to calculate them as strings (because it took it out of the data source as strings). I ended up fixing it by adding CInt() to each of the variables (or Convert.ToInt32();).
Hope this helps.
As this looks like it might be a homework problem, I am trying not to give a direct solution in my answer, but I see a number of questionable parts of your code that you should examine.
Why are X, Y, Z arrays? You are creating a new array each time through the outer loop, setting the length of the array to the number of elements in the line, then only assigning a value to one element of X and Y, and never assigning Z to anything.
As phoog suggests in his answer, what is the purpose of: for (int n = 0; n < 1; n++)?
What are you trying to accomplish with the do-while loop? As it has been mentioned in the comments by Mr Skeet, X[n], Y[n], Z[n] don't exist because n does not exist outside of the loop it is declared for. Even if it did exist Z[n] will always be zero because you never assign anything to the Z array after it is initialized, so the do-while loop will run forever.