Circle and line segment collision in C# - c#

I'm trying to implement a method to check if a circle and a line intersect. I took most of this code (fixed based on the answer), and also modified the code a bit to use Point's instead of Vector2f's`.
This is currently what I have:
private bool CircleLineIntersect(int x, int y, int radius, Point linePoint1, Point linePoint2) {
Point p1 = new Point(linePoint1.X,linePoint1.Y);
Point p2 = new Point(linePoint2.X,linePoint2.Y);
p1.X -= x;
p1.Y -= y;
p2.X -= x;
p2.Y -= y;
float dx = p2.X - p1.X;
float dy = p2.Y - p1.Y;
float dr = (float)Math.Sqrt((double)(dx * dx) + (double)(dy * dy));
float D = (p1.X * p2.Y) - (p2.X * p1.Y);
float di = (radius * radius) * (dr * dr) - (D * D);
if (di < 0) return false;
else return true;
}
It looks consistent with this algorithm, so I'm not sure what the problem is.
If anyone could provide guidance it would be much appreciated.
EDIT:
It doesn't seem to be calculating correctly. For example with input x=1272, y=1809, radius=80, linePoint1={X=1272,Y=2332}, linePoint2={X=1272,Y=2544} there shouldn't be an intersection (y+radius is less than both y values of the line segment), but the function is returning true.

Error exists in your test case. Not only does the it intersect, but your line goes through the center of the circle. The line is a vertical line (X =1272). Your circle is centred about (1272, 1809). ERGO it goes through the centre.
Perhaps you have a misunderstanding between the terms line and line-segment, within mathematics.

Related

How to decrease and increase values as a curve

I am trying to do a digging tool for my game, I have x and y coordinates of point A and B, what I want to do is create a curve between these points, nothing graphical I just need loop through the coordinates (float x, float y).
I am not good at explaining so here is a visual example;
The first image is what's happen if I just use a for loop to decrease the y value until middle and then increase it from the middle to end.
//Very specific code for my example
//I wrote it just for this example so I am not sure if it works
float y;
float x;
public void Example(float startX, float endX, float startY, float endY, float depth)
{
y = startY;
x = startX;
float changeAmountOfY = depth / (endX - startX);
for (int i = (int)startX; i < (startX + endX) / 2; i++)
{
x++;
y -= changeAmountOfY;
}
for (int i = (int)(startX + endX) / 2; i < endX; i++)
{
x++;
y += changeAmountOfY;
}
}
public void ChangeCoordinates()
{
Example(100f, 200f, 100f, 100f, 50f);
}
The second image is what I need.
I am developing the game on unity and I am using Vector2 for the coordinates but it is not important.
Pure C# or even C++ is welcome.
It is also fine if someone can just explain the math behind what I am trying to do.
Maybe this can help:
// Calculate radius
int radius = (B.X - A.X) / 2;
// Calculate middle
int middle_x = A.X + radius;
int middle_y = A.Y;
// or
int middle_y = (A.Y + B.Y) / 2;
// Coordinates for a semicircle
// 0 to 180 degree
for (int i = 0; i <= 180; i++)
{
double x_coordinate = middle_x + radius * Math.Cos(i * Math.PI / 180);
// Opened to bottom
double y_coordinate = middle_y + radius * Math.Sin(i * Math.PI / 180);
// or opened to top
double y_coordinate = middle_y - radius * Math.Sin(i * Math.PI / 180);
}
Take a look at unit circle.

Find an index for a given Point coordinate from an array of Points

Given an Point array and an arbitrary x,y coordinate, find the index for _points that is closest to the given coordinate.
PointD[] _points
//create a list of x,y coordinates:
for (int i = 0; i < _numberOfArcSegments + 1; i++)
{
double x1 = _orbitEllipseSemiMaj * Math.Sin(angle) - _focalDistance; //we add the focal distance so the focal point is "center"
double y1 = _orbitEllipseSemiMinor * Math.Cos(angle);
//rotates the points to allow for the LongditudeOfPeriapsis.
double x2 = (x1 * Math.Cos(_orbitAngleRadians)) - (y1 * Math.Sin(_orbitAngleRadians));
double y2 = (x1 * Math.Sin(_orbitAngleRadians)) + (y1 * Math.Cos(_orbitAngleRadians));
angle += _segmentArcSweepRadians;
_points[i] = new PointD() { x = x2, y = y2 };
}
I'm drawing an ellipse which represents an orbit. I'm first creating the point array above, then when I draw it, I (attempt) to find the point closest to where the orbiting body is.
To do this I've been attempting to calculate the angle from the center of the ellipse to the body:
public void Update()
{
//adjust so moons get the right positions (body position - focal point position)
Vector4 pos = _bodyPositionDB.AbsolutePosition - _positionDB.AbsolutePosition;
//adjust for focal point
pos.X += _focalDistance;
//rotate to the LonditudeOfPeriapsis.
double x2 = (pos.X * Math.Cos(-_orbitAngleRadians)) - (pos.Y * Math.Sin(-_orbitAngleRadians));
double y2 = (pos.X * Math.Sin(-_orbitAngleRadians)) + (pos.Y * Math.Cos(-_orbitAngleRadians));
_ellipseStartArcAngleRadians = (float)(Math.Atan2(y2, x2)); //Atan2 returns a value between -180 and 180;
}
then:
double unAdjustedIndex = (_ellipseStartArcAngleRadians / _segmentArcSweepRadians);
while (unAdjustedIndex < 0)
{
unAdjustedIndex += (2 * Math.PI);
}
int index = (int)unAdjustedIndex;
The ellipse draws fine, (the point array is correct and all is good once adjusted for viewscreen and camera offsets and zoom)
But does not start at the correct point (I'm decreasing the alpha in the color so the resulting ellipse fades away the further it gets from the body)
I've spend days trying to figure out what I'm doing wrong here and tried a dozen different things trying to figure out where my math is wrong, but I'm not seeing it.
I assume that _points should be an array of PointD;
This is the shortest way to get the closest point to your array (calcdistance should be a simple function that calculate the euclidean distance):
PointD p = _points.OrderBy(p => CalcDistance(p, gievnPoint)).First();

How to detect if the line intersects in C#?

I want to add additional feature of my project in C#, I can already draw lines in my program but I want to detect INTERSECTING LINES of a one line drawn and display the point they've intersect. Is it possible? Thank you
My program also includes computing for Perpendicular Distance, here is the sample code:
public static Double PerpendicularDistance(Point Point1, Point Point2, Point Point)
{
Double area = Math.Abs(.5 * (Point1.X * Point2.Y + Point2.X * Point.Y + Point.X * Point1.Y - Point2.X * Point1.Y - Point.X * Point2.Y - Point1.X * Point.Y));
Double bottom = Math.Sqrt(Math.Pow(Point1.X - Point2.X, 2) + Math.Pow(Point1.Y - Point2.Y, 2));
Double height = area / bottom * 2;
return height;
}
}
The POINT here is a class for my X and Y coordinates.
If you are trying to find the intersection of two line, then the solution is fairly trivial.
If the two line are in the form Ax + By = C:
float delta = a1*b2 - a2*b1;
if(delta == 0)
throw new ArgumentException("Lines are parallel");
float x = (b2*c1 - b1*c2)/delta;
float y = (a1*c2 - a2*c1)/delta;
My concern is comment above that says there is only one drawn line. I'm not sure what you mean. Does it mean that the app provides one line and the user the other, or are we dealing in curved lines where the line intersects itself?

How do I calculate angle from two coordinates?

I'm working on a project with 3D based objects and manipulating them via my program. I currently have a textbox that allows me to put a heading in degrees and a button that will calculate the required values to make my main object change its heading. This is the code for that function:
private void btnSetHeading_Click(object sender, EventArgs e)
{
if (this.textBoxHeading.Text.Length == 0)
return;
float heading = (float)0;
try
{
heading = float.Parse(this.textBoxHeading.Text);
}
catch (FormatException ex)
{
MessageBox.Show(ex.Message);
return;
}
if (heading < (float)0 || heading > (float)360)
{
MessageBox.Show("Invalid heading parameter. Acceptable range: 0 - 360");
return;
}
float tempCosine = (float)Math.Cos(((heading * Math.PI) / (float)360.0));
float tempSine = -((float)Math.Sin(((heading * Math.PI) / (float)360.0)));
try
{
ProgramInterop.CreateInstance.SetHeading(tempCosine, tempSine);
}
catch (Exception ex)
{
MessageBox.Show("Caught: " + ex.Message);
}
}
If I supply 90 as the heading to face, the results are tempCosine=0.7071068 and tempSine=-0.7071068, which then makes my main object face 90 degrees or due east.
The program requires the heading to be given in two seperate values(tempCosine and tempSine) which I'm not familiar with geometry enough to understand why I would multiply by 360 instead of 180 but this is how its required to work.
Now for the next part of my project involves making my main object face another object given both of their (x,y) coordinates. If for example my main object is at (9112.94, 22088.74) and the new object I want to face is at (9127.04, 22088.88), it would require almost exactly 90 degrees heading to make it face the new object.
How can I calculate the tempCosine and tempSine from those two coordinates?
Regarding 180, that's true. I get used to have an extension class like this for working with radians and degrees.
public static class Extension
{
public static double ToRadians(this double degree)
{
return degree * Math.PI / 180;
}
public static double ToDegrees(this double val)
{
return val * 180 / Math.PI;
}
}
Regarding sine and cosine (I'm not sure I understood evetything right) but if I use the code below
float x1 = 9112.94f;
float y1 = 22088.74f;
float x2 = 9127.04f;
float y2 = 22088.88f;
float r = (float) Math.Pow((x2 - x1) * (x2 - x1) + (y2 - y1) * (y2 - y1), 0.5);
float cosine = (x2 - x1) /r;
float sine = (y2 - y1) /r;
I'll get the angle 0.5712978 (not equal to 90).
Sorry if I misunderstood the problem.
I was able to work out the answer using Dani's suggestion to use Atan2(y, x). Thanks Dani.
float x1 = 9112.94f;
float y1 = 22088.74f;
float x2 = 9127.04f;
float y2 = 22088.88f;
float angleRadians;
float diffX = x2 - x1;
float diffY = y2 - y1;
float atan2Result = (float)Math.Atan2(diffX, diffY);
angleRadians = atan2Result / 2;
if (angleRadians < 0.0f)
angleRadians += (float)Math.PI;
float tempCosine = (float)Math.Cos(angleRadians);
float tempSine = -((float)Math.Sin(angleRadians));

Shorten a line by a number of pixels

I'm drawing a custom diagram of business objects using .NET GDI+. Among other things, the diagram consists of several lines that are connecting the objects.
In a particular scenario, I need to shorten a line by a specific number of pixels, let's say 10 pixels, i.e. find the point on the line that lies 10 pixels before the end point of the line.
Imagine a circle with radius r = 10 pixels, and a line with start point (x1, y1) and end point (x2, y2). The circle is centered at the end point of the line, as in the following illustration.
How do I calculate the point marked with a red circle, i.e. the intersection between circle and line? This would give me the new end point of the line, shortening it by 10 pixels.
Solution
Thank you for your answers from which I was able to put together the following procedure. I named it LengthenLine, since I find it more natural to pass a negative number of pixels if I want the line shortened.
Specifically, I was trying to put together a function that could draw a line with rounded corners, which can be found here.
public void LengthenLine(PointF startPoint, ref PointF endPoint, float pixelCount)
{
if (startPoint.Equals(endPoint))
return; // not a line
double dx = endPoint.X - startPoint.X;
double dy = endPoint.Y - startPoint.Y;
if (dx == 0)
{
// vertical line:
if (endPoint.Y < startPoint.Y)
endPoint.Y -= pixelCount;
else
endPoint.Y += pixelCount;
}
else if (dy == 0)
{
// horizontal line:
if (endPoint.X < startPoint.X)
endPoint.X -= pixelCount;
else
endPoint.X += pixelCount;
}
else
{
// non-horizontal, non-vertical line:
double length = Math.Sqrt(dx * dx + dy * dy);
double scale = (length + pixelCount) / length;
dx *= scale;
dy *= scale;
endPoint.X = startPoint.X + Convert.ToSingle(dx);
endPoint.Y = startPoint.Y + Convert.ToSingle(dy);
}
}
Find the direction vector, i.e. let the position vectors be (using floats) B = (x2, y2) and A = (x1, y1), then AB = B - A. Normalize that vector by dividing by its length ( Math.Sqrt(xx + yy) ). Then multiply the direction vector AB by the original length minus the circle's radius, and add back to the lines starting position:
double dx = x2 - x1;
double dy = y2 - y1;
double length = Math.Sqrt(dx * dx + dy * dy);
if (length > 0)
{
dx /= length;
dy /= length;
}
dx *= length - radius;
dy *= length - radius;
int x3 = (int)(x1 + dx);
int y3 = (int)(y1 + dy);
Edit: Fixed the code, aaand fixed the initial explanation (thought you wanted the line to go out from the circle's center to its perimeter :P)
I'm not sure why you even had to introduce the circle. For a line stretching from (x2,y2) to (x1,y1), you can calculate any point on that line as:
(x2+p*(x1-x2),y2+p*(y1-y2))
where p is the percentage along the line you wish to go.
To calculate the percentage, you just need:
p = r/L
So in your case, (x3,y3) can be calculated as:
(x2+(10/L)*(x1-x2),y2+(10/L)*(y1-y2))
For example, if you have the two points (x2=1,y2=5) and (x1=-6,y1=22), they have a length of sqrt(72 + 172 or 18.38477631 and 10 divided by that is 0.543928293. Putting all those figures into the equation above:
(x2 + (10/l) * (x1-x2) , y2 + (10/l) * (y1-y2))
= (1 + 0.543928293 * (-6- 1) , 5 + 0.543928293 * (22- 5))
= (1 + 0.543928293 * -7 , 5 + 0.543928293 * 17 )
= (x3=-2.807498053,y3=14.24678098)
The distance between (x3,y3) and (x1,y1) is sqrt(3.1925019472 + 7.7532190152) or 8.384776311, a difference of 10 to within one part in a thousand million, and that's only because of rounding errors on my calculator.
You can use similar triangles. For the main triangle, d is the hypotenuses and the extension of r is the vertical line that meets the right angle. Inside the circle you will have a smaller triangle with a hypotenuses of length r.
r/d = (x2-a0)/(x2-x1) = (y2-b0)/(y2-y1)
a0 = x2 + (x2-x1)r/d
b0 = y2 + (y2-y1)r/d

Categories

Resources