Project 3D coordinates to a 2D plane - c#

I have a plane defined by a normal vector and another normalalised direction vector that is moving along that plane, both in 3D space.
I'm trying to figure out how to project that normal direction 3D vector onto the plane such that it ends up being a 2D vector with x/y coordinates.

It sounds like you need to find the angle between the direction vector and the plane. The size of the projection is going to scale with the cosine of that angle. Since the normal vector of the plane is perpendicular, I think you can find the sine between the normal vector and your direction vector.
The angle between the two vectors is given by the dot product of the vectors over the magnitudes multiplied together. That gives us our theta. Take the sin of theta, and we have the scaling factor (I'll call it s)
Next, you need to define unit size vectors on the plane to project onto. It's probably easiest to do this by setting one of the unit vectors in the direction of the projection to move forward...
If you set the unit vector in the direction of the projection, then you know the length of the projection in that unit space by using the scaling factor and multiplying by the length of the vector.
After that, with the unit vector, multiply in the length and find your vector relative to your normally defined xyz axis.
I hope this helps.

Try something like this. I wrote a paper on this exact method a while ago and can provide you with a copy if you would like.
PointF Transform32(Point3 P)
{
float pX = (float)(((V.J * sxy) - V.I * cxy) * zoom);
float pY = (float)(((V.K * cz) - (V.I * sxy * sz) - (V.J * sz * cxy)));
return new PointF(Origin.X + pX, Origin.Y - pY);
}
cxy is the cosine of the x-y camera angle, measured in radians from the positive x-axis on the xy plane.
sxy is the sine of the x-y camera angle.
cz is the cosine of the z camera angle, measured in radians from the x-y plane (so the angle is zero if the camera rests on that plane).
sz is the sine of the z camera angle.
Alternatively:
Vector3 V = new Vector3(P.X, P.Y, P.Z);
Vector3 R = Operator.Project(V, View);
Vector3 Q = V - R;
Vector3 A = Operator.Cross(View, zA);
Vector3 B = Operator.Cross(A, View);
int pY = (int)(Operator.Dot(Q, B) / B.GetMagnitude());
int pX = (int)(Operator.Dot(Q, A) / A.GetMagnitude());
pY and pX should be your coordinates. Here, vector V is the position vector of the point in question, R is the projection of that vector onto your viewing vector, Q is the component of V orthogonal to the viewing Vector, A is an artificial X-axis formed by the cross-product of the viewing vector with the vector (0,0,1), and B is an artificial Y-axis formed by the cross product of A and (0,0,1).
It sounds like what you're looking for is something like a simple rendering engine, similar to this, which used the above formulae:
Hope this helps.

Related

How to calculate a point of intersection on the surface of a sphere from a point within the sphere facing out along a given vector

Working in Unity
I have a working solution using raycasts and a sphere collider, but would like to understand how to accomplish the same result using maths alone.
Scenario is as such:
a) I have a thoretical sphere
b) I project a line from any point within the sphere along any path/direction (both of which are expressed as a Vector3s)
c) I would like to determine the point (also a Vector3) at which the path intersects with the surface of the sphere, as if detected by a raycast returning point data.
I am familiar with the use of COS and SIN plotting points on a 2D plane, but not in 3 dimensions.
Hopefully my description is clear enough.
Any help would be most appreciated.
Let sphere is described by equation (where cx, cy, cz is sphere center, R is radius)
(x-cx)^2+(y-cy)^2+(z-cz)^2 = R^2
Point P0, direction vector is dir, so parameteric ray equation is
R = P0 + t * dir
or in coordinates
x = p0.x + t * dir.x
y = p0.y + t * dir.y
z = p0.z + t * dir.z
Substitute these expressions into sphere equation, solve resulting quadratic equation for unknown parameter t.
(p0x+t*dirx-cx)^2+.... = R^2
p0x^2+t^2*dirx^2+cx^2+2*p0x*t*dirx-2*p0x*cx-2*t*dirx*cx +... = R^2
t^2*(dirx^2+diry^2+dirz^2) +
t*(2*p0x*dirx-2*dirx*cx+2*p0y*diry-2*diry*cy+2*p0z*dirz-2*dirz*cz)
+ (p0x^2+p0y^2+p0z^2+cx^2+cy^2+cz^2-R^2) = 0
This is quadratic equation for unknown t
You might get 0, 1, or 2 solution for cases: ray does not intersect sphere, ray touches sphere, ray (line) intersects sphere in two points. For point P0 inside sphere one root will be negative, ignore it.
After that put t value into coordinate equations and get intersection point.

Unity - Apply two local rotations to object (trying to recreate the rotation of a controller joystick into a 3D mesh joystick)

I would like to recreate one on one the rotation of the real life controller joystick (i.e. 360 controller) into a 3D joystick mesh (that resembles the 360 controller one).
I thought about doing it by rotating the joystick in the X axis according to the magnitude of the input (mapping it to a min and max rotation in the X axis). And then figure the angle of the input and apply it to the Y axis of the 3D joystick.
This is the code I have, the joystick tilts properly in the X axis but the rotation in the Y axis doesn't work:
public void SetStickRotation(Vector2 stickInput)
{
float magnitude = stickInput.magnitude;
// This function converts the magnitude to a range between the min and max rotation I want to apply to the 3D stick in the X axis
float rotationX = Utils.ConvertRange(0.0f, 1.0f, m_StickRotationMinX, m_StickRotationMaxX, magnitude);
float angle = Mathf.Atan2(stickInput.x, stickInput.y);
// I try to apply both rotations to the 3D model
m_Stick.localEulerAngles = new Vector3(rotationX, angle, 0.0f);
}
I am not sure why is not working or even if I am doing it the right way (i.e. perhaps there is a more optimal way to achieve it).
Many thanks for your input.
I would recommend rotating it by an amount determined by the magnitude around a single axis determined by the direction. This will avoid the joystick spinning around, which would be especially noticeable in cases of asymmetric joysticks such as pilots joysticks:
Explanation in comments:
public void SetStickRotation(Vector2 stickInput)
{
/////////////////////////////////////////
// CONSTANTS (consider making a field) //
/////////////////////////////////////////
float maxRotation = 35f; // can rotate 35 degrees from neutral position (up)
///////////
// LOGIC //
///////////
// Convert input to x/z plane
Vector3 stickInput3 = new Vector3(stickInput.x, 0f, stickInput.y);
// determine axis of rotation to produce that direction
Vector3 axisOfRotation = Vector3.Cross(Vector3.up, stickInput3);
// determine angle of rotation
float angleOfRotation = maxRotation * Mathf.Min(1f, stickInput.magnitude);
// apply that rotation to the joystick as a local rotation
transform.localRotation = Quaternion.AngleAxis(angleOfRotation, axisOfRotation);
}
This will work for joysticks where:
the direction from its axle to its end is the local up direction,
it should have zero (identity) rotation on neutral input, and
stickInput with y=0 should rotate the knob around the stick's forward/back axis, and stickInput with x=0 should rotate the knob around the stick's left/right axis.
Figure out the problem, atan2 returns the angle in radiants, however the code assumes it is euler degrees, as soon as I did the conversion it worked well.
I put the code here if anyone is interested (not the change in the atan2 function):
public void SetStickRotation(Vector2 stickInput)
{
float magnitude = stickInput.magnitude;
// This function converts the magnitude to a range between the min and max rotation I want to apply to the 3D stick in the X axis
float rotationX = Utils.ConvertRange(0.0f, 1.0f, m_StickRotationMinX, m_StickRotationMaxX, magnitude);
float angle = Mathf.Atan2(direction.x, direction.y) * Mathf.Rad2Deg;
// Apply both rotations to the 3D model
m_Stick.localEulerAngles = new Vector3(rotationX, angle, 0.0f);
}

Angle of camera to intersect with object vector

I am trying to determine the angle where the camera forward vector intersect with an object vector.
Sorry, not straight forward to explain with my knowledge, please find attached a diagram: The camera may not be looking directly at the object (OBJ) and I'd like to know the angle ( ? in the diagram) where the camera's forward vector (V1 in red) intersects with the vector of the object (V2 in red) (if it does), e.g. point A, or B, or C, etc depending on the x-rotation of the camera.
I tried calculating a normalized vector for the red lines, v1 and v2.
Then calculate the angle between two vectors https://onlinemschool.com/math/library/vector/angl/
But the results don't match the expected values when testing.
//v1
Vector3 hypoth = Camera.main.transform.forward.normalized;
//v2
Vector3 adjacent = (new Vector3(obj.transform.position.x, obj.transform.position.y, Camera.main.transform.position.z)
-obj.transform.position).normalized;
float dotProd = Vector3.Dot(adjacent, hypoth);
float cosOfAngle = dotProd / (Vector3.Magnitude(adjacent) * Vector3.Magnitude(hypoth));
double radAngle = Math.Acos(cosOfAngle);
float angle = (float)((180 / Math.PI) * radAngle);
Finding the angle between v1 and v2 gives you this angle, which doesn't match what you mark in your diagram:
Instead, solve for the angle between v1 and the plane normal to v2:
We can do this in unity by projecting v1 to the plane normal to v2 using Vector3.ProjectOnPlane, and then finding the angle between that projection and v1 using Vector3.Angle:
Vector3 projection = Vector3.ProjectOnPlane(hypoth, adjacent);
float angle = Vector3.Angle(projection, hypoth);
I've a similar situation where I wanted to set the collidars of the terrain units on the same height of the player Jet and at the same time it must be on the line of sight of the camera, otherwise when u shoot the terrain units , the bullets will appear like moving through enemy units on the ground , this only works when you work with prospective camera, on orthongal , u may dont need to do this at all, its just set the object on the same height as the camera and everything will be aligned .
Here is my code
void SetColliderLocation()
{
// Object on the ground
A = TerrainUnit.transform.position;
// Camera location
B = cam.transform.position;
// Enemy jet height
height = mainPlayerTransform.position.y;
// Unit Vector normalized between A and B
AB_Normalized = (A - B).normalized;
// The unit vector required to move the collider to maintain its hieght and line of sight with the camera
unitVector = (height - A.y) / AB_Normalized.y;
// Setting the location of the collidar .
collidarGameObject.transform.position = (AB_Normalized * unitVector) + A;
}
I hope its some how similar of what you are looking for.
Edit:
If you applied this script and instead of collider you put a box , you will see the box location will be always between the camera on the sky and object on the ground however the camera or the object on the ground is moving.

Unity destination Vector based on rotation and the amount to move

I'm having an issue where I can't figure out the algorithm to find out the destination point based on objects rotation and the amount to move. I have to move to the direction of my rotation a certain amount, but I don't know how to calculate the destination point I end up being at. Example:
Object location = (0, 0)
Object rotation = 45
Amount to move = 4
with these variables the destination point would be (2.5, 2.5)
Example 2:
Object location = (0, 0)
Object rotation = 0
Amount to move = 4
and with these it would be (0, 4)
The problem is, I don't know how to calculate the destination point when I know those variables. I need an algorithm that will calculate the destination point, can somebody help with this? :)
Regards, Tuukka.
If this is a strictly algorithmic question where you want to calculate the destination point (i.e. no game object to move around, but abstract data), you can do this:
Consider the two-dimensional plane in cartesian coordinates, (i.e. the standard x/y system). Let O be an object at point (0,0). From your "destination point" (2.5, 2.5) I can assume that you want the following thing:
So 45° is the angle and 4 (amount to move) is the length of the line segment you want to move along. Starting from (0,0), this end point can be calculated using sine and cosine by using the formula for the polar representation of a point:
But actually, that image is wrong, which we'll see in the following computation. If the movement is along the line with a slope angle of 45°, you'd land a little bit elsewhere.
Anyways, for this example, alpha would be 45° which is pi/4 in radians (you get this by dividing by 180 and multiplying with pi), and the radius r would be 4 (the amount we want to move), so we'd have calculated the destination point as:
If the point is located anywhere in the room (not at (0,0) but at (x_0, y_0)), then you can still add it as an offset:
So in code you'd write:
public static Vector2 ComputeDestination(Vector2 origin, float amountToMove, float angle)
{
//convert degrees to radians
var rad = angle * Mathf.Deg2Rad;
//calculate end point
var end_point = origin + amountToMove * new Vector2(Mathf.Cos(rad), Mathf.Sin(rad));
return end_point;
}
float homMuchToMove = 4f;
float angle = 45f;
float pointX = Mathf.Cos (ConvertToRadians (angle)) * homMuchToMove;
float pointY = Mathf.Sin (ConvertToRadians (angle)) * homMuchToMove;
public float ConvertToRadians(float angle)
{
return (Mathf.PI / 180f) * angle;
}
For these values you will get both points at 2.828427f

How do I convert these 3D camera trig equations to work on a new axis

The following function calculates the target vector of my FPS camera to put in the OpenGL LookAt method. Camera orientation (in radians) (0,0,0) means the camera is parallel to the z axis looking in the negative direction and the camera right vector is parallel to the x axis in the positive direction.
static Matrix4 GetViewMatrix()
{
Vector3 cameraup = Vector3.Transform(Vector3.UnitY,(Quaternion.FromAxisAngle(LineOfSightVector, Orientation.Z)));
LineOfSightVector.X = (float)(Math.Sin((float)Orientation.X) * Math.Cos((float)Orientation.Y));
LineOfSightVector.Y = (float)Math.Sin((float)Orientation.Y);
LineOfSightVector.Z = (float)(Math.Cos((float)Orientation.X) * Math.Cos((float)Orientation.Y));
return Matrix4.LookAt(Position, Position + LineOfSightVector, cameraup) * View; //View = createperspectivefield of view matrix4
}
It works fine when the camera y axis is (0,1,0). However I have introduced a Z value to my camera orientation (roll). I use that to get the "cameraup" vector. I now need to adjust the 3 trig equations for the LineOfSightVector to take into account the change in the "up" vector so the camera controls go in the right direction. Can someone please advise me on this.
Thanks
Having
lineOfSight = vec3(sin(phi)*cos(ksi), sin(ksi), cos(phi)*cos(ksi));
you could compute right and up directions as follows:
right = vec3(cos(phi)*cos(ksi), 0, -sin(phi)*cos(ksi));
up = cross(lineOfSight, right);
up = normalize(up);
Notice that in such model cases of cos(ksi) == 0 should be handled separately.

Categories

Resources