Determine screen coordinates from point in DirectX world - c#

I have a point in space represented by a 4x4 matrix. I'd like to get the screen coordinates for the point. Picking appears to be the exact opposite fo what I need. I'm using the screen coordinate to determine where to draw text.
Currently the text I draw is floating in space far in front of the points. I've attached a screenshot of zoomed-in and zoomed-out to better explain. As you can see in the screenshot, the distance between each point is the same when zoomed in, when it should be smaller.
Am I missing a transformation? World coordinates consider 0,0,0 to be the center of the grid. I'm using SlimDX.
var viewProj = mMainCamera.View * mMainCamera.Projection;
//Convert 4x4 matrix for point to Vector4
var originalXyz = Vector3.Transform(Vector3.Zero, matrix);
//Vector4 to Vector3
Vector3 worldSpaceCoordinates = new Vector3(originalXyz.X, originalXyz.Y, originalXyz.Z);
//Transform point by view projection matrix
var transformedCoords = Vector3.Transform(worldSpaceCoordinates, viewProj);
Vector3 clipSpaceCoordinates = new Vector3(transformedCoords.X, transformedCoords.Y, transformedCoords.Z);
Vector2 pixelPosition = new Vector2((float)(0.5 * (clipSpaceCoordinates.X + 1) * ActualWidth), (float)(0.5 * (clipSpaceCoordinates.Y + 1) * ActualHeight));

Turns out I was way overthinking this. Just project the point to the screen by passing Vector3.Project your viewport information. It's a 3 line solution.
var viewProj = mMainCamera.View * mMainCamera.Projection;
var vp = mDevice.ImmediateContext.Rasterizer.GetViewports()[0];
var screenCoords = Vector3.Project(worldSpaceCoordinates, vp.X, vp.Y, vp.Width, vp.Height, vp.MinZ, vp.MaxZ, viewProj);

Related

Arc lines between 2 objects on a sphere Unity3D

I need a little help with maths for drawing lines between 2 points on a sphere. I have a 3d globe and some markers on it. I need to draw curved line from point 1 to point 2. I managed to draw lines from point to point with LineRenderer, but they are drawn with the wrong angle and I can't figure out, how to implement lines that go at the right angle. The code by far:
public static void DrawLine(Transform From, Transform To){
float count = 12f;
LineRenderer linerenderer;
GameObject line = new GameObject("Line");
linerenderer = line.AddComponent<LineRenderer>();
var points = new List<Vector3>();
Vector3 center = new Vector3(
(From.transform.position.x + To.transform.position.x) / 2f,
(From.transform.position.y + To.transform.position.y) ,
(From.transform.position.z + To.transform.position.z) / 2f
);
for (float ratio = 0; ratio <= 1; ratio += 1 / count)
{
var tangent1 = Vector3.Lerp(From.position, center, ratio);
var tangent2 = Vector3.Lerp(center, To.position, ratio);
var curve = Vector3.Lerp(tangent1, tangent2, ratio);
points.Add(curve);
}
linerenderer.positionCount = points.Count;
linerenderer.SetPositions(points.ToArray());
}
So what I have now is creepy lines rising above along y axis:
What should I take into account to let lines go along the sphere?
I suggest you to find the normal vector of your two points with a cross product (if your sphere is centered at the origin) and then normalize it to use it as a rotation axis for a rotation using quaternions. To make the interpolations, you can simply rotate the first point around this vector with an angle of k * a where k is a parameter from 0 to 1 and a is the angle between your first two vectors which you can find with the acos() of the dot product of your two normalized points
EDIT : I thought about a much easier solution (again, if the sphere is centered) : you can do a lerp between your two vectors and then normalize the result and multiply it by the radius of the sphere. However, the spacings between the resulting points wont be constant, especially if they are far from each other.
EDIT 2 : you can fix the problem of the second solution by using a function instead of a linear parameter for the lerp : f(t) = sin(t*a)/sin((PI+a*(1-2*t))/2)/dist(point1, point2) where a is the angle between the two points.

Getting the position of the the tip of a sprite, while rotated

I want to get the position of the sprite while its rotated,
Example
The bigger dot is the location i want to get (while rotated), and the sprite is rotated so its quite hard to get that, and im clueless of how to get it, the origin is (0,0) for anyone asking, is there any math that needs to be done to get this?
What you need to do is create a Vector2 from the origin to the rotated point before it's rotated and then rotate that vector.
For example, let's say we have a sprite with the origin point in the center and we want to know where the front point will be when the sprite is rotated.
First, create a vector from the origin to the point you want rotated on the unrotated sprite. In this case it's probably going to be something like 20 pixels to the right of the center:
var vector = new Vector2(20, 0);
Now we can rotate our vector with this simple function (borrowed from MonoGame.Extended)
public static Vector2 Rotate(Vector2 value, float radians)
{
var cos = (float) Math.Cos(radians);
var sin = (float) Math.Sin(radians);
return new Vector2(value.X*cos - value.Y*sin, value.X*sin + value.Y*cos);
}.
Now we can get our rotated vector like so:
var radians = MathHelper.ToRadians(degrees: -33);
var rotatedVector = Rotate(vector, radians);
To get this vector back into "sprite" space we can add it back to our origin point:
var point = origin + rotatedVector;
Or alternately if you want it in "world" space you can also add it to your sprites position.
var worldPoint = position + origin + rotatedVector;
Happy coding!

How do I convert these 3D camera trig equations to work on a new axis

The following function calculates the target vector of my FPS camera to put in the OpenGL LookAt method. Camera orientation (in radians) (0,0,0) means the camera is parallel to the z axis looking in the negative direction and the camera right vector is parallel to the x axis in the positive direction.
static Matrix4 GetViewMatrix()
{
Vector3 cameraup = Vector3.Transform(Vector3.UnitY,(Quaternion.FromAxisAngle(LineOfSightVector, Orientation.Z)));
LineOfSightVector.X = (float)(Math.Sin((float)Orientation.X) * Math.Cos((float)Orientation.Y));
LineOfSightVector.Y = (float)Math.Sin((float)Orientation.Y);
LineOfSightVector.Z = (float)(Math.Cos((float)Orientation.X) * Math.Cos((float)Orientation.Y));
return Matrix4.LookAt(Position, Position + LineOfSightVector, cameraup) * View; //View = createperspectivefield of view matrix4
}
It works fine when the camera y axis is (0,1,0). However I have introduced a Z value to my camera orientation (roll). I use that to get the "cameraup" vector. I now need to adjust the 3 trig equations for the LineOfSightVector to take into account the change in the "up" vector so the camera controls go in the right direction. Can someone please advise me on this.
Thanks
Having
lineOfSight = vec3(sin(phi)*cos(ksi), sin(ksi), cos(phi)*cos(ksi));
you could compute right and up directions as follows:
right = vec3(cos(phi)*cos(ksi), 0, -sin(phi)*cos(ksi));
up = cross(lineOfSight, right);
up = normalize(up);
Notice that in such model cases of cos(ksi) == 0 should be handled separately.

How to get coords inside a transformed sprite?

I am trying to the get the x and y coordinates inside a transformed sprite. I have a simple 200x200 sprite which rotates in the middle of the screen - with an origin of (0,0) to keep things simple.
I have written a piece of code that can transform the mouse coordinates but only with a specified x OR y value.
int ox = (int)(MousePos.X - Position.X);
int oy = (int)(MousePos.Y - Position.Y);
Relative.X = (float)((ox - (Math.Sin(Rotation) * Y /* problem here */)) / Math.Cos(Rotation));
Relative.Y = (float)((oy + (Math.Sin(Rotation) * X /* problem here */)) / Math.Cos(Rotation));
How can I achieve this? Or how can I fix my equation?
The most general way is to express the transformation as a matrix. This way, you can add any other transformation later, if you find you need it.
For the given transformation, the matrix is:
var mat = Matrix.CreateRotationZ(Rotation) * Matrix.CreateTranslation(Position);
This matrix can be interpreted as the system transformation from sprite space to world space. You want the inverse transformation - the system transformation from world space to sprite space.
var inv = Matrix.Invert(mat);
You can transform the mouse coordinates with this matrix:
var mouseInSpriteSpace = Vector2.Transform(MousePos, inv);
And you get the mouse position in the sprite's local system.
You can check if you have the correct matrix mat by using the overload of Spritebatch.Begin() that takes a matrix. If you pass the matrix, draw the sprite at (0, 0) with no rotation.

Rotate a point by another point in 2D

I want to know how to work out the new co-ordinates for a point when rotated by an angle relative to another point.
I have a block arrow and want to rotate it by an angle theta relative to a point in the middle of the base of the arrow.
This is required to allow me to draw a polygon between 2 onscreen controls. I can't use and rotate an image.
From what I have considered so far what complicates the matter further is that the origin of a screen is in the top left hand corner.
If you rotate point (px, py) around point (ox, oy) by angle theta you'll get:
p'x = cos(theta) * (px-ox) - sin(theta) * (py-oy) + ox
p'y = sin(theta) * (px-ox) + cos(theta) * (py-oy) + oy
If you are using GDI+ to do that, you can use Transform methods of the Graphics object:
graphics.TranslateTransform(point of origin);
graphics.RotateTransform(rotation angle);
Then draw the actual stuff.
If you have the System.Windows.Media namespace available, then you can use the built in transformations:
using System.Windows.Media;
var transform = new RotateTransform() {Angle = angleInDegrees, CenterX = center.X, CenterY = center.Y};
var transformedPoint = transform.Transform(point);
This takes a layout transform command on your image in the WPF, and rotates it the degree you want.
progress_image.LayoutTransform = new RotateTransform(90);

Categories

Resources