XNA rotation of a texture according to a point - c#

I'm writing a little test project. I have an object (with a position and bounding box) at an origin, and when something happens (say a mouse click/touch on phone), I want a line to be drawn from the origin object to the point.
Using a texture, I realise I'm going to have to use the rotation here, but have no idea how to work out how much to rotate the texture by. Any help would be appreciated.
So far, I have:
Vector2 Origin
Vector2 TouchPoint
and that's about it.
Thanks all!

Theres a simple formula for calculating an angle based on the X and Y coordinates:
float angle = Math.Atan2(TouchPoint.Y - Origin.Y, TouchPoint.X - Origin.X);
You can use this angle in an overload of the SpriteBatch.Draw() that accept an angle for the rotation.
See this for reference:
http://msdn.microsoft.com/en-us/library/ff433992.aspx
You may want to convert between degrees and radians:
float rad = deg * Math.PI/180;
float deg = rad * 180/Math.PI;

Related

create vector from magnitude and angle

I'm trying to test some code to create Vector2 from its angle and its distance from the player position. I've looked up online how to do this and found the Vector2.Angle function to get the angle between two vectors, and the Vector2.Distance function to find the distance between two vectors, and I thought that I would be able to recreate the original (check_points[i]) Vector using these values, but when I run this code this is what I get (the white lines are from position to check_points[i], and the red lines are from position to other, which is the vector I'm trying to create from the values):
I would have expected the red and white lines to be the same, how can I make them this way?
float angle = Vector2.Angle(position, check_points[i]) * Mathf.Deg2Rad;
float dist = Vector2.Distance(position, check_points[i]);
Vector2 other = new Vector2(Mathf.Cos(angle) * dist, Mathf.Sin(angle) * dist);
Debug.DrawLine(position, check_points[i], Color.white);
Debug.DrawLine(position, other, Color.red);
The point of this is to track the corners of obstacles so I can generate a mesh from the triangles made between the corners and the player, to do line of sight checking. I've previously gotten it to work just casting a ray out for every degree of rotation (0-360), but I've read this tutorial and am trying to implement corner tracking to get it to look smoother: https://ncase.me/sight-and-light/
So the issue is that your position and check_point[i] are both positions .. you are not really taking directions here.
As noted in the comments e.g. a position of 0,0,0 wouldn't work at all since this vector has no direction so any angle between this and another position should be undefined.
What you want to do is take directions like Vector2.right and check_points[i] - position.
The next issue was that Vector2.Angle is always positive so you want to rather use SignedAngle here!
float angle = Vector2.SignedAngle(Vector2.right, check_points[i] - position) * Mathf.Deg2Rad;
float dist = Vector2.Distance(position, check_points[i]);
Then later note that a position vector has one more component than only an Angle and a Distance (which alone is only a direction): A StartPoint. You want to use this vector and start at your position (which now also can be 0,0,0) like
Vector2 other = position + new Vector2(Mathf.Cos(angle) * dist, Mathf.Sin(angle) * dist);
Debug.DrawLine(position, check_points[i], Color.white);
Debug.DrawLine(position, other, Color.red);
As you can see the white lines are exatly covered by the red ones so you only see the red lines recalculated from the angle and distance

Converting coordinates placed on 360texture plane to spherical

I have a 360 spherical video. I use this video as a texture on a sphere in Unity. Inside the sphere is a camera and this functions as the setup for my Virtual Reality experience. Pretty basic.
I am trying to write a bit of code on the web where people can upload 360 images and videos, place a marker/hotspot on the 360 spherical image/video, and then apply the image/video-texture on the sphere in Unity3D. If I overlay a simple x/y coordinate grid on the 360 video/image-texture, put in some x/y-coordinates to place the marker/hotspot, and put the texture back on the sphere, Unity will not interpret this correctly since we are now in 3D space and we are looking at the texture from within the sphere mapped onto the plane with all the distortion happening.
My question is, how do I convert these x and y coordinates on the 2D plane of the 360 video texture to coordinates that can be understood in 3D within Unity3D?
My first thought was to use 2-dimensional cartesian coordinates and convert these into spherical coordinates, but I seem to be missing a z-axis in the cartesian coordinates to make this work.
Is the z-axis simply 0 or is it the radius from center of the sphere to the x/y-coordinate? What does the z-axis represent? Is there maybe two coordinate systems. One that is coordinates on a plane and one that is from the centre of the sphere?
This is the conversion code that I have so far:
public static void CartesianToSpherical(Vector3 cartCoords, out float outRadius, out float outPolar, out float outElevation){
if (cartCoords.x == 0)
cartCoords.x = Mathf.Epsilon;
outRadius = Mathf.Sqrt((cartCoords.x * cartCoords.x)
+ (cartCoords.y * cartCoords.y)
+ (cartCoords.z * cartCoords.z));
outPolar = Mathf.Atan(cartCoords.z / cartCoords.x);
if (cartCoords.x < 0)
outPolar += Mathf.PI;
outElevation = Mathf.Asin(cartCoords.y / outRadius);
}
This is my very first post so please excuse me if I am doing anything wrong and let me know how to improve.
Spherical co-ordinates are different from 2d or 3d co-ordinates as you need to measure them in radians. It is usually measured from the center of the sphere and XY axis. It marks a rectangular area on the surface of the sphere. Please refer to this link for spherical co-ordinates in unity - https://blog.nobel-joergensen.com/2010/10/22/spherical-coordinates-in-unity/

Get and object to orbit around another when button is pressed C# XNA

I have a ship and I want the Ship to follow the mouse, it does that fine and dandy. When forward and backwards are pressed it goes towards and away from the mouse perfectly, but I can not figure out how to make the left and right buttons make the ship circle around the mouse in a clockwise/counterclockwise direction.
I have tried to take the ships location, and the mouse's location, creating a slope, and then getting the perpendicular to that slope, but that doesn't work either.
How can I achieve this? I do not think it needs code, more of an equation, but if there is code, please tell me.
You need the parametric form for the equation of a circle. Since you want it centered about the mouse's current location, you need an offset translation. Try something like:
float radius = 10f;
float shipX;
float shipY;
float angle = current_angle; // update this to animate
shipX = mouseX + ( radius * Math.Sin(angle));
shipY = mouseY + ( radius * Math.Cos(angle));

XNA 4.0: 2D Camera Y and X are going in wrong direction

So I know there are a few questions/answers regarding building a 2D Camera for XNA however people seem to just be happy posting their code without explanation.
I'm looking for more of an explanation of what I'm doing wrong.
First off, I understand the whole World -> View - > Projection - > Screen transformation.
My goal is to have a camera object that is centered in the center of the viewport and that when the camera's position moves up it correlates to moving up in the viewport and when it moves to the right it correlates moving right in the viewport.
I'm having difficulty implementing that functionality because the Y value of the viewport is inverted.
//In Camera Class
private void UpdateViewTransform()
{
//My thinking here was that I would create a projection matrix to center the camera and then flip the Y axis appropriately
Matrix proj = Matrix.CreateTranslation(new Vector3(_viewport.Width * 0.5f, _viewport.Height * 0.5f, 0)) *
Matrix.CreateScale(new Vector3(1f, -1f, 1f));
//Here is the camera Matrix. I have to give the Inverse of this matrix to the Spritebatch I believe since I want to go from World Coordinates to Camera Coordinates
_viewMatrix = Matrix.CreateRotationZ(_rotation) *
Matrix.CreateScale(new Vector3(_zoom, _zoom, 1.0f)) *
Matrix.CreateTranslation(_position.X, _position.Y, 0.0f);
_viewMatrix = proj * _viewMatrix;
}
Can someone help me understand how I can build my view transformation to pass into the SpriteBatch so that I achieve what I'm looking for.
EDIT
This as a transformation seems to work however I am unsure why. Can someone perhaps break it down for me in understanding:
Matrix proj = Matrix.CreateTranslation(new Vector3(_viewport.Width * 0.5f, _viewport.Height * 0.5f, 0));
_viewMatrix = Matrix.CreateRotationZ(_rotation) *
Matrix.CreateScale(new Vector3(_zoom, _zoom, 1.0f)) *
Matrix.CreateTranslation(-1 * _position.X, _position.Y, 0.0f);
_viewMatrix = proj * _viewMatrix;
I've built a raytracer before so I should understand your understanding, my confusion lies with the fact it's 2D and that SpriteBatch is hiding what it's doing from me.
Thanks!
Farid
If you flip everything on the Y axis with your scale matrix, that means you are flipping the models that SpriteBatch is drawing (textured quads). This means you also have to change your winding-order (ie: backface culling is interpreting that you are drawing the backs of the triangles facing the camera, so it culls them, so you have to change the rule that it uses).
By default, SpriteBatch uses RasterizerState.CullCounterClockwise. When you call SpriteBatch.Begin you need to pass in RasterizerState.CullClockwise instead.
And, of course, Begin is where you pass in your transformation matrix.
I haven't carefully checked your matrix operations - although I have a suspicion that the order is incorrect. I recommend you create a very simple testing app and build up your transformations one at a time.
I've fought with XNA a lot trying to get it to be like other engines I have worked with before... my reccomendation, it isn't worth it... Just go with the XNA standards and use the Matrix helper methods for creating your Perspective / Viewport matrices.
So after just thinking logically, I was able to deduce the proper transformation.
I'll outline the steps here incase anyone wants the real breakdown:
It is important to understand what is a Camera Transformation or View Transformation.
A View Transformation is normally what is needed to go from Coordinates relative to your Camera to World-Space coordinates. The inverse of the View Transformation would then make a world coordinate relative to your Camera!
Creating a View Matrix
Apply the camera's rotation. We are doing it in the Z axis only because this is for a 2D camera.
Apply the transformation of the Camera. This make sense since we want the World coordinate which is the sum of the camera coordinate and the object relative to the camera.
You might notice that I multiplied the Y Component by -1. This is so that increasing the Camera's Y Position correlates to moving up since the screen's Y value points downward.
Apply the camera's zoom
Now the inverse of the matrix will do World Coordinate -> View Cordinates.
I also chose to center my camera in the center of the screen, so I included a pre-appended translation.
Matrix proj = Matrix.CreateTranslation(new Vector3(_viewport.Width * 0.5f, _viewport.Height * 0.5f, 0));
_viewMatrix = Matrix.CreateRotationZ(moveComponent.Rotation) *
Matrix.CreateTranslation(moveComponent.Position.X, -1 * moveComponent.Position.Y, 0.0f) *
Matrix.CreateScale(new Vector3(zoomComponent.Zoom, zoomComponent.Zoom, 1.0f));
_viewMatrix = proj * Matrix.Invert(_viewMatrix);

xna make an object move around ITS axis(not around point 0,0,0)

How can I make an object rotate around its axis? i.e. having the moon rotating BOTH around point 0, 0, 0 and its own axis? So far I have only been able to do the point 0, 0, 0 point by using the gametime component and creating a rotation matrix.
Transpose the object's center to (0,0,0), do the rotation, and transpose back.
Let's say we have the following:
class 2DMoon
{
Texture2D texture;
Vector2 axis;
Vector2 origin;
}
The origin point could be (0,0), but let's say it's something more complex--something like (29,43). Now let's say the texture's width is 50 and the height is 90.
To get the axis for the texture for it to rotate around, assuming you want the center, you would do the following (assuming the origin (ie. current position) and texture are loaded):
axis.X = (.5 * texture.Width);
axis.Y = (.5 * texture.Height);
As you know, that would take make the axis a vector of (25,45).
As BlueRaja states above, you could then make a method that looks like this:
Rotate()
{
origin.X -= axis.X;
origin.Y -= axis.Y;
// rotation goes here
origin.X += axis.X;
origin.Y += axis.y;
}
This should work for any sort of standard texture. (And of course, you don't HAVE to have the Vector2 I made up called "axis"--it's just for easy reference.
Now, take the same logic and apply it for the 3D.
A word of advice: if you are trying to work through logic in 3D, look at the logic in 2D first. 9 times out of 10, you'll find the answer you're looking for!
(If I made any mistake during the transposition in my Rotate() method, please let me know--I'm sort of tired where I'm at, and I'm not testing it, but the rotation should work like that, no?)

Categories

Resources