Align 3d Object to Camera position - c#

I am using the Viewport control to visualize 3d Tool Models with the help of AB4D PowerToys.
Currently I'm trying to align my 3D Text (which is a group of lines) to the camera, so it rotates always to the camera and appears like 2D text.
Is it possible to bind the Transformation of my Visual to the position of my camera?

Finally found the answer myself:
The following code should be called on the "CameraChanged" Event.
private void UpdateTransforms()
{
if (!this.IsAttachedToViewport3D())
{
return;
}
Matrix3D view, projection;
if (Camera.GetCameraMatrixes(out view, out projection))
{
Vector3D up = new Vector3D(view.M12, view.M22, view.M32);
Vector3D look = new Vector3D(view.M13, view.M23, view.M33);
Transform3D transform = Transform;
Matrix3D inverseWorld = this.GetTransform();
inverseWorld.Invert();
look = inverseWorld.Transform(look);
look.Normalize();
up = inverseWorld.Transform(up);
up.Normalize();
Quaternion q = LookAtRotation(look, up);
Transform3DGroup grp = new Transform3DGroup();
grp.Children.Add(new RotateTransform3D(new QuaternionRotation3D(q), CenterPosition));
grp.Children.Add(transform);
Transform = grp;
Camera.Refresh();
}
}
private static Quaternion LookAtRotation(Vector3D lookAt, Vector3D upDirection)
{
Vector3D forward = lookAt;
Vector3D up = upDirection;
// Orthonormalize
forward.Normalize();
up -= forward * Vector3D.DotProduct(up, forward);
up.Normalize();
Vector3D right = Vector3D.CrossProduct(up, forward);
Quaternion q = new Quaternion
{
W = Math.Sqrt(1.0 + right.X + up.Y + forward.Z) * 0.5
};
double w4Recip = 1.0 / (4.0 * q.W);
q.X = (up.Z - forward.Y) * w4Recip;
q.Y = (forward.X - right.Z) * w4Recip;
q.Z = (right.Y - up.X) * w4Recip;
return q;
}

Related

Random Rotation on a 3D sphere given an angle

This question is in between computer graphic, probability, and programming, but since I am coding it for an Unity project in C# I decided to post it here. Sorry if not appropriate.
I need to solve this problem: given a object on a 3d sphere at a certain position, and given a range of degrees, sample points on the sphere uniformly within the given range.
For example:
Left picture: the cube represents the center of the sphere, the green sphere is the starting position. I want to uniformly cover all surface of the circle within a certain degree, for example from -90 to 90 degrees around the green sphere. My approach (right picture) doesn't work as it over-samples points that are close to the starting position.
My sampler:
Vector3 getRandomEulerAngles(float min, float max)
{
float degree = Random.Range(min, max);
return degree * Vector3.Normalize(new Vector3(Random.Range(min, max), Random.Range(min, max), Random.Range(min, max)));
}
and for covering the top half of the sphere I would call getRandomEulerAngles(-90, 90).
Any idea?
Try this:
public class Sphere : MonoBehaviour
{
public float Radius = 10f;
public float Angle = 90f;
private void Start()
{
for (int i = 0; i < 10000; i++)
{
var randomPosition = GetRandomPosition(Angle, Radius);
Debug.DrawLine(transform.position, randomPosition, Color.green, 100f);
}
}
private Vector3 GetRandomPosition(float angle, float radius)
{
var rotationX = Quaternion.AngleAxis(Random.Range(-angle, angle), transform.right);
var rotationZ = Quaternion.AngleAxis(Random.Range(-angle, angle), transform.forward);
var position = rotationZ * rotationX * transform.up * radius + transform.position;
return position;
}
}
We can use a uniform sphere sampling for that. Given two random variables u and v (uniformly distributed), we can calculate a random point (p, q, r) on the sphere (also uniformly distributed) with:
float azimuth = v * 2.0 * PI;
float cosDistFromZenith = 1.0 - u;
float sinDistFromZenith = sqrt(1.0 - cosDistFromZenith * cosDistFromZenith);
(p, q, r) = (cos(azimuth) * sinDistFromZenith, sin(azimuth) * sinDistFromZenith, cosDistFromZenith);
If we put our reference direction (your object location) into zenithal position, we need to sample v from [0, 1] to get all directions around the object and u in [cos(minDistance), cos(maxDistance)], where minDistance and maxDistance are the angle distances from the object you want to allow. A distance of 90° or Pi/2 will give you a hemisphere. A distance of 180° or Pi will give you the full sphere.
Now that we can sample the region around the object in zenithal position, we need to account for other object locations as well. Let the object be positioned at (ox, oy, oz), which is a unit vector describing the direction from the sphere center.
We then build a local coordinate system:
rAxis = (ox, oy, oz)
pAxis = if |ox| < 0.9 : (1, 0, 0)
else : (0, 1, 0)
qAxis = normalize(cross(rAxis, pAxis))
pAxis = cross(qAxis, rAxis)
And finally, we can get our random point (x, y, z) on the sphere surface:
(x, y, z) = p * pAxis + q * qAxis + r * rAxis
Adapted from Nice Schertler, this is the code I am using
Vector3 GetRandomAroundSphere(float angleA, float angleB, Vector3 aroundPosition)
{
Assert.IsTrue(angleA >= 0 && angleB >= 0 && angleA <= 180 && angleB <= 180, "Both angles should be[0, 180]");
var v = Random.Range(0F, 1F);
var a = Mathf.Cos(Mathf.Deg2Rad * angleA);
var b = Mathf.Cos(Mathf.Deg2Rad * angleB);
float azimuth = v * 2.0F * UnityEngine.Mathf.PI;
float cosDistFromZenith = Random.Range(Mathf.Min(a, b), Mathf.Max(a, b));
float sinDistFromZenith = UnityEngine.Mathf.Sqrt(1.0F - cosDistFromZenith * cosDistFromZenith);
Vector3 pqr = new Vector3(UnityEngine.Mathf.Cos(azimuth) * sinDistFromZenith, UnityEngine.Mathf.Sin(azimuth) * sinDistFromZenith, cosDistFromZenith);
Vector3 rAxis = aroundPosition; // Vector3.up when around zenith
Vector3 pAxis = UnityEngine.Mathf.Abs(rAxis[0]) < 0.9 ? new Vector3(1F, 0F, 0F) : new Vector3(0F, 1F, 0F);
Vector3 qAxis = Vector3.Normalize(Vector3.Cross(rAxis, pAxis));
pAxis = Vector3.Cross(qAxis, rAxis);
Vector3 position = pqr[0] * pAxis + pqr[1] * qAxis + pqr[2] * rAxis;
return position;
}

Unity3d: Rotate a gameobject towards forward vector of another gameobject

BackCube.position = cameraEye.position - cameraEye.forward * 2;
float back = cameraEye.position.z - 2f;
BackCube.position = new Vector3(BackCube.position.x, BackCube.position.y, back);
var lookPosBack = cameraEye.position - BackCube.position;
lookPosBack.y = 0;
var rotationBack = Quaternion.LookRotation(lookPosBack);
BackCube.rotation = Quaternion.Slerp(BackCube.rotation, rotationBack, 1);
So, I want the my BackCube to rotate towards the forward vector of cameraEye. The code above looks at the cameraEye, but not towards the forward vector of cameraEye. I want the forward vectors of both pointing at each other being 2 units apart from each other. I have control only over the BackCube
This worked!
Vector3 direction = Vector3.ProjectOnPlane(cameraEye.forward, Vector3.up);
BackCube.transform.position = cameraEye.position - direction.normalized * 2;
var lookPosBack = cameraEye.position - BackCube.position;
lookPosBack.y = 0;
var rotationBack = Quaternion.LookRotation(lookPosBack);
BackCube.rotation = Quaternion.Slerp(BackCube.rotation, rotationBack, 1);

Calculate position on ground for camera

I'm trying to calculate the position on the ground that a camera is looking at (without doing a raytrace since I know the forward vector of the camera and the height of the ground). I tried doing this using the dot product but I still seem to be getting the wrong answer. This is what I did with the values I was testing with:
const float groundHeight = 0f;
Vector3 cameraPosition = new Vector3(0f, 3f, 0f);
Vector3 cameraForward = new Vector3(0f, -0.7f, 0.7f); //Unit vector
Vector3 positionOnGround = cameraPosition;
positionOnGround.y = groundHeight;
Vector3 cameraToGround = positionOnGround - cameraPosition;
float scalar = Vector3.Dot(cameraToGround, cameraForward);
Vector3 forwardToGround = cameraForward * scalar;
return cameraPosition + forwardToGround;
For some reason this is giving me the position 0, 1.5, 1.5 when I'm looking for something that has a height of 0. Any ideas of what I'm doing wrong?
Let's define:
camera is located at point C.
camera is looking at its target point on the ground (P) (you want to find P)
camera distance from ground yC = camera.transform.position.y
camera projection on the ground (G)
distance of p from G (zP)
angle of GCP is a. which in your case it's 45 degrees since CP is (0f, -0.7f, 0.7f)
Now
tan(a) = magnitude(PG) / magnitude(CG)
= zP / yC
= zP / camera.transform.position.y
P = new Vector3(xP,yP,zP)
= new Vector3(xC,0,zP)
= new Vector3(camera.transform.position.x, 0, tan(a)*camera.transform.position.y)

C# / OpenTK, why does my sphere not look smooth?

This should hopefully be a simple question. So I finally figured out how to render stuff in 3D in OpenTK. Great! Only problem is, it doesn't quite look how I expect. I'm drawing a sphere using the Polar method, and drawing using PrimitiveType.Polygon.
Here's the algorithm for calculating the coordinates. What I'm doing is stepping through each phi then theta in the sphere, incrementally adding more adjacent quads to my final point list:
Point 1: Theta1, Phi1
Point 2: Theta1, Phi2
Point 3: Theta2, Phi2
Point 4: Theta2: Phi1
protected static RegularPolygon3D _create_unit(int n)
{
List<Vector3> pts = new List<Vector3>();
float theta = 0.0f;
float theta2 = 0.0f;
float phi = 0.0f;
float phi2 = 0.0f;
float segments = n;
float cosT = 0.0f;
float cosT2 = 0.0f;
float cosP = 0.0f;
float cosP2 = 0.0f;
float sinT = 0.0f;
float sinT2 = 0.0f;
float sinP = 0.0f;
float sinP2 = 0.0f;
List<Vector3> current = new List<Vector3>(4);
for (float lat = 0; lat < segments; lat++)
{
phi = (float)Math.PI * (lat / segments);
phi2 = (float)Math.PI * ((lat + 1.0f) / segments);
cosP = (float)Math.Cos(phi);
cosP2 = (float)Math.Cos(phi2);
sinP = (float)Math.Sin(phi);
sinP2 = (float)Math.Sin(phi2);
for (float lon = 0; lon < segments; lon++)
{
current = new List<Vector3>(4);
theta = TWO_PI * (lon / segments);
theta2 = TWO_PI * ((lon + 1.0f) / segments);
cosT = (float)Math.Cos(theta);
cosT2 = (float)Math.Cos(theta2);
sinT = (float)Math.Sin(theta);
sinT2 = (float)Math.Sin(theta2);
current.Add(new Vector3(
cosT * sinP,
sinT * sinP,
cosP
));
current.Add(new Vector3(
cosT * sinP2,
sinT * sinP2,
cosP2
));
current.Add(new Vector3(
cosT2 * sinP2,
sinT2 * sinP2,
cosP2
));
current.Add(new Vector3(
cosT2 * sinP,
sinT2 * sinP,
cosP
));
pts.AddRange(current);
}
}
var rtn = new RegularPolygon3D(pts);
rtn.Translation = Vector3.ZERO;
rtn.Scale = Vector3.ONE;
return rtn;
}
And so my Sphere class looks like this:
public class Sphere : RegularPolygon3D
{
public static Sphere Create(Vector3 center, float radius)
{
var rp = RegularPolygon3D.Create(30, center, radius);
return new Sphere(rp);
}
private Sphere(RegularPolygon3D polygon) : base(polygon)
{
}
}
I should also mention, that the color of this sphere is not constant. I 2 dimensions, I have this code that works great for gradients. In 3D...not so much. That's why my sphere has multiple colors. The way the 2d gradient code works, is there is a list of colors coming from a class I created called GeometryColor. When the polygon is rendered, every vertex gets colored based off the list of colors within GeometryColor. So if there are 3 colors the user wished to gradient between, and there were 6 vertices (hexagon), then the code would assign the first 2 vertices color 1, the 2nd two color 2, then the last 2 color 3. The following code shows how the color for the vertex is calculated.
public ColorLibrary.sRGB GetVertexFillColor(int index)
{
var pct = ((float)index + 1.0f) / (float)Vertices.Count;
var colorIdx = (int)Math.Round((FillColor.Colors.Count - 1.0f) * pct);
return FillColor.Colors[colorIdx];
}
Anyway, here's the output I'm getting...hope somebody can see my error...
Thanks.
Edit: If I only use ONE Vertex color (i,e instead of my array of 4 diff colors), then I get a completely smooth sphere...although without lighting and stuff its hard to tell its anything but a circle lol)
Edit....so somehow my sphere is slightly see through...even though all my alphas are set to 1.0f and I'm doing depth testing..
GL.DepthMask(true);
GL.Enable(EnableCap.DepthTest);
GL.ClearDepth(1.0f);
GL.DepthFunc(DepthFunction.Lequal);
Final edit: OK, it has SOMETHING to do with my vertices I'm guessing, because when I use PrimitiveType.Quads it works perfectly....

Planet rotate around its local y-axis using Matrix4x4

I am having a bit of trouble figuring out how to rotate a sphere (planet) from a starting static position around it's local y-axis using a Matrix4x4 with custom rotation formulas. The planet doesn't rotate at all. I've posted the code below. Thanks for any advice.
Note: ori would be the degrees per second it would rotate
void OrientationRate()
{
Matrix4x4 o = new Matrix4x4();
float theta = ori * Time.fixedDeltaTime;
o[0, 0] = Mathf.Cos(theta);
o[0, 2] = -Mathf.Sin(theta);
o[2, 0] = Mathf.Sin(theta);
o[2, 2] = Mathf.Cos(theta);
Vector4 p = this.transform.eulerAngles;
p.w = 1;
Vector4 pprime = (o * p);
//set the new position
this.transform.Rotate(new Vector3(pprime.z, pprime.x, pprime.y) * Time.fixedDeltaTime);
}

Categories

Resources