how does Unity implements Vector3.Slerp exactly? - c#

For my research I need to know, how exactly Unity implements the Slerp function for Vector3.
The Unity3D documentation descripes, that the input vectors are treated as directions rather than points in space. But it doesn´t explain, if there are quaternions used intern.
The Unity3D-C#-Reference mentionied Vector3.Slerp here:
[FreeFunction("VectorScripting::Slerp", IsThreadSafe = true)] extern public static Vector3 Slerp(Vector3 a, Vector3 b, float t);
However, I cannot find the definition anywhere. I think it´s a C++ Reference. Unity's C++-Code is only available with a licence (as far as I know).
Can someone help me determine this question? All I need to know is if Unity3D internally uses Quaternions for Vector3.Slerp(Vector3, Vector3, float).
Thank you in advance for your help.

I'm of course not sure because we don't have the source code for these internal methods but I'm pretty sure they would not use Quaternion which would be pretty imperformant but rather use pure and simple float based math like sinus, cosinus etc something that in c# would look somewhat similar to e.g. the solution mentioned here
Vector3 Slerp(Vector3 start, Vector3 end, float percent)
{
// Dot product - the cosine of the angle between 2 vectors.
float dot = Vector3.Dot(start, end);
// Clamp it to be in the range of Acos()
// This may be unnecessary, but floating point
// precision can be a fickle mistress.
Mathf.Clamp(dot, -1.0f, 1.0f);
// Acos(dot) returns the angle between start and end,
// And multiplying that by percent returns the angle between
// start and the final result.
float theta = Mathf.Acos(dot) * percent;
Vector3 RelativeVec = end - start * dot;
RelativeVec.Normalize();
// Orthonormal basis
// The final result.
return ((start*Mathf.Cos(theta)) + (RelativeVec * Mathf.Sin(theta)));
}
Though theirs are of course in the underlying c++ environment and won't use Mathf and therefore should be a bit better in performance.

Related

Unity c# transform Function

I am new to unity I am learning to make a driving game and i am following a course from udemy, so far thigns seems to be nice but what they didnt explain is why unity takes Floating points for its transform functions such as transform.Rotate() transform.Translate(). and also how do we calculate our value for smooth motion and what data type it pretend if we do not use f with number for example (transform.Rotate(0,0,45))<- this function make the sprite rotate in z axis super fast while z=0.1f makes it smooth in in real life 0.1f is equal to what value for us humans to understand? and how it unity reads and understands it?
Tried Values as Floating and without but couldnt understand difference also want to udnerstand how unity reads it and interpret Floating values
whats the difference between:
float steerSpeed = 0.1f;
update() {
// difference between below
float steerAmount = Input.GetAxis(Horizontal) * steerSpeed;
// and this
float steerAmount = Input.GetAxis(Horizontal)
transform.Rotate(0, 0, steerAmount);
}
In C# (and some of the other programming languages), the f suffix means float.
For example, float x = 0.1f means the value of x is 0.1.
So, if you set the value of z in transform.Rotate to 0.1f, it rotates slower than z = 45.

How to determine an 360° angle from relative direction?

I have a minimap on the screen and having problems to determine the angle to a target relative to the camera.
Here is some drawing of what I mean with some examples of camera position and direction:
The black triangles represent the camera.
The black arrows define their forward direction.
The blue arrows are the direction to the target (= red dot in the middle) from the camera.
The circles in the specific cameras define the wanted orientation of its red dot.
Here's my code so far:
//Anchored position around minimap circle
void CalculateScreenPos(){
Vector3 dir = transform.position - cam.position; //xz distance
dir.y = 0;
angle = Angle360(cam.forward.normalized, dir.normalized, cam.right);
Vector2 desiredPosition = new Vector2(
eX * -Mathf.Cos(angle * Mathf.PI/180f) * dir.z,
eY * Mathf.Sin(angle * Mathf.PI/180f) * dir.x
);
minimapBlip.anchoredPosition = desiredPosition;
}
public static float Angle360(Vector3 from, Vector3 to, Vector3 right)
{
float angle = Vector3.Angle(from, to);
return (Vector3.Angle(right, to) > 90f) ? 360f - angle : angle;
}
But the angle seems not working properly, found out that it ranges from
0° + cam.eulerXAngles.x to 360° - cam.eulerAngles.x
So it works when the cam is never looking to the ground or sky.
How do I get rid of the unwanted added x-eulerAngles by not substracting/adding it again to the angle?
angle -= cam.transform.eulerAngles.x
is a bad choice as when the result of Angle360 is 360, it gets substracted again, leading immediatly to a wrong angle.
Also the circle can be an ellipsoid, that's why I have put eX and eY in the desired position that determine the extends of the ellipse.
I don't know what data type you are using e.g. cam.position, cam.heading, etc. but some general guidance that will help you debug this problem.
Write a unit test. Prepare some canned data, both for input (e.g. set cam.position/cam.heading, transform, etc.) and output (the expected angle). Do this for a few cases, e.g. all the six examples you've shown. This makes it easier to repeat your test, and understand which case isn't working - you might see a pattern. It also makes it easy to run your code through a debugger.
Break your functions into logical units of work. What does Angle360 do? I guess you understand what it is supposed to do, but I think it is really two functions
Get the angle between the two vectors (current direction and target direction)
Rotate map (or something like that)
Write tests for those broken out functions. You're just using some library angle difference function - is it behaving as you expect?
Name your variables. You have a vector called right - what is that? There's no comment. Is it right as in 'correct', or as in 'opposite of left'? Why is it Vector3.Angle(right, to), and why not Vector3.Angle(to, right)?
Generally you are performing some math and getting tripped up because your code is not clear, things are not well named, and the approach is not clear. Break the problem into smaller pieces and the issue will become obvious.
I solved it:
angle = Mathf.Atan2(heading.x, heading.z) * Mathf.Rad2Deg-Camera.main.transform.eulerAngles.y;
Vector2 desiredPosition = new Vector2(
eX * -Mathf.Cos((angle+90) * Mathf.Deg2Rad),
eY * Mathf.Sin((angle+90) * Mathf.Deg2Rad)
);
Thanks for the help so far and happy coding :D

Unable to create relative Quaternion in Unity3D

I am trying to rotate two Quaternions by the same amount (using the same Quaternion). I have an absoluteRotation and a relativeRotation (relative to its parent), I want to rotate the absoluteRotation towards another Transform's absoluteRotation, and then change the stored relative location as well as the stored absoluteRotation to match again:
void LateUpdate()
{
Quaternion newAbsRotation = Quaternion.RotateTowards(absRotationChain[0], boneChain[0].rotation, 0.5f);
Quaternion modX = Quaternion.Inverse(absRotationChain[0]) * newAbsRotation;
if(newAbsRotation == (absRotationChain[0] * modX)) {
Debug.Log("Equal!");
} else {
Debug.Log("Unequal!");
}
absRotationChain[0] = newAbsRotation;
// absRotationChain[0] = (absRotationChain[0] * modX) // doesn't work
}
However, I am unable to create a relative Quaternion for the rotation between both absolute Quaternions. I've read on multiple sources that it is supposed to be q' = q1-1 * q2, and in fact, the code above never prints out "Unequal!". However, this changes when I replace
absRotationChain[0] = newAbsRotation
with
absRotationChain[0] = (absRotationChain[0] * modX)
which is supposed to be equal. Instead of just producing the same result, the Quaternion iteratively approaches very quickly (during half a second) a (0, 0, 0, 0) Quaternion. If I then also replace the third parameter of the RotateTowards-function (0.5f) with 11f, then my Quaternion becomes (NaN, NaN, NaN, NaN).
I worked for hours on this problem now and can't find the reason for this issue, let alone a solution :(
I finally solved this problem!!
Obviously Unity has a precision problem with Quaternions.
First of all the Quaternion == Quaternion comparision I made wasn't really safe and just an approximation (they even wrote it in the docs!). So I found out that both were actually different at a very very insignificant digit. I guess Unity doesn't properly normalize the Quaternions.
In the end, using
Quaternion.Euler((absRotationChain[0] * modX).eulerAngles)
proved to be a decent workaround -.-'
Edit: Oh look, knowing that it is a normalization issue brought me this link from a guy having the same problem: https://gamedev.stackexchange.com/questions/74437/unity-quaternion-rotate-unrotate-error

Distance to a plane

I've written a simple little helper method whoch calculates the distance from a point to a plane. However, it seems to be returning nonsensical results. The code i have for creating a plane is thus:
Plane = new Plane(vertices.First().Position, vertices.Skip(1).First().Position, vertices.Skip(2).First().Position);
Fairly simple, I hope you'll agree. It creates an XNA plane structure using three points.
Now, immediately after this I do:
foreach (var v in vertices)
{
float d = Math.Abs(v.ComputeDistance(Plane));
if (d > Constants.TOLERANCE)
throw new ArgumentException("all points in a polygon must share a common plane");
}
Using the same set of vertices I used to construct the plane, I get that exception thrown! Mathematically this is impossible, since those three points must lie on the plane.
My ComputeDistance method is:
public static float ComputeDistance(this Vector3 point, Plane plane)
{
float dot = Vector3.Dot(plane.Normal, point);
float value = dot - plane.D;
return value;
}
AsI understand it, this is correct. So what could I be doing wrong? Or might I be encountering a bug in the implementation of XNA?
Some example data:
Points:
{X:0 Y:-0.5000001 Z:0.8660254}
{X:0.75 Y:-0.5000001 Z:-0.4330128}
{X:-0.75 Y:-0.5000001 Z:-0.4330126}
Plane created:
{Normal:{X:0 Y:0.9999999 Z:0} D:0.5} //I believe D should equal -0.5?
Distance from point 1 to plane:
1.0
It seems that your Plane is implemented so that D is not the projection of one of your points onto the plane normal, but rather the negative of this. You can think of this as projecting a vector from the plane to the origin onto the normal.
In any case, I believe that changing
float value = dot - plane.D;
to
float value = dot + plane.D;
should fix things. HTH.
Ok, I'm not totally sure I understand the math here, but I suspect (based on formulas from http://mathworld.wolfram.com/Point-PlaneDistance.html among others) that
float value = dot - plane.D;
should actually be
float value = dot / plane.D;
EDIT: Ok, as mentioned in comments below, this didn't work. My best suggestion then is to go look at the link or google "distance between a point and a plane" and try implementing the formula a different way.

How do I account for gravity using a wiimote's accelerometer?

For a project my team and I have been trying to track a wiimote in a 3D space using the built in accelerometer and the WiiMotion Plus gyroscope.
We’ve been able to track the rotation and position using an ODE (Found at http://www.alglib.net/,) but we’ve run into a problem with removing the gravity component from the accelerometer.
We looked at Accelerometer gravity components which had the formula (implemented in C# / XNA)
private Vector3 RemoveGravityFactor(Vector3 accel)
{
float g = -1f;
float pitchAngle = (Rotation.Z);
float rollAngle = (Rotation.Y);
float yawAngle = (Rotation.X);
float x = (float)(g * Math.Sin(pitchAngle));
float y = (float)(-g * Math.Cos(pitchAngle) * Math.Sin(rollAngle));
float z = (float)(-g * Math.Cos(pitchAngle) * Math.Cos(rollAngle));
Vector3 offset = new Vector3(x, y, z);
accel = accel - offset;
return accel;
}
But it doesn’t work at all. As a reference, the acceleration is straight from the accelerometer, and the rotation is measured in radians after it has been worked through the ODE.
Also, We are having problems with understanding how this formula works. Due to the fact that our tracking is taking into account all dimensions, why is Yaw not taken into account?
Thanks in advance for any advice or help that is offered.
EDIT:
After discussing it with my teammates and boss, we've come to find that this formula would actually work if we were using X, Y, and Z correctly. We've come to another stump though.
The problem that we're having is that the Wiimote library that we're using returns relative rotational values based on the gyroscope movement. In otherwords, if the buttons are facing up, rotating the wiimote left and right is yaw and if the buttons are facing toward you, yaw is the same when it SHOULD be the rotation of the entire wiimote.
We've found that Euler angles may be our answer, but we're unsure how to use them appropriately. If there is any input on this new development or any other suggestions please give them.
I'd bet that your accelerometer was not calibrated in zero gravity, so removing the effect of gravity will be difficult, at the very least.
First I'd suggest not using individual components to store the rotation (gimbal lock), a matrix would work better. calibrate by holding it still and measuring (it will be 1g downward). then for each rotation, multiple the rotation matrix by it. then you can tell which way is up and subtract a matrix of 1g down from the vector representing the acceleration. I know that doesn't make a lot of sense but I'm in a bit of a rush, add comments if you have questions.

Categories

Resources