I am trying to rotate two Quaternions by the same amount (using the same Quaternion). I have an absoluteRotation and a relativeRotation (relative to its parent), I want to rotate the absoluteRotation towards another Transform's absoluteRotation, and then change the stored relative location as well as the stored absoluteRotation to match again:
void LateUpdate()
{
Quaternion newAbsRotation = Quaternion.RotateTowards(absRotationChain[0], boneChain[0].rotation, 0.5f);
Quaternion modX = Quaternion.Inverse(absRotationChain[0]) * newAbsRotation;
if(newAbsRotation == (absRotationChain[0] * modX)) {
Debug.Log("Equal!");
} else {
Debug.Log("Unequal!");
}
absRotationChain[0] = newAbsRotation;
// absRotationChain[0] = (absRotationChain[0] * modX) // doesn't work
}
However, I am unable to create a relative Quaternion for the rotation between both absolute Quaternions. I've read on multiple sources that it is supposed to be q' = q1-1 * q2, and in fact, the code above never prints out "Unequal!". However, this changes when I replace
absRotationChain[0] = newAbsRotation
with
absRotationChain[0] = (absRotationChain[0] * modX)
which is supposed to be equal. Instead of just producing the same result, the Quaternion iteratively approaches very quickly (during half a second) a (0, 0, 0, 0) Quaternion. If I then also replace the third parameter of the RotateTowards-function (0.5f) with 11f, then my Quaternion becomes (NaN, NaN, NaN, NaN).
I worked for hours on this problem now and can't find the reason for this issue, let alone a solution :(
I finally solved this problem!!
Obviously Unity has a precision problem with Quaternions.
First of all the Quaternion == Quaternion comparision I made wasn't really safe and just an approximation (they even wrote it in the docs!). So I found out that both were actually different at a very very insignificant digit. I guess Unity doesn't properly normalize the Quaternions.
In the end, using
Quaternion.Euler((absRotationChain[0] * modX).eulerAngles)
proved to be a decent workaround -.-'
Edit: Oh look, knowing that it is a normalization issue brought me this link from a guy having the same problem: https://gamedev.stackexchange.com/questions/74437/unity-quaternion-rotate-unrotate-error
Related
Basically I am looking for a simple way for using the rigidbody/physics engine to have my ship look at another object (the enemy). I figured getting the direction between my transform and the enemy transform, converting that to a rotation, and then using the built in MoveRotation might work but it is causing this weird effect where it just kind of tilts the ship. I posted the bit of code as well as images of before and after the attempt to rotate the ship (The sphere is the enemy).
private void FixedUpdate()
{
//There is a user on the ship's control panel.
if (user != null)
{
var direction = (enemyOfFocus.transform.position - ship.transform.position);
var rotation = Quaternion.Euler(direction);
ship.GetComponent<Rigidbody>().MoveRotation(rotation);
}
}
Before.
After.
Well the Quaternion.Euler is a rotation about the given angles, for convenience provided as a Vector3.
Your direction is rather a vector pointing from ship towards enemyOfFocus and has a magnitude so the x, y, z values also depend on the distance between your objects. These are no Euler angles!
What you rather want is Quaternion.LookRotation (the example in the docs is basically exactly your use case ;) )
var direction = enemyOfFocus.transform.position - ship.transform.position;
var rotation = Quaternion.LookRotation(direction);
ship.GetComponent<Rigidbody>().MoveRotation(rotation);
For my research I need to know, how exactly Unity implements the Slerp function for Vector3.
The Unity3D documentation descripes, that the input vectors are treated as directions rather than points in space. But it doesn´t explain, if there are quaternions used intern.
The Unity3D-C#-Reference mentionied Vector3.Slerp here:
[FreeFunction("VectorScripting::Slerp", IsThreadSafe = true)] extern public static Vector3 Slerp(Vector3 a, Vector3 b, float t);
However, I cannot find the definition anywhere. I think it´s a C++ Reference. Unity's C++-Code is only available with a licence (as far as I know).
Can someone help me determine this question? All I need to know is if Unity3D internally uses Quaternions for Vector3.Slerp(Vector3, Vector3, float).
Thank you in advance for your help.
I'm of course not sure because we don't have the source code for these internal methods but I'm pretty sure they would not use Quaternion which would be pretty imperformant but rather use pure and simple float based math like sinus, cosinus etc something that in c# would look somewhat similar to e.g. the solution mentioned here
Vector3 Slerp(Vector3 start, Vector3 end, float percent)
{
// Dot product - the cosine of the angle between 2 vectors.
float dot = Vector3.Dot(start, end);
// Clamp it to be in the range of Acos()
// This may be unnecessary, but floating point
// precision can be a fickle mistress.
Mathf.Clamp(dot, -1.0f, 1.0f);
// Acos(dot) returns the angle between start and end,
// And multiplying that by percent returns the angle between
// start and the final result.
float theta = Mathf.Acos(dot) * percent;
Vector3 RelativeVec = end - start * dot;
RelativeVec.Normalize();
// Orthonormal basis
// The final result.
return ((start*Mathf.Cos(theta)) + (RelativeVec * Mathf.Sin(theta)));
}
Though theirs are of course in the underlying c++ environment and won't use Mathf and therefore should be a bit better in performance.
I have a minimap on the screen and having problems to determine the angle to a target relative to the camera.
Here is some drawing of what I mean with some examples of camera position and direction:
The black triangles represent the camera.
The black arrows define their forward direction.
The blue arrows are the direction to the target (= red dot in the middle) from the camera.
The circles in the specific cameras define the wanted orientation of its red dot.
Here's my code so far:
//Anchored position around minimap circle
void CalculateScreenPos(){
Vector3 dir = transform.position - cam.position; //xz distance
dir.y = 0;
angle = Angle360(cam.forward.normalized, dir.normalized, cam.right);
Vector2 desiredPosition = new Vector2(
eX * -Mathf.Cos(angle * Mathf.PI/180f) * dir.z,
eY * Mathf.Sin(angle * Mathf.PI/180f) * dir.x
);
minimapBlip.anchoredPosition = desiredPosition;
}
public static float Angle360(Vector3 from, Vector3 to, Vector3 right)
{
float angle = Vector3.Angle(from, to);
return (Vector3.Angle(right, to) > 90f) ? 360f - angle : angle;
}
But the angle seems not working properly, found out that it ranges from
0° + cam.eulerXAngles.x to 360° - cam.eulerAngles.x
So it works when the cam is never looking to the ground or sky.
How do I get rid of the unwanted added x-eulerAngles by not substracting/adding it again to the angle?
angle -= cam.transform.eulerAngles.x
is a bad choice as when the result of Angle360 is 360, it gets substracted again, leading immediatly to a wrong angle.
Also the circle can be an ellipsoid, that's why I have put eX and eY in the desired position that determine the extends of the ellipse.
I don't know what data type you are using e.g. cam.position, cam.heading, etc. but some general guidance that will help you debug this problem.
Write a unit test. Prepare some canned data, both for input (e.g. set cam.position/cam.heading, transform, etc.) and output (the expected angle). Do this for a few cases, e.g. all the six examples you've shown. This makes it easier to repeat your test, and understand which case isn't working - you might see a pattern. It also makes it easy to run your code through a debugger.
Break your functions into logical units of work. What does Angle360 do? I guess you understand what it is supposed to do, but I think it is really two functions
Get the angle between the two vectors (current direction and target direction)
Rotate map (or something like that)
Write tests for those broken out functions. You're just using some library angle difference function - is it behaving as you expect?
Name your variables. You have a vector called right - what is that? There's no comment. Is it right as in 'correct', or as in 'opposite of left'? Why is it Vector3.Angle(right, to), and why not Vector3.Angle(to, right)?
Generally you are performing some math and getting tripped up because your code is not clear, things are not well named, and the approach is not clear. Break the problem into smaller pieces and the issue will become obvious.
I solved it:
angle = Mathf.Atan2(heading.x, heading.z) * Mathf.Rad2Deg-Camera.main.transform.eulerAngles.y;
Vector2 desiredPosition = new Vector2(
eX * -Mathf.Cos((angle+90) * Mathf.Deg2Rad),
eY * Mathf.Sin((angle+90) * Mathf.Deg2Rad)
);
Thanks for the help so far and happy coding :D
I'm making a game in Unity where the game's main camera is controlled by the phone orientation. The problem I have is that the gyroscope's data is very noisy and make the camera rotate in a very jittery way. I tried on different phone to make sure the hardware is not the problem(Galaxy s6 and Sony Xperia T2).
I've tried the following but none seems to work:
-Slerp between current rotation and new attitude(too much jitter no matter the what I multiply Time.deltaTime by)
//called at each update
transform.rotation = Quaternion.Slerp(transform.rotation, new Quaternion(-Input.gyro.attitude.x, -Input.gyro.attitude.y,
Input.gyro.attitude.z, Input.gyro.attitude.w), 60 * Time.deltaTime);
-Average the last gyro samples (either Euleur angles average or Quaternion average; both cases still offer too much jitter no matter how many samples I track)
//called at each update
if (vectQueue.Count >= 5)
{
vectQueue.Enqueue(Input.gyro.attitude.eulerAngles);
vectQueue.Dequeue();
foreach (Vector3 vect in vectQueue) {
avgr = avgr + vect;
}
avgr = new Vector3(avgr.x/5, avgr.y/5,avgr.z/5);
transform.rotation = Quaternion.Slerp(transform.rotation, Quaternion.Euler(avgr),Time.deltaTime*100);
}
else
{
vectQueue.Enqueue(Input.gyro.attitude.eulerAngles);
}
-Round the gyroscope data(best solution so far to prevent jitter but obviously this isn't smooth at all)
-Apply high/low pass filter (doesn't seem to do anything)
public float alpha = 0.5f;
//called at each update
v1 = Input.gyro.attitude.eulerAngles;
if (v2== null)
{
v2 =v1;
v3 = Input.gyro.attitude.eulerAngles;
}
else
{
v3 = (1 - alpha) * v3 + (1 - alpha)*(v1 - v2);
v2 = v1;
transform.Rotate(v3);
}
-Copy paste code from people like this; but the jitter is way too strong in all cases.
So now I'm at a point where I guess I'll try learning kalman filters and implement it in unity but obviously I'd prefer a black box solution to save time.
Always use noise cancellation when using analogue input , you can do so by calculating the difference between gyro values in current frame and gyro values in previous frame , and if the difference is greater then desired amount (0.002 or 0.03 might be good) rotate your camera on gyro values.
This will eventually solve your problem of jittering .Hope you will get it
Found the problem; any of the ways I mentioned earlier are valid under normal circumstances(use a combination for optimal effects). The thing that was causing the insane jitter was the position of the camera. It had very large numbers as coordinates and I guess this was messing up the engine.
hello although i think you have found a solution for that i had an idea about how to smoothen it up using Quaternion.Lerp.
#the rot is used to face forward
Quaternion rot = new Quaternion(0,0,1,0)
Quaternion.lerp(transform.rotation* rot,gyro*rot,10)
I have my code I wrote for displaying a mirror as a plane with a texture from a RenderTarget2D each frame.
This all works perfectly fine, but I still think that there is something wrong in the way the reflection goes (like, the mirror isn't looking exacly where it's supposed to be looking).
There's a screenshot of the mirror that doesn't really look bad, the distort mainly occurs when the player gets close to the mirror.
Here is my code for creating the mirror texture, notice that the mirror is rotated by 15 degrees on the X axis.
RenderTarget2D rt;
...
rt = new RenderTarget2D(device, (int)(graphics.PreferredBackBufferWidth * 1.5), (int)(graphics.PreferredBackBufferHeight * 1.5));
...
device.SetRenderTarget(rt);
device.Clear(Color.Black);
Vector3 camerafinalPosition = camera.position;
if (camera.isCrouched) camerafinalPosition.Y -= (camera.characterOffset.Y * 6 / 20);
Vector3 mirrorPos = new Vector3((room.boundingBoxes[8].Min.X + room.boundingBoxes[8].Max.X) / 2, (room.boundingBoxes[8].Min.Y + room.boundingBoxes[8].Max.Y) / 2, (room.boundingBoxes[8].Min.Z + room.boundingBoxes[8].Max.Z) / 2);
Vector3 cameraFinalTarget = new Vector3((2 * mirrorPos.X) - camera.position.X, (2 * mirrorPos.Y) - camerafinalPosition.Y, camera.position.Z);
cameraFinalTarget = Vector3.Transform(cameraFinalTarget - mirrorPos, Matrix.CreateRotationX(MathHelper.ToRadians(-15))) + mirrorPos;
Matrix mirrorLookAt = Matrix.CreateLookAt(mirrorPos, cameraFinalTarget, Vector3.Up);
room.DrawRoom(mirrorLookAt, camera.projection, camera.position, camera.characterOffset, camera.isCrouched);
device.SetRenderTarget(null);
And then the mirror is being drawn using the rt texture.
I supposed something isn't completly right with the reflection physics or the way I create the LookAt matrix, Thanks for the help.
I didn't use XNA, but I did some Managed C# DX long time ago, so I don't remember too much, but are you sure mirrorLookAt should point to a cameraFinalTarget? Because basically the Matrix.CreateLookAt should create a matrix out of from-to-up vectors - 'to' in your example is a point where mirror aims. You need to calculate a vector from camera position to mirror position and then reflect it, and I don't see that in your code.
Unless your room.DrawRoom method doesn't calculate another mirrorLookAt matrix, I'm pretty sure your mirror target vector is the problem.
edit: Your reflection vector would be
Vector3 vectorToMirror = new Vector3(mirrorPos.X-camera.position.Y, mirrorPos.Y-camera.position.Y, mirrorPos.Z-camera.position.Z);
Vector3 mirrorReflectionVector = new Vector3(vectorToMirror-2*(Vector3.Dot(vectorToMirror, mirrorNormal)) * mirrorNormal);
Also I don't remember if the mirrorReflectionVector shouldn't be inverted (whether it is pointing to mirror or from mirror). Just check both ways and you'll see. Then you create your mirrorLookAt from
Matrix mirrorLookAt = Matrix.CreateLookAt(mirrorPos, mirrorReflectionVector, Vector3.Up);
Though I don't know wher the normal of your mirror is. Also, I've noticed 1 line I can't really understand
if (camera.isCrouched) camerafinalPosition.Y -= (camera.characterOffset.Y * 6 / 20);
What's the point of that? Let's assume your camera is crouched - shouldn't its Y value be lowered already? I don't know how do you render your main camera, but look at the mirror's rendering position - it's way lower than your main eye. I don't know how do you use your IsCrouched member, but If you want to lower the camera just write yourself a method Crouch() or anything similar, that would lower the Y value a little. Later on you're using your DrawRoom method, in which you pass camera.position parameter - yet, it's not "lowered" by crouch value, it's just "pure" camera.position. That may be the reason it's not rendering properly. Let me know If that helped you anything.